Nov 26 01:45:26 localhost kernel: Linux version 5.14.0-284.11.1.el9_2.x86_64 (mockbuild@x86-vm-09.build.eng.bos.redhat.com) (gcc (GCC) 11.3.1 20221121 (Red Hat 11.3.1-4), GNU ld version 2.35.2-37.el9) #1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023 Nov 26 01:45:26 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com. Nov 26 01:45:26 localhost kernel: Command line: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M Nov 26 01:45:26 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Nov 26 01:45:26 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Nov 26 01:45:26 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Nov 26 01:45:26 localhost kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Nov 26 01:45:26 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. Nov 26 01:45:26 localhost kernel: signal: max sigframe size: 1776 Nov 26 01:45:26 localhost kernel: BIOS-provided physical RAM map: Nov 26 01:45:26 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Nov 26 01:45:26 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Nov 26 01:45:26 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Nov 26 01:45:26 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable Nov 26 01:45:26 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved Nov 26 01:45:26 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Nov 26 01:45:26 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Nov 26 01:45:26 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000043fffffff] usable Nov 26 01:45:26 localhost kernel: NX (Execute Disable) protection: active Nov 26 01:45:26 localhost kernel: SMBIOS 2.8 present. Nov 26 01:45:26 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014 Nov 26 01:45:26 localhost kernel: Hypervisor detected: KVM Nov 26 01:45:26 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Nov 26 01:45:26 localhost kernel: kvm-clock: using sched offset of 2984058625 cycles Nov 26 01:45:26 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Nov 26 01:45:26 localhost kernel: tsc: Detected 2799.998 MHz processor Nov 26 01:45:26 localhost kernel: last_pfn = 0x440000 max_arch_pfn = 0x400000000 Nov 26 01:45:26 localhost kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Nov 26 01:45:26 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000 Nov 26 01:45:26 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef] Nov 26 01:45:26 localhost kernel: Using GB pages for direct mapping Nov 26 01:45:26 localhost kernel: RAMDISK: [mem 0x2eef4000-0x33771fff] Nov 26 01:45:26 localhost kernel: ACPI: Early table checksum verification disabled Nov 26 01:45:26 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS ) Nov 26 01:45:26 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS BXPC 00000001 BXPC 00000001) Nov 26 01:45:26 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS BXPC 00000001 BXPC 00000001) Nov 26 01:45:26 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS BXPC 00000001 BXPC 00000001) Nov 26 01:45:26 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040 Nov 26 01:45:26 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Nov 26 01:45:26 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Nov 26 01:45:26 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4] Nov 26 01:45:26 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570] Nov 26 01:45:26 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f] Nov 26 01:45:26 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694] Nov 26 01:45:26 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc] Nov 26 01:45:26 localhost kernel: No NUMA configuration found Nov 26 01:45:26 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000043fffffff] Nov 26 01:45:26 localhost kernel: NODE_DATA(0) allocated [mem 0x43ffd5000-0x43fffffff] Nov 26 01:45:26 localhost kernel: Reserving 256MB of memory at 2800MB for crashkernel (System RAM: 16383MB) Nov 26 01:45:26 localhost kernel: Zone ranges: Nov 26 01:45:26 localhost kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Nov 26 01:45:26 localhost kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Nov 26 01:45:26 localhost kernel: Normal [mem 0x0000000100000000-0x000000043fffffff] Nov 26 01:45:26 localhost kernel: Device empty Nov 26 01:45:26 localhost kernel: Movable zone start for each node Nov 26 01:45:26 localhost kernel: Early memory node ranges Nov 26 01:45:26 localhost kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Nov 26 01:45:26 localhost kernel: node 0: [mem 0x0000000000100000-0x00000000bffdafff] Nov 26 01:45:26 localhost kernel: node 0: [mem 0x0000000100000000-0x000000043fffffff] Nov 26 01:45:26 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000043fffffff] Nov 26 01:45:26 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges Nov 26 01:45:26 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges Nov 26 01:45:26 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges Nov 26 01:45:26 localhost kernel: ACPI: PM-Timer IO Port: 0x608 Nov 26 01:45:26 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Nov 26 01:45:26 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Nov 26 01:45:26 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Nov 26 01:45:26 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Nov 26 01:45:26 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Nov 26 01:45:26 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Nov 26 01:45:26 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Nov 26 01:45:26 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information Nov 26 01:45:26 localhost kernel: TSC deadline timer available Nov 26 01:45:26 localhost kernel: smpboot: Allowing 8 CPUs, 0 hotplug CPUs Nov 26 01:45:26 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff] Nov 26 01:45:26 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff] Nov 26 01:45:26 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff] Nov 26 01:45:26 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff] Nov 26 01:45:26 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff] Nov 26 01:45:26 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff] Nov 26 01:45:26 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff] Nov 26 01:45:26 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff] Nov 26 01:45:26 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff] Nov 26 01:45:26 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices Nov 26 01:45:26 localhost kernel: Booting paravirtualized kernel on KVM Nov 26 01:45:26 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Nov 26 01:45:26 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1 Nov 26 01:45:26 localhost kernel: percpu: Embedded 55 pages/cpu s188416 r8192 d28672 u262144 Nov 26 01:45:26 localhost kernel: kvm-guest: PV spinlocks disabled, no host support Nov 26 01:45:26 localhost kernel: Fallback order for Node 0: 0 Nov 26 01:45:26 localhost kernel: Built 1 zonelists, mobility grouping on. Total pages: 4128475 Nov 26 01:45:26 localhost kernel: Policy zone: Normal Nov 26 01:45:26 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M Nov 26 01:45:26 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64", will be passed to user space. Nov 26 01:45:26 localhost kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Nov 26 01:45:26 localhost kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Nov 26 01:45:26 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Nov 26 01:45:26 localhost kernel: software IO TLB: area num 8. Nov 26 01:45:26 localhost kernel: Memory: 2873456K/16776676K available (14342K kernel code, 5536K rwdata, 10180K rodata, 2792K init, 7524K bss, 741260K reserved, 0K cma-reserved) Nov 26 01:45:26 localhost kernel: random: get_random_u64 called from kmem_cache_open+0x1e/0x210 with crng_init=0 Nov 26 01:45:26 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1 Nov 26 01:45:26 localhost kernel: ftrace: allocating 44803 entries in 176 pages Nov 26 01:45:26 localhost kernel: ftrace: allocated 176 pages with 3 groups Nov 26 01:45:26 localhost kernel: Dynamic Preempt: voluntary Nov 26 01:45:26 localhost kernel: rcu: Preemptible hierarchical RCU implementation. Nov 26 01:45:26 localhost kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8. Nov 26 01:45:26 localhost kernel: #011Trampoline variant of Tasks RCU enabled. Nov 26 01:45:26 localhost kernel: #011Rude variant of Tasks RCU enabled. Nov 26 01:45:26 localhost kernel: #011Tracing variant of Tasks RCU enabled. Nov 26 01:45:26 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Nov 26 01:45:26 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8 Nov 26 01:45:26 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16 Nov 26 01:45:26 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Nov 26 01:45:26 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____) Nov 26 01:45:26 localhost kernel: random: crng init done (trusting CPU's manufacturer) Nov 26 01:45:26 localhost kernel: Console: colour VGA+ 80x25 Nov 26 01:45:26 localhost kernel: printk: console [tty0] enabled Nov 26 01:45:26 localhost kernel: printk: console [ttyS0] enabled Nov 26 01:45:26 localhost kernel: ACPI: Core revision 20211217 Nov 26 01:45:26 localhost kernel: APIC: Switch to symmetric I/O mode setup Nov 26 01:45:26 localhost kernel: x2apic enabled Nov 26 01:45:26 localhost kernel: Switched APIC routing to physical x2apic. Nov 26 01:45:26 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized Nov 26 01:45:26 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998) Nov 26 01:45:26 localhost kernel: pid_max: default: 32768 minimum: 301 Nov 26 01:45:26 localhost kernel: LSM: Security Framework initializing Nov 26 01:45:26 localhost kernel: Yama: becoming mindful. Nov 26 01:45:26 localhost kernel: SELinux: Initializing. Nov 26 01:45:26 localhost kernel: LSM support for eBPF active Nov 26 01:45:26 localhost kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Nov 26 01:45:26 localhost kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Nov 26 01:45:26 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Nov 26 01:45:26 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Nov 26 01:45:26 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Nov 26 01:45:26 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Nov 26 01:45:26 localhost kernel: Spectre V2 : Mitigation: Retpolines Nov 26 01:45:26 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Nov 26 01:45:26 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT Nov 26 01:45:26 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Nov 26 01:45:26 localhost kernel: RETBleed: Mitigation: untrained return thunk Nov 26 01:45:26 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Nov 26 01:45:26 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Nov 26 01:45:26 localhost kernel: Freeing SMP alternatives memory: 36K Nov 26 01:45:26 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0) Nov 26 01:45:26 localhost kernel: cblist_init_generic: Setting adjustable number of callback queues. Nov 26 01:45:26 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1. Nov 26 01:45:26 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1. Nov 26 01:45:26 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1. Nov 26 01:45:26 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Nov 26 01:45:26 localhost kernel: ... version: 0 Nov 26 01:45:26 localhost kernel: ... bit width: 48 Nov 26 01:45:26 localhost kernel: ... generic registers: 6 Nov 26 01:45:26 localhost kernel: ... value mask: 0000ffffffffffff Nov 26 01:45:26 localhost kernel: ... max period: 00007fffffffffff Nov 26 01:45:26 localhost kernel: ... fixed-purpose events: 0 Nov 26 01:45:26 localhost kernel: ... event mask: 000000000000003f Nov 26 01:45:26 localhost kernel: rcu: Hierarchical SRCU implementation. Nov 26 01:45:26 localhost kernel: rcu: #011Max phase no-delay instances is 400. Nov 26 01:45:26 localhost kernel: smp: Bringing up secondary CPUs ... Nov 26 01:45:26 localhost kernel: x86: Booting SMP configuration: Nov 26 01:45:26 localhost kernel: .... node #0, CPUs: #1 #2 #3 #4 #5 #6 #7 Nov 26 01:45:26 localhost kernel: smp: Brought up 1 node, 8 CPUs Nov 26 01:45:26 localhost kernel: smpboot: Max logical packages: 8 Nov 26 01:45:26 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS) Nov 26 01:45:26 localhost kernel: node 0 deferred pages initialised in 26ms Nov 26 01:45:26 localhost kernel: devtmpfs: initialized Nov 26 01:45:26 localhost kernel: x86/mm: Memory block size: 128MB Nov 26 01:45:26 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Nov 26 01:45:26 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear) Nov 26 01:45:26 localhost kernel: pinctrl core: initialized pinctrl subsystem Nov 26 01:45:26 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Nov 26 01:45:26 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations Nov 26 01:45:26 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Nov 26 01:45:26 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Nov 26 01:45:26 localhost kernel: audit: initializing netlink subsys (disabled) Nov 26 01:45:26 localhost kernel: audit: type=2000 audit(1764139524.872:1): state=initialized audit_enabled=0 res=1 Nov 26 01:45:26 localhost kernel: thermal_sys: Registered thermal governor 'fair_share' Nov 26 01:45:26 localhost kernel: thermal_sys: Registered thermal governor 'step_wise' Nov 26 01:45:26 localhost kernel: thermal_sys: Registered thermal governor 'user_space' Nov 26 01:45:26 localhost kernel: cpuidle: using governor menu Nov 26 01:45:26 localhost kernel: HugeTLB: can optimize 4095 vmemmap pages for hugepages-1048576kB Nov 26 01:45:26 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Nov 26 01:45:26 localhost kernel: PCI: Using configuration type 1 for base access Nov 26 01:45:26 localhost kernel: PCI: Using configuration type 1 for extended access Nov 26 01:45:26 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Nov 26 01:45:26 localhost kernel: HugeTLB: can optimize 7 vmemmap pages for hugepages-2048kB Nov 26 01:45:26 localhost kernel: HugeTLB registered 1.00 GiB page size, pre-allocated 0 pages Nov 26 01:45:26 localhost kernel: HugeTLB registered 2.00 MiB page size, pre-allocated 0 pages Nov 26 01:45:26 localhost kernel: cryptd: max_cpu_qlen set to 1000 Nov 26 01:45:26 localhost kernel: ACPI: Added _OSI(Module Device) Nov 26 01:45:26 localhost kernel: ACPI: Added _OSI(Processor Device) Nov 26 01:45:26 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Nov 26 01:45:26 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device) Nov 26 01:45:26 localhost kernel: ACPI: Added _OSI(Linux-Dell-Video) Nov 26 01:45:26 localhost kernel: ACPI: Added _OSI(Linux-Lenovo-NV-HDMI-Audio) Nov 26 01:45:26 localhost kernel: ACPI: Added _OSI(Linux-HPI-Hybrid-Graphics) Nov 26 01:45:26 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Nov 26 01:45:26 localhost kernel: ACPI: Interpreter enabled Nov 26 01:45:26 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5) Nov 26 01:45:26 localhost kernel: ACPI: Using IOAPIC for interrupt routing Nov 26 01:45:26 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Nov 26 01:45:26 localhost kernel: PCI: Using E820 reservations for host bridge windows Nov 26 01:45:26 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F Nov 26 01:45:26 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Nov 26 01:45:26 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3] Nov 26 01:45:26 localhost kernel: acpiphp: Slot [3] registered Nov 26 01:45:26 localhost kernel: acpiphp: Slot [4] registered Nov 26 01:45:26 localhost kernel: acpiphp: Slot [5] registered Nov 26 01:45:26 localhost kernel: acpiphp: Slot [6] registered Nov 26 01:45:26 localhost kernel: acpiphp: Slot [7] registered Nov 26 01:45:26 localhost kernel: acpiphp: Slot [8] registered Nov 26 01:45:26 localhost kernel: acpiphp: Slot [9] registered Nov 26 01:45:26 localhost kernel: acpiphp: Slot [10] registered Nov 26 01:45:26 localhost kernel: acpiphp: Slot [11] registered Nov 26 01:45:26 localhost kernel: acpiphp: Slot [12] registered Nov 26 01:45:26 localhost kernel: acpiphp: Slot [13] registered Nov 26 01:45:26 localhost kernel: acpiphp: Slot [14] registered Nov 26 01:45:26 localhost kernel: acpiphp: Slot [15] registered Nov 26 01:45:26 localhost kernel: acpiphp: Slot [16] registered Nov 26 01:45:26 localhost kernel: acpiphp: Slot [17] registered Nov 26 01:45:26 localhost kernel: acpiphp: Slot [18] registered Nov 26 01:45:26 localhost kernel: acpiphp: Slot [19] registered Nov 26 01:45:26 localhost kernel: acpiphp: Slot [20] registered Nov 26 01:45:26 localhost kernel: acpiphp: Slot [21] registered Nov 26 01:45:26 localhost kernel: acpiphp: Slot [22] registered Nov 26 01:45:26 localhost kernel: acpiphp: Slot [23] registered Nov 26 01:45:26 localhost kernel: acpiphp: Slot [24] registered Nov 26 01:45:26 localhost kernel: acpiphp: Slot [25] registered Nov 26 01:45:26 localhost kernel: acpiphp: Slot [26] registered Nov 26 01:45:26 localhost kernel: acpiphp: Slot [27] registered Nov 26 01:45:26 localhost kernel: acpiphp: Slot [28] registered Nov 26 01:45:26 localhost kernel: acpiphp: Slot [29] registered Nov 26 01:45:26 localhost kernel: acpiphp: Slot [30] registered Nov 26 01:45:26 localhost kernel: acpiphp: Slot [31] registered Nov 26 01:45:26 localhost kernel: PCI host bridge to bus 0000:00 Nov 26 01:45:26 localhost kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Nov 26 01:45:26 localhost kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Nov 26 01:45:26 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Nov 26 01:45:26 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Nov 26 01:45:26 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x440000000-0x4bfffffff window] Nov 26 01:45:26 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Nov 26 01:45:26 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 Nov 26 01:45:26 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 Nov 26 01:45:26 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 Nov 26 01:45:26 localhost kernel: pci 0000:00:01.1: reg 0x20: [io 0xc140-0xc14f] Nov 26 01:45:26 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Nov 26 01:45:26 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Nov 26 01:45:26 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Nov 26 01:45:26 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Nov 26 01:45:26 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 Nov 26 01:45:26 localhost kernel: pci 0000:00:01.2: reg 0x20: [io 0xc100-0xc11f] Nov 26 01:45:26 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 Nov 26 01:45:26 localhost kernel: pci 0000:00:01.3: quirk: [io 0x0600-0x063f] claimed by PIIX4 ACPI Nov 26 01:45:26 localhost kernel: pci 0000:00:01.3: quirk: [io 0x0700-0x070f] claimed by PIIX4 SMB Nov 26 01:45:26 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 Nov 26 01:45:26 localhost kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfe000000-0xfe7fffff pref] Nov 26 01:45:26 localhost kernel: pci 0000:00:02.0: reg 0x18: [mem 0xfe800000-0xfe803fff 64bit pref] Nov 26 01:45:26 localhost kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfeb90000-0xfeb90fff] Nov 26 01:45:26 localhost kernel: pci 0000:00:02.0: reg 0x30: [mem 0xfeb80000-0xfeb8ffff pref] Nov 26 01:45:26 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Nov 26 01:45:26 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 Nov 26 01:45:26 localhost kernel: pci 0000:00:03.0: reg 0x10: [io 0xc080-0xc0bf] Nov 26 01:45:26 localhost kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfeb91000-0xfeb91fff] Nov 26 01:45:26 localhost kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfe804000-0xfe807fff 64bit pref] Nov 26 01:45:26 localhost kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfeb00000-0xfeb7ffff pref] Nov 26 01:45:26 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 Nov 26 01:45:26 localhost kernel: pci 0000:00:04.0: reg 0x10: [io 0xc000-0xc07f] Nov 26 01:45:26 localhost kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfeb92000-0xfeb92fff] Nov 26 01:45:26 localhost kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfe808000-0xfe80bfff 64bit pref] Nov 26 01:45:26 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 Nov 26 01:45:26 localhost kernel: pci 0000:00:05.0: reg 0x10: [io 0xc0c0-0xc0ff] Nov 26 01:45:26 localhost kernel: pci 0000:00:05.0: reg 0x20: [mem 0xfe80c000-0xfe80ffff 64bit pref] Nov 26 01:45:26 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 Nov 26 01:45:26 localhost kernel: pci 0000:00:06.0: reg 0x10: [io 0xc120-0xc13f] Nov 26 01:45:26 localhost kernel: pci 0000:00:06.0: reg 0x20: [mem 0xfe810000-0xfe813fff 64bit pref] Nov 26 01:45:26 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Nov 26 01:45:26 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Nov 26 01:45:26 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Nov 26 01:45:26 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Nov 26 01:45:26 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9 Nov 26 01:45:26 localhost kernel: iommu: Default domain type: Translated Nov 26 01:45:26 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode Nov 26 01:45:26 localhost kernel: SCSI subsystem initialized Nov 26 01:45:26 localhost kernel: ACPI: bus type USB registered Nov 26 01:45:26 localhost kernel: usbcore: registered new interface driver usbfs Nov 26 01:45:26 localhost kernel: usbcore: registered new interface driver hub Nov 26 01:45:26 localhost kernel: usbcore: registered new device driver usb Nov 26 01:45:26 localhost kernel: pps_core: LinuxPPS API ver. 1 registered Nov 26 01:45:26 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Nov 26 01:45:26 localhost kernel: PTP clock support registered Nov 26 01:45:26 localhost kernel: EDAC MC: Ver: 3.0.0 Nov 26 01:45:26 localhost kernel: NetLabel: Initializing Nov 26 01:45:26 localhost kernel: NetLabel: domain hash size = 128 Nov 26 01:45:26 localhost kernel: NetLabel: protocols = UNLABELED CIPSOv4 CALIPSO Nov 26 01:45:26 localhost kernel: NetLabel: unlabeled traffic allowed by default Nov 26 01:45:26 localhost kernel: PCI: Using ACPI for IRQ routing Nov 26 01:45:26 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device Nov 26 01:45:26 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible Nov 26 01:45:26 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Nov 26 01:45:26 localhost kernel: vgaarb: loaded Nov 26 01:45:26 localhost kernel: clocksource: Switched to clocksource kvm-clock Nov 26 01:45:26 localhost kernel: VFS: Disk quotas dquot_6.6.0 Nov 26 01:45:26 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Nov 26 01:45:26 localhost kernel: pnp: PnP ACPI init Nov 26 01:45:26 localhost kernel: pnp: PnP ACPI: found 5 devices Nov 26 01:45:26 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Nov 26 01:45:26 localhost kernel: NET: Registered PF_INET protocol family Nov 26 01:45:26 localhost kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Nov 26 01:45:26 localhost kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear) Nov 26 01:45:26 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Nov 26 01:45:26 localhost kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear) Nov 26 01:45:26 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear) Nov 26 01:45:26 localhost kernel: TCP: Hash tables configured (established 131072 bind 65536) Nov 26 01:45:26 localhost kernel: MPTCP token hash table entries: 16384 (order: 6, 393216 bytes, linear) Nov 26 01:45:26 localhost kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear) Nov 26 01:45:26 localhost kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear) Nov 26 01:45:26 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Nov 26 01:45:26 localhost kernel: NET: Registered PF_XDP protocol family Nov 26 01:45:26 localhost kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Nov 26 01:45:26 localhost kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Nov 26 01:45:26 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Nov 26 01:45:26 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window] Nov 26 01:45:26 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x440000000-0x4bfffffff window] Nov 26 01:45:26 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release Nov 26 01:45:26 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Nov 26 01:45:26 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11 Nov 26 01:45:26 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x140 took 26874 usecs Nov 26 01:45:26 localhost kernel: PCI: CLS 0 bytes, default 64 Nov 26 01:45:26 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Nov 26 01:45:26 localhost kernel: Trying to unpack rootfs image as initramfs... Nov 26 01:45:26 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB) Nov 26 01:45:26 localhost kernel: ACPI: bus type thunderbolt registered Nov 26 01:45:26 localhost kernel: Initialise system trusted keyrings Nov 26 01:45:26 localhost kernel: Key type blacklist registered Nov 26 01:45:26 localhost kernel: workingset: timestamp_bits=36 max_order=22 bucket_order=0 Nov 26 01:45:26 localhost kernel: zbud: loaded Nov 26 01:45:26 localhost kernel: integrity: Platform Keyring initialized Nov 26 01:45:26 localhost kernel: NET: Registered PF_ALG protocol family Nov 26 01:45:26 localhost kernel: xor: automatically using best checksumming function avx Nov 26 01:45:26 localhost kernel: Key type asymmetric registered Nov 26 01:45:26 localhost kernel: Asymmetric key parser 'x509' registered Nov 26 01:45:26 localhost kernel: Running certificate verification selftests Nov 26 01:45:26 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db' Nov 26 01:45:26 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246) Nov 26 01:45:26 localhost kernel: io scheduler mq-deadline registered Nov 26 01:45:26 localhost kernel: io scheduler kyber registered Nov 26 01:45:26 localhost kernel: io scheduler bfq registered Nov 26 01:45:26 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE Nov 26 01:45:26 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4 Nov 26 01:45:26 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0 Nov 26 01:45:26 localhost kernel: ACPI: button: Power Button [PWRF] Nov 26 01:45:26 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10 Nov 26 01:45:26 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11 Nov 26 01:45:26 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10 Nov 26 01:45:26 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Nov 26 01:45:26 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Nov 26 01:45:26 localhost kernel: Non-volatile memory driver v1.3 Nov 26 01:45:26 localhost kernel: rdac: device handler registered Nov 26 01:45:26 localhost kernel: hp_sw: device handler registered Nov 26 01:45:26 localhost kernel: emc: device handler registered Nov 26 01:45:26 localhost kernel: alua: device handler registered Nov 26 01:45:26 localhost kernel: libphy: Fixed MDIO Bus: probed Nov 26 01:45:26 localhost kernel: ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver Nov 26 01:45:26 localhost kernel: ehci-pci: EHCI PCI platform driver Nov 26 01:45:26 localhost kernel: ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver Nov 26 01:45:26 localhost kernel: ohci-pci: OHCI PCI platform driver Nov 26 01:45:26 localhost kernel: uhci_hcd: USB Universal Host Controller Interface driver Nov 26 01:45:26 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller Nov 26 01:45:26 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1 Nov 26 01:45:26 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports Nov 26 01:45:26 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100 Nov 26 01:45:26 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14 Nov 26 01:45:26 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1 Nov 26 01:45:26 localhost kernel: usb usb1: Product: UHCI Host Controller Nov 26 01:45:26 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-284.11.1.el9_2.x86_64 uhci_hcd Nov 26 01:45:26 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2 Nov 26 01:45:26 localhost kernel: hub 1-0:1.0: USB hub found Nov 26 01:45:26 localhost kernel: hub 1-0:1.0: 2 ports detected Nov 26 01:45:26 localhost kernel: usbcore: registered new interface driver usbserial_generic Nov 26 01:45:26 localhost kernel: usbserial: USB Serial support registered for generic Nov 26 01:45:26 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Nov 26 01:45:26 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Nov 26 01:45:26 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Nov 26 01:45:26 localhost kernel: mousedev: PS/2 mouse device common for all mice Nov 26 01:45:26 localhost kernel: rtc_cmos 00:04: RTC can wake from S4 Nov 26 01:45:26 localhost kernel: rtc_cmos 00:04: registered as rtc0 Nov 26 01:45:26 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1 Nov 26 01:45:26 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-11-26T06:45:25 UTC (1764139525) Nov 26 01:45:26 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram Nov 26 01:45:26 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4 Nov 26 01:45:26 localhost kernel: hid: raw HID events driver (C) Jiri Kosina Nov 26 01:45:26 localhost kernel: usbcore: registered new interface driver usbhid Nov 26 01:45:26 localhost kernel: usbhid: USB HID core driver Nov 26 01:45:26 localhost kernel: drop_monitor: Initializing network drop monitor service Nov 26 01:45:26 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3 Nov 26 01:45:26 localhost kernel: Initializing XFRM netlink socket Nov 26 01:45:26 localhost kernel: NET: Registered PF_INET6 protocol family Nov 26 01:45:26 localhost kernel: Segment Routing with IPv6 Nov 26 01:45:26 localhost kernel: NET: Registered PF_PACKET protocol family Nov 26 01:45:26 localhost kernel: mpls_gso: MPLS GSO support Nov 26 01:45:26 localhost kernel: IPI shorthand broadcast: enabled Nov 26 01:45:26 localhost kernel: AVX2 version of gcm_enc/dec engaged. Nov 26 01:45:26 localhost kernel: AES CTR mode by8 optimization enabled Nov 26 01:45:26 localhost kernel: sched_clock: Marking stable (779785943, 175376822)->(1080170096, -125007331) Nov 26 01:45:26 localhost kernel: registered taskstats version 1 Nov 26 01:45:26 localhost kernel: Loading compiled-in X.509 certificates Nov 26 01:45:26 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kernel signing key: aaec4b640ef162b54684864066c7d4ffd428cd72' Nov 26 01:45:26 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80' Nov 26 01:45:26 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8' Nov 26 01:45:26 localhost kernel: zswap: loaded using pool lzo/zbud Nov 26 01:45:26 localhost kernel: page_owner is disabled Nov 26 01:45:26 localhost kernel: Key type big_key registered Nov 26 01:45:26 localhost kernel: Freeing initrd memory: 74232K Nov 26 01:45:26 localhost kernel: Key type encrypted registered Nov 26 01:45:26 localhost kernel: ima: No TPM chip found, activating TPM-bypass! Nov 26 01:45:26 localhost kernel: Loading compiled-in module X.509 certificates Nov 26 01:45:26 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kernel signing key: aaec4b640ef162b54684864066c7d4ffd428cd72' Nov 26 01:45:26 localhost kernel: ima: Allocated hash algorithm: sha256 Nov 26 01:45:26 localhost kernel: ima: No architecture policies found Nov 26 01:45:26 localhost kernel: evm: Initialising EVM extended attributes: Nov 26 01:45:26 localhost kernel: evm: security.selinux Nov 26 01:45:26 localhost kernel: evm: security.SMACK64 (disabled) Nov 26 01:45:26 localhost kernel: evm: security.SMACK64EXEC (disabled) Nov 26 01:45:26 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled) Nov 26 01:45:26 localhost kernel: evm: security.SMACK64MMAP (disabled) Nov 26 01:45:26 localhost kernel: evm: security.apparmor (disabled) Nov 26 01:45:26 localhost kernel: evm: security.ima Nov 26 01:45:26 localhost kernel: evm: security.capability Nov 26 01:45:26 localhost kernel: evm: HMAC attrs: 0x1 Nov 26 01:45:26 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd Nov 26 01:45:26 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00 Nov 26 01:45:26 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10 Nov 26 01:45:26 localhost kernel: usb 1-1: Product: QEMU USB Tablet Nov 26 01:45:26 localhost kernel: usb 1-1: Manufacturer: QEMU Nov 26 01:45:26 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1 Nov 26 01:45:26 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5 Nov 26 01:45:26 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0 Nov 26 01:45:26 localhost kernel: Freeing unused decrypted memory: 2036K Nov 26 01:45:26 localhost kernel: Freeing unused kernel image (initmem) memory: 2792K Nov 26 01:45:26 localhost kernel: Write protecting the kernel read-only data: 26624k Nov 26 01:45:26 localhost kernel: Freeing unused kernel image (text/rodata gap) memory: 2040K Nov 26 01:45:26 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 60K Nov 26 01:45:26 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found. Nov 26 01:45:26 localhost kernel: Run /init as init process Nov 26 01:45:26 localhost systemd[1]: systemd 252-13.el9_2 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Nov 26 01:45:26 localhost systemd[1]: Detected virtualization kvm. Nov 26 01:45:26 localhost systemd[1]: Detected architecture x86-64. Nov 26 01:45:26 localhost systemd[1]: Running in initrd. Nov 26 01:45:26 localhost systemd[1]: No hostname configured, using default hostname. Nov 26 01:45:26 localhost systemd[1]: Hostname set to . Nov 26 01:45:26 localhost systemd[1]: Initializing machine ID from VM UUID. Nov 26 01:45:26 localhost systemd[1]: Queued start job for default target Initrd Default Target. Nov 26 01:45:26 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch. Nov 26 01:45:26 localhost systemd[1]: Reached target Local Encrypted Volumes. Nov 26 01:45:26 localhost systemd[1]: Reached target Initrd /usr File System. Nov 26 01:45:26 localhost systemd[1]: Reached target Local File Systems. Nov 26 01:45:26 localhost systemd[1]: Reached target Path Units. Nov 26 01:45:26 localhost systemd[1]: Reached target Slice Units. Nov 26 01:45:26 localhost systemd[1]: Reached target Swaps. Nov 26 01:45:26 localhost systemd[1]: Reached target Timer Units. Nov 26 01:45:26 localhost systemd[1]: Listening on D-Bus System Message Bus Socket. Nov 26 01:45:26 localhost systemd[1]: Listening on Journal Socket (/dev/log). Nov 26 01:45:26 localhost systemd[1]: Listening on Journal Socket. Nov 26 01:45:26 localhost systemd[1]: Listening on udev Control Socket. Nov 26 01:45:26 localhost systemd[1]: Listening on udev Kernel Socket. Nov 26 01:45:26 localhost systemd[1]: Reached target Socket Units. Nov 26 01:45:26 localhost systemd[1]: Starting Create List of Static Device Nodes... Nov 26 01:45:26 localhost systemd[1]: Starting Journal Service... Nov 26 01:45:26 localhost systemd[1]: Starting Load Kernel Modules... Nov 26 01:45:26 localhost systemd[1]: Starting Create System Users... Nov 26 01:45:26 localhost systemd[1]: Starting Setup Virtual Console... Nov 26 01:45:26 localhost systemd[1]: Finished Create List of Static Device Nodes. Nov 26 01:45:26 localhost systemd[1]: Finished Load Kernel Modules. Nov 26 01:45:26 localhost systemd[1]: Starting Apply Kernel Variables... Nov 26 01:45:26 localhost systemd-journald[282]: Journal started Nov 26 01:45:26 localhost systemd-journald[282]: Runtime Journal (/run/log/journal/54d67e253d534e7fba95c2d307a21761) is 8.0M, max 314.7M, 306.7M free. Nov 26 01:45:26 localhost systemd-modules-load[283]: Module 'msr' is built in Nov 26 01:45:26 localhost systemd[1]: Started Journal Service. Nov 26 01:45:26 localhost systemd[1]: Finished Setup Virtual Console. Nov 26 01:45:26 localhost systemd[1]: Finished Apply Kernel Variables. Nov 26 01:45:26 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met. Nov 26 01:45:26 localhost systemd[1]: Starting dracut cmdline hook... Nov 26 01:45:26 localhost systemd-sysusers[284]: Creating group 'sgx' with GID 997. Nov 26 01:45:26 localhost systemd-sysusers[284]: Creating group 'users' with GID 100. Nov 26 01:45:26 localhost systemd-sysusers[284]: Creating group 'dbus' with GID 81. Nov 26 01:45:26 localhost systemd-sysusers[284]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81. Nov 26 01:45:26 localhost systemd[1]: Finished Create System Users. Nov 26 01:45:26 localhost systemd[1]: Starting Create Static Device Nodes in /dev... Nov 26 01:45:26 localhost systemd[1]: Starting Create Volatile Files and Directories... Nov 26 01:45:26 localhost systemd[1]: Finished Create Static Device Nodes in /dev. Nov 26 01:45:26 localhost dracut-cmdline[290]: dracut-9.2 (Plow) dracut-057-21.git20230214.el9 Nov 26 01:45:26 localhost systemd[1]: Finished Create Volatile Files and Directories. Nov 26 01:45:26 localhost dracut-cmdline[290]: Using kernel command line parameters: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M Nov 26 01:45:26 localhost systemd[1]: Finished dracut cmdline hook. Nov 26 01:45:26 localhost systemd[1]: Starting dracut pre-udev hook... Nov 26 01:45:26 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Nov 26 01:45:26 localhost kernel: device-mapper: uevent: version 1.0.3 Nov 26 01:45:26 localhost kernel: device-mapper: ioctl: 4.47.0-ioctl (2022-07-28) initialised: dm-devel@redhat.com Nov 26 01:45:26 localhost kernel: RPC: Registered named UNIX socket transport module. Nov 26 01:45:26 localhost kernel: RPC: Registered udp transport module. Nov 26 01:45:26 localhost kernel: RPC: Registered tcp transport module. Nov 26 01:45:26 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module. Nov 26 01:45:26 localhost rpc.statd[406]: Version 2.5.4 starting Nov 26 01:45:26 localhost rpc.statd[406]: Initializing NSM state Nov 26 01:45:26 localhost rpc.idmapd[411]: Setting log level to 0 Nov 26 01:45:26 localhost systemd[1]: Finished dracut pre-udev hook. Nov 26 01:45:26 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files... Nov 26 01:45:26 localhost systemd-udevd[424]: Using default interface naming scheme 'rhel-9.0'. Nov 26 01:45:26 localhost systemd[1]: Started Rule-based Manager for Device Events and Files. Nov 26 01:45:26 localhost systemd[1]: Starting dracut pre-trigger hook... Nov 26 01:45:26 localhost systemd[1]: Finished dracut pre-trigger hook. Nov 26 01:45:26 localhost systemd[1]: Starting Coldplug All udev Devices... Nov 26 01:45:26 localhost systemd[1]: Finished Coldplug All udev Devices. Nov 26 01:45:26 localhost systemd[1]: Reached target System Initialization. Nov 26 01:45:26 localhost systemd[1]: Reached target Basic System. Nov 26 01:45:26 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet). Nov 26 01:45:26 localhost systemd[1]: Reached target Network. Nov 26 01:45:26 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet). Nov 26 01:45:26 localhost systemd[1]: Starting dracut initqueue hook... Nov 26 01:45:26 localhost kernel: virtio_blk virtio2: [vda] 838860800 512-byte logical blocks (429 GB/400 GiB) Nov 26 01:45:26 localhost kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Nov 26 01:45:26 localhost kernel: GPT:20971519 != 838860799 Nov 26 01:45:26 localhost kernel: GPT:Alternate GPT header not at the end of the disk. Nov 26 01:45:26 localhost kernel: GPT:20971519 != 838860799 Nov 26 01:45:26 localhost kernel: GPT: Use GNU Parted to correct GPT errors. Nov 26 01:45:26 localhost kernel: vda: vda1 vda2 vda3 vda4 Nov 26 01:45:26 localhost kernel: scsi host0: ata_piix Nov 26 01:45:26 localhost kernel: scsi host1: ata_piix Nov 26 01:45:26 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 Nov 26 01:45:26 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 Nov 26 01:45:26 localhost systemd-udevd[441]: Network interface NamePolicy= disabled on kernel command line. Nov 26 01:45:26 localhost systemd[1]: Found device /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a. Nov 26 01:45:26 localhost systemd[1]: Reached target Initrd Root Device. Nov 26 01:45:27 localhost kernel: ata1: found unknown device (class 0) Nov 26 01:45:27 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Nov 26 01:45:27 localhost kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Nov 26 01:45:27 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5 Nov 26 01:45:27 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Nov 26 01:45:27 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Nov 26 01:45:27 localhost systemd[1]: Finished dracut initqueue hook. Nov 26 01:45:27 localhost systemd[1]: Reached target Preparation for Remote File Systems. Nov 26 01:45:27 localhost systemd[1]: Reached target Remote Encrypted Volumes. Nov 26 01:45:27 localhost systemd[1]: Reached target Remote File Systems. Nov 26 01:45:27 localhost systemd[1]: Starting dracut pre-mount hook... Nov 26 01:45:27 localhost systemd[1]: Finished dracut pre-mount hook. Nov 26 01:45:27 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a... Nov 26 01:45:27 localhost systemd-fsck[511]: /usr/sbin/fsck.xfs: XFS file system. Nov 26 01:45:27 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a. Nov 26 01:45:27 localhost systemd[1]: Mounting /sysroot... Nov 26 01:45:27 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled Nov 26 01:45:27 localhost kernel: XFS (vda4): Mounting V5 Filesystem Nov 26 01:45:27 localhost kernel: XFS (vda4): Ending clean mount Nov 26 01:45:27 localhost systemd[1]: Mounted /sysroot. Nov 26 01:45:27 localhost systemd[1]: Reached target Initrd Root File System. Nov 26 01:45:27 localhost systemd[1]: Starting Mountpoints Configured in the Real Root... Nov 26 01:45:27 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully. Nov 26 01:45:27 localhost systemd[1]: Finished Mountpoints Configured in the Real Root. Nov 26 01:45:27 localhost systemd[1]: Reached target Initrd File Systems. Nov 26 01:45:27 localhost systemd[1]: Reached target Initrd Default Target. Nov 26 01:45:27 localhost systemd[1]: Starting dracut mount hook... Nov 26 01:45:27 localhost systemd[1]: Finished dracut mount hook. Nov 26 01:45:27 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook... Nov 26 01:45:27 localhost rpc.idmapd[411]: exiting on signal 15 Nov 26 01:45:27 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully. Nov 26 01:45:27 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook. Nov 26 01:45:27 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons... Nov 26 01:45:27 localhost systemd[1]: Stopped target Network. Nov 26 01:45:27 localhost systemd[1]: Stopped target Remote Encrypted Volumes. Nov 26 01:45:27 localhost systemd[1]: Stopped target Timer Units. Nov 26 01:45:27 localhost systemd[1]: dbus.socket: Deactivated successfully. Nov 26 01:45:27 localhost systemd[1]: Closed D-Bus System Message Bus Socket. Nov 26 01:45:27 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Nov 26 01:45:27 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook. Nov 26 01:45:27 localhost systemd[1]: Stopped target Initrd Default Target. Nov 26 01:45:27 localhost systemd[1]: Stopped target Basic System. Nov 26 01:45:27 localhost systemd[1]: Stopped target Initrd Root Device. Nov 26 01:45:27 localhost systemd[1]: Stopped target Initrd /usr File System. Nov 26 01:45:27 localhost systemd[1]: Stopped target Path Units. Nov 26 01:45:27 localhost systemd[1]: Stopped target Remote File Systems. Nov 26 01:45:27 localhost systemd[1]: Stopped target Preparation for Remote File Systems. Nov 26 01:45:27 localhost systemd[1]: Stopped target Slice Units. Nov 26 01:45:27 localhost systemd[1]: Stopped target Socket Units. Nov 26 01:45:27 localhost systemd[1]: Stopped target System Initialization. Nov 26 01:45:27 localhost systemd[1]: Stopped target Local File Systems. Nov 26 01:45:27 localhost systemd[1]: Stopped target Swaps. Nov 26 01:45:27 localhost systemd[1]: dracut-mount.service: Deactivated successfully. Nov 26 01:45:27 localhost systemd[1]: Stopped dracut mount hook. Nov 26 01:45:27 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully. Nov 26 01:45:27 localhost systemd[1]: Stopped dracut pre-mount hook. Nov 26 01:45:27 localhost systemd[1]: Stopped target Local Encrypted Volumes. Nov 26 01:45:27 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Nov 26 01:45:27 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch. Nov 26 01:45:27 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully. Nov 26 01:45:27 localhost systemd[1]: Stopped dracut initqueue hook. Nov 26 01:45:27 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully. Nov 26 01:45:27 localhost systemd[1]: Stopped Apply Kernel Variables. Nov 26 01:45:27 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Nov 26 01:45:27 localhost systemd[1]: Stopped Load Kernel Modules. Nov 26 01:45:27 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Nov 26 01:45:27 localhost systemd[1]: Stopped Create Volatile Files and Directories. Nov 26 01:45:27 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Nov 26 01:45:27 localhost systemd[1]: Stopped Coldplug All udev Devices. Nov 26 01:45:27 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Nov 26 01:45:27 localhost systemd[1]: Stopped dracut pre-trigger hook. Nov 26 01:45:27 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files... Nov 26 01:45:27 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Nov 26 01:45:27 localhost systemd[1]: Stopped Setup Virtual Console. Nov 26 01:45:27 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Nov 26 01:45:27 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Nov 26 01:45:27 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully. Nov 26 01:45:27 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons. Nov 26 01:45:27 localhost systemd[1]: systemd-udevd.service: Deactivated successfully. Nov 26 01:45:27 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files. Nov 26 01:45:27 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Nov 26 01:45:27 localhost systemd[1]: Closed udev Control Socket. Nov 26 01:45:27 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Nov 26 01:45:27 localhost systemd[1]: Closed udev Kernel Socket. Nov 26 01:45:27 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully. Nov 26 01:45:27 localhost systemd[1]: Stopped dracut pre-udev hook. Nov 26 01:45:27 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully. Nov 26 01:45:27 localhost systemd[1]: Stopped dracut cmdline hook. Nov 26 01:45:27 localhost systemd[1]: Starting Cleanup udev Database... Nov 26 01:45:27 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Nov 26 01:45:27 localhost systemd[1]: Stopped Create Static Device Nodes in /dev. Nov 26 01:45:27 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully. Nov 26 01:45:27 localhost systemd[1]: Stopped Create List of Static Device Nodes. Nov 26 01:45:27 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully. Nov 26 01:45:27 localhost systemd[1]: Stopped Create System Users. Nov 26 01:45:27 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Nov 26 01:45:27 localhost systemd[1]: Finished Cleanup udev Database. Nov 26 01:45:27 localhost systemd[1]: Reached target Switch Root. Nov 26 01:45:27 localhost systemd[1]: Starting Switch Root... Nov 26 01:45:27 localhost systemd[1]: Switching root. Nov 26 01:45:27 localhost systemd-journald[282]: Journal stopped Nov 26 01:45:28 localhost systemd-journald[282]: Received SIGTERM from PID 1 (systemd). Nov 26 01:45:28 localhost kernel: audit: type=1404 audit(1764139528.015:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1 Nov 26 01:45:28 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 26 01:45:28 localhost kernel: SELinux: policy capability open_perms=1 Nov 26 01:45:28 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 26 01:45:28 localhost kernel: SELinux: policy capability always_check_network=0 Nov 26 01:45:28 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 26 01:45:28 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 26 01:45:28 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 26 01:45:28 localhost kernel: audit: type=1403 audit(1764139528.138:3): auid=4294967295 ses=4294967295 lsm=selinux res=1 Nov 26 01:45:28 localhost systemd[1]: Successfully loaded SELinux policy in 126.466ms. Nov 26 01:45:28 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 32.899ms. Nov 26 01:45:28 localhost systemd[1]: systemd 252-13.el9_2 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Nov 26 01:45:28 localhost systemd[1]: Detected virtualization kvm. Nov 26 01:45:28 localhost systemd[1]: Detected architecture x86-64. Nov 26 01:45:28 localhost systemd-rc-local-generator[582]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 01:45:28 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 01:45:28 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully. Nov 26 01:45:28 localhost systemd[1]: Stopped Switch Root. Nov 26 01:45:28 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Nov 26 01:45:28 localhost systemd[1]: Created slice Slice /system/getty. Nov 26 01:45:28 localhost systemd[1]: Created slice Slice /system/modprobe. Nov 26 01:45:28 localhost systemd[1]: Created slice Slice /system/serial-getty. Nov 26 01:45:28 localhost systemd[1]: Created slice Slice /system/sshd-keygen. Nov 26 01:45:28 localhost systemd[1]: Created slice Slice /system/systemd-fsck. Nov 26 01:45:28 localhost systemd[1]: Created slice User and Session Slice. Nov 26 01:45:28 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch. Nov 26 01:45:28 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch. Nov 26 01:45:28 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point. Nov 26 01:45:28 localhost systemd[1]: Reached target Local Encrypted Volumes. Nov 26 01:45:28 localhost systemd[1]: Stopped target Switch Root. Nov 26 01:45:28 localhost systemd[1]: Stopped target Initrd File Systems. Nov 26 01:45:28 localhost systemd[1]: Stopped target Initrd Root File System. Nov 26 01:45:28 localhost systemd[1]: Reached target Local Integrity Protected Volumes. Nov 26 01:45:28 localhost systemd[1]: Reached target Path Units. Nov 26 01:45:28 localhost systemd[1]: Reached target rpc_pipefs.target. Nov 26 01:45:28 localhost systemd[1]: Reached target Slice Units. Nov 26 01:45:28 localhost systemd[1]: Reached target Swaps. Nov 26 01:45:28 localhost systemd[1]: Reached target Local Verity Protected Volumes. Nov 26 01:45:28 localhost systemd[1]: Listening on RPCbind Server Activation Socket. Nov 26 01:45:28 localhost systemd[1]: Reached target RPC Port Mapper. Nov 26 01:45:28 localhost systemd[1]: Listening on Process Core Dump Socket. Nov 26 01:45:28 localhost systemd[1]: Listening on initctl Compatibility Named Pipe. Nov 26 01:45:28 localhost systemd[1]: Listening on udev Control Socket. Nov 26 01:45:28 localhost systemd[1]: Listening on udev Kernel Socket. Nov 26 01:45:28 localhost systemd[1]: Mounting Huge Pages File System... Nov 26 01:45:28 localhost systemd[1]: Mounting POSIX Message Queue File System... Nov 26 01:45:28 localhost systemd[1]: Mounting Kernel Debug File System... Nov 26 01:45:28 localhost systemd[1]: Mounting Kernel Trace File System... Nov 26 01:45:28 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab). Nov 26 01:45:28 localhost systemd[1]: Starting Create List of Static Device Nodes... Nov 26 01:45:28 localhost systemd[1]: Starting Load Kernel Module configfs... Nov 26 01:45:28 localhost systemd[1]: Starting Load Kernel Module drm... Nov 26 01:45:28 localhost systemd[1]: Starting Load Kernel Module fuse... Nov 26 01:45:28 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network... Nov 26 01:45:28 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully. Nov 26 01:45:28 localhost systemd[1]: Stopped File System Check on Root Device. Nov 26 01:45:28 localhost systemd[1]: Stopped Journal Service. Nov 26 01:45:28 localhost systemd[1]: Starting Journal Service... Nov 26 01:45:28 localhost systemd[1]: Starting Load Kernel Modules... Nov 26 01:45:28 localhost systemd[1]: Starting Generate network units from Kernel command line... Nov 26 01:45:28 localhost systemd[1]: Starting Remount Root and Kernel File Systems... Nov 26 01:45:28 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met. Nov 26 01:45:28 localhost systemd[1]: Starting Coldplug All udev Devices... Nov 26 01:45:28 localhost kernel: fuse: init (API version 7.36) Nov 26 01:45:28 localhost systemd[1]: Mounted Huge Pages File System. Nov 26 01:45:28 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff) Nov 26 01:45:28 localhost systemd[1]: Mounted POSIX Message Queue File System. Nov 26 01:45:28 localhost systemd-journald[618]: Journal started Nov 26 01:45:28 localhost systemd-journald[618]: Runtime Journal (/run/log/journal/ea6370aa35b896eb1e7cdbd81aa316d7) is 8.0M, max 314.7M, 306.7M free. Nov 26 01:45:28 localhost systemd[1]: Queued start job for default target Multi-User System. Nov 26 01:45:28 localhost systemd[1]: systemd-journald.service: Deactivated successfully. Nov 26 01:45:28 localhost systemd-modules-load[619]: Module 'msr' is built in Nov 26 01:45:28 localhost systemd[1]: Started Journal Service. Nov 26 01:45:28 localhost systemd[1]: Mounted Kernel Debug File System. Nov 26 01:45:28 localhost systemd[1]: Mounted Kernel Trace File System. Nov 26 01:45:28 localhost systemd[1]: Finished Create List of Static Device Nodes. Nov 26 01:45:28 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully. Nov 26 01:45:28 localhost systemd[1]: Finished Load Kernel Module configfs. Nov 26 01:45:28 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully. Nov 26 01:45:28 localhost systemd[1]: Finished Load Kernel Module fuse. Nov 26 01:45:28 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network. Nov 26 01:45:28 localhost systemd[1]: Finished Load Kernel Modules. Nov 26 01:45:28 localhost systemd[1]: Finished Generate network units from Kernel command line. Nov 26 01:45:28 localhost systemd[1]: Finished Remount Root and Kernel File Systems. Nov 26 01:45:28 localhost systemd[1]: Mounting FUSE Control File System... Nov 26 01:45:28 localhost systemd[1]: Mounting Kernel Configuration File System... Nov 26 01:45:28 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes). Nov 26 01:45:28 localhost kernel: ACPI: bus type drm_connector registered Nov 26 01:45:28 localhost systemd[1]: Starting Rebuild Hardware Database... Nov 26 01:45:28 localhost systemd[1]: Starting Flush Journal to Persistent Storage... Nov 26 01:45:28 localhost systemd[1]: Starting Load/Save Random Seed... Nov 26 01:45:28 localhost systemd[1]: Starting Apply Kernel Variables... Nov 26 01:45:28 localhost systemd[1]: Starting Create System Users... Nov 26 01:45:28 localhost systemd[1]: modprobe@drm.service: Deactivated successfully. Nov 26 01:45:28 localhost systemd[1]: Finished Load Kernel Module drm. Nov 26 01:45:28 localhost systemd-journald[618]: Runtime Journal (/run/log/journal/ea6370aa35b896eb1e7cdbd81aa316d7) is 8.0M, max 314.7M, 306.7M free. Nov 26 01:45:28 localhost systemd-journald[618]: Received client request to flush runtime journal. Nov 26 01:45:28 localhost systemd[1]: Mounted FUSE Control File System. Nov 26 01:45:28 localhost systemd[1]: Mounted Kernel Configuration File System. Nov 26 01:45:28 localhost systemd[1]: Finished Flush Journal to Persistent Storage. Nov 26 01:45:28 localhost systemd[1]: Finished Load/Save Random Seed. Nov 26 01:45:28 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes). Nov 26 01:45:28 localhost systemd[1]: Finished Coldplug All udev Devices. Nov 26 01:45:28 localhost systemd[1]: Finished Apply Kernel Variables. Nov 26 01:45:28 localhost systemd-sysusers[630]: Creating group 'sgx' with GID 989. Nov 26 01:45:28 localhost systemd-sysusers[630]: Creating group 'systemd-oom' with GID 988. Nov 26 01:45:28 localhost systemd-sysusers[630]: Creating user 'systemd-oom' (systemd Userspace OOM Killer) with UID 988 and GID 988. Nov 26 01:45:28 localhost systemd[1]: Finished Create System Users. Nov 26 01:45:28 localhost systemd[1]: Starting Create Static Device Nodes in /dev... Nov 26 01:45:28 localhost systemd[1]: Finished Create Static Device Nodes in /dev. Nov 26 01:45:28 localhost systemd[1]: Reached target Preparation for Local File Systems. Nov 26 01:45:28 localhost systemd[1]: Set up automount EFI System Partition Automount. Nov 26 01:45:29 localhost systemd[1]: Finished Rebuild Hardware Database. Nov 26 01:45:29 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files... Nov 26 01:45:29 localhost systemd-udevd[635]: Using default interface naming scheme 'rhel-9.0'. Nov 26 01:45:29 localhost systemd[1]: Started Rule-based Manager for Device Events and Files. Nov 26 01:45:29 localhost systemd[1]: Starting Load Kernel Module configfs... Nov 26 01:45:29 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully. Nov 26 01:45:29 localhost systemd[1]: Finished Load Kernel Module configfs. Nov 26 01:45:29 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped. Nov 26 01:45:29 localhost systemd-udevd[640]: Network interface NamePolicy= disabled on kernel command line. Nov 26 01:45:29 localhost systemd[1]: Condition check resulted in /dev/disk/by-uuid/b141154b-6a70-437a-a97f-d160c9ba37eb being skipped. Nov 26 01:45:29 localhost systemd[1]: Condition check resulted in /dev/disk/by-uuid/7B77-95E7 being skipped. Nov 26 01:45:29 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/7B77-95E7... Nov 26 01:45:29 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0 Nov 26 01:45:29 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6 Nov 26 01:45:29 localhost systemd-fsck[679]: fsck.fat 4.2 (2021-01-31) Nov 26 01:45:29 localhost systemd-fsck[679]: /dev/vda2: 12 files, 1782/51145 clusters Nov 26 01:45:29 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/7B77-95E7. Nov 26 01:45:29 localhost kernel: SVM: TSC scaling supported Nov 26 01:45:29 localhost kernel: kvm: Nested Virtualization enabled Nov 26 01:45:29 localhost kernel: SVM: kvm: Nested Paging enabled Nov 26 01:45:29 localhost kernel: SVM: LBR virtualization supported Nov 26 01:45:29 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0 Nov 26 01:45:29 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console Nov 26 01:45:29 localhost kernel: Console: switching to colour dummy device 80x25 Nov 26 01:45:29 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible Nov 26 01:45:29 localhost kernel: [drm] features: -context_init Nov 26 01:45:29 localhost kernel: [drm] number of scanouts: 1 Nov 26 01:45:29 localhost kernel: [drm] number of cap sets: 0 Nov 26 01:45:29 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 0 for virtio0 on minor 0 Nov 26 01:45:29 localhost kernel: virtio_gpu virtio0: [drm] drm_plane_enable_fb_damage_clips() not called Nov 26 01:45:29 localhost kernel: Console: switching to colour frame buffer device 128x48 Nov 26 01:45:29 localhost kernel: virtio_gpu virtio0: [drm] fb0: virtio_gpudrmfb frame buffer device Nov 26 01:45:29 localhost systemd[1]: Mounting /boot... Nov 26 01:45:29 localhost kernel: XFS (vda3): Mounting V5 Filesystem Nov 26 01:45:29 localhost kernel: XFS (vda3): Ending clean mount Nov 26 01:45:29 localhost kernel: xfs filesystem being mounted at /boot supports timestamps until 2038 (0x7fffffff) Nov 26 01:45:29 localhost systemd[1]: Mounted /boot. Nov 26 01:45:29 localhost systemd[1]: Mounting /boot/efi... Nov 26 01:45:29 localhost systemd[1]: Mounted /boot/efi. Nov 26 01:45:29 localhost systemd[1]: Reached target Local File Systems. Nov 26 01:45:29 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache... Nov 26 01:45:29 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux). Nov 26 01:45:29 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Nov 26 01:45:29 localhost systemd[1]: Store a System Token in an EFI Variable was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Nov 26 01:45:29 localhost systemd[1]: Starting Automatic Boot Loader Update... Nov 26 01:45:29 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id). Nov 26 01:45:29 localhost systemd[1]: Starting Create Volatile Files and Directories... Nov 26 01:45:29 localhost systemd[1]: efi.automount: Got automount request for /efi, triggered by 717 (bootctl) Nov 26 01:45:29 localhost systemd[1]: Starting File System Check on /dev/vda2... Nov 26 01:45:29 localhost systemd[1]: Finished File System Check on /dev/vda2. Nov 26 01:45:29 localhost systemd[1]: Mounting EFI System Partition Automount... Nov 26 01:45:29 localhost systemd[1]: Mounted EFI System Partition Automount. Nov 26 01:45:30 localhost systemd[1]: Finished Automatic Boot Loader Update. Nov 26 01:45:30 localhost systemd[1]: Finished Create Volatile Files and Directories. Nov 26 01:45:30 localhost systemd[1]: Starting Security Auditing Service... Nov 26 01:45:30 localhost systemd[1]: Starting RPC Bind... Nov 26 01:45:30 localhost systemd[1]: Starting Rebuild Journal Catalog... Nov 26 01:45:30 localhost auditd[727]: audit dispatcher initialized with q_depth=1200 and 1 active plugins Nov 26 01:45:30 localhost auditd[727]: Init complete, auditd 3.0.7 listening for events (startup state enable) Nov 26 01:45:30 localhost systemd[1]: Finished Rebuild Journal Catalog. Nov 26 01:45:30 localhost systemd[1]: Started RPC Bind. Nov 26 01:45:30 localhost augenrules[732]: /sbin/augenrules: No change Nov 26 01:45:30 localhost augenrules[742]: No rules Nov 26 01:45:30 localhost augenrules[742]: enabled 1 Nov 26 01:45:30 localhost augenrules[742]: failure 1 Nov 26 01:45:30 localhost augenrules[742]: pid 727 Nov 26 01:45:30 localhost augenrules[742]: rate_limit 0 Nov 26 01:45:30 localhost augenrules[742]: backlog_limit 8192 Nov 26 01:45:30 localhost augenrules[742]: lost 0 Nov 26 01:45:30 localhost augenrules[742]: backlog 0 Nov 26 01:45:30 localhost augenrules[742]: backlog_wait_time 60000 Nov 26 01:45:30 localhost augenrules[742]: backlog_wait_time_actual 0 Nov 26 01:45:30 localhost augenrules[742]: enabled 1 Nov 26 01:45:30 localhost augenrules[742]: failure 1 Nov 26 01:45:30 localhost augenrules[742]: pid 727 Nov 26 01:45:30 localhost augenrules[742]: rate_limit 0 Nov 26 01:45:30 localhost augenrules[742]: backlog_limit 8192 Nov 26 01:45:30 localhost augenrules[742]: lost 0 Nov 26 01:45:30 localhost augenrules[742]: backlog 0 Nov 26 01:45:30 localhost augenrules[742]: backlog_wait_time 60000 Nov 26 01:45:30 localhost augenrules[742]: backlog_wait_time_actual 0 Nov 26 01:45:30 localhost augenrules[742]: enabled 1 Nov 26 01:45:30 localhost augenrules[742]: failure 1 Nov 26 01:45:30 localhost augenrules[742]: pid 727 Nov 26 01:45:30 localhost augenrules[742]: rate_limit 0 Nov 26 01:45:30 localhost augenrules[742]: backlog_limit 8192 Nov 26 01:45:30 localhost augenrules[742]: lost 0 Nov 26 01:45:30 localhost augenrules[742]: backlog 0 Nov 26 01:45:30 localhost augenrules[742]: backlog_wait_time 60000 Nov 26 01:45:30 localhost augenrules[742]: backlog_wait_time_actual 0 Nov 26 01:45:30 localhost systemd[1]: Started Security Auditing Service. Nov 26 01:45:30 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP... Nov 26 01:45:30 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP. Nov 26 01:45:30 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache. Nov 26 01:45:30 localhost systemd[1]: Starting Update is Completed... Nov 26 01:45:30 localhost systemd[1]: Finished Update is Completed. Nov 26 01:45:30 localhost systemd[1]: Reached target System Initialization. Nov 26 01:45:30 localhost systemd[1]: Started dnf makecache --timer. Nov 26 01:45:30 localhost systemd[1]: Started Daily rotation of log files. Nov 26 01:45:30 localhost systemd[1]: Started Daily Cleanup of Temporary Directories. Nov 26 01:45:30 localhost systemd[1]: Reached target Timer Units. Nov 26 01:45:30 localhost systemd[1]: Listening on D-Bus System Message Bus Socket. Nov 26 01:45:30 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket. Nov 26 01:45:30 localhost systemd[1]: Reached target Socket Units. Nov 26 01:45:30 localhost systemd[1]: Starting Initial cloud-init job (pre-networking)... Nov 26 01:45:30 localhost systemd[1]: Starting D-Bus System Message Bus... Nov 26 01:45:30 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Nov 26 01:45:30 localhost systemd[1]: Started D-Bus System Message Bus. Nov 26 01:45:30 localhost systemd[1]: Reached target Basic System. Nov 26 01:45:30 localhost journal[752]: Ready Nov 26 01:45:30 localhost systemd[1]: Starting NTP client/server... Nov 26 01:45:30 localhost systemd[1]: Starting Restore /run/initramfs on shutdown... Nov 26 01:45:30 localhost systemd[1]: Started irqbalance daemon. Nov 26 01:45:30 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload). Nov 26 01:45:30 localhost systemd[1]: Starting System Logging Service... Nov 26 01:45:30 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 26 01:45:30 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 26 01:45:30 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 26 01:45:30 localhost systemd[1]: Reached target sshd-keygen.target. Nov 26 01:45:30 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met. Nov 26 01:45:30 localhost systemd[1]: Reached target User and Group Name Lookups. Nov 26 01:45:30 localhost systemd[1]: Starting User Login Management... Nov 26 01:45:30 localhost systemd[1]: Finished Restore /run/initramfs on shutdown. Nov 26 01:45:30 localhost systemd[1]: Started System Logging Service. Nov 26 01:45:30 localhost rsyslogd[760]: [origin software="rsyslogd" swVersion="8.2102.0-111.el9" x-pid="760" x-info="https://www.rsyslog.com"] start Nov 26 01:45:30 localhost rsyslogd[760]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2040 ] Nov 26 01:45:30 localhost chronyd[767]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG) Nov 26 01:45:30 localhost chronyd[767]: Using right/UTC timezone to obtain leap second data Nov 26 01:45:30 localhost chronyd[767]: Loaded seccomp filter (level 2) Nov 26 01:45:30 localhost systemd[1]: Started NTP client/server. Nov 26 01:45:30 localhost systemd-logind[761]: New seat seat0. Nov 26 01:45:30 localhost systemd-logind[761]: Watching system buttons on /dev/input/event0 (Power Button) Nov 26 01:45:30 localhost systemd-logind[761]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard) Nov 26 01:45:30 localhost systemd[1]: Started User Login Management. Nov 26 01:45:30 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 26 01:45:30 localhost cloud-init[771]: Cloud-init v. 22.1-9.el9 running 'init-local' at Wed, 26 Nov 2025 06:45:30 +0000. Up 6.17 seconds. Nov 26 01:45:31 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpv21ws3eg.mount: Deactivated successfully. Nov 26 01:45:31 localhost systemd[1]: Starting Hostname Service... Nov 26 01:45:31 localhost systemd[1]: Started Hostname Service. Nov 26 01:45:31 localhost systemd-hostnamed[786]: Hostname set to (static) Nov 26 01:45:31 localhost systemd[1]: Finished Initial cloud-init job (pre-networking). Nov 26 01:45:31 localhost systemd[1]: Reached target Preparation for Network. Nov 26 01:45:31 localhost systemd[1]: Starting Network Manager... Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.4832] NetworkManager (version 1.42.2-1.el9) is starting... (boot:819ec951-3642-4926-8ee1-046701615667) Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.4842] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf) Nov 26 01:45:31 localhost systemd[1]: Started Network Manager. Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.4895] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager" Nov 26 01:45:31 localhost systemd[1]: Reached target Network. Nov 26 01:45:31 localhost systemd[1]: Starting Network Manager Wait Online... Nov 26 01:45:31 localhost systemd[1]: Starting GSSAPI Proxy Daemon... Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.4997] manager[0x56075fd2d020]: monitoring kernel firmware directory '/lib/firmware'. Nov 26 01:45:31 localhost systemd[1]: Starting Enable periodic update of entitlement certificates.... Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5041] hostname: hostname: using hostnamed Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5042] hostname: static hostname changed from (none) to "np0005536118.novalocal" Nov 26 01:45:31 localhost systemd[1]: Starting Dynamic System Tuning Daemon... Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5065] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto) Nov 26 01:45:31 localhost systemd[1]: Started Enable periodic update of entitlement certificates.. Nov 26 01:45:31 localhost systemd[1]: Started GSSAPI Proxy Daemon. Nov 26 01:45:31 localhost systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab). Nov 26 01:45:31 localhost systemd[1]: Reached target NFS client services. Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5242] manager[0x56075fd2d020]: rfkill: Wi-Fi hardware radio set enabled Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5242] manager[0x56075fd2d020]: rfkill: WWAN hardware radio set enabled Nov 26 01:45:31 localhost systemd[1]: Reached target Preparation for Remote File Systems. Nov 26 01:45:31 localhost systemd[1]: Reached target Remote File Systems. Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5321] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-device-plugin-team.so) Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5322] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file Nov 26 01:45:31 localhost systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5332] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5334] manager: Networking is enabled by state file Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5373] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-settings-plugin-ifcfg-rh.so") Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5373] settings: Loaded settings plugin: keyfile (internal) Nov 26 01:45:31 localhost systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch. Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5419] dhcp: init: Using DHCP client 'internal' Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5424] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1) Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5449] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5460] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external') Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5473] device (lo): Activation: starting connection 'lo' (1f7610b0-ce6c-4927-8bc7-7b36b520bcf1) Nov 26 01:45:31 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5492] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2) Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5498] device (eth0): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external') Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5552] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external') Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5557] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external') Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5576] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external') Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5579] device (eth0): carrier: link connected Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5583] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external') Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5591] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed') Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5617] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5624] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5625] device (eth0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed') Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5629] manager: NetworkManager state is now CONNECTING Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5631] device (eth0): state change: prepare -> config (reason 'none', sys-iface-state: 'managed') Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5642] device (eth0): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed') Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5646] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds) Nov 26 01:45:31 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5696] dhcp4 (eth0): state changed new lease, address=38.102.83.176 Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5701] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5734] device (eth0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'managed') Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5755] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external') Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5758] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external') Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5767] device (lo): Activation: successful, device activated. Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5779] device (eth0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'managed') Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5782] device (eth0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'managed') Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5790] manager: NetworkManager state is now CONNECTED_SITE Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5795] device (eth0): Activation: successful, device activated. Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5804] manager: NetworkManager state is now CONNECTED_GLOBAL Nov 26 01:45:31 localhost NetworkManager[791]: [1764139531.5812] manager: startup complete Nov 26 01:45:31 localhost systemd[1]: Finished Network Manager Wait Online. Nov 26 01:45:31 localhost systemd[1]: Starting Initial cloud-init job (metadata service crawler)... Nov 26 01:45:31 localhost cloud-init[935]: Cloud-init v. 22.1-9.el9 running 'init' at Wed, 26 Nov 2025 06:45:31 +0000. Up 7.04 seconds. Nov 26 01:45:31 localhost cloud-init[935]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++ Nov 26 01:45:31 localhost cloud-init[935]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+ Nov 26 01:45:31 localhost cloud-init[935]: ci-info: | Device | Up | Address | Mask | Scope | Hw-Address | Nov 26 01:45:31 localhost cloud-init[935]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+ Nov 26 01:45:31 localhost cloud-init[935]: ci-info: | eth0 | True | 38.102.83.176 | 255.255.255.0 | global | fa:16:3e:73:ba:36 | Nov 26 01:45:31 localhost cloud-init[935]: ci-info: | eth0 | True | fe80::f816:3eff:fe73:ba36/64 | . | link | fa:16:3e:73:ba:36 | Nov 26 01:45:31 localhost cloud-init[935]: ci-info: | lo | True | 127.0.0.1 | 255.0.0.0 | host | . | Nov 26 01:45:31 localhost cloud-init[935]: ci-info: | lo | True | ::1/128 | . | host | . | Nov 26 01:45:31 localhost cloud-init[935]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+ Nov 26 01:45:31 localhost cloud-init[935]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++ Nov 26 01:45:31 localhost cloud-init[935]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+ Nov 26 01:45:31 localhost cloud-init[935]: ci-info: | Route | Destination | Gateway | Genmask | Interface | Flags | Nov 26 01:45:31 localhost cloud-init[935]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+ Nov 26 01:45:31 localhost cloud-init[935]: ci-info: | 0 | 0.0.0.0 | 38.102.83.1 | 0.0.0.0 | eth0 | UG | Nov 26 01:45:31 localhost cloud-init[935]: ci-info: | 1 | 38.102.83.0 | 0.0.0.0 | 255.255.255.0 | eth0 | U | Nov 26 01:45:31 localhost cloud-init[935]: ci-info: | 2 | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 | eth0 | UGH | Nov 26 01:45:31 localhost cloud-init[935]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+ Nov 26 01:45:31 localhost cloud-init[935]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++ Nov 26 01:45:31 localhost cloud-init[935]: ci-info: +-------+-------------+---------+-----------+-------+ Nov 26 01:45:31 localhost cloud-init[935]: ci-info: | Route | Destination | Gateway | Interface | Flags | Nov 26 01:45:31 localhost cloud-init[935]: ci-info: +-------+-------------+---------+-----------+-------+ Nov 26 01:45:31 localhost cloud-init[935]: ci-info: | 1 | fe80::/64 | :: | eth0 | U | Nov 26 01:45:31 localhost cloud-init[935]: ci-info: | 3 | multicast | :: | eth0 | U | Nov 26 01:45:31 localhost cloud-init[935]: ci-info: +-------+-------------+---------+-----------+-------+ Nov 26 01:45:31 localhost systemd[1]: Starting Authorization Manager... Nov 26 01:45:31 localhost polkitd[1038]: Started polkitd version 0.117 Nov 26 01:45:32 localhost systemd[1]: Started Dynamic System Tuning Daemon. Nov 26 01:45:32 localhost systemd[1]: Started Authorization Manager. Nov 26 01:45:35 localhost cloud-init[935]: Generating public/private rsa key pair. Nov 26 01:45:35 localhost cloud-init[935]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key Nov 26 01:45:35 localhost cloud-init[935]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub Nov 26 01:45:35 localhost cloud-init[935]: The key fingerprint is: Nov 26 01:45:35 localhost cloud-init[935]: SHA256:OYu7t1y/xFGi93LGJx8kvi29yf3kOqHzXWQa/+mJcSk root@np0005536118.novalocal Nov 26 01:45:35 localhost cloud-init[935]: The key's randomart image is: Nov 26 01:45:35 localhost cloud-init[935]: +---[RSA 3072]----+ Nov 26 01:45:35 localhost cloud-init[935]: | | Nov 26 01:45:35 localhost cloud-init[935]: | | Nov 26 01:45:35 localhost cloud-init[935]: | . . | Nov 26 01:45:35 localhost cloud-init[935]: | . . o | Nov 26 01:45:35 localhost cloud-init[935]: | S . o...o| Nov 26 01:45:35 localhost cloud-init[935]: | . o o.++*.| Nov 26 01:45:35 localhost cloud-init[935]: | . . . +EO==| Nov 26 01:45:35 localhost cloud-init[935]: | o.. oo=XOO| Nov 26 01:45:35 localhost cloud-init[935]: | ooo. o*=@X| Nov 26 01:45:35 localhost cloud-init[935]: +----[SHA256]-----+ Nov 26 01:45:35 localhost cloud-init[935]: Generating public/private ecdsa key pair. Nov 26 01:45:35 localhost cloud-init[935]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key Nov 26 01:45:35 localhost cloud-init[935]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub Nov 26 01:45:35 localhost cloud-init[935]: The key fingerprint is: Nov 26 01:45:35 localhost cloud-init[935]: SHA256:ZXV8YimzzhmP81oVW6cfhU1ot02LyYLPQwKt17fdqvY root@np0005536118.novalocal Nov 26 01:45:35 localhost cloud-init[935]: The key's randomart image is: Nov 26 01:45:35 localhost cloud-init[935]: +---[ECDSA 256]---+ Nov 26 01:45:35 localhost cloud-init[935]: | . ...o.| Nov 26 01:45:35 localhost cloud-init[935]: | . . .o.B++| Nov 26 01:45:35 localhost cloud-init[935]: | o = .*+*O| Nov 26 01:45:35 localhost cloud-init[935]: | . * +o= +B| Nov 26 01:45:35 localhost cloud-init[935]: | S =oo=+oo| Nov 26 01:45:35 localhost cloud-init[935]: | +*..+o| Nov 26 01:45:35 localhost cloud-init[935]: | .o...| Nov 26 01:45:35 localhost cloud-init[935]: | ..o | Nov 26 01:45:35 localhost cloud-init[935]: | .o+E | Nov 26 01:45:35 localhost cloud-init[935]: +----[SHA256]-----+ Nov 26 01:45:35 localhost cloud-init[935]: Generating public/private ed25519 key pair. Nov 26 01:45:35 localhost cloud-init[935]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key Nov 26 01:45:35 localhost cloud-init[935]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub Nov 26 01:45:35 localhost cloud-init[935]: The key fingerprint is: Nov 26 01:45:35 localhost cloud-init[935]: SHA256:V4XXaKUaIOgCOrtEqpjPLmCe2kI/N+Ld4YPMWnEofSg root@np0005536118.novalocal Nov 26 01:45:35 localhost cloud-init[935]: The key's randomart image is: Nov 26 01:45:35 localhost cloud-init[935]: +--[ED25519 256]--+ Nov 26 01:45:35 localhost cloud-init[935]: | .. . ..+.| Nov 26 01:45:35 localhost cloud-init[935]: | . . . ...+..| Nov 26 01:45:35 localhost cloud-init[935]: | . . . oo. | Nov 26 01:45:35 localhost cloud-init[935]: |o. o + . o | Nov 26 01:45:35 localhost cloud-init[935]: |oo E * oS . . | Nov 26 01:45:35 localhost cloud-init[935]: |++ o + . | Nov 26 01:45:35 localhost cloud-init[935]: |Ooo o... | Nov 26 01:45:35 localhost cloud-init[935]: |B= +o*o.. | Nov 26 01:45:35 localhost cloud-init[935]: |o=*o=..o. | Nov 26 01:45:35 localhost cloud-init[935]: +----[SHA256]-----+ Nov 26 01:45:35 localhost sm-notify[1134]: Version 2.5.4 starting Nov 26 01:45:35 localhost systemd[1]: Finished Initial cloud-init job (metadata service crawler). Nov 26 01:45:35 localhost systemd[1]: Reached target Cloud-config availability. Nov 26 01:45:35 localhost systemd[1]: Reached target Network is Online. Nov 26 01:45:35 localhost systemd[1]: Starting Apply the settings specified in cloud-config... Nov 26 01:45:35 localhost sshd[1135]: main: sshd: ssh-rsa algorithm is disabled Nov 26 01:45:35 localhost systemd[1]: Run Insights Client at boot was skipped because of an unmet condition check (ConditionPathExists=/etc/insights-client/.run_insights_client_next_boot). Nov 26 01:45:35 localhost systemd[1]: Starting Crash recovery kernel arming... Nov 26 01:45:35 localhost systemd[1]: Starting Notify NFS peers of a restart... Nov 26 01:45:35 localhost systemd[1]: Starting OpenSSH server daemon... Nov 26 01:45:35 localhost systemd[1]: Starting Permit User Sessions... Nov 26 01:45:35 localhost systemd[1]: Started Notify NFS peers of a restart. Nov 26 01:45:35 localhost systemd[1]: Finished Permit User Sessions. Nov 26 01:45:35 localhost systemd[1]: Started Command Scheduler. Nov 26 01:45:35 localhost systemd[1]: Started Getty on tty1. Nov 26 01:45:35 localhost systemd[1]: Started Serial Getty on ttyS0. Nov 26 01:45:35 localhost systemd[1]: Reached target Login Prompts. Nov 26 01:45:35 localhost systemd[1]: Started OpenSSH server daemon. Nov 26 01:45:35 localhost systemd[1]: Reached target Multi-User System. Nov 26 01:45:35 localhost systemd[1]: Starting Record Runlevel Change in UTMP... Nov 26 01:45:35 localhost systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully. Nov 26 01:45:35 localhost systemd[1]: Finished Record Runlevel Change in UTMP. Nov 26 01:45:35 localhost sshd[1158]: main: sshd: ssh-rsa algorithm is disabled Nov 26 01:45:35 localhost sshd[1171]: main: sshd: ssh-rsa algorithm is disabled Nov 26 01:45:35 localhost sshd[1189]: main: sshd: ssh-rsa algorithm is disabled Nov 26 01:45:35 localhost kdumpctl[1142]: kdump: No kdump initial ramdisk found. Nov 26 01:45:35 localhost kdumpctl[1142]: kdump: Rebuilding /boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img Nov 26 01:45:35 localhost sshd[1197]: main: sshd: ssh-rsa algorithm is disabled Nov 26 01:45:35 localhost sshd[1204]: main: sshd: ssh-rsa algorithm is disabled Nov 26 01:45:36 localhost sshd[1219]: main: sshd: ssh-rsa algorithm is disabled Nov 26 01:45:36 localhost sshd[1243]: main: sshd: ssh-rsa algorithm is disabled Nov 26 01:45:36 localhost cloud-init[1265]: Cloud-init v. 22.1-9.el9 running 'modules:config' at Wed, 26 Nov 2025 06:45:36 +0000. Up 11.25 seconds. Nov 26 01:45:36 localhost sshd[1263]: main: sshd: ssh-rsa algorithm is disabled Nov 26 01:45:36 localhost sshd[1276]: main: sshd: ssh-rsa algorithm is disabled Nov 26 01:45:36 localhost systemd[1]: Finished Apply the settings specified in cloud-config. Nov 26 01:45:36 localhost systemd[1]: Starting Execute cloud user/final scripts... Nov 26 01:45:36 localhost cloud-init[1440]: Cloud-init v. 22.1-9.el9 running 'modules:final' at Wed, 26 Nov 2025 06:45:36 +0000. Up 11.59 seconds. Nov 26 01:45:36 localhost dracut[1439]: dracut-057-21.git20230214.el9 Nov 26 01:45:36 localhost cloud-init[1457]: ############################################################# Nov 26 01:45:36 localhost cloud-init[1459]: -----BEGIN SSH HOST KEY FINGERPRINTS----- Nov 26 01:45:36 localhost cloud-init[1465]: 256 SHA256:ZXV8YimzzhmP81oVW6cfhU1ot02LyYLPQwKt17fdqvY root@np0005536118.novalocal (ECDSA) Nov 26 01:45:36 localhost chronyd[767]: Selected source 23.159.16.194 (2.rhel.pool.ntp.org) Nov 26 01:45:36 localhost chronyd[767]: System clock TAI offset set to 37 seconds Nov 26 01:45:36 localhost cloud-init[1472]: 256 SHA256:V4XXaKUaIOgCOrtEqpjPLmCe2kI/N+Ld4YPMWnEofSg root@np0005536118.novalocal (ED25519) Nov 26 01:45:36 localhost dracut[1442]: Executing: /usr/bin/dracut --add kdumpbase --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics -o "plymouth resume ifcfg earlykdump" --mount "/dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device -f /boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img 5.14.0-284.11.1.el9_2.x86_64 Nov 26 01:45:36 localhost cloud-init[1479]: 3072 SHA256:OYu7t1y/xFGi93LGJx8kvi29yf3kOqHzXWQa/+mJcSk root@np0005536118.novalocal (RSA) Nov 26 01:45:36 localhost cloud-init[1481]: -----END SSH HOST KEY FINGERPRINTS----- Nov 26 01:45:36 localhost cloud-init[1483]: ############################################################# Nov 26 01:45:36 localhost cloud-init[1440]: Cloud-init v. 22.1-9.el9 finished at Wed, 26 Nov 2025 06:45:36 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0]. Up 11.85 seconds Nov 26 01:45:36 localhost systemd[1]: Reloading Network Manager... Nov 26 01:45:36 localhost NetworkManager[791]: [1764139536.7448] audit: op="reload" arg="0" pid=1568 uid=0 result="success" Nov 26 01:45:36 localhost NetworkManager[791]: [1764139536.7458] config: signal: SIGHUP (no changes from disk) Nov 26 01:45:36 localhost dracut[1442]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found! Nov 26 01:45:36 localhost systemd[1]: Reloaded Network Manager. Nov 26 01:45:36 localhost dracut[1442]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found! Nov 26 01:45:36 localhost systemd[1]: Finished Execute cloud user/final scripts. Nov 26 01:45:36 localhost dracut[1442]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found! Nov 26 01:45:36 localhost systemd[1]: Reached target Cloud-init target. Nov 26 01:45:36 localhost dracut[1442]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found! Nov 26 01:45:36 localhost dracut[1442]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found! Nov 26 01:45:36 localhost dracut[1442]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found! Nov 26 01:45:36 localhost dracut[1442]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found! Nov 26 01:45:36 localhost dracut[1442]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found! Nov 26 01:45:36 localhost dracut[1442]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found! Nov 26 01:45:36 localhost dracut[1442]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found! Nov 26 01:45:36 localhost dracut[1442]: dracut module 'connman' will not be installed, because command 'connmand' could not be found! Nov 26 01:45:36 localhost dracut[1442]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found! Nov 26 01:45:36 localhost dracut[1442]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found! Nov 26 01:45:36 localhost dracut[1442]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found! Nov 26 01:45:36 localhost dracut[1442]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'! Nov 26 01:45:36 localhost dracut[1442]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found! Nov 26 01:45:36 localhost dracut[1442]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found! Nov 26 01:45:36 localhost dracut[1442]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found! Nov 26 01:45:36 localhost dracut[1442]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found! Nov 26 01:45:36 localhost dracut[1442]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found! Nov 26 01:45:37 localhost dracut[1442]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found! Nov 26 01:45:37 localhost dracut[1442]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found! Nov 26 01:45:37 localhost dracut[1442]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found! Nov 26 01:45:37 localhost dracut[1442]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found! Nov 26 01:45:37 localhost dracut[1442]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found! Nov 26 01:45:37 localhost dracut[1442]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found! Nov 26 01:45:37 localhost dracut[1442]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found! Nov 26 01:45:37 localhost dracut[1442]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found! Nov 26 01:45:37 localhost dracut[1442]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found! Nov 26 01:45:37 localhost dracut[1442]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found! Nov 26 01:45:37 localhost dracut[1442]: memstrack is not available Nov 26 01:45:37 localhost dracut[1442]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng Nov 26 01:45:37 localhost dracut[1442]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found! Nov 26 01:45:37 localhost dracut[1442]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found! Nov 26 01:45:37 localhost dracut[1442]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found! Nov 26 01:45:37 localhost dracut[1442]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found! Nov 26 01:45:37 localhost dracut[1442]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found! Nov 26 01:45:37 localhost dracut[1442]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found! Nov 26 01:45:37 localhost dracut[1442]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found! Nov 26 01:45:37 localhost dracut[1442]: dracut module 'connman' will not be installed, because command 'connmand' could not be found! Nov 26 01:45:37 localhost dracut[1442]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found! Nov 26 01:45:37 localhost dracut[1442]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found! Nov 26 01:45:37 localhost dracut[1442]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found! Nov 26 01:45:37 localhost dracut[1442]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'! Nov 26 01:45:37 localhost dracut[1442]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found! Nov 26 01:45:37 localhost dracut[1442]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found! Nov 26 01:45:37 localhost dracut[1442]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found! Nov 26 01:45:37 localhost dracut[1442]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found! Nov 26 01:45:37 localhost dracut[1442]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found! Nov 26 01:45:37 localhost dracut[1442]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found! Nov 26 01:45:37 localhost dracut[1442]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found! Nov 26 01:45:37 localhost dracut[1442]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found! Nov 26 01:45:37 localhost dracut[1442]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found! Nov 26 01:45:37 localhost dracut[1442]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found! Nov 26 01:45:37 localhost dracut[1442]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found! Nov 26 01:45:37 localhost dracut[1442]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found! Nov 26 01:45:37 localhost dracut[1442]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found! Nov 26 01:45:37 localhost dracut[1442]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found! Nov 26 01:45:37 localhost dracut[1442]: memstrack is not available Nov 26 01:45:37 localhost dracut[1442]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng Nov 26 01:45:37 localhost chronyd[767]: Selected source 149.56.19.163 (2.rhel.pool.ntp.org) Nov 26 01:45:37 localhost dracut[1442]: *** Including module: systemd *** Nov 26 01:45:38 localhost dracut[1442]: *** Including module: systemd-initrd *** Nov 26 01:45:38 localhost dracut[1442]: *** Including module: i18n *** Nov 26 01:45:38 localhost dracut[1442]: No KEYMAP configured. Nov 26 01:45:38 localhost dracut[1442]: *** Including module: drm *** Nov 26 01:45:38 localhost dracut[1442]: *** Including module: prefixdevname *** Nov 26 01:45:38 localhost dracut[1442]: *** Including module: kernel-modules *** Nov 26 01:45:39 localhost dracut[1442]: *** Including module: kernel-modules-extra *** Nov 26 01:45:39 localhost dracut[1442]: *** Including module: qemu *** Nov 26 01:45:39 localhost dracut[1442]: *** Including module: fstab-sys *** Nov 26 01:45:39 localhost dracut[1442]: *** Including module: rootfs-block *** Nov 26 01:45:39 localhost dracut[1442]: *** Including module: terminfo *** Nov 26 01:45:39 localhost dracut[1442]: *** Including module: udev-rules *** Nov 26 01:45:40 localhost dracut[1442]: Skipping udev rule: 91-permissions.rules Nov 26 01:45:40 localhost dracut[1442]: Skipping udev rule: 80-drivers-modprobe.rules Nov 26 01:45:40 localhost dracut[1442]: *** Including module: virtiofs *** Nov 26 01:45:40 localhost dracut[1442]: *** Including module: dracut-systemd *** Nov 26 01:45:40 localhost dracut[1442]: *** Including module: usrmount *** Nov 26 01:45:40 localhost dracut[1442]: *** Including module: base *** Nov 26 01:45:40 localhost dracut[1442]: *** Including module: fs-lib *** Nov 26 01:45:40 localhost dracut[1442]: *** Including module: kdumpbase *** Nov 26 01:45:40 localhost dracut[1442]: *** Including module: microcode_ctl-fw_dir_override *** Nov 26 01:45:40 localhost dracut[1442]: microcode_ctl module: mangling fw_dir Nov 26 01:45:40 localhost dracut[1442]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel"... Nov 26 01:45:40 localhost dracut[1442]: microcode_ctl: configuration "intel" is ignored Nov 26 01:45:40 localhost dracut[1442]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"... Nov 26 01:45:40 localhost dracut[1442]: microcode_ctl: configuration "intel-06-2d-07" is ignored Nov 26 01:45:40 localhost dracut[1442]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"... Nov 26 01:45:40 localhost dracut[1442]: microcode_ctl: configuration "intel-06-4e-03" is ignored Nov 26 01:45:40 localhost dracut[1442]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"... Nov 26 01:45:40 localhost dracut[1442]: microcode_ctl: configuration "intel-06-4f-01" is ignored Nov 26 01:45:41 localhost dracut[1442]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"... Nov 26 01:45:41 localhost dracut[1442]: microcode_ctl: configuration "intel-06-55-04" is ignored Nov 26 01:45:41 localhost dracut[1442]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"... Nov 26 01:45:41 localhost dracut[1442]: microcode_ctl: configuration "intel-06-5e-03" is ignored Nov 26 01:45:41 localhost dracut[1442]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"... Nov 26 01:45:41 localhost dracut[1442]: microcode_ctl: configuration "intel-06-8c-01" is ignored Nov 26 01:45:41 localhost dracut[1442]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"... Nov 26 01:45:41 localhost dracut[1442]: microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored Nov 26 01:45:41 localhost dracut[1442]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"... Nov 26 01:45:41 localhost dracut[1442]: microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored Nov 26 01:45:41 localhost dracut[1442]: microcode_ctl: final fw_dir: "/lib/firmware/updates/5.14.0-284.11.1.el9_2.x86_64 /lib/firmware/updates /lib/firmware/5.14.0-284.11.1.el9_2.x86_64 /lib/firmware" Nov 26 01:45:41 localhost dracut[1442]: *** Including module: shutdown *** Nov 26 01:45:41 localhost dracut[1442]: *** Including module: squash *** Nov 26 01:45:41 localhost dracut[1442]: *** Including modules done *** Nov 26 01:45:41 localhost dracut[1442]: *** Installing kernel module dependencies *** Nov 26 01:45:41 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Nov 26 01:45:41 localhost dracut[1442]: *** Installing kernel module dependencies done *** Nov 26 01:45:41 localhost dracut[1442]: *** Resolving executable dependencies *** Nov 26 01:45:43 localhost dracut[1442]: *** Resolving executable dependencies done *** Nov 26 01:45:43 localhost dracut[1442]: *** Hardlinking files *** Nov 26 01:45:43 localhost dracut[1442]: Mode: real Nov 26 01:45:43 localhost dracut[1442]: Files: 1099 Nov 26 01:45:43 localhost dracut[1442]: Linked: 3 files Nov 26 01:45:43 localhost dracut[1442]: Compared: 0 xattrs Nov 26 01:45:43 localhost dracut[1442]: Compared: 373 files Nov 26 01:45:43 localhost dracut[1442]: Saved: 61.04 KiB Nov 26 01:45:43 localhost dracut[1442]: Duration: 0.039147 seconds Nov 26 01:45:43 localhost dracut[1442]: *** Hardlinking files done *** Nov 26 01:45:43 localhost dracut[1442]: Could not find 'strip'. Not stripping the initramfs. Nov 26 01:45:43 localhost dracut[1442]: *** Generating early-microcode cpio image *** Nov 26 01:45:43 localhost dracut[1442]: *** Constructing AuthenticAMD.bin *** Nov 26 01:45:43 localhost dracut[1442]: *** Store current command line parameters *** Nov 26 01:45:43 localhost dracut[1442]: Stored kernel commandline: Nov 26 01:45:43 localhost dracut[1442]: No dracut internal kernel commandline stored in the initramfs Nov 26 01:45:43 localhost dracut[1442]: *** Install squash loader *** Nov 26 01:45:44 localhost dracut[1442]: *** Squashing the files inside the initramfs *** Nov 26 01:45:45 localhost dracut[1442]: *** Squashing the files inside the initramfs done *** Nov 26 01:45:45 localhost dracut[1442]: *** Creating image file '/boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img' *** Nov 26 01:45:45 localhost dracut[1442]: *** Creating initramfs image file '/boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img' done *** Nov 26 01:45:45 localhost kdumpctl[1142]: kdump: kexec: loaded kdump kernel Nov 26 01:45:45 localhost kdumpctl[1142]: kdump: Starting kdump: [OK] Nov 26 01:45:45 localhost systemd[1]: Finished Crash recovery kernel arming. Nov 26 01:45:45 localhost systemd[1]: Startup finished in 1.262s (kernel) + 1.982s (initrd) + 17.681s (userspace) = 20.926s. Nov 26 01:46:01 localhost systemd[1]: systemd-hostnamed.service: Deactivated successfully. Nov 26 01:46:34 localhost sshd[4179]: main: sshd: ssh-rsa algorithm is disabled Nov 26 01:46:35 localhost systemd[1]: Created slice User Slice of UID 1000. Nov 26 01:46:35 localhost systemd[1]: Starting User Runtime Directory /run/user/1000... Nov 26 01:46:35 localhost systemd-logind[761]: New session 1 of user zuul. Nov 26 01:46:35 localhost systemd[1]: Finished User Runtime Directory /run/user/1000. Nov 26 01:46:35 localhost systemd[1]: Starting User Manager for UID 1000... Nov 26 01:46:35 localhost systemd[4183]: Queued start job for default target Main User Target. Nov 26 01:46:35 localhost systemd[4183]: Created slice User Application Slice. Nov 26 01:46:35 localhost systemd[4183]: Started Mark boot as successful after the user session has run 2 minutes. Nov 26 01:46:35 localhost systemd[4183]: Started Daily Cleanup of User's Temporary Directories. Nov 26 01:46:35 localhost systemd[4183]: Reached target Paths. Nov 26 01:46:35 localhost systemd[4183]: Reached target Timers. Nov 26 01:46:35 localhost systemd[4183]: Starting D-Bus User Message Bus Socket... Nov 26 01:46:35 localhost systemd[4183]: Starting Create User's Volatile Files and Directories... Nov 26 01:46:35 localhost systemd[4183]: Finished Create User's Volatile Files and Directories. Nov 26 01:46:35 localhost systemd[4183]: Listening on D-Bus User Message Bus Socket. Nov 26 01:46:35 localhost systemd[4183]: Reached target Sockets. Nov 26 01:46:35 localhost systemd[4183]: Reached target Basic System. Nov 26 01:46:35 localhost systemd[4183]: Reached target Main User Target. Nov 26 01:46:35 localhost systemd[4183]: Startup finished in 122ms. Nov 26 01:46:35 localhost systemd[1]: Started User Manager for UID 1000. Nov 26 01:46:35 localhost systemd[1]: Started Session 1 of User zuul. Nov 26 01:46:35 localhost python3[4235]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 01:46:45 localhost python3[4253]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 01:46:52 localhost python3[4307]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 01:46:54 localhost python3[4337]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present Nov 26 01:46:57 localhost python3[4353]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCkJddqZ+TwuLMCoD/CUKb6dnZ5nImZkr99k28vGFQTZD8B2L/Jx+KwKLctJwJdAbPZC/wCl/36ZPjbla3kwCBCcgDq4oMWypJH1O/63E9BgGHNHKyv8+W8cLdCN1zy1EpGO62uGHVn4l57+Bp2T37Fy3IKVmX+tQkDoTdmzgtr5i8E1khji5awitbNX6RCXkWRlMkvVByLh74T7HTnO21e4xp556VlHAFGjYIDNAjgNkyhO6M9ssBagiIOrBzbXvnmNyZxIeiznzLQGBwty3La7OiGgztNcwLCRTVHG+4hwiKk7RIRradK18HqKab9McNcGbbIU/uUQYbYTPIEWiEmDTYeyTBoy+veLsVUYfXRLJDerz6WvmIUiiLVU0ABmx7b9k9dwjYa9U8tscYuTfYVjocSnR3IVQDEikuw4Bklms2ijHLwfRS9oeb9XvpqyM10A4FQnSLPgHdrRpCWBm4+Nek0Esi3RXYub8PT5HuL5Q87j+qe66WazVu6iSRRGCM= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 26 01:46:58 localhost python3[4367]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 01:46:59 localhost python3[4426]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 01:46:59 localhost python3[4467]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764139618.9884021-391-195477513002575/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=d32fdd92636a49f1ac6e190427f27813_id_rsa follow=False checksum=c5afe835443889fe9adecf4b2807aa7da7e61790 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 01:47:01 localhost python3[4540]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 01:47:01 localhost python3[4581]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764139620.7554715-491-52760578354672/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=d32fdd92636a49f1ac6e190427f27813_id_rsa.pub follow=False checksum=868249ad8a507d032dbe5de33122c4f3d30d20a1 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 01:47:03 localhost python3[4609]: ansible-ping Invoked with data=pong Nov 26 01:47:05 localhost python3[4623]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 01:47:08 localhost python3[4676]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None Nov 26 01:47:10 localhost python3[4698]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 01:47:11 localhost python3[4712]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 01:47:11 localhost python3[4726]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 01:47:12 localhost python3[4740]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 01:47:12 localhost python3[4754]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 01:47:12 localhost python3[4768]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 01:47:15 localhost python3[4784]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 01:47:17 localhost python3[4832]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 01:47:17 localhost python3[4875]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764139636.9241211-102-142635187223138/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 01:47:25 localhost python3[4903]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 26 01:47:25 localhost python3[4917]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 26 01:47:25 localhost python3[4931]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 26 01:47:25 localhost python3[4945]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 26 01:47:26 localhost python3[4959]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 26 01:47:26 localhost python3[4973]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 26 01:47:26 localhost python3[4987]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 26 01:47:26 localhost python3[5001]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 26 01:47:27 localhost python3[5015]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 26 01:47:27 localhost python3[5029]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 26 01:47:27 localhost python3[5043]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 26 01:47:27 localhost python3[5057]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 26 01:47:28 localhost python3[5071]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 26 01:47:28 localhost python3[5085]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 26 01:47:28 localhost python3[5099]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 26 01:47:28 localhost python3[5113]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 26 01:47:29 localhost python3[5127]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 26 01:47:29 localhost python3[5141]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 26 01:47:29 localhost python3[5155]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 26 01:47:30 localhost python3[5169]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 26 01:47:30 localhost python3[5183]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 26 01:47:30 localhost python3[5197]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 26 01:47:30 localhost python3[5211]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 26 01:47:31 localhost python3[5226]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 26 01:47:31 localhost python3[5240]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 26 01:47:31 localhost python3[5254]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 26 01:47:33 localhost python3[5270]: ansible-community.general.timezone Invoked with name=UTC hwclock=None Nov 26 01:47:33 localhost systemd[1]: Starting Time & Date Service... Nov 26 01:47:33 localhost systemd[1]: Started Time & Date Service. Nov 26 01:47:33 localhost systemd-timedated[5272]: Changed time zone to 'UTC' (UTC). Nov 26 01:47:34 localhost python3[5291]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 01:47:35 localhost python3[5337]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 01:47:35 localhost python3[5378]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1764139655.0880618-493-20269395187207/source _original_basename=tmptms2ksrz follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 01:47:37 localhost python3[5438]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 01:47:37 localhost python3[5479]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764139656.759253-587-202687938755443/source _original_basename=tmph_4kzvv0 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 01:47:39 localhost python3[5541]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 01:47:39 localhost python3[5584]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764139658.8072507-728-199278111818686/source _original_basename=tmpm_vcima9 follow=False checksum=8e0e434468aa50922357fbdb56d8b197f48f0949 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 01:47:40 localhost python3[5612]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 01:47:40 localhost python3[5628]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 01:47:43 localhost python3[5678]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 01:47:43 localhost python3[5721]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1764139663.3326068-856-212828855137117/source _original_basename=tmp7dxvs39x follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 01:47:45 localhost python3[5752]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163e3b-3c83-e0f0-6ff9-000000000023-1-overcloudnovacompute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 01:47:46 localhost python3[5770]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-e0f0-6ff9-000000000024-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None Nov 26 01:47:48 localhost python3[5788]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 01:48:03 localhost systemd[1]: systemd-timedated.service: Deactivated successfully. Nov 26 01:48:08 localhost python3[5806]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 01:48:37 localhost systemd[4183]: Starting Mark boot as successful... Nov 26 01:48:37 localhost systemd[4183]: Finished Mark boot as successful. Nov 26 01:49:08 localhost systemd-logind[761]: Session 1 logged out. Waiting for processes to exit. Nov 26 01:50:11 localhost systemd[1]: Unmounting EFI System Partition Automount... Nov 26 01:50:11 localhost systemd[1]: efi.mount: Deactivated successfully. Nov 26 01:50:11 localhost systemd[1]: Unmounted EFI System Partition Automount. Nov 26 01:51:37 localhost systemd[4183]: Created slice User Background Tasks Slice. Nov 26 01:51:37 localhost systemd[4183]: Starting Cleanup of User's Temporary Files and Directories... Nov 26 01:51:37 localhost systemd[4183]: Finished Cleanup of User's Temporary Files and Directories. Nov 26 01:51:52 localhost kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 Nov 26 01:51:52 localhost kernel: pci 0000:00:07.0: reg 0x10: [io 0x0000-0x003f] Nov 26 01:51:52 localhost kernel: pci 0000:00:07.0: reg 0x14: [mem 0x00000000-0x00000fff] Nov 26 01:51:52 localhost kernel: pci 0000:00:07.0: reg 0x20: [mem 0x00000000-0x00003fff 64bit pref] Nov 26 01:51:52 localhost kernel: pci 0000:00:07.0: reg 0x30: [mem 0x00000000-0x0007ffff pref] Nov 26 01:51:52 localhost kernel: pci 0000:00:07.0: BAR 6: assigned [mem 0xc0000000-0xc007ffff pref] Nov 26 01:51:52 localhost kernel: pci 0000:00:07.0: BAR 4: assigned [mem 0x440000000-0x440003fff 64bit pref] Nov 26 01:51:52 localhost kernel: pci 0000:00:07.0: BAR 1: assigned [mem 0xc0080000-0xc0080fff] Nov 26 01:51:52 localhost kernel: pci 0000:00:07.0: BAR 0: assigned [io 0x1000-0x103f] Nov 26 01:51:52 localhost kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003) Nov 26 01:51:52 localhost NetworkManager[791]: [1764139912.3650] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3) Nov 26 01:51:52 localhost systemd-udevd[5815]: Network interface NamePolicy= disabled on kernel command line. Nov 26 01:51:52 localhost NetworkManager[791]: [1764139912.3803] device (eth1): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external') Nov 26 01:51:52 localhost NetworkManager[791]: [1764139912.3873] settings: (eth1): created default wired connection 'Wired connection 1' Nov 26 01:51:52 localhost NetworkManager[791]: [1764139912.3879] device (eth1): carrier: link connected Nov 26 01:51:52 localhost NetworkManager[791]: [1764139912.3883] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed') Nov 26 01:51:52 localhost NetworkManager[791]: [1764139912.3889] policy: auto-activating connection 'Wired connection 1' (d78ad6ae-f9c3-3bfa-a756-93d7bf539310) Nov 26 01:51:52 localhost NetworkManager[791]: [1764139912.3896] device (eth1): Activation: starting connection 'Wired connection 1' (d78ad6ae-f9c3-3bfa-a756-93d7bf539310) Nov 26 01:51:52 localhost NetworkManager[791]: [1764139912.3897] device (eth1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed') Nov 26 01:51:52 localhost NetworkManager[791]: [1764139912.3902] device (eth1): state change: prepare -> config (reason 'none', sys-iface-state: 'managed') Nov 26 01:51:52 localhost NetworkManager[791]: [1764139912.3908] device (eth1): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed') Nov 26 01:51:52 localhost NetworkManager[791]: [1764139912.3912] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds) Nov 26 01:51:53 localhost sshd[5818]: main: sshd: ssh-rsa algorithm is disabled Nov 26 01:51:53 localhost systemd-logind[761]: New session 3 of user zuul. Nov 26 01:51:53 localhost systemd[1]: Started Session 3 of User zuul. Nov 26 01:51:53 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth1: link becomes ready Nov 26 01:51:53 localhost python3[5835]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163e3b-3c83-afea-b643-000000000408-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 01:52:06 localhost python3[5885]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 01:52:07 localhost python3[5928]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764139926.4173217-486-169108043579783/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=79046a23eecaca3746001ee26819fc0e1c1c25dd backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 01:52:07 localhost python3[5958]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 26 01:52:08 localhost systemd[1]: NetworkManager-wait-online.service: Deactivated successfully. Nov 26 01:52:08 localhost systemd[1]: Stopped Network Manager Wait Online. Nov 26 01:52:08 localhost systemd[1]: Stopping Network Manager Wait Online... Nov 26 01:52:08 localhost NetworkManager[791]: [1764139928.7236] caught SIGTERM, shutting down normally. Nov 26 01:52:08 localhost systemd[1]: Stopping Network Manager... Nov 26 01:52:08 localhost NetworkManager[791]: [1764139928.7328] dhcp4 (eth0): canceled DHCP transaction Nov 26 01:52:08 localhost NetworkManager[791]: [1764139928.7328] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds) Nov 26 01:52:08 localhost NetworkManager[791]: [1764139928.7329] dhcp4 (eth0): state changed no lease Nov 26 01:52:08 localhost NetworkManager[791]: [1764139928.7335] manager: NetworkManager state is now CONNECTING Nov 26 01:52:08 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Nov 26 01:52:08 localhost NetworkManager[791]: [1764139928.7428] dhcp4 (eth1): canceled DHCP transaction Nov 26 01:52:08 localhost NetworkManager[791]: [1764139928.7429] dhcp4 (eth1): state changed no lease Nov 26 01:52:08 localhost NetworkManager[791]: [1764139928.7522] exiting (success) Nov 26 01:52:08 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Nov 26 01:52:08 localhost systemd[1]: NetworkManager.service: Deactivated successfully. Nov 26 01:52:08 localhost systemd[1]: Stopped Network Manager. Nov 26 01:52:08 localhost systemd[1]: NetworkManager.service: Consumed 2.070s CPU time. Nov 26 01:52:08 localhost systemd[1]: Starting Network Manager... Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.8021] NetworkManager (version 1.42.2-1.el9) is starting... (after a restart, boot:819ec951-3642-4926-8ee1-046701615667) Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.8023] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf) Nov 26 01:52:08 localhost systemd[1]: Started Network Manager. Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.8047] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager" Nov 26 01:52:08 localhost systemd[1]: Starting Network Manager Wait Online... Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.8102] manager[0x556b72b47090]: monitoring kernel firmware directory '/lib/firmware'. Nov 26 01:52:08 localhost systemd[1]: Starting Hostname Service... Nov 26 01:52:08 localhost systemd[1]: Started Hostname Service. Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.8969] hostname: hostname: using hostnamed Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.8973] hostname: static hostname changed from (none) to "np0005536118.novalocal" Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.8981] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto) Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.8987] manager[0x556b72b47090]: rfkill: Wi-Fi hardware radio set enabled Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.8988] manager[0x556b72b47090]: rfkill: WWAN hardware radio set enabled Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9031] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-device-plugin-team.so) Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9031] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9033] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9033] manager: Networking is enabled by state file Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9052] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-settings-plugin-ifcfg-rh.so") Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9053] settings: Loaded settings plugin: keyfile (internal) Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9103] dhcp: init: Using DHCP client 'internal' Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9107] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1) Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9116] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9123] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external') Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9136] device (lo): Activation: starting connection 'lo' (1f7610b0-ce6c-4927-8bc7-7b36b520bcf1) Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9145] device (eth0): carrier: link connected Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9151] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2) Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9158] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated) Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9159] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'assume') Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9168] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'assume') Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9178] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9185] device (eth1): carrier: link connected Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9191] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3) Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9200] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (d78ad6ae-f9c3-3bfa-a756-93d7bf539310) (indicated) Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9200] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'assume') Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9208] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'assume') Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9218] device (eth1): Activation: starting connection 'Wired connection 1' (d78ad6ae-f9c3-3bfa-a756-93d7bf539310) Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9247] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external') Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9251] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external') Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9254] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external') Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9258] device (eth0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'assume') Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9263] device (eth0): state change: prepare -> config (reason 'none', sys-iface-state: 'assume') Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9266] device (eth1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'assume') Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9271] device (eth1): state change: prepare -> config (reason 'none', sys-iface-state: 'assume') Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9276] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external') Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9288] device (eth0): state change: config -> ip-config (reason 'none', sys-iface-state: 'assume') Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9295] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds) Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9312] device (eth1): state change: config -> ip-config (reason 'none', sys-iface-state: 'assume') Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9317] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds) Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9372] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external') Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9381] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external') Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9390] device (lo): Activation: successful, device activated. Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9449] dhcp4 (eth0): state changed new lease, address=38.102.83.176 Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9456] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9565] device (eth0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'assume') Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9587] device (eth0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'assume') Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9589] device (eth0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'assume') Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9594] manager: NetworkManager state is now CONNECTED_SITE Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9600] device (eth0): Activation: successful, device activated. Nov 26 01:52:08 localhost NetworkManager[5970]: [1764139928.9608] manager: NetworkManager state is now CONNECTED_GLOBAL Nov 26 01:52:09 localhost python3[6020]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163e3b-3c83-afea-b643-00000000012b-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 01:52:19 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Nov 26 01:52:38 localhost systemd[1]: systemd-hostnamed.service: Deactivated successfully. Nov 26 01:52:53 localhost NetworkManager[5970]: [1764139973.7581] device (eth1): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'assume') Nov 26 01:52:53 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Nov 26 01:52:53 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Nov 26 01:52:53 localhost NetworkManager[5970]: [1764139973.7823] device (eth1): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'assume') Nov 26 01:52:53 localhost NetworkManager[5970]: [1764139973.7834] device (eth1): state change: secondaries -> activated (reason 'none', sys-iface-state: 'assume') Nov 26 01:52:53 localhost NetworkManager[5970]: [1764139973.7866] device (eth1): Activation: successful, device activated. Nov 26 01:52:53 localhost NetworkManager[5970]: [1764139973.7884] manager: startup complete Nov 26 01:52:53 localhost systemd[1]: Finished Network Manager Wait Online. Nov 26 01:53:03 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Nov 26 01:53:09 localhost systemd[1]: session-3.scope: Deactivated successfully. Nov 26 01:53:09 localhost systemd[1]: session-3.scope: Consumed 1.486s CPU time. Nov 26 01:53:09 localhost systemd-logind[761]: Session 3 logged out. Waiting for processes to exit. Nov 26 01:53:09 localhost systemd-logind[761]: Removed session 3. Nov 26 01:53:56 localhost sshd[6058]: main: sshd: ssh-rsa algorithm is disabled Nov 26 01:53:56 localhost systemd-logind[761]: New session 4 of user zuul. Nov 26 01:53:56 localhost systemd[1]: Started Session 4 of User zuul. Nov 26 01:53:56 localhost python3[6109]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 01:53:57 localhost python3[6152]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764140036.483479-628-72488607655869/source _original_basename=tmpx3mjn65f follow=False checksum=a50d969e2fa1c836ea228114d79e5fab1cb50176 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 01:54:00 localhost systemd[1]: session-4.scope: Deactivated successfully. Nov 26 01:54:00 localhost systemd-logind[761]: Session 4 logged out. Waiting for processes to exit. Nov 26 01:54:00 localhost systemd-logind[761]: Removed session 4. Nov 26 01:58:38 localhost sshd[6170]: main: sshd: ssh-rsa algorithm is disabled Nov 26 01:58:38 localhost systemd-logind[761]: New session 5 of user zuul. Nov 26 01:58:38 localhost systemd[1]: Started Session 5 of User zuul. Nov 26 01:58:39 localhost python3[6189]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-f047-84a6-000000001cf8-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 01:58:40 localhost python3[6208]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 01:58:40 localhost python3[6224]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 01:58:41 localhost python3[6240]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 01:58:41 localhost python3[6256]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 01:58:42 localhost python3[6272]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 01:58:43 localhost python3[6320]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 01:58:43 localhost python3[6363]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764140323.1770217-632-183877538395947/source _original_basename=tmp2b82zy0d follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 01:58:45 localhost python3[6393]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 26 01:58:45 localhost systemd[1]: Reloading. Nov 26 01:58:45 localhost systemd-rc-local-generator[6411]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 01:58:45 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 01:58:46 localhost python3[6440]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None Nov 26 01:58:48 localhost python3[6456]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0 riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 01:58:48 localhost python3[6474]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0 riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 01:58:48 localhost python3[6492]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0 riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 01:58:49 localhost python3[6510]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0 riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 01:58:50 localhost python3[6527]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init"; cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system"; cat /sys/fs/cgroup/system.slice/io.max; echo "user"; cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-f047-84a6-000000001cff-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 01:58:50 localhost python3[6547]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 01:58:53 localhost systemd[1]: session-5.scope: Deactivated successfully. Nov 26 01:58:53 localhost systemd[1]: session-5.scope: Consumed 4.072s CPU time. Nov 26 01:58:53 localhost systemd-logind[761]: Session 5 logged out. Waiting for processes to exit. Nov 26 01:58:53 localhost systemd-logind[761]: Removed session 5. Nov 26 02:00:10 localhost sshd[6552]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:00:10 localhost systemd-logind[761]: New session 6 of user zuul. Nov 26 02:00:10 localhost systemd[1]: Started Session 6 of User zuul. Nov 26 02:00:11 localhost systemd[1]: Starting RHSM dbus service... Nov 26 02:00:12 localhost systemd[1]: Started RHSM dbus service. Nov 26 02:00:12 localhost rhsm-service[6576]: INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm' Nov 26 02:00:12 localhost rhsm-service[6576]: INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm' Nov 26 02:00:12 localhost rhsm-service[6576]: INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm' Nov 26 02:00:12 localhost rhsm-service[6576]: INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm' Nov 26 02:00:20 localhost rhsm-service[6576]: INFO [subscription_manager.managerlib:90] Consumer created: np0005536118.novalocal (c3ce536d-b475-4829-b391-421a6c252b3b) Nov 26 02:00:20 localhost subscription-manager[6576]: Registered system with identity: c3ce536d-b475-4829-b391-421a6c252b3b Nov 26 02:00:32 localhost rhsm-service[6576]: INFO [subscription_manager.entcertlib:131] certs updated: Nov 26 02:00:32 localhost rhsm-service[6576]: Total updates: 1 Nov 26 02:00:32 localhost rhsm-service[6576]: Found (local) serial# [] Nov 26 02:00:32 localhost rhsm-service[6576]: Expected (UEP) serial# [6471731241583790646] Nov 26 02:00:32 localhost rhsm-service[6576]: Added (new) Nov 26 02:00:32 localhost rhsm-service[6576]: [sn:6471731241583790646 ( Content Access,) @ /etc/pki/entitlement/6471731241583790646.pem] Nov 26 02:00:32 localhost rhsm-service[6576]: Deleted (rogue): Nov 26 02:00:32 localhost rhsm-service[6576]: Nov 26 02:00:32 localhost subscription-manager[6576]: Added subscription for 'Content Access' contract 'None' Nov 26 02:00:32 localhost subscription-manager[6576]: Added subscription for product ' Content Access' Nov 26 02:00:32 localhost systemd[1]: Starting Cleanup of Temporary Directories... Nov 26 02:00:32 localhost systemd[1]: Starting dnf makecache... Nov 26 02:00:32 localhost systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully. Nov 26 02:00:32 localhost systemd[1]: Finished Cleanup of Temporary Directories. Nov 26 02:00:32 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully. Nov 26 02:00:33 localhost dnf[6639]: Updating Subscription Management repositories. Nov 26 02:00:46 localhost rhsm-service[6576]: INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm' Nov 26 02:00:46 localhost rhsm-service[6576]: INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm' Nov 26 02:00:46 localhost rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 26 02:00:47 localhost rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 26 02:00:47 localhost rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 26 02:00:47 localhost rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 26 02:00:48 localhost rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 26 02:00:49 localhost dnf[6639]: Failed determining last makecache time. Nov 26 02:00:50 localhost dnf[6639]: Red Hat Enterprise Linux 9 for x86_64 - BaseOS 51 MB/s | 44 MB 00:00 Nov 26 02:00:55 localhost dnf[6639]: Red Hat Enterprise Linux 9 for x86_64 - AppStre 61 MB/s | 42 MB 00:00 Nov 26 02:00:55 localhost python3[6681]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/redhat-release zuul_log_id=fa163e3b-3c83-d35d-024d-00000000000d-1-overcloudnovacompute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:00:57 localhost python3[6700]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 26 02:01:04 localhost dnf[6639]: Red Hat Enterprise Linux 9 for x86_64 - AppStre 42 MB/s | 24 MB 00:00 Nov 26 02:01:09 localhost dnf[6639]: Red Hat Enterprise Linux 9 for x86_64 - BaseOS 26 MB/s | 14 MB 00:00 Nov 26 02:01:11 localhost dnf[6639]: Last metadata expiration check: 0:00:02 ago on Wed Nov 26 07:01:09 2025. Nov 26 02:01:12 localhost dnf[6639]: Metadata cache created. Nov 26 02:01:13 localhost systemd[1]: dnf-makecache.service: Deactivated successfully. Nov 26 02:01:13 localhost systemd[1]: Finished dnf makecache. Nov 26 02:01:13 localhost systemd[1]: dnf-makecache.service: Consumed 24.057s CPU time. Nov 26 02:01:21 localhost setsebool[6797]: The virt_use_nfs policy boolean was changed to 1 by root Nov 26 02:01:21 localhost setsebool[6797]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root Nov 26 02:01:30 localhost kernel: SELinux: Converting 407 SID table entries... Nov 26 02:01:30 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 26 02:01:30 localhost kernel: SELinux: policy capability open_perms=1 Nov 26 02:01:30 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 26 02:01:30 localhost kernel: SELinux: policy capability always_check_network=0 Nov 26 02:01:30 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 26 02:01:30 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 26 02:01:30 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 26 02:01:42 localhost dbus-broker-launch[756]: avc: op=load_policy lsm=selinux seqno=3 res=1 Nov 26 02:01:42 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 26 02:01:42 localhost systemd[1]: Starting man-db-cache-update.service... Nov 26 02:01:42 localhost systemd[1]: Reloading. Nov 26 02:01:42 localhost systemd-rc-local-generator[7675]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 02:01:42 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 02:01:42 localhost systemd[1]: Queuing reload/restart jobs for marked units… Nov 26 02:01:44 localhost rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 26 02:01:44 localhost rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 26 02:01:52 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Nov 26 02:01:52 localhost systemd[1]: Finished man-db-cache-update.service. Nov 26 02:01:52 localhost systemd[1]: man-db-cache-update.service: Consumed 10.893s CPU time. Nov 26 02:01:52 localhost systemd[1]: run-r81a378399cf74c258be3c1fc1a3c2b8a.service: Deactivated successfully. Nov 26 02:02:44 localhost systemd[1]: session-6.scope: Deactivated successfully. Nov 26 02:02:44 localhost systemd[1]: session-6.scope: Consumed 29.825s CPU time. Nov 26 02:02:44 localhost systemd-logind[761]: Session 6 logged out. Waiting for processes to exit. Nov 26 02:02:44 localhost systemd-logind[761]: Removed session 6. Nov 26 02:02:44 localhost sshd[18374]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:02:44 localhost systemd-logind[761]: New session 7 of user zuul. Nov 26 02:02:44 localhost systemd[1]: Started Session 7 of User zuul. Nov 26 02:02:44 localhost podman[18394]: 2025-11-26 07:02:44.786504413 +0000 UTC m=+0.108836214 system refresh Nov 26 02:02:45 localhost systemd[4183]: Starting D-Bus User Message Bus... Nov 26 02:02:45 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 26 02:02:45 localhost dbus-broker-launch[18453]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored Nov 26 02:02:45 localhost dbus-broker-launch[18453]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored Nov 26 02:02:45 localhost systemd[4183]: Started D-Bus User Message Bus. Nov 26 02:02:45 localhost journal[18453]: Ready Nov 26 02:02:45 localhost systemd[4183]: selinux: avc: op=load_policy lsm=selinux seqno=3 res=1 Nov 26 02:02:45 localhost systemd[4183]: Created slice Slice /user. Nov 26 02:02:45 localhost systemd[4183]: podman-18436.scope: unit configures an IP firewall, but not running as root. Nov 26 02:02:45 localhost systemd[4183]: (This warning is only shown for the first unit using IP firewalling.) Nov 26 02:02:45 localhost systemd[4183]: Started podman-18436.scope. Nov 26 02:02:46 localhost systemd[4183]: Started podman-pause-fea178c7.scope. Nov 26 02:02:48 localhost systemd[1]: session-7.scope: Deactivated successfully. Nov 26 02:02:48 localhost systemd[1]: session-7.scope: Consumed 1.185s CPU time. Nov 26 02:02:48 localhost systemd-logind[761]: Session 7 logged out. Waiting for processes to exit. Nov 26 02:02:48 localhost systemd-logind[761]: Removed session 7. Nov 26 02:03:05 localhost sshd[18456]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:03:05 localhost sshd[18458]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:03:05 localhost sshd[18460]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:03:05 localhost sshd[18459]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:03:05 localhost sshd[18457]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:03:10 localhost sshd[18466]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:03:10 localhost systemd-logind[761]: New session 8 of user zuul. Nov 26 02:03:10 localhost systemd[1]: Started Session 8 of User zuul. Nov 26 02:03:11 localhost python3[18483]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK7/bj+ZiWFDocRehbEoxL39rhCdUVeX3mjlGJhSU3i0U3FYlQ1ykBMbE6VkFwb1iGipDdiCuXJF97xMsFoOHkM= zuul@np0005536109.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 26 02:03:11 localhost python3[18499]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK7/bj+ZiWFDocRehbEoxL39rhCdUVeX3mjlGJhSU3i0U3FYlQ1ykBMbE6VkFwb1iGipDdiCuXJF97xMsFoOHkM= zuul@np0005536109.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 26 02:03:13 localhost systemd[1]: session-8.scope: Deactivated successfully. Nov 26 02:03:13 localhost systemd-logind[761]: Session 8 logged out. Waiting for processes to exit. Nov 26 02:03:13 localhost systemd-logind[761]: Removed session 8. Nov 26 02:04:42 localhost sshd[18501]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:04:43 localhost systemd-logind[761]: New session 9 of user zuul. Nov 26 02:04:43 localhost systemd[1]: Started Session 9 of User zuul. Nov 26 02:04:43 localhost python3[18520]: ansible-authorized_key Invoked with user=root manage_dir=True key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCkJddqZ+TwuLMCoD/CUKb6dnZ5nImZkr99k28vGFQTZD8B2L/Jx+KwKLctJwJdAbPZC/wCl/36ZPjbla3kwCBCcgDq4oMWypJH1O/63E9BgGHNHKyv8+W8cLdCN1zy1EpGO62uGHVn4l57+Bp2T37Fy3IKVmX+tQkDoTdmzgtr5i8E1khji5awitbNX6RCXkWRlMkvVByLh74T7HTnO21e4xp556VlHAFGjYIDNAjgNkyhO6M9ssBagiIOrBzbXvnmNyZxIeiznzLQGBwty3La7OiGgztNcwLCRTVHG+4hwiKk7RIRradK18HqKab9McNcGbbIU/uUQYbYTPIEWiEmDTYeyTBoy+veLsVUYfXRLJDerz6WvmIUiiLVU0ABmx7b9k9dwjYa9U8tscYuTfYVjocSnR3IVQDEikuw4Bklms2ijHLwfRS9oeb9XvpqyM10A4FQnSLPgHdrRpCWBm4+Nek0Esi3RXYub8PT5HuL5Q87j+qe66WazVu6iSRRGCM= zuul-build-sshkey state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 26 02:04:44 localhost python3[18536]: ansible-user Invoked with name=root state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005536118.novalocal update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None Nov 26 02:04:47 localhost python3[18586]: ansible-ansible.legacy.stat Invoked with path=/root/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:04:47 localhost python3[18629]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764140686.8878868-133-112501480375019/source dest=/root/.ssh/id_rsa mode=384 owner=root force=False _original_basename=d32fdd92636a49f1ac6e190427f27813_id_rsa follow=False checksum=c5afe835443889fe9adecf4b2807aa7da7e61790 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:04:48 localhost python3[18691]: ansible-ansible.legacy.stat Invoked with path=/root/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:04:49 localhost python3[18734]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764140688.6557455-224-205803618790646/source dest=/root/.ssh/id_rsa.pub mode=420 owner=root force=False _original_basename=d32fdd92636a49f1ac6e190427f27813_id_rsa.pub follow=False checksum=868249ad8a507d032dbe5de33122c4f3d30d20a1 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:04:51 localhost python3[18764]: ansible-ansible.builtin.file Invoked with path=/etc/nodepool state=directory mode=0777 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:04:52 localhost python3[18810]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:04:53 localhost python3[18826]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/sub_nodes _original_basename=tmpi186c59c recurse=False state=file path=/etc/nodepool/sub_nodes force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:04:54 localhost python3[18886]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:04:54 localhost python3[18902]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/sub_nodes_private _original_basename=tmpc_49jb6k recurse=False state=file path=/etc/nodepool/sub_nodes_private force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:04:55 localhost python3[18962]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:04:56 localhost python3[18978]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/node_private _original_basename=tmp67mj3il_ recurse=False state=file path=/etc/nodepool/node_private force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:04:56 localhost systemd[1]: session-9.scope: Deactivated successfully. Nov 26 02:04:56 localhost systemd[1]: session-9.scope: Consumed 4.052s CPU time. Nov 26 02:04:56 localhost systemd-logind[761]: Session 9 logged out. Waiting for processes to exit. Nov 26 02:04:56 localhost systemd-logind[761]: Removed session 9. Nov 26 02:07:05 localhost sshd[18995]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:07:06 localhost systemd-logind[761]: New session 10 of user zuul. Nov 26 02:07:06 localhost systemd[1]: Started Session 10 of User zuul. Nov 26 02:07:06 localhost python3[19041]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:12:06 localhost systemd[1]: session-10.scope: Deactivated successfully. Nov 26 02:12:06 localhost systemd-logind[761]: Session 10 logged out. Waiting for processes to exit. Nov 26 02:12:06 localhost systemd-logind[761]: Removed session 10. Nov 26 02:17:53 localhost sshd[19048]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:17:53 localhost systemd-logind[761]: New session 11 of user zuul. Nov 26 02:17:53 localhost systemd[1]: Started Session 11 of User zuul. Nov 26 02:17:53 localhost python3[19065]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/redhat-release zuul_log_id=fa163e3b-3c83-d4e6-36e5-00000000000c-1-overcloudnovacompute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:17:55 localhost python3[19085]: ansible-ansible.legacy.command Invoked with _raw_params=yum clean all zuul_log_id=fa163e3b-3c83-d4e6-36e5-00000000000d-1-overcloudnovacompute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:18:00 localhost python3[19104]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-baseos-eus-rpms'] state=enabled purge=False Nov 26 02:18:03 localhost rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 26 02:18:58 localhost python3[19263]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-appstream-eus-rpms'] state=enabled purge=False Nov 26 02:19:01 localhost rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 26 02:19:01 localhost rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 26 02:19:09 localhost python3[19463]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-highavailability-eus-rpms'] state=enabled purge=False Nov 26 02:19:12 localhost rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 26 02:19:12 localhost rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 26 02:19:18 localhost rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 26 02:19:18 localhost rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 26 02:19:41 localhost python3[19857]: ansible-community.general.rhsm_repository Invoked with name=['fast-datapath-for-rhel-9-x86_64-rpms'] state=enabled purge=False Nov 26 02:19:44 localhost rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 26 02:19:49 localhost rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 26 02:19:49 localhost rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 26 02:20:12 localhost python3[20133]: ansible-community.general.rhsm_repository Invoked with name=['openstack-17.1-for-rhel-9-x86_64-rpms'] state=enabled purge=False Nov 26 02:20:15 localhost rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 26 02:20:20 localhost rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 26 02:20:20 localhost rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 26 02:20:47 localhost python3[20470]: ansible-ansible.legacy.command Invoked with _raw_params=yum repolist --enabled#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-d4e6-36e5-000000000013-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:20:52 localhost python3[20489]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch', 'os-net-config', 'ansible-core'] state=present update_cache=True allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 26 02:21:12 localhost kernel: SELinux: Converting 487 SID table entries... Nov 26 02:21:12 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 26 02:21:12 localhost kernel: SELinux: policy capability open_perms=1 Nov 26 02:21:12 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 26 02:21:12 localhost kernel: SELinux: policy capability always_check_network=0 Nov 26 02:21:12 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 26 02:21:12 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 26 02:21:12 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 26 02:21:13 localhost dbus-broker-launch[756]: avc: op=load_policy lsm=selinux seqno=4 res=1 Nov 26 02:21:13 localhost systemd[1]: Started daily update of the root trust anchor for DNSSEC. Nov 26 02:21:16 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 26 02:21:16 localhost systemd[1]: Starting man-db-cache-update.service... Nov 26 02:21:16 localhost systemd[1]: Reloading. Nov 26 02:21:16 localhost systemd-rc-local-generator[21162]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 02:21:16 localhost systemd-sysv-generator[21165]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 02:21:16 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 02:21:16 localhost systemd[1]: Queuing reload/restart jobs for marked units… Nov 26 02:21:17 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Nov 26 02:21:17 localhost systemd[1]: Finished man-db-cache-update.service. Nov 26 02:21:17 localhost systemd[1]: run-r09fa9acc8ab1446a88e85ed113289dcf.service: Deactivated successfully. Nov 26 02:21:18 localhost rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 26 02:21:18 localhost rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 26 02:21:46 localhost python3[21819]: ansible-ansible.legacy.command Invoked with _raw_params=ansible-galaxy collection install ansible.posix#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-d4e6-36e5-000000000015-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:22:18 localhost python3[21839]: ansible-ansible.builtin.file Invoked with path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:22:19 localhost python3[21887]: ansible-ansible.legacy.stat Invoked with path=/etc/os-net-config/tripleo_config.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:22:19 localhost python3[21930]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764141738.6375675-292-128963151956593/source dest=/etc/os-net-config/tripleo_config.yaml mode=None follow=False _original_basename=overcloud_net_config.j2 checksum=9333f42ac4b9baf349a5c32f7bcba3335b5912e0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:22:20 localhost python3[21960]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Nov 26 02:22:20 localhost systemd-journald[618]: Field hash table of /run/log/journal/ea6370aa35b896eb1e7cdbd81aa316d7/system.journal has a fill level at 89.2 (297 of 333 items), suggesting rotation. Nov 26 02:22:20 localhost systemd-journald[618]: /run/log/journal/ea6370aa35b896eb1e7cdbd81aa316d7/system.journal: Journal header limits reached or header out-of-date, rotating. Nov 26 02:22:20 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 26 02:22:21 localhost python3[21981]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-20 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Nov 26 02:22:21 localhost python3[22001]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-21 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Nov 26 02:22:21 localhost python3[22021]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-22 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Nov 26 02:22:21 localhost python3[22041]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-23 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Nov 26 02:22:24 localhost python3[22061]: ansible-ansible.builtin.systemd Invoked with name=network state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 26 02:22:24 localhost systemd[1]: Starting LSB: Bring up/down networking... Nov 26 02:22:24 localhost network[22064]: WARN : [network] You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 26 02:22:24 localhost network[22075]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 26 02:22:24 localhost network[22064]: WARN : [network] 'network-scripts' will be removed from distribution in near future. Nov 26 02:22:24 localhost network[22076]: 'network-scripts' will be removed from distribution in near future. Nov 26 02:22:24 localhost network[22064]: WARN : [network] It is advised to switch to 'NetworkManager' instead for network management. Nov 26 02:22:24 localhost network[22077]: It is advised to switch to 'NetworkManager' instead for network management. Nov 26 02:22:24 localhost NetworkManager[5970]: [1764141744.4020] audit: op="connections-reload" pid=22105 uid=0 result="success" Nov 26 02:22:24 localhost network[22064]: Bringing up loopback interface: [ OK ] Nov 26 02:22:24 localhost NetworkManager[5970]: [1764141744.6076] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth0" pid=22193 uid=0 result="success" Nov 26 02:22:24 localhost network[22064]: Bringing up interface eth0: [ OK ] Nov 26 02:22:24 localhost systemd[1]: Started LSB: Bring up/down networking. Nov 26 02:22:25 localhost python3[22234]: ansible-ansible.builtin.systemd Invoked with name=openvswitch state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 26 02:22:25 localhost systemd[1]: Starting Open vSwitch Database Unit... Nov 26 02:22:25 localhost chown[22238]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory Nov 26 02:22:25 localhost ovs-ctl[22243]: /etc/openvswitch/conf.db does not exist ... (warning). Nov 26 02:22:25 localhost ovs-ctl[22243]: Creating empty database /etc/openvswitch/conf.db [ OK ] Nov 26 02:22:25 localhost ovs-ctl[22243]: Starting ovsdb-server [ OK ] Nov 26 02:22:25 localhost ovs-vsctl[22293]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1 Nov 26 02:22:25 localhost ovs-vsctl[22313]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.6-141.el9fdp "external-ids:system-id=\"8fad182b-d1fd-4eb1-a4d3-436a76a6f49e\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"rhel\"" "system-version=\"9.2\"" Nov 26 02:22:25 localhost ovs-ctl[22243]: Configuring Open vSwitch system IDs [ OK ] Nov 26 02:22:25 localhost ovs-ctl[22243]: Enabling remote OVSDB managers [ OK ] Nov 26 02:22:25 localhost ovs-vsctl[22319]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=np0005536118.novalocal Nov 26 02:22:25 localhost systemd[1]: Started Open vSwitch Database Unit. Nov 26 02:22:25 localhost systemd[1]: Starting Open vSwitch Delete Transient Ports... Nov 26 02:22:25 localhost systemd[1]: Finished Open vSwitch Delete Transient Ports. Nov 26 02:22:25 localhost systemd[1]: Starting Open vSwitch Forwarding Unit... Nov 26 02:22:25 localhost kernel: openvswitch: Open vSwitch switching datapath Nov 26 02:22:25 localhost ovs-ctl[22364]: Inserting openvswitch module [ OK ] Nov 26 02:22:25 localhost ovs-ctl[22332]: Starting ovs-vswitchd [ OK ] Nov 26 02:22:25 localhost ovs-ctl[22332]: Enabling remote OVSDB managers [ OK ] Nov 26 02:22:25 localhost ovs-vsctl[22383]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=np0005536118.novalocal Nov 26 02:22:25 localhost systemd[1]: Started Open vSwitch Forwarding Unit. Nov 26 02:22:25 localhost systemd[1]: Starting Open vSwitch... Nov 26 02:22:25 localhost systemd[1]: Finished Open vSwitch. Nov 26 02:22:28 localhost python3[22401]: ansible-ansible.legacy.command Invoked with _raw_params=os-net-config -c /etc/os-net-config/tripleo_config.yaml#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-d4e6-36e5-00000000001a-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:22:29 localhost NetworkManager[5970]: [1764141749.7461] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22559 uid=0 result="success" Nov 26 02:22:29 localhost ifup[22560]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 26 02:22:29 localhost ifup[22561]: 'network-scripts' will be removed from distribution in near future. Nov 26 02:22:29 localhost ifup[22562]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 26 02:22:29 localhost NetworkManager[5970]: [1764141749.7758] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22568 uid=0 result="success" Nov 26 02:22:29 localhost ovs-vsctl[22570]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --may-exist add-br br-ex -- set bridge br-ex other-config:mac-table-size=50000 -- set bridge br-ex other-config:hwaddr=fa:16:3e:70:c4:f6 -- set bridge br-ex fail_mode=standalone -- del-controller br-ex Nov 26 02:22:29 localhost kernel: device ovs-system entered promiscuous mode Nov 26 02:22:29 localhost NetworkManager[5970]: [1764141749.7933] manager: (ovs-system): new Generic device (/org/freedesktop/NetworkManager/Devices/4) Nov 26 02:22:29 localhost kernel: Timeout policy base is empty Nov 26 02:22:29 localhost kernel: Failed to associated timeout policy `ovs_test_tp' Nov 26 02:22:29 localhost systemd-udevd[22572]: Network interface NamePolicy= disabled on kernel command line. Nov 26 02:22:29 localhost kernel: device br-ex entered promiscuous mode Nov 26 02:22:29 localhost systemd-udevd[22588]: Network interface NamePolicy= disabled on kernel command line. Nov 26 02:22:29 localhost NetworkManager[5970]: [1764141749.8191] manager: (br-ex): new Generic device (/org/freedesktop/NetworkManager/Devices/5) Nov 26 02:22:29 localhost NetworkManager[5970]: [1764141749.8397] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22598 uid=0 result="success" Nov 26 02:22:29 localhost NetworkManager[5970]: [1764141749.8594] device (br-ex): carrier: link connected Nov 26 02:22:32 localhost NetworkManager[5970]: [1764141752.9172] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22627 uid=0 result="success" Nov 26 02:22:32 localhost NetworkManager[5970]: [1764141752.9596] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22642 uid=0 result="success" Nov 26 02:22:33 localhost NET[22667]: /etc/sysconfig/network-scripts/ifup-post : updated /etc/resolv.conf Nov 26 02:22:33 localhost NetworkManager[5970]: [1764141753.0395] device (eth1): state change: activated -> unmanaged (reason 'unmanaged', sys-iface-state: 'managed') Nov 26 02:22:33 localhost NetworkManager[5970]: [1764141753.0498] dhcp4 (eth1): canceled DHCP transaction Nov 26 02:22:33 localhost NetworkManager[5970]: [1764141753.0499] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds) Nov 26 02:22:33 localhost NetworkManager[5970]: [1764141753.0500] dhcp4 (eth1): state changed no lease Nov 26 02:22:33 localhost NetworkManager[5970]: [1764141753.0555] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22676 uid=0 result="success" Nov 26 02:22:33 localhost ifup[22677]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 26 02:22:33 localhost ifup[22678]: 'network-scripts' will be removed from distribution in near future. Nov 26 02:22:33 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Nov 26 02:22:33 localhost ifup[22680]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 26 02:22:33 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Nov 26 02:22:33 localhost NetworkManager[5970]: [1764141753.0893] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22694 uid=0 result="success" Nov 26 02:22:33 localhost NetworkManager[5970]: [1764141753.1593] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22704 uid=0 result="success" Nov 26 02:22:33 localhost NetworkManager[5970]: [1764141753.1638] device (eth1): carrier: link connected Nov 26 02:22:33 localhost NetworkManager[5970]: [1764141753.1817] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22713 uid=0 result="success" Nov 26 02:22:33 localhost ipv6_wait_tentative[22725]: Waiting for interface eth1 IPv6 address(es) to leave the 'tentative' state Nov 26 02:22:34 localhost ipv6_wait_tentative[22730]: Waiting for interface eth1 IPv6 address(es) to leave the 'tentative' state Nov 26 02:22:35 localhost NetworkManager[5970]: [1764141755.2435] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22739 uid=0 result="success" Nov 26 02:22:35 localhost ovs-vsctl[22754]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex eth1 -- add-port br-ex eth1 Nov 26 02:22:35 localhost kernel: device eth1 entered promiscuous mode Nov 26 02:22:35 localhost NetworkManager[5970]: [1764141755.3137] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22762 uid=0 result="success" Nov 26 02:22:35 localhost ifup[22763]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 26 02:22:35 localhost ifup[22764]: 'network-scripts' will be removed from distribution in near future. Nov 26 02:22:35 localhost ifup[22765]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 26 02:22:35 localhost NetworkManager[5970]: [1764141755.3427] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22771 uid=0 result="success" Nov 26 02:22:35 localhost NetworkManager[5970]: [1764141755.3822] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22781 uid=0 result="success" Nov 26 02:22:35 localhost ifup[22782]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 26 02:22:35 localhost ifup[22783]: 'network-scripts' will be removed from distribution in near future. Nov 26 02:22:35 localhost ifup[22784]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 26 02:22:35 localhost NetworkManager[5970]: [1764141755.4161] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22790 uid=0 result="success" Nov 26 02:22:35 localhost ovs-vsctl[22793]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan21 -- add-port br-ex vlan21 tag=21 -- set Interface vlan21 type=internal Nov 26 02:22:35 localhost kernel: device vlan21 entered promiscuous mode Nov 26 02:22:35 localhost NetworkManager[5970]: [1764141755.4563] manager: (vlan21): new Generic device (/org/freedesktop/NetworkManager/Devices/6) Nov 26 02:22:35 localhost systemd-udevd[22795]: Network interface NamePolicy= disabled on kernel command line. Nov 26 02:22:35 localhost NetworkManager[5970]: [1764141755.4847] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22804 uid=0 result="success" Nov 26 02:22:35 localhost NetworkManager[5970]: [1764141755.5050] device (vlan21): carrier: link connected Nov 26 02:22:38 localhost NetworkManager[5970]: [1764141758.5583] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22833 uid=0 result="success" Nov 26 02:22:38 localhost NetworkManager[5970]: [1764141758.6066] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22848 uid=0 result="success" Nov 26 02:22:38 localhost NetworkManager[5970]: [1764141758.6680] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22869 uid=0 result="success" Nov 26 02:22:38 localhost ifup[22870]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 26 02:22:38 localhost ifup[22871]: 'network-scripts' will be removed from distribution in near future. Nov 26 02:22:38 localhost ifup[22872]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 26 02:22:38 localhost NetworkManager[5970]: [1764141758.7012] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22878 uid=0 result="success" Nov 26 02:22:38 localhost ovs-vsctl[22881]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan20 -- add-port br-ex vlan20 tag=20 -- set Interface vlan20 type=internal Nov 26 02:22:38 localhost kernel: device vlan20 entered promiscuous mode Nov 26 02:22:38 localhost systemd-udevd[22883]: Network interface NamePolicy= disabled on kernel command line. Nov 26 02:22:38 localhost NetworkManager[5970]: [1764141758.7385] manager: (vlan20): new Generic device (/org/freedesktop/NetworkManager/Devices/7) Nov 26 02:22:38 localhost NetworkManager[5970]: [1764141758.7637] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22893 uid=0 result="success" Nov 26 02:22:38 localhost NetworkManager[5970]: [1764141758.7839] device (vlan20): carrier: link connected Nov 26 02:22:41 localhost NetworkManager[5970]: [1764141761.8324] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22923 uid=0 result="success" Nov 26 02:22:41 localhost NetworkManager[5970]: [1764141761.8773] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22938 uid=0 result="success" Nov 26 02:22:41 localhost NetworkManager[5970]: [1764141761.9378] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=22959 uid=0 result="success" Nov 26 02:22:41 localhost ifup[22960]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 26 02:22:41 localhost ifup[22961]: 'network-scripts' will be removed from distribution in near future. Nov 26 02:22:41 localhost ifup[22962]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 26 02:22:41 localhost NetworkManager[5970]: [1764141761.9695] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=22968 uid=0 result="success" Nov 26 02:22:42 localhost ovs-vsctl[22971]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan23 -- add-port br-ex vlan23 tag=23 -- set Interface vlan23 type=internal Nov 26 02:22:42 localhost kernel: device vlan23 entered promiscuous mode Nov 26 02:22:42 localhost systemd-udevd[22973]: Network interface NamePolicy= disabled on kernel command line. Nov 26 02:22:42 localhost NetworkManager[5970]: [1764141762.0522] manager: (vlan23): new Generic device (/org/freedesktop/NetworkManager/Devices/8) Nov 26 02:22:42 localhost NetworkManager[5970]: [1764141762.0793] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=22983 uid=0 result="success" Nov 26 02:22:42 localhost NetworkManager[5970]: [1764141762.0999] device (vlan23): carrier: link connected Nov 26 02:22:43 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Nov 26 02:22:45 localhost NetworkManager[5970]: [1764141765.1490] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23013 uid=0 result="success" Nov 26 02:22:45 localhost NetworkManager[5970]: [1764141765.1963] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23028 uid=0 result="success" Nov 26 02:22:45 localhost NetworkManager[5970]: [1764141765.2523] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23049 uid=0 result="success" Nov 26 02:22:45 localhost ifup[23050]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 26 02:22:45 localhost ifup[23051]: 'network-scripts' will be removed from distribution in near future. Nov 26 02:22:45 localhost ifup[23052]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 26 02:22:45 localhost NetworkManager[5970]: [1764141765.2856] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23058 uid=0 result="success" Nov 26 02:22:45 localhost ovs-vsctl[23061]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan44 -- add-port br-ex vlan44 tag=44 -- set Interface vlan44 type=internal Nov 26 02:22:45 localhost kernel: device vlan44 entered promiscuous mode Nov 26 02:22:45 localhost NetworkManager[5970]: [1764141765.3261] manager: (vlan44): new Generic device (/org/freedesktop/NetworkManager/Devices/9) Nov 26 02:22:45 localhost systemd-udevd[23063]: Network interface NamePolicy= disabled on kernel command line. Nov 26 02:22:45 localhost NetworkManager[5970]: [1764141765.3540] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23073 uid=0 result="success" Nov 26 02:22:45 localhost NetworkManager[5970]: [1764141765.3716] device (vlan44): carrier: link connected Nov 26 02:22:48 localhost NetworkManager[5970]: [1764141768.4296] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23103 uid=0 result="success" Nov 26 02:22:48 localhost NetworkManager[5970]: [1764141768.4709] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23118 uid=0 result="success" Nov 26 02:22:48 localhost NetworkManager[5970]: [1764141768.5279] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23139 uid=0 result="success" Nov 26 02:22:48 localhost ifup[23140]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 26 02:22:48 localhost ifup[23141]: 'network-scripts' will be removed from distribution in near future. Nov 26 02:22:48 localhost ifup[23142]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 26 02:22:48 localhost NetworkManager[5970]: [1764141768.5573] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23148 uid=0 result="success" Nov 26 02:22:48 localhost ovs-vsctl[23151]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan22 -- add-port br-ex vlan22 tag=22 -- set Interface vlan22 type=internal Nov 26 02:22:48 localhost NetworkManager[5970]: [1764141768.6225] manager: (vlan22): new Generic device (/org/freedesktop/NetworkManager/Devices/10) Nov 26 02:22:48 localhost kernel: device vlan22 entered promiscuous mode Nov 26 02:22:48 localhost systemd-udevd[23153]: Network interface NamePolicy= disabled on kernel command line. Nov 26 02:22:48 localhost NetworkManager[5970]: [1764141768.6463] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23163 uid=0 result="success" Nov 26 02:22:48 localhost NetworkManager[5970]: [1764141768.6664] device (vlan22): carrier: link connected Nov 26 02:22:51 localhost NetworkManager[5970]: [1764141771.7177] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23193 uid=0 result="success" Nov 26 02:22:51 localhost NetworkManager[5970]: [1764141771.7678] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23208 uid=0 result="success" Nov 26 02:22:51 localhost NetworkManager[5970]: [1764141771.8326] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23229 uid=0 result="success" Nov 26 02:22:51 localhost ifup[23230]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 26 02:22:51 localhost ifup[23231]: 'network-scripts' will be removed from distribution in near future. Nov 26 02:22:51 localhost ifup[23232]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 26 02:22:51 localhost NetworkManager[5970]: [1764141771.8616] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23238 uid=0 result="success" Nov 26 02:22:51 localhost ovs-vsctl[23241]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan44 -- add-port br-ex vlan44 tag=44 -- set Interface vlan44 type=internal Nov 26 02:22:51 localhost NetworkManager[5970]: [1764141771.9486] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23248 uid=0 result="success" Nov 26 02:22:53 localhost NetworkManager[5970]: [1764141773.0037] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23275 uid=0 result="success" Nov 26 02:22:53 localhost NetworkManager[5970]: [1764141773.0480] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23290 uid=0 result="success" Nov 26 02:22:53 localhost NetworkManager[5970]: [1764141773.1009] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23311 uid=0 result="success" Nov 26 02:22:53 localhost ifup[23312]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 26 02:22:53 localhost ifup[23313]: 'network-scripts' will be removed from distribution in near future. Nov 26 02:22:53 localhost ifup[23314]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 26 02:22:53 localhost NetworkManager[5970]: [1764141773.1323] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23320 uid=0 result="success" Nov 26 02:22:53 localhost ovs-vsctl[23323]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan20 -- add-port br-ex vlan20 tag=20 -- set Interface vlan20 type=internal Nov 26 02:22:53 localhost NetworkManager[5970]: [1764141773.2236] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23330 uid=0 result="success" Nov 26 02:22:54 localhost NetworkManager[5970]: [1764141774.2864] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23358 uid=0 result="success" Nov 26 02:22:54 localhost NetworkManager[5970]: [1764141774.3257] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23373 uid=0 result="success" Nov 26 02:22:54 localhost NetworkManager[5970]: [1764141774.3838] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23394 uid=0 result="success" Nov 26 02:22:54 localhost ifup[23395]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 26 02:22:54 localhost ifup[23396]: 'network-scripts' will be removed from distribution in near future. Nov 26 02:22:54 localhost ifup[23397]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 26 02:22:54 localhost NetworkManager[5970]: [1764141774.4179] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23403 uid=0 result="success" Nov 26 02:22:54 localhost ovs-vsctl[23406]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan21 -- add-port br-ex vlan21 tag=21 -- set Interface vlan21 type=internal Nov 26 02:22:54 localhost NetworkManager[5970]: [1764141774.5047] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23413 uid=0 result="success" Nov 26 02:22:55 localhost NetworkManager[5970]: [1764141775.5668] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23441 uid=0 result="success" Nov 26 02:22:55 localhost NetworkManager[5970]: [1764141775.6158] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23456 uid=0 result="success" Nov 26 02:22:55 localhost NetworkManager[5970]: [1764141775.6795] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23477 uid=0 result="success" Nov 26 02:22:55 localhost ifup[23478]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 26 02:22:55 localhost ifup[23479]: 'network-scripts' will be removed from distribution in near future. Nov 26 02:22:55 localhost ifup[23480]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 26 02:22:55 localhost NetworkManager[5970]: [1764141775.7146] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23486 uid=0 result="success" Nov 26 02:22:55 localhost ovs-vsctl[23489]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan23 -- add-port br-ex vlan23 tag=23 -- set Interface vlan23 type=internal Nov 26 02:22:55 localhost NetworkManager[5970]: [1764141775.8069] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23496 uid=0 result="success" Nov 26 02:22:56 localhost NetworkManager[5970]: [1764141776.8670] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23524 uid=0 result="success" Nov 26 02:22:56 localhost NetworkManager[5970]: [1764141776.9149] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23539 uid=0 result="success" Nov 26 02:22:56 localhost NetworkManager[5970]: [1764141776.9778] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23560 uid=0 result="success" Nov 26 02:22:56 localhost ifup[23561]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 26 02:22:56 localhost ifup[23562]: 'network-scripts' will be removed from distribution in near future. Nov 26 02:22:56 localhost ifup[23563]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 26 02:22:57 localhost NetworkManager[5970]: [1764141777.0115] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23569 uid=0 result="success" Nov 26 02:22:57 localhost ovs-vsctl[23572]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan22 -- add-port br-ex vlan22 tag=22 -- set Interface vlan22 type=internal Nov 26 02:22:57 localhost NetworkManager[5970]: [1764141777.1046] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23579 uid=0 result="success" Nov 26 02:22:58 localhost NetworkManager[5970]: [1764141778.1710] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23607 uid=0 result="success" Nov 26 02:22:58 localhost NetworkManager[5970]: [1764141778.2227] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23622 uid=0 result="success" Nov 26 02:23:51 localhost python3[23654]: ansible-ansible.legacy.command Invoked with _raw_params=ip a#012ping -c 2 -W 2 192.168.122.10#012ping -c 2 -W 2 192.168.122.11#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-d4e6-36e5-00000000001b-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:23:56 localhost python3[23673]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCkJddqZ+TwuLMCoD/CUKb6dnZ5nImZkr99k28vGFQTZD8B2L/Jx+KwKLctJwJdAbPZC/wCl/36ZPjbla3kwCBCcgDq4oMWypJH1O/63E9BgGHNHKyv8+W8cLdCN1zy1EpGO62uGHVn4l57+Bp2T37Fy3IKVmX+tQkDoTdmzgtr5i8E1khji5awitbNX6RCXkWRlMkvVByLh74T7HTnO21e4xp556VlHAFGjYIDNAjgNkyhO6M9ssBagiIOrBzbXvnmNyZxIeiznzLQGBwty3La7OiGgztNcwLCRTVHG+4hwiKk7RIRradK18HqKab9McNcGbbIU/uUQYbYTPIEWiEmDTYeyTBoy+veLsVUYfXRLJDerz6WvmIUiiLVU0ABmx7b9k9dwjYa9U8tscYuTfYVjocSnR3IVQDEikuw4Bklms2ijHLwfRS9oeb9XvpqyM10A4FQnSLPgHdrRpCWBm4+Nek0Esi3RXYub8PT5HuL5Q87j+qe66WazVu6iSRRGCM= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 26 02:23:57 localhost python3[23689]: ansible-ansible.posix.authorized_key Invoked with user=root key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCkJddqZ+TwuLMCoD/CUKb6dnZ5nImZkr99k28vGFQTZD8B2L/Jx+KwKLctJwJdAbPZC/wCl/36ZPjbla3kwCBCcgDq4oMWypJH1O/63E9BgGHNHKyv8+W8cLdCN1zy1EpGO62uGHVn4l57+Bp2T37Fy3IKVmX+tQkDoTdmzgtr5i8E1khji5awitbNX6RCXkWRlMkvVByLh74T7HTnO21e4xp556VlHAFGjYIDNAjgNkyhO6M9ssBagiIOrBzbXvnmNyZxIeiznzLQGBwty3La7OiGgztNcwLCRTVHG+4hwiKk7RIRradK18HqKab9McNcGbbIU/uUQYbYTPIEWiEmDTYeyTBoy+veLsVUYfXRLJDerz6WvmIUiiLVU0ABmx7b9k9dwjYa9U8tscYuTfYVjocSnR3IVQDEikuw4Bklms2ijHLwfRS9oeb9XvpqyM10A4FQnSLPgHdrRpCWBm4+Nek0Esi3RXYub8PT5HuL5Q87j+qe66WazVu6iSRRGCM= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 26 02:23:58 localhost python3[23703]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCkJddqZ+TwuLMCoD/CUKb6dnZ5nImZkr99k28vGFQTZD8B2L/Jx+KwKLctJwJdAbPZC/wCl/36ZPjbla3kwCBCcgDq4oMWypJH1O/63E9BgGHNHKyv8+W8cLdCN1zy1EpGO62uGHVn4l57+Bp2T37Fy3IKVmX+tQkDoTdmzgtr5i8E1khji5awitbNX6RCXkWRlMkvVByLh74T7HTnO21e4xp556VlHAFGjYIDNAjgNkyhO6M9ssBagiIOrBzbXvnmNyZxIeiznzLQGBwty3La7OiGgztNcwLCRTVHG+4hwiKk7RIRradK18HqKab9McNcGbbIU/uUQYbYTPIEWiEmDTYeyTBoy+veLsVUYfXRLJDerz6WvmIUiiLVU0ABmx7b9k9dwjYa9U8tscYuTfYVjocSnR3IVQDEikuw4Bklms2ijHLwfRS9oeb9XvpqyM10A4FQnSLPgHdrRpCWBm4+Nek0Esi3RXYub8PT5HuL5Q87j+qe66WazVu6iSRRGCM= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 26 02:23:59 localhost python3[23719]: ansible-ansible.posix.authorized_key Invoked with user=root key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCkJddqZ+TwuLMCoD/CUKb6dnZ5nImZkr99k28vGFQTZD8B2L/Jx+KwKLctJwJdAbPZC/wCl/36ZPjbla3kwCBCcgDq4oMWypJH1O/63E9BgGHNHKyv8+W8cLdCN1zy1EpGO62uGHVn4l57+Bp2T37Fy3IKVmX+tQkDoTdmzgtr5i8E1khji5awitbNX6RCXkWRlMkvVByLh74T7HTnO21e4xp556VlHAFGjYIDNAjgNkyhO6M9ssBagiIOrBzbXvnmNyZxIeiznzLQGBwty3La7OiGgztNcwLCRTVHG+4hwiKk7RIRradK18HqKab9McNcGbbIU/uUQYbYTPIEWiEmDTYeyTBoy+veLsVUYfXRLJDerz6WvmIUiiLVU0ABmx7b9k9dwjYa9U8tscYuTfYVjocSnR3IVQDEikuw4Bklms2ijHLwfRS9oeb9XvpqyM10A4FQnSLPgHdrRpCWBm4+Nek0Esi3RXYub8PT5HuL5Q87j+qe66WazVu6iSRRGCM= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 26 02:24:00 localhost python3[23733]: ansible-ansible.builtin.slurp Invoked with path=/etc/hostname src=/etc/hostname Nov 26 02:24:00 localhost python3[23748]: ansible-ansible.legacy.command Invoked with _raw_params=hostname="np0005536118.novalocal"#012hostname_str_array=(${hostname//./ })#012echo ${hostname_str_array[0]} > /home/zuul/ansible_hostname#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-d4e6-36e5-000000000022-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:24:01 localhost python3[23768]: ansible-ansible.legacy.command Invoked with _raw_params=hostname=$(cat /home/zuul/ansible_hostname)#012hostnamectl hostname "$hostname.localdomain"#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-d4e6-36e5-000000000023-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:24:01 localhost systemd[1]: Starting Hostname Service... Nov 26 02:24:01 localhost systemd[1]: Started Hostname Service. Nov 26 02:24:01 localhost systemd-hostnamed[23772]: Hostname set to (static) Nov 26 02:24:01 localhost NetworkManager[5970]: [1764141841.8144] hostname: static hostname changed from "np0005536118.novalocal" to "np0005536118.localdomain" Nov 26 02:24:01 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Nov 26 02:24:01 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Nov 26 02:24:03 localhost systemd[1]: session-11.scope: Deactivated successfully. Nov 26 02:24:03 localhost systemd[1]: session-11.scope: Consumed 1min 44.345s CPU time. Nov 26 02:24:03 localhost systemd-logind[761]: Session 11 logged out. Waiting for processes to exit. Nov 26 02:24:03 localhost systemd-logind[761]: Removed session 11. Nov 26 02:24:05 localhost sshd[23783]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:24:05 localhost systemd-logind[761]: New session 12 of user zuul. Nov 26 02:24:05 localhost systemd[1]: Started Session 12 of User zuul. Nov 26 02:24:06 localhost python3[23800]: ansible-ansible.builtin.slurp Invoked with path=/home/zuul/ansible_hostname src=/home/zuul/ansible_hostname Nov 26 02:24:08 localhost systemd[1]: session-12.scope: Deactivated successfully. Nov 26 02:24:08 localhost systemd-logind[761]: Session 12 logged out. Waiting for processes to exit. Nov 26 02:24:08 localhost systemd-logind[761]: Removed session 12. Nov 26 02:24:11 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Nov 26 02:24:31 localhost systemd[1]: systemd-hostnamed.service: Deactivated successfully. Nov 26 02:24:53 localhost sshd[23804]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:24:53 localhost systemd-logind[761]: New session 13 of user zuul. Nov 26 02:24:53 localhost systemd[1]: Started Session 13 of User zuul. Nov 26 02:24:54 localhost python3[23823]: ansible-ansible.legacy.dnf Invoked with name=['lvm2', 'jq'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 26 02:24:57 localhost systemd[1]: Reloading. Nov 26 02:24:57 localhost systemd-rc-local-generator[23865]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 02:24:57 localhost systemd-sysv-generator[23870]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 02:24:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 02:24:57 localhost systemd[1]: Listening on Device-mapper event daemon FIFOs. Nov 26 02:24:58 localhost systemd[1]: Reloading. Nov 26 02:24:58 localhost systemd-rc-local-generator[23913]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 02:24:58 localhost systemd-sysv-generator[23916]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 02:24:58 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 02:24:58 localhost systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling... Nov 26 02:24:58 localhost systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling. Nov 26 02:24:58 localhost systemd[1]: Reloading. Nov 26 02:24:58 localhost systemd-sysv-generator[23951]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 02:24:58 localhost systemd-rc-local-generator[23944]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 02:24:58 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 02:24:58 localhost systemd[1]: Listening on LVM2 poll daemon socket. Nov 26 02:24:58 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 26 02:24:58 localhost systemd[1]: Starting man-db-cache-update.service... Nov 26 02:24:58 localhost systemd[1]: Reloading. Nov 26 02:24:59 localhost systemd-rc-local-generator[24003]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 02:24:59 localhost systemd-sysv-generator[24009]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 02:24:59 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 02:24:59 localhost systemd[1]: Queuing reload/restart jobs for marked units… Nov 26 02:24:59 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 26 02:24:59 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Nov 26 02:24:59 localhost systemd[1]: Finished man-db-cache-update.service. Nov 26 02:24:59 localhost systemd[1]: run-r12f6e4c3f90d41f6bb0c527e8296704c.service: Deactivated successfully. Nov 26 02:24:59 localhost systemd[1]: run-r98a6e6a8b3214f3dbb2ab59dbbe445a2.service: Deactivated successfully. Nov 26 02:26:00 localhost systemd[1]: session-13.scope: Deactivated successfully. Nov 26 02:26:00 localhost systemd[1]: session-13.scope: Consumed 4.592s CPU time. Nov 26 02:26:00 localhost systemd-logind[761]: Session 13 logged out. Waiting for processes to exit. Nov 26 02:26:00 localhost systemd-logind[761]: Removed session 13. Nov 26 02:41:24 localhost sshd[24601]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:41:25 localhost systemd-logind[761]: New session 14 of user zuul. Nov 26 02:41:25 localhost systemd[1]: Started Session 14 of User zuul. Nov 26 02:41:25 localhost python3[24649]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 02:41:27 localhost python3[24736]: ansible-ansible.builtin.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 26 02:41:30 localhost python3[24753]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 02:41:31 localhost python3[24770]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=7G#012losetup /dev/loop3 /var/lib/ceph-osd-0.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:41:31 localhost kernel: loop: module loaded Nov 26 02:41:31 localhost kernel: loop3: detected capacity change from 0 to 14680064 Nov 26 02:41:31 localhost python3[24795]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3#012vgcreate ceph_vg0 /dev/loop3#012lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:41:31 localhost lvm[24798]: PV /dev/loop3 not used. Nov 26 02:41:31 localhost lvm[24800]: PV /dev/loop3 online, VG ceph_vg0 is complete. Nov 26 02:41:31 localhost systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0. Nov 26 02:41:31 localhost lvm[24803]: 1 logical volume(s) in volume group "ceph_vg0" now active Nov 26 02:41:31 localhost lvm[24810]: PV /dev/loop3 online, VG ceph_vg0 is complete. Nov 26 02:41:31 localhost lvm[24810]: VG ceph_vg0 finished Nov 26 02:41:31 localhost systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully. Nov 26 02:41:32 localhost python3[24858]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:41:32 localhost python3[24901]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764142892.1509688-55484-100503229656592/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:41:33 localhost python3[24931]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 02:41:33 localhost systemd[1]: Reloading. Nov 26 02:41:33 localhost systemd-rc-local-generator[24961]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 02:41:33 localhost systemd-sysv-generator[24964]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 02:41:33 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 02:41:33 localhost systemd[1]: Starting Ceph OSD losetup... Nov 26 02:41:34 localhost bash[24973]: /dev/loop3: [64516]:8400144 (/var/lib/ceph-osd-0.img) Nov 26 02:41:34 localhost systemd[1]: Finished Ceph OSD losetup. Nov 26 02:41:34 localhost lvm[24974]: PV /dev/loop3 online, VG ceph_vg0 is complete. Nov 26 02:41:34 localhost lvm[24974]: VG ceph_vg0 finished Nov 26 02:41:34 localhost python3[24990]: ansible-ansible.builtin.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 26 02:41:37 localhost python3[25007]: ansible-ansible.builtin.stat Invoked with path=/dev/loop4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 02:41:38 localhost python3[25023]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-1.img bs=1 count=0 seek=7G#012losetup /dev/loop4 /var/lib/ceph-osd-1.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:41:38 localhost kernel: loop4: detected capacity change from 0 to 14680064 Nov 26 02:41:38 localhost python3[25045]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop4#012vgcreate ceph_vg1 /dev/loop4#012lvcreate -n ceph_lv1 -l +100%FREE ceph_vg1#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:41:38 localhost lvm[25048]: PV /dev/loop4 not used. Nov 26 02:41:38 localhost lvm[25050]: PV /dev/loop4 online, VG ceph_vg1 is complete. Nov 26 02:41:38 localhost systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg1. Nov 26 02:41:38 localhost lvm[25057]: 1 logical volume(s) in volume group "ceph_vg1" now active Nov 26 02:41:38 localhost lvm[25061]: PV /dev/loop4 online, VG ceph_vg1 is complete. Nov 26 02:41:38 localhost lvm[25061]: VG ceph_vg1 finished Nov 26 02:41:38 localhost systemd[1]: lvm-activate-ceph_vg1.service: Deactivated successfully. Nov 26 02:41:39 localhost python3[25109]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-1.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:41:39 localhost python3[25152]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764142899.2078562-55688-79131768870807/source dest=/etc/systemd/system/ceph-osd-losetup-1.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=19612168ea279db4171b94ee1f8625de1ec44b58 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:41:40 localhost python3[25182]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-1.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 02:41:40 localhost systemd[1]: Reloading. Nov 26 02:41:40 localhost systemd-sysv-generator[25212]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 02:41:40 localhost systemd-rc-local-generator[25209]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 02:41:40 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 02:41:40 localhost systemd[1]: Starting Ceph OSD losetup... Nov 26 02:41:40 localhost bash[25223]: /dev/loop4: [64516]:8399529 (/var/lib/ceph-osd-1.img) Nov 26 02:41:40 localhost systemd[1]: Finished Ceph OSD losetup. Nov 26 02:41:40 localhost lvm[25224]: PV /dev/loop4 online, VG ceph_vg1 is complete. Nov 26 02:41:40 localhost lvm[25224]: VG ceph_vg1 finished Nov 26 02:41:49 localhost python3[25270]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all', 'min'] gather_timeout=45 filter=[] fact_path=/etc/ansible/facts.d Nov 26 02:41:50 localhost python3[25290]: ansible-hostname Invoked with name=np0005536118.localdomain use=None Nov 26 02:41:50 localhost systemd[1]: Starting Hostname Service... Nov 26 02:41:50 localhost systemd[1]: Started Hostname Service. Nov 26 02:41:52 localhost python3[25313]: ansible-tempfile Invoked with state=file suffix=tmphosts prefix=ansible. path=None Nov 26 02:41:53 localhost python3[25361]: ansible-ansible.legacy.copy Invoked with remote_src=True src=/etc/hosts dest=/tmp/ansible.hpfhz_8btmphosts mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:41:53 localhost python3[25391]: ansible-blockinfile Invoked with state=absent path=/tmp/ansible.hpfhz_8btmphosts block= marker=# {mark} marker_begin=HEAT_HOSTS_START - Do not edit manually within this section! marker_end=HEAT_HOSTS_END create=False backup=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:41:54 localhost python3[25407]: ansible-blockinfile Invoked with create=True path=/tmp/ansible.hpfhz_8btmphosts insertbefore=BOF block=192.168.122.106 np0005536117.localdomain np0005536117#012192.168.122.106 np0005536117.ctlplane.localdomain np0005536117.ctlplane#012192.168.122.107 np0005536118.localdomain np0005536118#012192.168.122.107 np0005536118.ctlplane.localdomain np0005536118.ctlplane#012192.168.122.108 np0005536119.localdomain np0005536119#012192.168.122.108 np0005536119.ctlplane.localdomain np0005536119.ctlplane#012192.168.122.103 np0005536112.localdomain np0005536112#012192.168.122.103 np0005536112.ctlplane.localdomain np0005536112.ctlplane#012192.168.122.104 np0005536113.localdomain np0005536113#012192.168.122.104 np0005536113.ctlplane.localdomain np0005536113.ctlplane#012192.168.122.105 np0005536114.localdomain np0005536114#012192.168.122.105 np0005536114.ctlplane.localdomain np0005536114.ctlplane#012#012192.168.122.100 undercloud.ctlplane.localdomain undercloud.ctlplane#012 marker=# {mark} marker_begin=START_HOST_ENTRIES_FOR_STACK: overcloud marker_end=END_HOST_ENTRIES_FOR_STACK: overcloud state=present backup=False unsafe_writes=False insertafter=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:41:54 localhost python3[25423]: ansible-ansible.legacy.command Invoked with _raw_params=cp "/tmp/ansible.hpfhz_8btmphosts" "/etc/hosts" _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:41:55 localhost python3[25440]: ansible-file Invoked with path=/tmp/ansible.hpfhz_8btmphosts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:41:57 localhost python3[25456]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:41:58 localhost python3[25474]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 26 02:42:02 localhost python3[25523]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:42:03 localhost python3[25568]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764142922.4014378-56540-256608093869815/source dest=/etc/chrony.conf owner=root group=root mode=420 follow=False _original_basename=chrony.conf.j2 checksum=4fd4fbbb2de00c70a54478b7feb8ef8adf6a3362 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:42:04 localhost python3[25598]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 02:42:06 localhost python3[25616]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 26 02:42:06 localhost systemd[1]: Stopping NTP client/server... Nov 26 02:42:06 localhost chronyd[767]: chronyd exiting Nov 26 02:42:06 localhost systemd[1]: chronyd.service: Deactivated successfully. Nov 26 02:42:06 localhost systemd[1]: Stopped NTP client/server. Nov 26 02:42:06 localhost systemd[1]: chronyd.service: Consumed 108ms CPU time, read 1.9M from disk, written 0B to disk. Nov 26 02:42:06 localhost systemd[1]: Starting NTP client/server... Nov 26 02:42:06 localhost chronyd[25624]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG) Nov 26 02:42:06 localhost chronyd[25624]: Frequency -30.598 +/- 0.189 ppm read from /var/lib/chrony/drift Nov 26 02:42:06 localhost chronyd[25624]: Loaded seccomp filter (level 2) Nov 26 02:42:06 localhost systemd[1]: Started NTP client/server. Nov 26 02:42:07 localhost python3[25673]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/chrony-online.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:42:07 localhost python3[25716]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764142926.8722517-56754-2443155926298/source dest=/etc/systemd/system/chrony-online.service _original_basename=chrony-online.service follow=False checksum=d4d85e046d61f558ac7ec8178c6d529d893e81e1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:42:08 localhost python3[25746]: ansible-systemd Invoked with state=started name=chrony-online.service enabled=True daemon-reload=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 02:42:08 localhost systemd[1]: Reloading. Nov 26 02:42:08 localhost systemd-rc-local-generator[25768]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 02:42:08 localhost systemd-sysv-generator[25773]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 02:42:08 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 02:42:08 localhost systemd[1]: Reloading. Nov 26 02:42:08 localhost systemd-rc-local-generator[25811]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 02:42:08 localhost systemd-sysv-generator[25815]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 02:42:08 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 02:42:08 localhost systemd[1]: Starting chronyd online sources service... Nov 26 02:42:08 localhost chronyc[25822]: 200 OK Nov 26 02:42:08 localhost systemd[1]: chrony-online.service: Deactivated successfully. Nov 26 02:42:08 localhost systemd[1]: Finished chronyd online sources service. Nov 26 02:42:09 localhost python3[25838]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:42:09 localhost chronyd[25624]: System clock was stepped by 0.000000 seconds Nov 26 02:42:09 localhost python3[25855]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:42:10 localhost chronyd[25624]: Selected source 174.138.193.90 (pool.ntp.org) Nov 26 02:42:20 localhost python3[25872]: ansible-timezone Invoked with name=UTC hwclock=None Nov 26 02:42:20 localhost systemd[1]: Starting Time & Date Service... Nov 26 02:42:20 localhost systemd[1]: Started Time & Date Service. Nov 26 02:42:20 localhost systemd[1]: systemd-hostnamed.service: Deactivated successfully. Nov 26 02:42:21 localhost python3[25894]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 26 02:42:21 localhost systemd[1]: Stopping NTP client/server... Nov 26 02:42:21 localhost chronyd[25624]: chronyd exiting Nov 26 02:42:21 localhost systemd[1]: chronyd.service: Deactivated successfully. Nov 26 02:42:21 localhost systemd[1]: Stopped NTP client/server. Nov 26 02:42:21 localhost systemd[1]: Starting NTP client/server... Nov 26 02:42:21 localhost chronyd[25901]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG) Nov 26 02:42:21 localhost chronyd[25901]: Frequency -30.598 +/- 0.189 ppm read from /var/lib/chrony/drift Nov 26 02:42:21 localhost chronyd[25901]: Loaded seccomp filter (level 2) Nov 26 02:42:21 localhost systemd[1]: Started NTP client/server. Nov 26 02:42:25 localhost chronyd[25901]: Selected source 23.133.168.244 (pool.ntp.org) Nov 26 02:42:36 localhost sshd[25903]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:42:50 localhost systemd[1]: systemd-timedated.service: Deactivated successfully. Nov 26 02:44:20 localhost sshd[26101]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:44:20 localhost systemd[1]: Created slice User Slice of UID 1002. Nov 26 02:44:20 localhost systemd[1]: Starting User Runtime Directory /run/user/1002... Nov 26 02:44:20 localhost systemd-logind[761]: New session 15 of user ceph-admin. Nov 26 02:44:20 localhost systemd[1]: Finished User Runtime Directory /run/user/1002. Nov 26 02:44:20 localhost systemd[1]: Starting User Manager for UID 1002... Nov 26 02:44:20 localhost sshd[26118]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:44:20 localhost systemd[26105]: Queued start job for default target Main User Target. Nov 26 02:44:20 localhost systemd[26105]: Created slice User Application Slice. Nov 26 02:44:20 localhost systemd[26105]: Started Mark boot as successful after the user session has run 2 minutes. Nov 26 02:44:20 localhost systemd[26105]: Started Daily Cleanup of User's Temporary Directories. Nov 26 02:44:20 localhost systemd[26105]: Reached target Paths. Nov 26 02:44:20 localhost systemd[26105]: Reached target Timers. Nov 26 02:44:20 localhost systemd[26105]: Starting D-Bus User Message Bus Socket... Nov 26 02:44:20 localhost systemd[26105]: Starting Create User's Volatile Files and Directories... Nov 26 02:44:20 localhost systemd[26105]: Finished Create User's Volatile Files and Directories. Nov 26 02:44:20 localhost systemd[26105]: Listening on D-Bus User Message Bus Socket. Nov 26 02:44:20 localhost systemd[26105]: Reached target Sockets. Nov 26 02:44:20 localhost systemd[26105]: Reached target Basic System. Nov 26 02:44:20 localhost systemd[26105]: Reached target Main User Target. Nov 26 02:44:20 localhost systemd[26105]: Startup finished in 123ms. Nov 26 02:44:20 localhost systemd[1]: Started User Manager for UID 1002. Nov 26 02:44:20 localhost systemd[1]: Started Session 15 of User ceph-admin. Nov 26 02:44:20 localhost systemd-logind[761]: New session 17 of user ceph-admin. Nov 26 02:44:20 localhost systemd[1]: Started Session 17 of User ceph-admin. Nov 26 02:44:21 localhost sshd[26140]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:44:21 localhost systemd-logind[761]: New session 18 of user ceph-admin. Nov 26 02:44:21 localhost systemd[1]: Started Session 18 of User ceph-admin. Nov 26 02:44:21 localhost sshd[26159]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:44:21 localhost systemd-logind[761]: New session 19 of user ceph-admin. Nov 26 02:44:21 localhost systemd[1]: Started Session 19 of User ceph-admin. Nov 26 02:44:21 localhost sshd[26178]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:44:21 localhost systemd-logind[761]: New session 20 of user ceph-admin. Nov 26 02:44:21 localhost systemd[1]: Started Session 20 of User ceph-admin. Nov 26 02:44:22 localhost sshd[26197]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:44:22 localhost systemd-logind[761]: New session 21 of user ceph-admin. Nov 26 02:44:22 localhost systemd[1]: Started Session 21 of User ceph-admin. Nov 26 02:44:22 localhost sshd[26216]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:44:22 localhost systemd-logind[761]: New session 22 of user ceph-admin. Nov 26 02:44:22 localhost systemd[1]: Started Session 22 of User ceph-admin. Nov 26 02:44:22 localhost sshd[26235]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:44:23 localhost systemd-logind[761]: New session 23 of user ceph-admin. Nov 26 02:44:23 localhost systemd[1]: Started Session 23 of User ceph-admin. Nov 26 02:44:23 localhost sshd[26254]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:44:23 localhost systemd-logind[761]: New session 24 of user ceph-admin. Nov 26 02:44:23 localhost systemd[1]: Started Session 24 of User ceph-admin. Nov 26 02:44:23 localhost sshd[26273]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:44:23 localhost systemd-logind[761]: New session 25 of user ceph-admin. Nov 26 02:44:23 localhost systemd[1]: Started Session 25 of User ceph-admin. Nov 26 02:44:24 localhost sshd[26290]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:44:24 localhost systemd-logind[761]: New session 26 of user ceph-admin. Nov 26 02:44:24 localhost systemd[1]: Started Session 26 of User ceph-admin. Nov 26 02:44:24 localhost sshd[26309]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:44:24 localhost systemd-logind[761]: New session 27 of user ceph-admin. Nov 26 02:44:24 localhost systemd[1]: Started Session 27 of User ceph-admin. Nov 26 02:44:25 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 26 02:44:43 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 26 02:44:44 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 26 02:44:44 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 26 02:44:44 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 26 02:44:44 localhost systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 26522 (sysctl) Nov 26 02:44:44 localhost systemd[1]: Mounting Arbitrary Executable File Formats File System... Nov 26 02:44:44 localhost systemd[1]: Mounted Arbitrary Executable File Formats File System. Nov 26 02:44:45 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 26 02:44:46 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 26 02:44:49 localhost kernel: VFS: idmapped mount is not enabled. Nov 26 02:45:09 localhost podman[26662]: Nov 26 02:45:09 localhost podman[26662]: 2025-11-26 07:45:09.654133367 +0000 UTC m=+23.172099093 container create fe6a366202776c2f3dbb42da0af5735b83cc09856bd8c1cc890af9dce042012b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_sutherland, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, io.openshift.expose-services=, CEPH_POINT_RELEASE=, version=7, release=553, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, ceph=True) Nov 26 02:45:09 localhost systemd[1]: var-lib-containers-storage-overlay-volatile\x2dcheck2315825385-merged.mount: Deactivated successfully. Nov 26 02:45:09 localhost podman[26662]: 2025-11-26 07:44:46.524708436 +0000 UTC m=+0.042674232 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 02:45:09 localhost systemd[1]: Created slice Slice /machine. Nov 26 02:45:09 localhost systemd[1]: Started libpod-conmon-fe6a366202776c2f3dbb42da0af5735b83cc09856bd8c1cc890af9dce042012b.scope. Nov 26 02:45:09 localhost systemd[1]: Started libcrun container. Nov 26 02:45:09 localhost podman[26662]: 2025-11-26 07:45:09.772560522 +0000 UTC m=+23.290526278 container init fe6a366202776c2f3dbb42da0af5735b83cc09856bd8c1cc890af9dce042012b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_sutherland, name=rhceph, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , GIT_BRANCH=main, CEPH_POINT_RELEASE=, ceph=True, release=553, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7) Nov 26 02:45:09 localhost podman[26662]: 2025-11-26 07:45:09.789964181 +0000 UTC m=+23.307929927 container start fe6a366202776c2f3dbb42da0af5735b83cc09856bd8c1cc890af9dce042012b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_sutherland, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, ceph=True, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , io.openshift.expose-services=, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, release=553, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, GIT_CLEAN=True, description=Red Hat Ceph Storage 7) Nov 26 02:45:09 localhost beautiful_sutherland[27035]: 167 167 Nov 26 02:45:09 localhost systemd[1]: libpod-fe6a366202776c2f3dbb42da0af5735b83cc09856bd8c1cc890af9dce042012b.scope: Deactivated successfully. Nov 26 02:45:09 localhost podman[26662]: 2025-11-26 07:45:09.791837804 +0000 UTC m=+23.309803550 container attach fe6a366202776c2f3dbb42da0af5735b83cc09856bd8c1cc890af9dce042012b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_sutherland, GIT_CLEAN=True, vcs-type=git, CEPH_POINT_RELEASE=, GIT_BRANCH=main, architecture=x86_64, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, RELEASE=main, vendor=Red Hat, Inc., version=7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, build-date=2025-09-24T08:57:55) Nov 26 02:45:09 localhost podman[26662]: 2025-11-26 07:45:09.795641183 +0000 UTC m=+23.313606969 container died fe6a366202776c2f3dbb42da0af5735b83cc09856bd8c1cc890af9dce042012b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_sutherland, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, distribution-scope=public, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, name=rhceph) Nov 26 02:45:09 localhost podman[27040]: 2025-11-26 07:45:09.891675409 +0000 UTC m=+0.089442830 container remove fe6a366202776c2f3dbb42da0af5735b83cc09856bd8c1cc890af9dce042012b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_sutherland, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, ceph=True, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, release=553, GIT_BRANCH=main, vcs-type=git, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph) Nov 26 02:45:09 localhost systemd[1]: libpod-conmon-fe6a366202776c2f3dbb42da0af5735b83cc09856bd8c1cc890af9dce042012b.scope: Deactivated successfully. Nov 26 02:45:10 localhost podman[27061]: Nov 26 02:45:10 localhost podman[27061]: 2025-11-26 07:45:10.10942519 +0000 UTC m=+0.080681553 container create a0414c3e52ac6c67d455ed4b07b857c5ac45724393e03d2770b2e736c034695f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_williams, ceph=True, vendor=Red Hat, Inc., GIT_CLEAN=True, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, RELEASE=main, version=7, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 26 02:45:10 localhost systemd[1]: Started libpod-conmon-a0414c3e52ac6c67d455ed4b07b857c5ac45724393e03d2770b2e736c034695f.scope. Nov 26 02:45:10 localhost systemd[1]: Started libcrun container. Nov 26 02:45:10 localhost podman[27061]: 2025-11-26 07:45:10.077511696 +0000 UTC m=+0.048768089 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 02:45:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5814fdb4a68ed787cf16b79d6fde68f0ec818646b83eb9f6be6a236449f197fa/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 26 02:45:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5814fdb4a68ed787cf16b79d6fde68f0ec818646b83eb9f6be6a236449f197fa/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 26 02:45:10 localhost podman[27061]: 2025-11-26 07:45:10.195120712 +0000 UTC m=+0.166377055 container init a0414c3e52ac6c67d455ed4b07b857c5ac45724393e03d2770b2e736c034695f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_williams, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , distribution-scope=public, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, name=rhceph, GIT_BRANCH=main, release=553, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, vendor=Red Hat, Inc., version=7) Nov 26 02:45:10 localhost podman[27061]: 2025-11-26 07:45:10.206398372 +0000 UTC m=+0.177654715 container start a0414c3e52ac6c67d455ed4b07b857c5ac45724393e03d2770b2e736c034695f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_williams, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, distribution-scope=public, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, name=rhceph, vcs-type=git, maintainer=Guillaume Abrioux ) Nov 26 02:45:10 localhost podman[27061]: 2025-11-26 07:45:10.206726541 +0000 UTC m=+0.177982904 container attach a0414c3e52ac6c67d455ed4b07b857c5ac45724393e03d2770b2e736c034695f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_williams, GIT_CLEAN=True, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, maintainer=Guillaume Abrioux , release=553, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, architecture=x86_64, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, GIT_BRANCH=main, io.buildah.version=1.33.12) Nov 26 02:45:10 localhost systemd[1]: var-lib-containers-storage-overlay-ec87d440cfff7a451ad0a4c9505ffd47157b2abfc8250b04b46d0efbef1d3319-merged.mount: Deactivated successfully. Nov 26 02:45:10 localhost xenodochial_williams[27077]: [ Nov 26 02:45:10 localhost xenodochial_williams[27077]: { Nov 26 02:45:10 localhost xenodochial_williams[27077]: "available": false, Nov 26 02:45:10 localhost xenodochial_williams[27077]: "ceph_device": false, Nov 26 02:45:10 localhost xenodochial_williams[27077]: "device_id": "QEMU_DVD-ROM_QM00001", Nov 26 02:45:10 localhost xenodochial_williams[27077]: "lsm_data": {}, Nov 26 02:45:10 localhost xenodochial_williams[27077]: "lvs": [], Nov 26 02:45:10 localhost xenodochial_williams[27077]: "path": "/dev/sr0", Nov 26 02:45:10 localhost xenodochial_williams[27077]: "rejected_reasons": [ Nov 26 02:45:10 localhost xenodochial_williams[27077]: "Has a FileSystem", Nov 26 02:45:10 localhost xenodochial_williams[27077]: "Insufficient space (<5GB)" Nov 26 02:45:10 localhost xenodochial_williams[27077]: ], Nov 26 02:45:10 localhost xenodochial_williams[27077]: "sys_api": { Nov 26 02:45:10 localhost xenodochial_williams[27077]: "actuators": null, Nov 26 02:45:10 localhost xenodochial_williams[27077]: "device_nodes": "sr0", Nov 26 02:45:10 localhost xenodochial_williams[27077]: "human_readable_size": "482.00 KB", Nov 26 02:45:10 localhost xenodochial_williams[27077]: "id_bus": "ata", Nov 26 02:45:10 localhost xenodochial_williams[27077]: "model": "QEMU DVD-ROM", Nov 26 02:45:10 localhost xenodochial_williams[27077]: "nr_requests": "2", Nov 26 02:45:10 localhost xenodochial_williams[27077]: "partitions": {}, Nov 26 02:45:10 localhost xenodochial_williams[27077]: "path": "/dev/sr0", Nov 26 02:45:10 localhost xenodochial_williams[27077]: "removable": "1", Nov 26 02:45:10 localhost xenodochial_williams[27077]: "rev": "2.5+", Nov 26 02:45:10 localhost xenodochial_williams[27077]: "ro": "0", Nov 26 02:45:10 localhost xenodochial_williams[27077]: "rotational": "1", Nov 26 02:45:10 localhost xenodochial_williams[27077]: "sas_address": "", Nov 26 02:45:10 localhost xenodochial_williams[27077]: "sas_device_handle": "", Nov 26 02:45:10 localhost xenodochial_williams[27077]: "scheduler_mode": "mq-deadline", Nov 26 02:45:10 localhost xenodochial_williams[27077]: "sectors": 0, Nov 26 02:45:10 localhost xenodochial_williams[27077]: "sectorsize": "2048", Nov 26 02:45:10 localhost xenodochial_williams[27077]: "size": 493568.0, Nov 26 02:45:10 localhost xenodochial_williams[27077]: "support_discard": "0", Nov 26 02:45:10 localhost xenodochial_williams[27077]: "type": "disk", Nov 26 02:45:10 localhost xenodochial_williams[27077]: "vendor": "QEMU" Nov 26 02:45:10 localhost xenodochial_williams[27077]: } Nov 26 02:45:10 localhost xenodochial_williams[27077]: } Nov 26 02:45:10 localhost xenodochial_williams[27077]: ] Nov 26 02:45:10 localhost systemd[1]: libpod-a0414c3e52ac6c67d455ed4b07b857c5ac45724393e03d2770b2e736c034695f.scope: Deactivated successfully. Nov 26 02:45:10 localhost podman[27061]: 2025-11-26 07:45:10.968302388 +0000 UTC m=+0.939558781 container died a0414c3e52ac6c67d455ed4b07b857c5ac45724393e03d2770b2e736c034695f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_williams, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, architecture=x86_64, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_BRANCH=main, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, name=rhceph, version=7, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 26 02:45:11 localhost systemd[1]: tmp-crun.DJ3AzF.mount: Deactivated successfully. Nov 26 02:45:11 localhost systemd[1]: var-lib-containers-storage-overlay-5814fdb4a68ed787cf16b79d6fde68f0ec818646b83eb9f6be6a236449f197fa-merged.mount: Deactivated successfully. Nov 26 02:45:11 localhost podman[28206]: 2025-11-26 07:45:11.068889829 +0000 UTC m=+0.087806222 container remove a0414c3e52ac6c67d455ed4b07b857c5ac45724393e03d2770b2e736c034695f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_williams, io.buildah.version=1.33.12, version=7, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, ceph=True, GIT_CLEAN=True, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, release=553, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7) Nov 26 02:45:11 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 26 02:45:11 localhost systemd[1]: libpod-conmon-a0414c3e52ac6c67d455ed4b07b857c5ac45724393e03d2770b2e736c034695f.scope: Deactivated successfully. Nov 26 02:45:11 localhost systemd[1]: systemd-coredump.socket: Deactivated successfully. Nov 26 02:45:11 localhost systemd[1]: Closed Process Core Dump Socket. Nov 26 02:45:11 localhost systemd[1]: Stopping Process Core Dump Socket... Nov 26 02:45:11 localhost systemd[1]: Listening on Process Core Dump Socket. Nov 26 02:45:11 localhost systemd[1]: Reloading. Nov 26 02:45:11 localhost systemd-sysv-generator[28293]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 02:45:11 localhost systemd-rc-local-generator[28289]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 02:45:11 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 02:45:12 localhost systemd[1]: Reloading. Nov 26 02:45:12 localhost systemd-sysv-generator[28325]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 02:45:12 localhost systemd-rc-local-generator[28319]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 02:45:12 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 02:45:24 localhost sshd[28339]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:45:45 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 26 02:45:45 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 26 02:45:46 localhost podman[28670]: Nov 26 02:45:46 localhost podman[28670]: 2025-11-26 07:45:46.091177073 +0000 UTC m=+0.067193565 container create 39427890f6526263b9860ed022579ce63ed5a6a2926c0dfcac32018d9cc95d06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_benz, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, RELEASE=main, description=Red Hat Ceph Storage 7, name=rhceph, architecture=x86_64, release=553, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, ceph=True, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 26 02:45:46 localhost systemd[1]: Started libpod-conmon-39427890f6526263b9860ed022579ce63ed5a6a2926c0dfcac32018d9cc95d06.scope. Nov 26 02:45:46 localhost systemd[1]: Started libcrun container. Nov 26 02:45:46 localhost podman[28670]: 2025-11-26 07:45:46.163695018 +0000 UTC m=+0.139711540 container init 39427890f6526263b9860ed022579ce63ed5a6a2926c0dfcac32018d9cc95d06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_benz, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, name=rhceph, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, vcs-type=git, GIT_BRANCH=main, CEPH_POINT_RELEASE=, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, version=7, vendor=Red Hat, Inc.) Nov 26 02:45:46 localhost podman[28670]: 2025-11-26 07:45:46.068706113 +0000 UTC m=+0.044722625 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 02:45:46 localhost pedantic_benz[28686]: 167 167 Nov 26 02:45:46 localhost systemd[1]: libpod-39427890f6526263b9860ed022579ce63ed5a6a2926c0dfcac32018d9cc95d06.scope: Deactivated successfully. Nov 26 02:45:46 localhost podman[28670]: 2025-11-26 07:45:46.18054803 +0000 UTC m=+0.156564522 container start 39427890f6526263b9860ed022579ce63ed5a6a2926c0dfcac32018d9cc95d06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_benz, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, io.openshift.tags=rhceph ceph, release=553, CEPH_POINT_RELEASE=, ceph=True, vcs-type=git, name=rhceph, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, version=7, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 26 02:45:46 localhost podman[28670]: 2025-11-26 07:45:46.180763445 +0000 UTC m=+0.156779957 container attach 39427890f6526263b9860ed022579ce63ed5a6a2926c0dfcac32018d9cc95d06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_benz, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, CEPH_POINT_RELEASE=, GIT_BRANCH=main, version=7, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, maintainer=Guillaume Abrioux , architecture=x86_64, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, release=553, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, name=rhceph) Nov 26 02:45:46 localhost podman[28670]: 2025-11-26 07:45:46.184192099 +0000 UTC m=+0.160208601 container died 39427890f6526263b9860ed022579ce63ed5a6a2926c0dfcac32018d9cc95d06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_benz, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, vcs-type=git, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, ceph=True, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.buildah.version=1.33.12) Nov 26 02:45:46 localhost podman[28691]: 2025-11-26 07:45:46.277965864 +0000 UTC m=+0.092905005 container remove 39427890f6526263b9860ed022579ce63ed5a6a2926c0dfcac32018d9cc95d06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_benz, name=rhceph, RELEASE=main, CEPH_POINT_RELEASE=, io.openshift.expose-services=, ceph=True, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, architecture=x86_64, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12) Nov 26 02:45:46 localhost systemd[1]: libpod-conmon-39427890f6526263b9860ed022579ce63ed5a6a2926c0dfcac32018d9cc95d06.scope: Deactivated successfully. Nov 26 02:45:46 localhost systemd[1]: Reloading. Nov 26 02:45:46 localhost systemd-sysv-generator[28735]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 02:45:46 localhost systemd-rc-local-generator[28729]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 02:45:46 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 02:45:46 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 26 02:45:46 localhost systemd[1]: Reloading. Nov 26 02:45:46 localhost systemd-rc-local-generator[28768]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 02:45:46 localhost systemd-sysv-generator[28775]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 02:45:46 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 02:45:46 localhost systemd[1]: Reached target All Ceph clusters and services. Nov 26 02:45:46 localhost systemd[1]: Reloading. Nov 26 02:45:46 localhost systemd-rc-local-generator[28811]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 02:45:46 localhost systemd-sysv-generator[28814]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 02:45:46 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 02:45:47 localhost systemd[1]: Reached target Ceph cluster 0d5e5e6d-3c4b-5efe-8c65-346ae6715606. Nov 26 02:45:47 localhost systemd[1]: Reloading. Nov 26 02:45:47 localhost systemd-sysv-generator[28853]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 02:45:47 localhost systemd-rc-local-generator[28848]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 02:45:47 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 02:45:47 localhost systemd[1]: Reloading. Nov 26 02:45:47 localhost systemd-rc-local-generator[28886]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 02:45:47 localhost systemd-sysv-generator[28890]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 02:45:47 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 02:45:47 localhost systemd[1]: Created slice Slice /system/ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606. Nov 26 02:45:47 localhost systemd[1]: Reached target System Time Set. Nov 26 02:45:47 localhost systemd[1]: Reached target System Time Synchronized. Nov 26 02:45:47 localhost systemd[1]: Starting Ceph crash.np0005536118 for 0d5e5e6d-3c4b-5efe-8c65-346ae6715606... Nov 26 02:45:47 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 26 02:45:47 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 26 02:45:47 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 26 02:45:47 localhost podman[28951]: Nov 26 02:45:48 localhost podman[28951]: 2025-11-26 07:45:48.001923226 +0000 UTC m=+0.085799600 container create a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, name=rhceph, GIT_CLEAN=True, RELEASE=main, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., release=553, architecture=x86_64) Nov 26 02:45:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a9b57f3a8a8203430ea3197bab118bc929d6c1c74ff2a1082e9e236a98f717c/merged/etc/ceph/ceph.client.crash.np0005536118.keyring supports timestamps until 2038 (0x7fffffff) Nov 26 02:45:48 localhost podman[28951]: 2025-11-26 07:45:47.968766185 +0000 UTC m=+0.052642559 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 02:45:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a9b57f3a8a8203430ea3197bab118bc929d6c1c74ff2a1082e9e236a98f717c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 26 02:45:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a9b57f3a8a8203430ea3197bab118bc929d6c1c74ff2a1082e9e236a98f717c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Nov 26 02:45:48 localhost podman[28951]: 2025-11-26 07:45:48.101699948 +0000 UTC m=+0.185576342 container init a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, release=553, maintainer=Guillaume Abrioux , ceph=True, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, version=7, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64) Nov 26 02:45:48 localhost podman[28951]: 2025-11-26 07:45:48.113085027 +0000 UTC m=+0.196961401 container start a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, io.openshift.tags=rhceph ceph, release=553, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, name=rhceph, architecture=x86_64, ceph=True, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.33.12, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 26 02:45:48 localhost bash[28951]: a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c Nov 26 02:45:48 localhost systemd[1]: Started Ceph crash.np0005536118 for 0d5e5e6d-3c4b-5efe-8c65-346ae6715606. Nov 26 02:45:48 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118[28964]: INFO:ceph-crash:pinging cluster to exercise our key Nov 26 02:45:48 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118[28964]: 2025-11-26T07:45:48.298+0000 7f06c2321640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory Nov 26 02:45:48 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118[28964]: 2025-11-26T07:45:48.298+0000 7f06c2321640 -1 AuthRegistry(0x7f06bc0680d0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx Nov 26 02:45:48 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118[28964]: 2025-11-26T07:45:48.300+0000 7f06c2321640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory Nov 26 02:45:48 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118[28964]: 2025-11-26T07:45:48.300+0000 7f06c2321640 -1 AuthRegistry(0x7f06c2320000) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx Nov 26 02:45:48 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118[28964]: 2025-11-26T07:45:48.309+0000 7f06c0897640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1] Nov 26 02:45:48 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118[28964]: 2025-11-26T07:45:48.310+0000 7f06bb7fe640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1] Nov 26 02:45:48 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118[28964]: 2025-11-26T07:45:48.310+0000 7f06bbfff640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1] Nov 26 02:45:48 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118[28964]: 2025-11-26T07:45:48.310+0000 7f06c2321640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication Nov 26 02:45:48 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118[28964]: [errno 13] RADOS permission denied (error connecting to the cluster) Nov 26 02:45:48 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118[28964]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s Nov 26 02:45:48 localhost podman[29049]: Nov 26 02:45:48 localhost podman[29049]: 2025-11-26 07:45:48.969751678 +0000 UTC m=+0.059563538 container create d01741366f960396cd34213c9373a17b4060e2613d814ffb254da91ca88196cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_brattain, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , io.openshift.expose-services=, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, com.redhat.component=rhceph-container, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, version=7, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, name=rhceph, vcs-type=git, build-date=2025-09-24T08:57:55, architecture=x86_64, GIT_CLEAN=True) Nov 26 02:45:49 localhost systemd[1]: Started libpod-conmon-d01741366f960396cd34213c9373a17b4060e2613d814ffb254da91ca88196cb.scope. Nov 26 02:45:49 localhost systemd[1]: Started libcrun container. Nov 26 02:45:49 localhost podman[29049]: 2025-11-26 07:45:48.939842357 +0000 UTC m=+0.029654237 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 02:45:49 localhost podman[29049]: 2025-11-26 07:45:49.048737491 +0000 UTC m=+0.138549341 container init d01741366f960396cd34213c9373a17b4060e2613d814ffb254da91ca88196cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_brattain, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_CLEAN=True, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 26 02:45:49 localhost podman[29049]: 2025-11-26 07:45:49.059107534 +0000 UTC m=+0.148919384 container start d01741366f960396cd34213c9373a17b4060e2613d814ffb254da91ca88196cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_brattain, distribution-scope=public, version=7, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, name=rhceph, release=553, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, vcs-type=git, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.openshift.expose-services=, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True) Nov 26 02:45:49 localhost podman[29049]: 2025-11-26 07:45:49.05935002 +0000 UTC m=+0.149161880 container attach d01741366f960396cd34213c9373a17b4060e2613d814ffb254da91ca88196cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_brattain, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, release=553, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, distribution-scope=public, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 26 02:45:49 localhost heuristic_brattain[29064]: 167 167 Nov 26 02:45:49 localhost systemd[1]: libpod-d01741366f960396cd34213c9373a17b4060e2613d814ffb254da91ca88196cb.scope: Deactivated successfully. Nov 26 02:45:49 localhost podman[29049]: 2025-11-26 07:45:49.063337088 +0000 UTC m=+0.153148938 container died d01741366f960396cd34213c9373a17b4060e2613d814ffb254da91ca88196cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_brattain, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, version=7, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, release=553) Nov 26 02:45:49 localhost podman[29069]: 2025-11-26 07:45:49.160134036 +0000 UTC m=+0.081620698 container remove d01741366f960396cd34213c9373a17b4060e2613d814ffb254da91ca88196cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_brattain, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, version=7, maintainer=Guillaume Abrioux , GIT_BRANCH=main, description=Red Hat Ceph Storage 7, release=553, vendor=Red Hat, Inc., architecture=x86_64, name=rhceph) Nov 26 02:45:49 localhost systemd[1]: libpod-conmon-d01741366f960396cd34213c9373a17b4060e2613d814ffb254da91ca88196cb.scope: Deactivated successfully. Nov 26 02:45:49 localhost podman[29088]: Nov 26 02:45:49 localhost podman[29088]: 2025-11-26 07:45:49.358474519 +0000 UTC m=+0.071372396 container create ad237c73862c229295f8ca277ab8004e2c0e3279f3298e4b35de3939655e9827 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_roentgen, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., release=553, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, ceph=True, description=Red Hat Ceph Storage 7, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, architecture=x86_64, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, version=7, maintainer=Guillaume Abrioux , vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 26 02:45:49 localhost systemd[1]: Started libpod-conmon-ad237c73862c229295f8ca277ab8004e2c0e3279f3298e4b35de3939655e9827.scope. Nov 26 02:45:49 localhost systemd[1]: Started libcrun container. Nov 26 02:45:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7cfac173163a782678f0344b3918a1debd7316b76af7674801d530ccb3425f15/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 26 02:45:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7cfac173163a782678f0344b3918a1debd7316b76af7674801d530ccb3425f15/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Nov 26 02:45:49 localhost podman[29088]: 2025-11-26 07:45:49.33274294 +0000 UTC m=+0.045640867 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 02:45:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7cfac173163a782678f0344b3918a1debd7316b76af7674801d530ccb3425f15/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 26 02:45:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7cfac173163a782678f0344b3918a1debd7316b76af7674801d530ccb3425f15/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 26 02:45:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7cfac173163a782678f0344b3918a1debd7316b76af7674801d530ccb3425f15/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff) Nov 26 02:45:49 localhost podman[29088]: 2025-11-26 07:45:49.462493874 +0000 UTC m=+0.175391802 container init ad237c73862c229295f8ca277ab8004e2c0e3279f3298e4b35de3939655e9827 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_roentgen, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, version=7, maintainer=Guillaume Abrioux , architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, release=553, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, vcs-type=git, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 26 02:45:49 localhost podman[29088]: 2025-11-26 07:45:49.530078968 +0000 UTC m=+0.242976895 container start ad237c73862c229295f8ca277ab8004e2c0e3279f3298e4b35de3939655e9827 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_roentgen, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., RELEASE=main, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, name=rhceph) Nov 26 02:45:49 localhost podman[29088]: 2025-11-26 07:45:49.53051625 +0000 UTC m=+0.243414167 container attach ad237c73862c229295f8ca277ab8004e2c0e3279f3298e4b35de3939655e9827 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_roentgen, RELEASE=main, io.openshift.tags=rhceph ceph, ceph=True, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, distribution-scope=public, GIT_CLEAN=True, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, release=553, vendor=Red Hat, Inc., version=7, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 26 02:45:49 localhost systemd[1]: var-lib-containers-storage-overlay-e970db1c5431f70befcdd81fada5b740288f8aaf69024c7e41831d7fc86b2270-merged.mount: Deactivated successfully. Nov 26 02:45:49 localhost unruffled_roentgen[29103]: --> passed data devices: 0 physical, 2 LVM Nov 26 02:45:49 localhost unruffled_roentgen[29103]: --> relative data size: 1.0 Nov 26 02:45:50 localhost unruffled_roentgen[29103]: Running command: /usr/bin/ceph-authtool --gen-print-key Nov 26 02:45:50 localhost unruffled_roentgen[29103]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 4c7370a1-c96a-417f-bde2-a93f51ef7561 Nov 26 02:45:50 localhost unruffled_roentgen[29103]: Running command: /usr/bin/ceph-authtool --gen-print-key Nov 26 02:45:50 localhost lvm[29157]: PV /dev/loop3 online, VG ceph_vg0 is complete. Nov 26 02:45:50 localhost lvm[29157]: VG ceph_vg0 finished Nov 26 02:45:50 localhost unruffled_roentgen[29103]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-0 Nov 26 02:45:50 localhost unruffled_roentgen[29103]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0 Nov 26 02:45:50 localhost unruffled_roentgen[29103]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 Nov 26 02:45:50 localhost unruffled_roentgen[29103]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block Nov 26 02:45:50 localhost unruffled_roentgen[29103]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-0/activate.monmap Nov 26 02:45:51 localhost unruffled_roentgen[29103]: stderr: got monmap epoch 3 Nov 26 02:45:51 localhost unruffled_roentgen[29103]: --> Creating keyring file for osd.0 Nov 26 02:45:51 localhost unruffled_roentgen[29103]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/keyring Nov 26 02:45:51 localhost unruffled_roentgen[29103]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/ Nov 26 02:45:51 localhost unruffled_roentgen[29103]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 0 --monmap /var/lib/ceph/osd/ceph-0/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-0/ --osd-uuid 4c7370a1-c96a-417f-bde2-a93f51ef7561 --setuser ceph --setgroup ceph Nov 26 02:45:53 localhost unruffled_roentgen[29103]: stderr: 2025-11-26T07:45:51.218+0000 7ff88dd20a80 -1 bluestore(/var/lib/ceph/osd/ceph-0//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3] Nov 26 02:45:53 localhost unruffled_roentgen[29103]: stderr: 2025-11-26T07:45:51.219+0000 7ff88dd20a80 -1 bluestore(/var/lib/ceph/osd/ceph-0/) _read_fsid unparsable uuid Nov 26 02:45:53 localhost unruffled_roentgen[29103]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0 Nov 26 02:45:53 localhost unruffled_roentgen[29103]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 Nov 26 02:45:53 localhost unruffled_roentgen[29103]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-0 --no-mon-config Nov 26 02:45:53 localhost unruffled_roentgen[29103]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block Nov 26 02:45:53 localhost unruffled_roentgen[29103]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block Nov 26 02:45:53 localhost unruffled_roentgen[29103]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 Nov 26 02:45:53 localhost unruffled_roentgen[29103]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 Nov 26 02:45:53 localhost unruffled_roentgen[29103]: --> ceph-volume lvm activate successful for osd ID: 0 Nov 26 02:45:53 localhost unruffled_roentgen[29103]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0 Nov 26 02:45:53 localhost unruffled_roentgen[29103]: Running command: /usr/bin/ceph-authtool --gen-print-key Nov 26 02:45:53 localhost unruffled_roentgen[29103]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new f9257f78-62ea-450a-a79b-9944ac21c834 Nov 26 02:45:54 localhost lvm[30086]: PV /dev/loop4 online, VG ceph_vg1 is complete. Nov 26 02:45:54 localhost lvm[30086]: VG ceph_vg1 finished Nov 26 02:45:54 localhost unruffled_roentgen[29103]: Running command: /usr/bin/ceph-authtool --gen-print-key Nov 26 02:45:54 localhost unruffled_roentgen[29103]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-4 Nov 26 02:45:54 localhost unruffled_roentgen[29103]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg1/ceph_lv1 Nov 26 02:45:54 localhost unruffled_roentgen[29103]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 Nov 26 02:45:54 localhost unruffled_roentgen[29103]: Running command: /usr/bin/ln -s /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-4/block Nov 26 02:45:54 localhost unruffled_roentgen[29103]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-4/activate.monmap Nov 26 02:45:54 localhost unruffled_roentgen[29103]: stderr: got monmap epoch 3 Nov 26 02:45:54 localhost unruffled_roentgen[29103]: --> Creating keyring file for osd.4 Nov 26 02:45:55 localhost unruffled_roentgen[29103]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4/keyring Nov 26 02:45:55 localhost unruffled_roentgen[29103]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4/ Nov 26 02:45:55 localhost unruffled_roentgen[29103]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 4 --monmap /var/lib/ceph/osd/ceph-4/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-4/ --osd-uuid f9257f78-62ea-450a-a79b-9944ac21c834 --setuser ceph --setgroup ceph Nov 26 02:45:57 localhost unruffled_roentgen[29103]: stderr: 2025-11-26T07:45:55.079+0000 7f32028a4a80 -1 bluestore(/var/lib/ceph/osd/ceph-4//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3] Nov 26 02:45:57 localhost unruffled_roentgen[29103]: stderr: 2025-11-26T07:45:55.079+0000 7f32028a4a80 -1 bluestore(/var/lib/ceph/osd/ceph-4/) _read_fsid unparsable uuid Nov 26 02:45:57 localhost unruffled_roentgen[29103]: --> ceph-volume lvm prepare successful for: ceph_vg1/ceph_lv1 Nov 26 02:45:57 localhost unruffled_roentgen[29103]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 Nov 26 02:45:57 localhost unruffled_roentgen[29103]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg1/ceph_lv1 --path /var/lib/ceph/osd/ceph-4 --no-mon-config Nov 26 02:45:57 localhost unruffled_roentgen[29103]: Running command: /usr/bin/ln -snf /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-4/block Nov 26 02:45:57 localhost unruffled_roentgen[29103]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-4/block Nov 26 02:45:57 localhost unruffled_roentgen[29103]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 Nov 26 02:45:57 localhost unruffled_roentgen[29103]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 Nov 26 02:45:57 localhost unruffled_roentgen[29103]: --> ceph-volume lvm activate successful for osd ID: 4 Nov 26 02:45:57 localhost unruffled_roentgen[29103]: --> ceph-volume lvm create successful for: ceph_vg1/ceph_lv1 Nov 26 02:45:57 localhost systemd[1]: libpod-ad237c73862c229295f8ca277ab8004e2c0e3279f3298e4b35de3939655e9827.scope: Deactivated successfully. Nov 26 02:45:57 localhost systemd[1]: libpod-ad237c73862c229295f8ca277ab8004e2c0e3279f3298e4b35de3939655e9827.scope: Consumed 3.734s CPU time. Nov 26 02:45:57 localhost podman[29088]: 2025-11-26 07:45:57.695603319 +0000 UTC m=+8.408501276 container died ad237c73862c229295f8ca277ab8004e2c0e3279f3298e4b35de3939655e9827 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_roentgen, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, release=553, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, vendor=Red Hat, Inc., distribution-scope=public, GIT_CLEAN=True, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.openshift.tags=rhceph ceph, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7) Nov 26 02:45:57 localhost systemd[1]: var-lib-containers-storage-overlay-7cfac173163a782678f0344b3918a1debd7316b76af7674801d530ccb3425f15-merged.mount: Deactivated successfully. Nov 26 02:45:57 localhost podman[30986]: 2025-11-26 07:45:57.777357269 +0000 UTC m=+0.068415775 container remove ad237c73862c229295f8ca277ab8004e2c0e3279f3298e4b35de3939655e9827 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_roentgen, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, distribution-scope=public, GIT_BRANCH=main, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, com.redhat.component=rhceph-container, release=553, vcs-type=git, CEPH_POINT_RELEASE=, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , GIT_CLEAN=True, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 26 02:45:57 localhost systemd[1]: libpod-conmon-ad237c73862c229295f8ca277ab8004e2c0e3279f3298e4b35de3939655e9827.scope: Deactivated successfully. Nov 26 02:45:58 localhost podman[31069]: Nov 26 02:45:58 localhost podman[31069]: 2025-11-26 07:45:58.524105001 +0000 UTC m=+0.049638926 container create 51bf3f93ffb7884041f71f9bc27b0479ee40a8bfc7f99854a94b9236b311e141 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_kepler, architecture=x86_64, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, name=rhceph, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-type=git, io.openshift.expose-services=, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, distribution-scope=public) Nov 26 02:45:58 localhost systemd[1]: Started libpod-conmon-51bf3f93ffb7884041f71f9bc27b0479ee40a8bfc7f99854a94b9236b311e141.scope. Nov 26 02:45:58 localhost systemd[1]: Started libcrun container. Nov 26 02:45:58 localhost podman[31069]: 2025-11-26 07:45:58.576059853 +0000 UTC m=+0.101593788 container init 51bf3f93ffb7884041f71f9bc27b0479ee40a8bfc7f99854a94b9236b311e141 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_kepler, version=7, GIT_BRANCH=main, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, GIT_CLEAN=True, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, vcs-type=git, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph) Nov 26 02:45:58 localhost podman[31069]: 2025-11-26 07:45:58.585295769 +0000 UTC m=+0.110829724 container start 51bf3f93ffb7884041f71f9bc27b0479ee40a8bfc7f99854a94b9236b311e141 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_kepler, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , architecture=x86_64, version=7, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, release=553, vcs-type=git, CEPH_POINT_RELEASE=, io.openshift.expose-services=, ceph=True, io.buildah.version=1.33.12) Nov 26 02:45:58 localhost podman[31069]: 2025-11-26 07:45:58.585575866 +0000 UTC m=+0.111109821 container attach 51bf3f93ffb7884041f71f9bc27b0479ee40a8bfc7f99854a94b9236b311e141 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_kepler, release=553, com.redhat.component=rhceph-container, GIT_CLEAN=True, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, version=7, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, CEPH_POINT_RELEASE=, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph) Nov 26 02:45:58 localhost zen_kepler[31084]: 167 167 Nov 26 02:45:58 localhost systemd[1]: libpod-51bf3f93ffb7884041f71f9bc27b0479ee40a8bfc7f99854a94b9236b311e141.scope: Deactivated successfully. Nov 26 02:45:58 localhost podman[31069]: 2025-11-26 07:45:58.588945298 +0000 UTC m=+0.114479293 container died 51bf3f93ffb7884041f71f9bc27b0479ee40a8bfc7f99854a94b9236b311e141 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_kepler, release=553, vendor=Red Hat, Inc., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, RELEASE=main, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, com.redhat.component=rhceph-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, ceph=True, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 26 02:45:58 localhost podman[31069]: 2025-11-26 07:45:58.506705516 +0000 UTC m=+0.032239451 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 02:45:58 localhost podman[31089]: 2025-11-26 07:45:58.670287588 +0000 UTC m=+0.070658400 container remove 51bf3f93ffb7884041f71f9bc27b0479ee40a8bfc7f99854a94b9236b311e141 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_kepler, build-date=2025-09-24T08:57:55, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.buildah.version=1.33.12, version=7, description=Red Hat Ceph Storage 7, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., RELEASE=main, GIT_CLEAN=True, ceph=True, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 26 02:45:58 localhost systemd[1]: libpod-conmon-51bf3f93ffb7884041f71f9bc27b0479ee40a8bfc7f99854a94b9236b311e141.scope: Deactivated successfully. Nov 26 02:45:58 localhost systemd[1]: var-lib-containers-storage-overlay-a1c01006d62a79560d50434f3373f2a221ebf1cc1ad7d31a92f44646337f9071-merged.mount: Deactivated successfully. Nov 26 02:45:58 localhost podman[31110]: Nov 26 02:45:58 localhost podman[31110]: 2025-11-26 07:45:58.856479245 +0000 UTC m=+0.070738183 container create c405db1432c0d7cc4af9baad00cacd6e91acac2b49d1bb7eb205777cbf98665b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_khorana, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_CLEAN=True, name=rhceph, description=Red Hat Ceph Storage 7, ceph=True, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, distribution-scope=public, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, com.redhat.component=rhceph-container, RELEASE=main) Nov 26 02:45:58 localhost podman[31110]: 2025-11-26 07:45:58.828449469 +0000 UTC m=+0.042708427 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 02:45:58 localhost systemd[1]: Started libpod-conmon-c405db1432c0d7cc4af9baad00cacd6e91acac2b49d1bb7eb205777cbf98665b.scope. Nov 26 02:45:58 localhost systemd[1]: Started libcrun container. Nov 26 02:45:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2dac88a75020b529bf40ccdc75b50c703d7a70f91e7663acf75641c5863e816/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 26 02:45:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2dac88a75020b529bf40ccdc75b50c703d7a70f91e7663acf75641c5863e816/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 26 02:45:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2dac88a75020b529bf40ccdc75b50c703d7a70f91e7663acf75641c5863e816/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 26 02:45:58 localhost podman[31110]: 2025-11-26 07:45:58.995938156 +0000 UTC m=+0.210197094 container init c405db1432c0d7cc4af9baad00cacd6e91acac2b49d1bb7eb205777cbf98665b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_khorana, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, release=553, CEPH_POINT_RELEASE=, io.openshift.expose-services=, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , vcs-type=git, name=rhceph, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, ceph=True, version=7, distribution-scope=public) Nov 26 02:45:59 localhost systemd[1]: tmp-crun.bomVJy.mount: Deactivated successfully. Nov 26 02:45:59 localhost podman[31110]: 2025-11-26 07:45:59.012111342 +0000 UTC m=+0.226370310 container start c405db1432c0d7cc4af9baad00cacd6e91acac2b49d1bb7eb205777cbf98665b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_khorana, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , version=7, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, distribution-scope=public, ceph=True, RELEASE=main, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64) Nov 26 02:45:59 localhost podman[31110]: 2025-11-26 07:45:59.012729397 +0000 UTC m=+0.226988345 container attach c405db1432c0d7cc4af9baad00cacd6e91acac2b49d1bb7eb205777cbf98665b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_khorana, RELEASE=main, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, version=7, CEPH_POINT_RELEASE=, io.openshift.expose-services=, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, release=553, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , vcs-type=git, GIT_CLEAN=True) Nov 26 02:45:59 localhost beautiful_khorana[31125]: { Nov 26 02:45:59 localhost beautiful_khorana[31125]: "0": [ Nov 26 02:45:59 localhost beautiful_khorana[31125]: { Nov 26 02:45:59 localhost beautiful_khorana[31125]: "devices": [ Nov 26 02:45:59 localhost beautiful_khorana[31125]: "/dev/loop3" Nov 26 02:45:59 localhost beautiful_khorana[31125]: ], Nov 26 02:45:59 localhost beautiful_khorana[31125]: "lv_name": "ceph_lv0", Nov 26 02:45:59 localhost beautiful_khorana[31125]: "lv_path": "/dev/ceph_vg0/ceph_lv0", Nov 26 02:45:59 localhost beautiful_khorana[31125]: "lv_size": "7511998464", Nov 26 02:45:59 localhost beautiful_khorana[31125]: "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=j8qjSE-TMgp-tExn-Q2wI-qCTT-EQAN-ldlvoR,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=0d5e5e6d-3c4b-5efe-8c65-346ae6715606,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=4c7370a1-c96a-417f-bde2-a93f51ef7561,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0", Nov 26 02:45:59 localhost beautiful_khorana[31125]: "lv_uuid": "j8qjSE-TMgp-tExn-Q2wI-qCTT-EQAN-ldlvoR", Nov 26 02:45:59 localhost beautiful_khorana[31125]: "name": "ceph_lv0", Nov 26 02:45:59 localhost beautiful_khorana[31125]: "path": "/dev/ceph_vg0/ceph_lv0", Nov 26 02:45:59 localhost beautiful_khorana[31125]: "tags": { Nov 26 02:45:59 localhost beautiful_khorana[31125]: "ceph.block_device": "/dev/ceph_vg0/ceph_lv0", Nov 26 02:45:59 localhost beautiful_khorana[31125]: "ceph.block_uuid": "j8qjSE-TMgp-tExn-Q2wI-qCTT-EQAN-ldlvoR", Nov 26 02:45:59 localhost beautiful_khorana[31125]: "ceph.cephx_lockbox_secret": "", Nov 26 02:45:59 localhost beautiful_khorana[31125]: "ceph.cluster_fsid": "0d5e5e6d-3c4b-5efe-8c65-346ae6715606", Nov 26 02:45:59 localhost beautiful_khorana[31125]: "ceph.cluster_name": "ceph", Nov 26 02:45:59 localhost beautiful_khorana[31125]: "ceph.crush_device_class": "", Nov 26 02:45:59 localhost beautiful_khorana[31125]: "ceph.encrypted": "0", Nov 26 02:45:59 localhost beautiful_khorana[31125]: "ceph.osd_fsid": "4c7370a1-c96a-417f-bde2-a93f51ef7561", Nov 26 02:45:59 localhost beautiful_khorana[31125]: "ceph.osd_id": "0", Nov 26 02:45:59 localhost beautiful_khorana[31125]: "ceph.osdspec_affinity": "default_drive_group", Nov 26 02:45:59 localhost beautiful_khorana[31125]: "ceph.type": "block", Nov 26 02:45:59 localhost beautiful_khorana[31125]: "ceph.vdo": "0" Nov 26 02:45:59 localhost beautiful_khorana[31125]: }, Nov 26 02:45:59 localhost beautiful_khorana[31125]: "type": "block", Nov 26 02:45:59 localhost beautiful_khorana[31125]: "vg_name": "ceph_vg0" Nov 26 02:45:59 localhost beautiful_khorana[31125]: } Nov 26 02:45:59 localhost beautiful_khorana[31125]: ], Nov 26 02:45:59 localhost beautiful_khorana[31125]: "4": [ Nov 26 02:45:59 localhost beautiful_khorana[31125]: { Nov 26 02:45:59 localhost beautiful_khorana[31125]: "devices": [ Nov 26 02:45:59 localhost beautiful_khorana[31125]: "/dev/loop4" Nov 26 02:45:59 localhost beautiful_khorana[31125]: ], Nov 26 02:45:59 localhost beautiful_khorana[31125]: "lv_name": "ceph_lv1", Nov 26 02:45:59 localhost beautiful_khorana[31125]: "lv_path": "/dev/ceph_vg1/ceph_lv1", Nov 26 02:45:59 localhost beautiful_khorana[31125]: "lv_size": "7511998464", Nov 26 02:45:59 localhost beautiful_khorana[31125]: "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=TXBq16-kmKx-W9DV-iTEb-kvPN-o4eV-StAoAL,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=0d5e5e6d-3c4b-5efe-8c65-346ae6715606,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=f9257f78-62ea-450a-a79b-9944ac21c834,ceph.osd_id=4,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0", Nov 26 02:45:59 localhost beautiful_khorana[31125]: "lv_uuid": "TXBq16-kmKx-W9DV-iTEb-kvPN-o4eV-StAoAL", Nov 26 02:45:59 localhost beautiful_khorana[31125]: "name": "ceph_lv1", Nov 26 02:45:59 localhost beautiful_khorana[31125]: "path": "/dev/ceph_vg1/ceph_lv1", Nov 26 02:45:59 localhost beautiful_khorana[31125]: "tags": { Nov 26 02:45:59 localhost beautiful_khorana[31125]: "ceph.block_device": "/dev/ceph_vg1/ceph_lv1", Nov 26 02:45:59 localhost beautiful_khorana[31125]: "ceph.block_uuid": "TXBq16-kmKx-W9DV-iTEb-kvPN-o4eV-StAoAL", Nov 26 02:45:59 localhost beautiful_khorana[31125]: "ceph.cephx_lockbox_secret": "", Nov 26 02:45:59 localhost beautiful_khorana[31125]: "ceph.cluster_fsid": "0d5e5e6d-3c4b-5efe-8c65-346ae6715606", Nov 26 02:45:59 localhost beautiful_khorana[31125]: "ceph.cluster_name": "ceph", Nov 26 02:45:59 localhost beautiful_khorana[31125]: "ceph.crush_device_class": "", Nov 26 02:45:59 localhost beautiful_khorana[31125]: "ceph.encrypted": "0", Nov 26 02:45:59 localhost beautiful_khorana[31125]: "ceph.osd_fsid": "f9257f78-62ea-450a-a79b-9944ac21c834", Nov 26 02:45:59 localhost beautiful_khorana[31125]: "ceph.osd_id": "4", Nov 26 02:45:59 localhost beautiful_khorana[31125]: "ceph.osdspec_affinity": "default_drive_group", Nov 26 02:45:59 localhost beautiful_khorana[31125]: "ceph.type": "block", Nov 26 02:45:59 localhost beautiful_khorana[31125]: "ceph.vdo": "0" Nov 26 02:45:59 localhost beautiful_khorana[31125]: }, Nov 26 02:45:59 localhost beautiful_khorana[31125]: "type": "block", Nov 26 02:45:59 localhost beautiful_khorana[31125]: "vg_name": "ceph_vg1" Nov 26 02:45:59 localhost beautiful_khorana[31125]: } Nov 26 02:45:59 localhost beautiful_khorana[31125]: ] Nov 26 02:45:59 localhost beautiful_khorana[31125]: } Nov 26 02:45:59 localhost systemd[1]: libpod-c405db1432c0d7cc4af9baad00cacd6e91acac2b49d1bb7eb205777cbf98665b.scope: Deactivated successfully. Nov 26 02:45:59 localhost podman[31110]: 2025-11-26 07:45:59.362478155 +0000 UTC m=+0.576737153 container died c405db1432c0d7cc4af9baad00cacd6e91acac2b49d1bb7eb205777cbf98665b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_khorana, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.openshift.expose-services=, RELEASE=main, release=553, architecture=x86_64, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, io.buildah.version=1.33.12, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 26 02:45:59 localhost podman[31134]: 2025-11-26 07:45:59.454036506 +0000 UTC m=+0.079772654 container remove c405db1432c0d7cc4af9baad00cacd6e91acac2b49d1bb7eb205777cbf98665b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_khorana, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, com.redhat.component=rhceph-container, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_BRANCH=main, maintainer=Guillaume Abrioux , name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, io.openshift.expose-services=, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 26 02:45:59 localhost systemd[1]: libpod-conmon-c405db1432c0d7cc4af9baad00cacd6e91acac2b49d1bb7eb205777cbf98665b.scope: Deactivated successfully. Nov 26 02:45:59 localhost systemd[1]: var-lib-containers-storage-overlay-a2dac88a75020b529bf40ccdc75b50c703d7a70f91e7663acf75641c5863e816-merged.mount: Deactivated successfully. Nov 26 02:46:00 localhost podman[31220]: Nov 26 02:46:00 localhost podman[31220]: 2025-11-26 07:46:00.215903938 +0000 UTC m=+0.061646850 container create 42b1c584291d971e5eccdc05ebca81d8c5285513c19aa38fa7fdf3021a996bdb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_ellis, name=rhceph, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, architecture=x86_64, RELEASE=main, vcs-type=git, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, GIT_BRANCH=main, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 26 02:46:00 localhost systemd[1]: Started libpod-conmon-42b1c584291d971e5eccdc05ebca81d8c5285513c19aa38fa7fdf3021a996bdb.scope. Nov 26 02:46:00 localhost systemd[1]: Started libcrun container. Nov 26 02:46:00 localhost podman[31220]: 2025-11-26 07:46:00.266537836 +0000 UTC m=+0.112280738 container init 42b1c584291d971e5eccdc05ebca81d8c5285513c19aa38fa7fdf3021a996bdb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_ellis, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, GIT_BRANCH=main, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vcs-type=git, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public) Nov 26 02:46:00 localhost podman[31220]: 2025-11-26 07:46:00.276198232 +0000 UTC m=+0.121941174 container start 42b1c584291d971e5eccdc05ebca81d8c5285513c19aa38fa7fdf3021a996bdb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_ellis, vcs-type=git, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, release=553, ceph=True, io.buildah.version=1.33.12, GIT_BRANCH=main, distribution-scope=public, io.openshift.tags=rhceph ceph, version=7, maintainer=Guillaume Abrioux , name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 26 02:46:00 localhost podman[31220]: 2025-11-26 07:46:00.276513821 +0000 UTC m=+0.122256723 container attach 42b1c584291d971e5eccdc05ebca81d8c5285513c19aa38fa7fdf3021a996bdb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_ellis, description=Red Hat Ceph Storage 7, release=553, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.component=rhceph-container, GIT_BRANCH=main, distribution-scope=public, ceph=True, CEPH_POINT_RELEASE=, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, architecture=x86_64, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, io.openshift.expose-services=, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main) Nov 26 02:46:00 localhost xenodochial_ellis[31236]: 167 167 Nov 26 02:46:00 localhost systemd[1]: libpod-42b1c584291d971e5eccdc05ebca81d8c5285513c19aa38fa7fdf3021a996bdb.scope: Deactivated successfully. Nov 26 02:46:00 localhost podman[31220]: 2025-11-26 07:46:00.277989177 +0000 UTC m=+0.123732119 container died 42b1c584291d971e5eccdc05ebca81d8c5285513c19aa38fa7fdf3021a996bdb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_ellis, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, release=553, io.buildah.version=1.33.12, architecture=x86_64, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, maintainer=Guillaume Abrioux , name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, version=7, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, com.redhat.component=rhceph-container, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, vendor=Red Hat, Inc., vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 26 02:46:00 localhost podman[31220]: 2025-11-26 07:46:00.181774173 +0000 UTC m=+0.027517115 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 02:46:00 localhost podman[31241]: 2025-11-26 07:46:00.346051882 +0000 UTC m=+0.057855896 container remove 42b1c584291d971e5eccdc05ebca81d8c5285513c19aa38fa7fdf3021a996bdb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_ellis, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, GIT_BRANCH=main, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, maintainer=Guillaume Abrioux , version=7, architecture=x86_64, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_CLEAN=True, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 26 02:46:00 localhost systemd[1]: libpod-conmon-42b1c584291d971e5eccdc05ebca81d8c5285513c19aa38fa7fdf3021a996bdb.scope: Deactivated successfully. Nov 26 02:46:00 localhost podman[31270]: Nov 26 02:46:00 localhost podman[31270]: 2025-11-26 07:46:00.65860209 +0000 UTC m=+0.071160812 container create 67485fb4f51de7108665c8cb5f82316f35496e474816c5b01bc5dbd88490f9b5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-0-activate-test, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, ceph=True, vcs-type=git, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, distribution-scope=public, name=rhceph, CEPH_POINT_RELEASE=, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., release=553, io.k8s.description=Red Hat Ceph Storage 7, version=7, architecture=x86_64, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 26 02:46:00 localhost systemd[1]: Started libpod-conmon-67485fb4f51de7108665c8cb5f82316f35496e474816c5b01bc5dbd88490f9b5.scope. Nov 26 02:46:00 localhost systemd[1]: Started libcrun container. Nov 26 02:46:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/85fa859434027e1306f96297ca4b1816d038eb97078b0d05fe41a8cdd6e2897a/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 26 02:46:00 localhost podman[31270]: 2025-11-26 07:46:00.634304815 +0000 UTC m=+0.046863557 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 02:46:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/85fa859434027e1306f96297ca4b1816d038eb97078b0d05fe41a8cdd6e2897a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Nov 26 02:46:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/85fa859434027e1306f96297ca4b1816d038eb97078b0d05fe41a8cdd6e2897a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 26 02:46:00 localhost systemd[1]: var-lib-containers-storage-overlay-f2ea82480e5e65a9b4619103f100fd439fa70f553b18b449b676cb4b9c5978b5-merged.mount: Deactivated successfully. Nov 26 02:46:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/85fa859434027e1306f96297ca4b1816d038eb97078b0d05fe41a8cdd6e2897a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 26 02:46:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/85fa859434027e1306f96297ca4b1816d038eb97078b0d05fe41a8cdd6e2897a/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff) Nov 26 02:46:00 localhost podman[31270]: 2025-11-26 07:46:00.78900817 +0000 UTC m=+0.201566912 container init 67485fb4f51de7108665c8cb5f82316f35496e474816c5b01bc5dbd88490f9b5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-0-activate-test, io.buildah.version=1.33.12, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., version=7, CEPH_POINT_RELEASE=, release=553, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, distribution-scope=public, architecture=x86_64, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 26 02:46:00 localhost podman[31270]: 2025-11-26 07:46:00.799883096 +0000 UTC m=+0.212441868 container start 67485fb4f51de7108665c8cb5f82316f35496e474816c5b01bc5dbd88490f9b5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-0-activate-test, release=553, ceph=True, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, name=rhceph, CEPH_POINT_RELEASE=, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, version=7, io.buildah.version=1.33.12, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , RELEASE=main, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, architecture=x86_64, io.openshift.tags=rhceph ceph) Nov 26 02:46:00 localhost podman[31270]: 2025-11-26 07:46:00.800241835 +0000 UTC m=+0.212800617 container attach 67485fb4f51de7108665c8cb5f82316f35496e474816c5b01bc5dbd88490f9b5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-0-activate-test, io.openshift.tags=rhceph ceph, distribution-scope=public, io.buildah.version=1.33.12, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, RELEASE=main, maintainer=Guillaume Abrioux , io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, architecture=x86_64, version=7, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 26 02:46:01 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-0-activate-test[31285]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID] Nov 26 02:46:01 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-0-activate-test[31285]: [--no-systemd] [--no-tmpfs] Nov 26 02:46:01 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-0-activate-test[31285]: ceph-volume activate: error: unrecognized arguments: --bad-option Nov 26 02:46:01 localhost systemd[1]: libpod-67485fb4f51de7108665c8cb5f82316f35496e474816c5b01bc5dbd88490f9b5.scope: Deactivated successfully. Nov 26 02:46:01 localhost podman[31270]: 2025-11-26 07:46:01.023918648 +0000 UTC m=+0.436477460 container died 67485fb4f51de7108665c8cb5f82316f35496e474816c5b01bc5dbd88490f9b5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-0-activate-test, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, RELEASE=main, architecture=x86_64, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, release=553, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 26 02:46:01 localhost systemd[1]: var-lib-containers-storage-overlay-85fa859434027e1306f96297ca4b1816d038eb97078b0d05fe41a8cdd6e2897a-merged.mount: Deactivated successfully. Nov 26 02:46:01 localhost systemd-journald[618]: Field hash table of /run/log/journal/ea6370aa35b896eb1e7cdbd81aa316d7/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation. Nov 26 02:46:01 localhost systemd-journald[618]: /run/log/journal/ea6370aa35b896eb1e7cdbd81aa316d7/system.journal: Journal header limits reached or header out-of-date, rotating. Nov 26 02:46:01 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 26 02:46:01 localhost podman[31290]: 2025-11-26 07:46:01.130057075 +0000 UTC m=+0.092113375 container remove 67485fb4f51de7108665c8cb5f82316f35496e474816c5b01bc5dbd88490f9b5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-0-activate-test, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, com.redhat.component=rhceph-container, release=553, version=7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, GIT_CLEAN=True, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7) Nov 26 02:46:01 localhost systemd[1]: libpod-conmon-67485fb4f51de7108665c8cb5f82316f35496e474816c5b01bc5dbd88490f9b5.scope: Deactivated successfully. Nov 26 02:46:01 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 26 02:46:01 localhost systemd[1]: Reloading. Nov 26 02:46:01 localhost systemd-rc-local-generator[31345]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 02:46:01 localhost systemd-sysv-generator[31350]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 02:46:01 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 02:46:01 localhost systemd[1]: Reloading. Nov 26 02:46:01 localhost systemd-sysv-generator[31388]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 02:46:01 localhost systemd-rc-local-generator[31385]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 02:46:01 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 02:46:01 localhost systemd[1]: Starting Ceph osd.0 for 0d5e5e6d-3c4b-5efe-8c65-346ae6715606... Nov 26 02:46:02 localhost podman[31448]: Nov 26 02:46:02 localhost podman[31448]: 2025-11-26 07:46:02.302404632 +0000 UTC m=+0.075315534 container create f0fef1f461765ae048eeccf9f8b20278bcd08218d3b9b268b814b0bec648cfe0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-0-activate, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, io.openshift.expose-services=, RELEASE=main, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, version=7, name=rhceph, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 26 02:46:02 localhost systemd[1]: tmp-crun.jFzJWJ.mount: Deactivated successfully. Nov 26 02:46:02 localhost systemd[1]: Started libcrun container. Nov 26 02:46:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d096698d69a0992542b250ed5a5cc87c22d624a6431119d43962f50c6b51f930/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 26 02:46:02 localhost podman[31448]: 2025-11-26 07:46:02.271870154 +0000 UTC m=+0.044781066 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 02:46:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d096698d69a0992542b250ed5a5cc87c22d624a6431119d43962f50c6b51f930/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Nov 26 02:46:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d096698d69a0992542b250ed5a5cc87c22d624a6431119d43962f50c6b51f930/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 26 02:46:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d096698d69a0992542b250ed5a5cc87c22d624a6431119d43962f50c6b51f930/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 26 02:46:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d096698d69a0992542b250ed5a5cc87c22d624a6431119d43962f50c6b51f930/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff) Nov 26 02:46:02 localhost podman[31448]: 2025-11-26 07:46:02.415045008 +0000 UTC m=+0.187955920 container init f0fef1f461765ae048eeccf9f8b20278bcd08218d3b9b268b814b0bec648cfe0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-0-activate, name=rhceph, description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , GIT_CLEAN=True, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, RELEASE=main, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55) Nov 26 02:46:02 localhost podman[31448]: 2025-11-26 07:46:02.425238137 +0000 UTC m=+0.198149039 container start f0fef1f461765ae048eeccf9f8b20278bcd08218d3b9b268b814b0bec648cfe0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-0-activate, architecture=x86_64, name=rhceph, distribution-scope=public, vcs-type=git, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, release=553, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, version=7) Nov 26 02:46:02 localhost podman[31448]: 2025-11-26 07:46:02.425524774 +0000 UTC m=+0.198435676 container attach f0fef1f461765ae048eeccf9f8b20278bcd08218d3b9b268b814b0bec648cfe0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-0-activate, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, RELEASE=main, architecture=x86_64, vcs-type=git, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, name=rhceph, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, GIT_BRANCH=main, release=553, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 26 02:46:03 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-0-activate[31463]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 Nov 26 02:46:03 localhost bash[31448]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 Nov 26 02:46:03 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-0-activate[31463]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-0 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0 Nov 26 02:46:03 localhost bash[31448]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-0 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0 Nov 26 02:46:03 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-0-activate[31463]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0 Nov 26 02:46:03 localhost bash[31448]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0 Nov 26 02:46:03 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-0-activate[31463]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 Nov 26 02:46:03 localhost bash[31448]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 Nov 26 02:46:03 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-0-activate[31463]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-0/block Nov 26 02:46:03 localhost bash[31448]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-0/block Nov 26 02:46:03 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-0-activate[31463]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 Nov 26 02:46:03 localhost bash[31448]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 Nov 26 02:46:03 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-0-activate[31463]: --> ceph-volume raw activate successful for osd ID: 0 Nov 26 02:46:03 localhost bash[31448]: --> ceph-volume raw activate successful for osd ID: 0 Nov 26 02:46:03 localhost systemd[1]: libpod-f0fef1f461765ae048eeccf9f8b20278bcd08218d3b9b268b814b0bec648cfe0.scope: Deactivated successfully. Nov 26 02:46:03 localhost podman[31594]: 2025-11-26 07:46:03.237752009 +0000 UTC m=+0.052100467 container died f0fef1f461765ae048eeccf9f8b20278bcd08218d3b9b268b814b0bec648cfe0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-0-activate, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, architecture=x86_64, GIT_CLEAN=True, release=553, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, version=7, RELEASE=main, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, name=rhceph, distribution-scope=public, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, ceph=True, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 26 02:46:03 localhost podman[31594]: 2025-11-26 07:46:03.270336925 +0000 UTC m=+0.084685363 container remove f0fef1f461765ae048eeccf9f8b20278bcd08218d3b9b268b814b0bec648cfe0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-0-activate, CEPH_POINT_RELEASE=, io.openshift.expose-services=, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.openshift.tags=rhceph ceph, architecture=x86_64, release=553, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , GIT_BRANCH=main, io.buildah.version=1.33.12, distribution-scope=public, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, name=rhceph, RELEASE=main) Nov 26 02:46:03 localhost systemd[1]: var-lib-containers-storage-overlay-d096698d69a0992542b250ed5a5cc87c22d624a6431119d43962f50c6b51f930-merged.mount: Deactivated successfully. Nov 26 02:46:03 localhost podman[31655]: Nov 26 02:46:03 localhost podman[31655]: 2025-11-26 07:46:03.607169227 +0000 UTC m=+0.069230504 container create fee32db2deec5725f1f38e48842deb1d64c955d4176beaadefba6e72b959e6a3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_BRANCH=main, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., name=rhceph, ceph=True, architecture=x86_64, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.33.12) Nov 26 02:46:03 localhost systemd[1]: tmp-crun.ededuc.mount: Deactivated successfully. Nov 26 02:46:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b7a17f43f60de08dd7bd272a08dc9b33fa5cb0fd45b31f41bea2758dda1f23b/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 26 02:46:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b7a17f43f60de08dd7bd272a08dc9b33fa5cb0fd45b31f41bea2758dda1f23b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Nov 26 02:46:03 localhost podman[31655]: 2025-11-26 07:46:03.582286859 +0000 UTC m=+0.044348166 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 02:46:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b7a17f43f60de08dd7bd272a08dc9b33fa5cb0fd45b31f41bea2758dda1f23b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 26 02:46:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b7a17f43f60de08dd7bd272a08dc9b33fa5cb0fd45b31f41bea2758dda1f23b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 26 02:46:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b7a17f43f60de08dd7bd272a08dc9b33fa5cb0fd45b31f41bea2758dda1f23b/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff) Nov 26 02:46:03 localhost podman[31655]: 2025-11-26 07:46:03.736868071 +0000 UTC m=+0.198929348 container init fee32db2deec5725f1f38e48842deb1d64c955d4176beaadefba6e72b959e6a3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-0, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, name=rhceph, ceph=True, RELEASE=main, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, release=553, io.openshift.expose-services=, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., architecture=x86_64, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 26 02:46:03 localhost podman[31655]: 2025-11-26 07:46:03.746555358 +0000 UTC m=+0.208616635 container start fee32db2deec5725f1f38e48842deb1d64c955d4176beaadefba6e72b959e6a3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-0, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, release=553, ceph=True, version=7, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, name=rhceph) Nov 26 02:46:03 localhost bash[31655]: fee32db2deec5725f1f38e48842deb1d64c955d4176beaadefba6e72b959e6a3 Nov 26 02:46:03 localhost systemd[1]: Started Ceph osd.0 for 0d5e5e6d-3c4b-5efe-8c65-346ae6715606. Nov 26 02:46:03 localhost ceph-osd[31674]: set uid:gid to 167:167 (ceph:ceph) Nov 26 02:46:03 localhost ceph-osd[31674]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-osd, pid 2 Nov 26 02:46:03 localhost ceph-osd[31674]: pidfile_write: ignore empty --pid-file Nov 26 02:46:03 localhost ceph-osd[31674]: bdev(0x5576e1020e00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block Nov 26 02:46:03 localhost ceph-osd[31674]: bdev(0x5576e1020e00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument Nov 26 02:46:03 localhost ceph-osd[31674]: bdev(0x5576e1020e00 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 26 02:46:03 localhost ceph-osd[31674]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Nov 26 02:46:03 localhost ceph-osd[31674]: bdev(0x5576e1021180 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block Nov 26 02:46:03 localhost ceph-osd[31674]: bdev(0x5576e1021180 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument Nov 26 02:46:03 localhost ceph-osd[31674]: bdev(0x5576e1021180 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 26 02:46:03 localhost ceph-osd[31674]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 7.0 GiB Nov 26 02:46:03 localhost ceph-osd[31674]: bdev(0x5576e1021180 /var/lib/ceph/osd/ceph-0/block) close Nov 26 02:46:04 localhost ceph-osd[31674]: bdev(0x5576e1020e00 /var/lib/ceph/osd/ceph-0/block) close Nov 26 02:46:04 localhost ceph-osd[31674]: starting osd.0 osd_data /var/lib/ceph/osd/ceph-0 /var/lib/ceph/osd/ceph-0/journal Nov 26 02:46:04 localhost ceph-osd[31674]: load: jerasure load: lrc Nov 26 02:46:04 localhost ceph-osd[31674]: bdev(0x5576e1020e00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block Nov 26 02:46:04 localhost ceph-osd[31674]: bdev(0x5576e1020e00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument Nov 26 02:46:04 localhost ceph-osd[31674]: bdev(0x5576e1020e00 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 26 02:46:04 localhost ceph-osd[31674]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Nov 26 02:46:04 localhost ceph-osd[31674]: bdev(0x5576e1020e00 /var/lib/ceph/osd/ceph-0/block) close Nov 26 02:46:04 localhost podman[31766]: Nov 26 02:46:04 localhost ceph-osd[31674]: bdev(0x5576e1020e00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block Nov 26 02:46:04 localhost ceph-osd[31674]: bdev(0x5576e1020e00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument Nov 26 02:46:04 localhost ceph-osd[31674]: bdev(0x5576e1020e00 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 26 02:46:04 localhost ceph-osd[31674]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Nov 26 02:46:04 localhost ceph-osd[31674]: bdev(0x5576e1020e00 /var/lib/ceph/osd/ceph-0/block) close Nov 26 02:46:04 localhost podman[31766]: 2025-11-26 07:46:04.610307443 +0000 UTC m=+0.078224385 container create e9a214b0c3528cfbd28cd61fcf82bcd87913357d7cde29e7f738402db5d2db3b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_chaplygin, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , GIT_BRANCH=main, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, GIT_CLEAN=True, ceph=True, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 26 02:46:04 localhost podman[31766]: 2025-11-26 07:46:04.577769547 +0000 UTC m=+0.045686499 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 02:46:04 localhost systemd[1]: Started libpod-conmon-e9a214b0c3528cfbd28cd61fcf82bcd87913357d7cde29e7f738402db5d2db3b.scope. Nov 26 02:46:04 localhost systemd[1]: Started libcrun container. Nov 26 02:46:04 localhost podman[31766]: 2025-11-26 07:46:04.724194359 +0000 UTC m=+0.192111301 container init e9a214b0c3528cfbd28cd61fcf82bcd87913357d7cde29e7f738402db5d2db3b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_chaplygin, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, RELEASE=main, maintainer=Guillaume Abrioux , release=553, version=7, io.openshift.tags=rhceph ceph, architecture=x86_64, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_CLEAN=True, vendor=Red Hat, Inc., GIT_BRANCH=main, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 26 02:46:04 localhost podman[31766]: 2025-11-26 07:46:04.735838834 +0000 UTC m=+0.203755776 container start e9a214b0c3528cfbd28cd61fcf82bcd87913357d7cde29e7f738402db5d2db3b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_chaplygin, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.openshift.expose-services=, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, RELEASE=main, vcs-type=git, name=rhceph, release=553, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55) Nov 26 02:46:04 localhost podman[31766]: 2025-11-26 07:46:04.736203444 +0000 UTC m=+0.204120386 container attach e9a214b0c3528cfbd28cd61fcf82bcd87913357d7cde29e7f738402db5d2db3b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_chaplygin, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, release=553, GIT_CLEAN=True, distribution-scope=public, architecture=x86_64, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, build-date=2025-09-24T08:57:55, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, GIT_BRANCH=main, version=7) Nov 26 02:46:04 localhost sleepy_chaplygin[31785]: 167 167 Nov 26 02:46:04 localhost systemd[1]: libpod-e9a214b0c3528cfbd28cd61fcf82bcd87913357d7cde29e7f738402db5d2db3b.scope: Deactivated successfully. Nov 26 02:46:04 localhost podman[31766]: 2025-11-26 07:46:04.742863996 +0000 UTC m=+0.210780988 container died e9a214b0c3528cfbd28cd61fcf82bcd87913357d7cde29e7f738402db5d2db3b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_chaplygin, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , release=553, ceph=True, version=7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, RELEASE=main, architecture=x86_64, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7) Nov 26 02:46:04 localhost podman[31790]: 2025-11-26 07:46:04.839347457 +0000 UTC m=+0.085038652 container remove e9a214b0c3528cfbd28cd61fcf82bcd87913357d7cde29e7f738402db5d2db3b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_chaplygin, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, maintainer=Guillaume Abrioux , vcs-type=git, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 26 02:46:04 localhost systemd[1]: libpod-conmon-e9a214b0c3528cfbd28cd61fcf82bcd87913357d7cde29e7f738402db5d2db3b.scope: Deactivated successfully. Nov 26 02:46:04 localhost ceph-osd[31674]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second Nov 26 02:46:04 localhost ceph-osd[31674]: osd.0:0.OSDShard using op scheduler mclock_scheduler, cutoff=196 Nov 26 02:46:04 localhost ceph-osd[31674]: bdev(0x5576e1020e00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block Nov 26 02:46:04 localhost ceph-osd[31674]: bdev(0x5576e1020e00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument Nov 26 02:46:04 localhost ceph-osd[31674]: bdev(0x5576e1020e00 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 26 02:46:04 localhost ceph-osd[31674]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Nov 26 02:46:04 localhost ceph-osd[31674]: bdev(0x5576e1021180 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block Nov 26 02:46:04 localhost ceph-osd[31674]: bdev(0x5576e1021180 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument Nov 26 02:46:04 localhost ceph-osd[31674]: bdev(0x5576e1021180 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 26 02:46:04 localhost ceph-osd[31674]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 7.0 GiB Nov 26 02:46:04 localhost ceph-osd[31674]: bluefs mount Nov 26 02:46:04 localhost ceph-osd[31674]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000 Nov 26 02:46:04 localhost ceph-osd[31674]: bluefs mount shared_bdev_used = 0 Nov 26 02:46:04 localhost ceph-osd[31674]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: RocksDB version: 7.9.2 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Git sha 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Compile date 2025-09-23 00:00:00 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: DB SUMMARY Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: DB Session ID: NWWUYC1ZLYKO61WRQULB Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: CURRENT file: CURRENT Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: IDENTITY file: IDENTITY Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: MANIFEST file: MANIFEST-000032 size: 1007 Bytes Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: SST files in db.slow dir, Total Num: 0, files: Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.error_if_exists: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.create_if_missing: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.paranoid_checks: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.flush_verify_memtable_count: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.env: 0x5576e12b4cb0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.fs: LegacyFileSystem Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.info_log: 0x5576e1f9a780 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_file_opening_threads: 16 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.statistics: (nil) Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.use_fsync: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_log_file_size: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_manifest_file_size: 1073741824 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.log_file_time_to_roll: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.keep_log_file_num: 1000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.recycle_log_file_num: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.allow_fallocate: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.allow_mmap_reads: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.allow_mmap_writes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.use_direct_reads: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.create_missing_column_families: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.db_log_dir: Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.wal_dir: db.wal Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.table_cache_numshardbits: 6 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.WAL_ttl_seconds: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.WAL_size_limit_MB: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.manifest_preallocation_size: 4194304 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.is_fd_close_on_exec: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.advise_random_on_open: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.db_write_buffer_size: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.write_buffer_manager: 0x5576e100a140 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.access_hint_on_compaction_start: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.random_access_max_buffer_size: 1048576 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.use_adaptive_mutex: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.rate_limiter: (nil) Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.wal_recovery_mode: 2 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.enable_thread_tracking: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.enable_pipelined_write: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.unordered_write: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.allow_concurrent_memtable_write: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.write_thread_max_yield_usec: 100 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.write_thread_slow_yield_usec: 3 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.row_cache: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.wal_filter: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.avoid_flush_during_recovery: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.allow_ingest_behind: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.two_write_queues: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.manual_wal_flush: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.wal_compression: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.atomic_flush: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.persist_stats_to_disk: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.write_dbid_to_manifest: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.log_readahead_size: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.file_checksum_gen_factory: Unknown Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.best_efforts_recovery: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.allow_data_in_errors: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.db_host_id: __hostname__ Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.enforce_single_del_contracts: true Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_background_jobs: 4 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_background_compactions: -1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_subcompactions: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.avoid_flush_during_shutdown: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.writable_file_max_buffer_size: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.delayed_write_rate : 16777216 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_total_wal_size: 1073741824 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.stats_dump_period_sec: 600 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.stats_persist_period_sec: 600 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.stats_history_buffer_size: 1048576 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_open_files: -1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bytes_per_sync: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.wal_bytes_per_sync: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.strict_bytes_per_sync: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_readahead_size: 2097152 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_background_flushes: -1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Compression algorithms supported: Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: #011kZSTD supported: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: #011kXpressCompression supported: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: #011kBZip2Compression supported: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: #011kLZ4Compression supported: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: #011kZlibCompression supported: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: #011kLZ4HCCompression supported: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: #011kSnappyCompression supported: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Fast CRC32 supported: Supported on x86 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: DMutex implementation: pthread_mutex_t Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default) Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.merge_operator: .T:int64_array.b:bitwise_xor Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5576e1f9a940)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5576e0ff8850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression: LZ4 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.num_levels: 7 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.table_properties_collectors: Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0) Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]: Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.merge_operator: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5576e1f9a940)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5576e0ff8850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression: LZ4 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.num_levels: 7 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1) Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]: Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.merge_operator: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5576e1f9a940)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5576e0ff8850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression: LZ4 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.num_levels: 7 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2) Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]: Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.merge_operator: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5576e1f9a940)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5576e0ff8850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression: LZ4 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.num_levels: 7 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0) Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]: Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.merge_operator: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5576e1f9a940)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5576e0ff8850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression: LZ4 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.num_levels: 7 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1) Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]: Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.merge_operator: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5576e1f9a940)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5576e0ff8850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression: LZ4 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.num_levels: 7 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2) Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]: Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.merge_operator: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5576e1f9a940)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5576e0ff8850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression: LZ4 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.num_levels: 7 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0) Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]: Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.merge_operator: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5576e1f9ab60)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5576e0ff82d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression: LZ4 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.num_levels: 7 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1) Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]: Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.merge_operator: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5576e1f9ab60)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5576e0ff82d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression: LZ4 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.num_levels: 7 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2) Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]: Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.merge_operator: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5576e1f9ab60)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5576e0ff82d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression: LZ4 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.num_levels: 7 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L) Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P) Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 62e581d6-5568-4f1a-a6b8-b37d738d43df Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764143164935645, "job": 1, "event": "recovery_started", "wal_files": [31]} Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764143164935980, "job": 1, "event": "recovery_finished"} Nov 26 02:46:04 localhost ceph-osd[31674]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 Nov 26 02:46:04 localhost ceph-osd[31674]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old nid_max 1025 Nov 26 02:46:04 localhost ceph-osd[31674]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old blobid_max 10240 Nov 26 02:46:04 localhost ceph-osd[31674]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta ondisk_format 4 compat_ondisk_format 3 Nov 26 02:46:04 localhost ceph-osd[31674]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta min_alloc_size 0x1000 Nov 26 02:46:04 localhost ceph-osd[31674]: freelist init Nov 26 02:46:04 localhost ceph-osd[31674]: freelist _read_cfg Nov 26 02:46:04 localhost ceph-osd[31674]: bluestore(/var/lib/ceph/osd/ceph-0) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07 Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work Nov 26 02:46:04 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete Nov 26 02:46:04 localhost ceph-osd[31674]: bluefs umount Nov 26 02:46:04 localhost ceph-osd[31674]: bdev(0x5576e1021180 /var/lib/ceph/osd/ceph-0/block) close Nov 26 02:46:05 localhost podman[32012]: Nov 26 02:46:05 localhost podman[32012]: 2025-11-26 07:46:05.172314674 +0000 UTC m=+0.082134430 container create 3ad16160daef4781ab26f85118fbcd06a2741eb9e46a71e630085924b8252ade (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-4-activate-test, version=7, io.openshift.tags=rhceph ceph, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, vcs-type=git, GIT_BRANCH=main, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, RELEASE=main, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, build-date=2025-09-24T08:57:55) Nov 26 02:46:05 localhost ceph-osd[31674]: bdev(0x5576e1021180 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block Nov 26 02:46:05 localhost ceph-osd[31674]: bdev(0x5576e1021180 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument Nov 26 02:46:05 localhost ceph-osd[31674]: bdev(0x5576e1021180 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 26 02:46:05 localhost ceph-osd[31674]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 7.0 GiB Nov 26 02:46:05 localhost ceph-osd[31674]: bluefs mount Nov 26 02:46:05 localhost ceph-osd[31674]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000 Nov 26 02:46:05 localhost ceph-osd[31674]: bluefs mount shared_bdev_used = 4718592 Nov 26 02:46:05 localhost ceph-osd[31674]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: RocksDB version: 7.9.2 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Git sha 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Compile date 2025-09-23 00:00:00 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: DB SUMMARY Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: DB Session ID: NWWUYC1ZLYKO61WRQULA Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: CURRENT file: CURRENT Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: IDENTITY file: IDENTITY Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: MANIFEST file: MANIFEST-000032 size: 1007 Bytes Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: SST files in db.slow dir, Total Num: 0, files: Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.error_if_exists: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.create_if_missing: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.paranoid_checks: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.flush_verify_memtable_count: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.env: 0x5576e12b5ea0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.fs: LegacyFileSystem Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.info_log: 0x5576e1f9b880 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_file_opening_threads: 16 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.statistics: (nil) Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.use_fsync: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_log_file_size: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_manifest_file_size: 1073741824 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.log_file_time_to_roll: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.keep_log_file_num: 1000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.recycle_log_file_num: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.allow_fallocate: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.allow_mmap_reads: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.allow_mmap_writes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.use_direct_reads: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.create_missing_column_families: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.db_log_dir: Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.wal_dir: db.wal Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.table_cache_numshardbits: 6 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.WAL_ttl_seconds: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.WAL_size_limit_MB: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.manifest_preallocation_size: 4194304 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.is_fd_close_on_exec: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.advise_random_on_open: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.db_write_buffer_size: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.write_buffer_manager: 0x5576e100b540 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.access_hint_on_compaction_start: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.random_access_max_buffer_size: 1048576 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.use_adaptive_mutex: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.rate_limiter: (nil) Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.wal_recovery_mode: 2 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.enable_thread_tracking: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.enable_pipelined_write: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.unordered_write: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.allow_concurrent_memtable_write: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.write_thread_max_yield_usec: 100 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.write_thread_slow_yield_usec: 3 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.row_cache: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.wal_filter: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.avoid_flush_during_recovery: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.allow_ingest_behind: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.two_write_queues: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.manual_wal_flush: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.wal_compression: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.atomic_flush: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.persist_stats_to_disk: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.write_dbid_to_manifest: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.log_readahead_size: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.file_checksum_gen_factory: Unknown Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.best_efforts_recovery: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.allow_data_in_errors: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.db_host_id: __hostname__ Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.enforce_single_del_contracts: true Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_background_jobs: 4 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_background_compactions: -1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_subcompactions: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.avoid_flush_during_shutdown: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.writable_file_max_buffer_size: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.delayed_write_rate : 16777216 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_total_wal_size: 1073741824 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.stats_dump_period_sec: 600 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.stats_persist_period_sec: 600 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.stats_history_buffer_size: 1048576 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_open_files: -1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bytes_per_sync: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.wal_bytes_per_sync: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.strict_bytes_per_sync: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_readahead_size: 2097152 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_background_flushes: -1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Compression algorithms supported: Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: #011kZSTD supported: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: #011kXpressCompression supported: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: #011kBZip2Compression supported: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: #011kLZ4Compression supported: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: #011kZlibCompression supported: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: #011kLZ4HCCompression supported: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: #011kSnappyCompression supported: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Fast CRC32 supported: Supported on x86 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: DMutex implementation: pthread_mutex_t Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default) Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.merge_operator: .T:int64_array.b:bitwise_xor Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5576e10b9200)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5576e0ff82d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression: LZ4 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.num_levels: 7 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.table_properties_collectors: Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0) Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]: Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.merge_operator: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5576e10b9200)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5576e0ff82d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression: LZ4 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.num_levels: 7 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1) Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]: Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.merge_operator: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5576e10b9200)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5576e0ff82d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression: LZ4 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.num_levels: 7 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2) Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]: Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.merge_operator: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5576e10b9200)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5576e0ff82d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression: LZ4 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.num_levels: 7 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0) Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]: Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.merge_operator: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5576e10b9200)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5576e0ff82d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression: LZ4 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.num_levels: 7 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1) Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]: Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.merge_operator: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5576e10b9200)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5576e0ff82d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression: LZ4 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.num_levels: 7 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2) Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]: Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.merge_operator: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5576e10b9200)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5576e0ff82d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression: LZ4 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.num_levels: 7 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0) Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]: Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.merge_operator: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5576e10b8fc0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5576e0ff9610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression: LZ4 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.num_levels: 7 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1) Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]: Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.merge_operator: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5576e10b8fc0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5576e0ff9610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression: LZ4 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.num_levels: 7 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2) Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]: Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.merge_operator: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5576e10b8fc0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5576e0ff9610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression: LZ4 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.num_levels: 7 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L) Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P) Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 62e581d6-5568-4f1a-a6b8-b37d738d43df Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764143165208903, "job": 1, "event": "recovery_started", "wal_files": [31]} Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764143165215463, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1261, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764143165, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "62e581d6-5568-4f1a-a6b8-b37d738d43df", "db_session_id": "NWWUYC1ZLYKO61WRQULA", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}} Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764143165223767, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1607, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 466, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764143165, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "62e581d6-5568-4f1a-a6b8-b37d738d43df", "db_session_id": "NWWUYC1ZLYKO61WRQULA", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}} Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764143165232163, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764143165, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "62e581d6-5568-4f1a-a6b8-b37d738d43df", "db_session_id": "NWWUYC1ZLYKO61WRQULA", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}} Nov 26 02:46:05 localhost systemd[1]: Started libpod-conmon-3ad16160daef4781ab26f85118fbcd06a2741eb9e46a71e630085924b8252ade.scope. Nov 26 02:46:05 localhost podman[32012]: 2025-11-26 07:46:05.139124392 +0000 UTC m=+0.048944148 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl_open.cc:1432] Failed to truncate log #31: IO error: No such file or directory: While open a file for appending: db.wal/000031.log: No such file or directory Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764143165238761, "job": 1, "event": "recovery_finished"} Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/version_set.cc:5047] Creating manifest 40 Nov 26 02:46:05 localhost systemd[1]: Started libcrun container. Nov 26 02:46:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd5333dbd59ca9eff3af1aae69318a6fccf48035d9a6f66a3110e21f0129e1e7/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 26 02:46:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd5333dbd59ca9eff3af1aae69318a6fccf48035d9a6f66a3110e21f0129e1e7/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x5576e10c0380 Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: DB pointer 0x5576e1ef3a00 Nov 26 02:46:05 localhost ceph-osd[31674]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 Nov 26 02:46:05 localhost ceph-osd[31674]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super from 4, latest 4 Nov 26 02:46:05 localhost ceph-osd[31674]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super done Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 26 02:46:05 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.1 total, 0.1 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5576e0ff82d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.1e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5576e0ff82d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.1e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5576e0ff82d0#2 capacity: 460.80 MB usag Nov 26 02:46:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd5333dbd59ca9eff3af1aae69318a6fccf48035d9a6f66a3110e21f0129e1e7/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 26 02:46:05 localhost ceph-osd[31674]: /builddir/build/BUILD/ceph-18.2.1/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs Nov 26 02:46:05 localhost ceph-osd[31674]: /builddir/build/BUILD/ceph-18.2.1/src/cls/hello/cls_hello.cc:316: loading cls_hello Nov 26 02:46:05 localhost ceph-osd[31674]: _get_class not permitted to load lua Nov 26 02:46:05 localhost ceph-osd[31674]: _get_class not permitted to load sdk Nov 26 02:46:05 localhost ceph-osd[31674]: _get_class not permitted to load test_remote_reads Nov 26 02:46:05 localhost ceph-osd[31674]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for clients Nov 26 02:46:05 localhost ceph-osd[31674]: osd.0 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons Nov 26 02:46:05 localhost ceph-osd[31674]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for osds Nov 26 02:46:05 localhost ceph-osd[31674]: osd.0 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature Nov 26 02:46:05 localhost ceph-osd[31674]: osd.0 0 load_pgs Nov 26 02:46:05 localhost ceph-osd[31674]: osd.0 0 load_pgs opened 0 pgs Nov 26 02:46:05 localhost ceph-osd[31674]: osd.0 0 log_to_monitors true Nov 26 02:46:05 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-0[31670]: 2025-11-26T07:46:05.298+0000 7f8627ecda80 -1 osd.0 0 log_to_monitors true Nov 26 02:46:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd5333dbd59ca9eff3af1aae69318a6fccf48035d9a6f66a3110e21f0129e1e7/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 26 02:46:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd5333dbd59ca9eff3af1aae69318a6fccf48035d9a6f66a3110e21f0129e1e7/merged/var/lib/ceph/osd/ceph-4 supports timestamps until 2038 (0x7fffffff) Nov 26 02:46:05 localhost podman[32012]: 2025-11-26 07:46:05.318889831 +0000 UTC m=+0.228709627 container init 3ad16160daef4781ab26f85118fbcd06a2741eb9e46a71e630085924b8252ade (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-4-activate-test, vendor=Red Hat, Inc., vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, release=553, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.openshift.tags=rhceph ceph, RELEASE=main, version=7, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, name=rhceph, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, ceph=True, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux ) Nov 26 02:46:05 localhost podman[32012]: 2025-11-26 07:46:05.329231144 +0000 UTC m=+0.239050900 container start 3ad16160daef4781ab26f85118fbcd06a2741eb9e46a71e630085924b8252ade (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-4-activate-test, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, GIT_CLEAN=True, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_BRANCH=main, build-date=2025-09-24T08:57:55, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, architecture=x86_64, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 26 02:46:05 localhost podman[32012]: 2025-11-26 07:46:05.329533661 +0000 UTC m=+0.239353457 container attach 3ad16160daef4781ab26f85118fbcd06a2741eb9e46a71e630085924b8252ade (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-4-activate-test, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, ceph=True, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, release=553) Nov 26 02:46:05 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-4-activate-test[32208]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID] Nov 26 02:46:05 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-4-activate-test[32208]: [--no-systemd] [--no-tmpfs] Nov 26 02:46:05 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-4-activate-test[32208]: ceph-volume activate: error: unrecognized arguments: --bad-option Nov 26 02:46:05 localhost systemd[1]: libpod-3ad16160daef4781ab26f85118fbcd06a2741eb9e46a71e630085924b8252ade.scope: Deactivated successfully. Nov 26 02:46:05 localhost podman[32012]: 2025-11-26 07:46:05.554865335 +0000 UTC m=+0.464685091 container died 3ad16160daef4781ab26f85118fbcd06a2741eb9e46a71e630085924b8252ade (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-4-activate-test, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., ceph=True, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, version=7, io.buildah.version=1.33.12, io.openshift.expose-services=, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, release=553, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55) Nov 26 02:46:05 localhost systemd[1]: var-lib-containers-storage-overlay-edd2da199320ebd8e51c158f450edbc9b1d53893f3cdb849d79be45b65889c29-merged.mount: Deactivated successfully. Nov 26 02:46:05 localhost podman[32246]: 2025-11-26 07:46:05.653414226 +0000 UTC m=+0.087094952 container remove 3ad16160daef4781ab26f85118fbcd06a2741eb9e46a71e630085924b8252ade (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-4-activate-test, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.buildah.version=1.33.12, version=7, io.openshift.tags=rhceph ceph, vcs-type=git, release=553, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.component=rhceph-container, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux ) Nov 26 02:46:05 localhost systemd[1]: libpod-conmon-3ad16160daef4781ab26f85118fbcd06a2741eb9e46a71e630085924b8252ade.scope: Deactivated successfully. Nov 26 02:46:05 localhost systemd[1]: Reloading. Nov 26 02:46:06 localhost systemd-rc-local-generator[32301]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 02:46:06 localhost systemd-sysv-generator[32306]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 02:46:06 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 02:46:06 localhost systemd[1]: Reloading. Nov 26 02:46:06 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : purged_snaps scrub starts Nov 26 02:46:06 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : purged_snaps scrub ok Nov 26 02:46:06 localhost systemd-sysv-generator[32343]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 02:46:06 localhost systemd-rc-local-generator[32340]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 02:46:06 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 02:46:06 localhost ceph-osd[31674]: osd.0 0 done with init, starting boot process Nov 26 02:46:06 localhost ceph-osd[31674]: osd.0 0 start_boot Nov 26 02:46:06 localhost ceph-osd[31674]: osd.0 0 maybe_override_options_for_qos osd_max_backfills set to 1 Nov 26 02:46:06 localhost ceph-osd[31674]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active set to 0 Nov 26 02:46:06 localhost ceph-osd[31674]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3 Nov 26 02:46:06 localhost ceph-osd[31674]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10 Nov 26 02:46:06 localhost ceph-osd[31674]: osd.0 0 bench count 12288000 bsize 4 KiB Nov 26 02:46:06 localhost systemd[1]: Starting Ceph osd.4 for 0d5e5e6d-3c4b-5efe-8c65-346ae6715606... Nov 26 02:46:06 localhost podman[32407]: Nov 26 02:46:06 localhost podman[32407]: 2025-11-26 07:46:06.778070945 +0000 UTC m=+0.072950016 container create d948067ef47285257b2a7eb518a6c90d909dc3f505536feebd4b11e8ed19f6fc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-4-activate, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, ceph=True, RELEASE=main, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, vcs-type=git, name=rhceph, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.openshift.expose-services=) Nov 26 02:46:06 localhost systemd[1]: tmp-crun.p6k92k.mount: Deactivated successfully. Nov 26 02:46:06 localhost podman[32407]: 2025-11-26 07:46:06.737340819 +0000 UTC m=+0.032219900 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 02:46:06 localhost systemd[1]: Started libcrun container. Nov 26 02:46:06 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5711858d9e140793d26f3b67f8cf758d17aa80eaad175a764b14526edaa8e024/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 26 02:46:06 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5711858d9e140793d26f3b67f8cf758d17aa80eaad175a764b14526edaa8e024/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Nov 26 02:46:06 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5711858d9e140793d26f3b67f8cf758d17aa80eaad175a764b14526edaa8e024/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 26 02:46:06 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5711858d9e140793d26f3b67f8cf758d17aa80eaad175a764b14526edaa8e024/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 26 02:46:06 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5711858d9e140793d26f3b67f8cf758d17aa80eaad175a764b14526edaa8e024/merged/var/lib/ceph/osd/ceph-4 supports timestamps until 2038 (0x7fffffff) Nov 26 02:46:06 localhost podman[32407]: 2025-11-26 07:46:06.93355664 +0000 UTC m=+0.228435681 container init d948067ef47285257b2a7eb518a6c90d909dc3f505536feebd4b11e8ed19f6fc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-4-activate, io.k8s.description=Red Hat Ceph Storage 7, version=7, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.openshift.expose-services=, ceph=True, RELEASE=main, io.buildah.version=1.33.12, release=553, io.openshift.tags=rhceph ceph, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 26 02:46:06 localhost podman[32407]: 2025-11-26 07:46:06.942522979 +0000 UTC m=+0.237402020 container start d948067ef47285257b2a7eb518a6c90d909dc3f505536feebd4b11e8ed19f6fc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-4-activate, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, io.buildah.version=1.33.12, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, name=rhceph, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, vcs-type=git, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, release=553, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc.) Nov 26 02:46:06 localhost podman[32407]: 2025-11-26 07:46:06.942746874 +0000 UTC m=+0.237625905 container attach d948067ef47285257b2a7eb518a6c90d909dc3f505536feebd4b11e8ed19f6fc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-4-activate, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, RELEASE=main, version=7, com.redhat.component=rhceph-container, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., architecture=x86_64, GIT_BRANCH=main, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git) Nov 26 02:46:07 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-4-activate[32421]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 Nov 26 02:46:07 localhost bash[32407]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 Nov 26 02:46:07 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-4-activate[32421]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-4 --no-mon-config --dev /dev/mapper/ceph_vg1-ceph_lv1 Nov 26 02:46:07 localhost bash[32407]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-4 --no-mon-config --dev /dev/mapper/ceph_vg1-ceph_lv1 Nov 26 02:46:07 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-4-activate[32421]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg1-ceph_lv1 Nov 26 02:46:07 localhost bash[32407]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg1-ceph_lv1 Nov 26 02:46:07 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-4-activate[32421]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 Nov 26 02:46:07 localhost bash[32407]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 Nov 26 02:46:07 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-4-activate[32421]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg1-ceph_lv1 /var/lib/ceph/osd/ceph-4/block Nov 26 02:46:07 localhost bash[32407]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg1-ceph_lv1 /var/lib/ceph/osd/ceph-4/block Nov 26 02:46:07 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-4-activate[32421]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 Nov 26 02:46:07 localhost bash[32407]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 Nov 26 02:46:07 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-4-activate[32421]: --> ceph-volume raw activate successful for osd ID: 4 Nov 26 02:46:07 localhost bash[32407]: --> ceph-volume raw activate successful for osd ID: 4 Nov 26 02:46:07 localhost systemd[1]: libpod-d948067ef47285257b2a7eb518a6c90d909dc3f505536feebd4b11e8ed19f6fc.scope: Deactivated successfully. Nov 26 02:46:07 localhost podman[32407]: 2025-11-26 07:46:07.565559683 +0000 UTC m=+0.860438824 container died d948067ef47285257b2a7eb518a6c90d909dc3f505536feebd4b11e8ed19f6fc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-4-activate, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, io.buildah.version=1.33.12, version=7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, distribution-scope=public, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, ceph=True, RELEASE=main, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7) Nov 26 02:46:07 localhost podman[32552]: 2025-11-26 07:46:07.673465534 +0000 UTC m=+0.096591074 container remove d948067ef47285257b2a7eb518a6c90d909dc3f505536feebd4b11e8ed19f6fc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-4-activate, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, distribution-scope=public, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_CLEAN=True, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, RELEASE=main, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.component=rhceph-container, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 26 02:46:07 localhost systemd[1]: var-lib-containers-storage-overlay-5711858d9e140793d26f3b67f8cf758d17aa80eaad175a764b14526edaa8e024-merged.mount: Deactivated successfully. Nov 26 02:46:07 localhost podman[32613]: Nov 26 02:46:07 localhost podman[32613]: 2025-11-26 07:46:07.992133272 +0000 UTC m=+0.087714838 container create 0ff402139bf8cc1270ea516468dbd52d642102c90c7b752f2f048f605bc98702 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-4, version=7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., release=553, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, CEPH_POINT_RELEASE=, ceph=True, RELEASE=main, architecture=x86_64, io.buildah.version=1.33.12, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , GIT_BRANCH=main) Nov 26 02:46:08 localhost systemd[1]: tmp-crun.bow6LK.mount: Deactivated successfully. Nov 26 02:46:08 localhost podman[32613]: 2025-11-26 07:46:07.960527938 +0000 UTC m=+0.056109474 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 02:46:08 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/040891b192199b6bd59636fb030def068479397671b77e29f7d0eb54bb0a1fc1/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 26 02:46:08 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/040891b192199b6bd59636fb030def068479397671b77e29f7d0eb54bb0a1fc1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 26 02:46:08 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/040891b192199b6bd59636fb030def068479397671b77e29f7d0eb54bb0a1fc1/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Nov 26 02:46:08 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/040891b192199b6bd59636fb030def068479397671b77e29f7d0eb54bb0a1fc1/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 26 02:46:08 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/040891b192199b6bd59636fb030def068479397671b77e29f7d0eb54bb0a1fc1/merged/var/lib/ceph/osd/ceph-4 supports timestamps until 2038 (0x7fffffff) Nov 26 02:46:08 localhost podman[32613]: 2025-11-26 07:46:08.124992682 +0000 UTC m=+0.220574208 container init 0ff402139bf8cc1270ea516468dbd52d642102c90c7b752f2f048f605bc98702 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-4, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, name=rhceph, release=553, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, com.redhat.component=rhceph-container, GIT_BRANCH=main, architecture=x86_64, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7) Nov 26 02:46:08 localhost podman[32613]: 2025-11-26 07:46:08.135479409 +0000 UTC m=+0.231060945 container start 0ff402139bf8cc1270ea516468dbd52d642102c90c7b752f2f048f605bc98702 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-4, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, RELEASE=main, maintainer=Guillaume Abrioux , vcs-type=git, CEPH_POINT_RELEASE=, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, io.buildah.version=1.33.12, distribution-scope=public, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., name=rhceph, version=7, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 26 02:46:08 localhost bash[32613]: 0ff402139bf8cc1270ea516468dbd52d642102c90c7b752f2f048f605bc98702 Nov 26 02:46:08 localhost systemd[1]: Started Ceph osd.4 for 0d5e5e6d-3c4b-5efe-8c65-346ae6715606. Nov 26 02:46:08 localhost ceph-osd[32631]: set uid:gid to 167:167 (ceph:ceph) Nov 26 02:46:08 localhost ceph-osd[32631]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-osd, pid 2 Nov 26 02:46:08 localhost ceph-osd[32631]: pidfile_write: ignore empty --pid-file Nov 26 02:46:08 localhost ceph-osd[32631]: bdev(0x55933555ee00 /var/lib/ceph/osd/ceph-4/block) open path /var/lib/ceph/osd/ceph-4/block Nov 26 02:46:08 localhost ceph-osd[32631]: bdev(0x55933555ee00 /var/lib/ceph/osd/ceph-4/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-4/block failed: (22) Invalid argument Nov 26 02:46:08 localhost ceph-osd[32631]: bdev(0x55933555ee00 /var/lib/ceph/osd/ceph-4/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 26 02:46:08 localhost ceph-osd[32631]: bluestore(/var/lib/ceph/osd/ceph-4) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Nov 26 02:46:08 localhost ceph-osd[32631]: bdev(0x55933555f180 /var/lib/ceph/osd/ceph-4/block) open path /var/lib/ceph/osd/ceph-4/block Nov 26 02:46:08 localhost ceph-osd[32631]: bdev(0x55933555f180 /var/lib/ceph/osd/ceph-4/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-4/block failed: (22) Invalid argument Nov 26 02:46:08 localhost ceph-osd[32631]: bdev(0x55933555f180 /var/lib/ceph/osd/ceph-4/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 26 02:46:08 localhost ceph-osd[32631]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-4/block size 7.0 GiB Nov 26 02:46:08 localhost ceph-osd[32631]: bdev(0x55933555f180 /var/lib/ceph/osd/ceph-4/block) close Nov 26 02:46:08 localhost ceph-osd[32631]: bdev(0x55933555ee00 /var/lib/ceph/osd/ceph-4/block) close Nov 26 02:46:08 localhost ceph-osd[32631]: starting osd.4 osd_data /var/lib/ceph/osd/ceph-4 /var/lib/ceph/osd/ceph-4/journal Nov 26 02:46:08 localhost ceph-osd[31674]: osd.0 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 32.491 iops: 8317.585 elapsed_sec: 0.361 Nov 26 02:46:08 localhost ceph-osd[31674]: log_channel(cluster) log [WRN] : OSD bench result of 8317.584772 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.0. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd]. Nov 26 02:46:08 localhost ceph-osd[31674]: osd.0 0 waiting for initial osdmap Nov 26 02:46:08 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-0[31670]: 2025-11-26T07:46:08.735+0000 7f8624661640 -1 osd.0 0 waiting for initial osdmap Nov 26 02:46:08 localhost ceph-osd[32631]: load: jerasure load: lrc Nov 26 02:46:08 localhost ceph-osd[32631]: bdev(0x55933555ee00 /var/lib/ceph/osd/ceph-4/block) open path /var/lib/ceph/osd/ceph-4/block Nov 26 02:46:08 localhost ceph-osd[32631]: bdev(0x55933555ee00 /var/lib/ceph/osd/ceph-4/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-4/block failed: (22) Invalid argument Nov 26 02:46:08 localhost ceph-osd[32631]: bdev(0x55933555ee00 /var/lib/ceph/osd/ceph-4/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 26 02:46:08 localhost ceph-osd[32631]: bluestore(/var/lib/ceph/osd/ceph-4) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Nov 26 02:46:08 localhost ceph-osd[32631]: bdev(0x55933555ee00 /var/lib/ceph/osd/ceph-4/block) close Nov 26 02:46:08 localhost ceph-osd[31674]: osd.0 10 crush map has features 288514050185494528, adjusting msgr requires for clients Nov 26 02:46:08 localhost ceph-osd[31674]: osd.0 10 crush map has features 288514050185494528 was 288232575208792577, adjusting msgr requires for mons Nov 26 02:46:08 localhost ceph-osd[31674]: osd.0 10 crush map has features 3314932999778484224, adjusting msgr requires for osds Nov 26 02:46:08 localhost ceph-osd[31674]: osd.0 10 check_osdmap_features require_osd_release unknown -> reef Nov 26 02:46:08 localhost ceph-osd[31674]: osd.0 10 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory Nov 26 02:46:08 localhost ceph-osd[31674]: osd.0 10 set_numa_affinity not setting numa affinity Nov 26 02:46:08 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-0[31670]: 2025-11-26T07:46:08.755+0000 7f861f476640 -1 osd.0 10 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory Nov 26 02:46:08 localhost ceph-osd[31674]: osd.0 10 _collect_metadata loop3: no unique device id for loop3: fallback method has no model nor serial Nov 26 02:46:08 localhost podman[32721]: Nov 26 02:46:08 localhost podman[32721]: 2025-11-26 07:46:08.893034525 +0000 UTC m=+0.053719875 container create cb92bf62a2fe03b0141e67d0c75a9b0c07e9f832bb9eb29828e410c377d8b042 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_booth, vcs-type=git, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, architecture=x86_64, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, GIT_CLEAN=True, RELEASE=main, com.redhat.component=rhceph-container, name=rhceph, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, release=553, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 26 02:46:08 localhost systemd[1]: Started libpod-conmon-cb92bf62a2fe03b0141e67d0c75a9b0c07e9f832bb9eb29828e410c377d8b042.scope. Nov 26 02:46:08 localhost systemd[1]: Started libcrun container. Nov 26 02:46:08 localhost podman[32721]: 2025-11-26 07:46:08.959170474 +0000 UTC m=+0.119855824 container init cb92bf62a2fe03b0141e67d0c75a9b0c07e9f832bb9eb29828e410c377d8b042 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_booth, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, architecture=x86_64, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., release=553, name=rhceph, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, io.buildah.version=1.33.12) Nov 26 02:46:08 localhost podman[32721]: 2025-11-26 07:46:08.864282322 +0000 UTC m=+0.024967672 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 02:46:08 localhost podman[32721]: 2025-11-26 07:46:08.968369059 +0000 UTC m=+0.129054399 container start cb92bf62a2fe03b0141e67d0c75a9b0c07e9f832bb9eb29828e410c377d8b042 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_booth, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, maintainer=Guillaume Abrioux , RELEASE=main, vendor=Red Hat, Inc., io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, architecture=x86_64, version=7, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, io.openshift.expose-services=, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, release=553) Nov 26 02:46:08 localhost podman[32721]: 2025-11-26 07:46:08.968574544 +0000 UTC m=+0.129259904 container attach cb92bf62a2fe03b0141e67d0c75a9b0c07e9f832bb9eb29828e410c377d8b042 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_booth, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, maintainer=Guillaume Abrioux , vcs-type=git, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, version=7, vendor=Red Hat, Inc., RELEASE=main, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, name=rhceph, distribution-scope=public, GIT_BRANCH=main, description=Red Hat Ceph Storage 7) Nov 26 02:46:08 localhost great_booth[32736]: 167 167 Nov 26 02:46:08 localhost systemd[1]: libpod-cb92bf62a2fe03b0141e67d0c75a9b0c07e9f832bb9eb29828e410c377d8b042.scope: Deactivated successfully. Nov 26 02:46:08 localhost podman[32721]: 2025-11-26 07:46:08.97331976 +0000 UTC m=+0.134005110 container died cb92bf62a2fe03b0141e67d0c75a9b0c07e9f832bb9eb29828e410c377d8b042 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_booth, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, release=553, name=rhceph, distribution-scope=public, io.openshift.expose-services=, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, architecture=x86_64, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 26 02:46:09 localhost ceph-osd[32631]: bdev(0x55933555ee00 /var/lib/ceph/osd/ceph-4/block) open path /var/lib/ceph/osd/ceph-4/block Nov 26 02:46:09 localhost ceph-osd[32631]: bdev(0x55933555ee00 /var/lib/ceph/osd/ceph-4/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-4/block failed: (22) Invalid argument Nov 26 02:46:09 localhost ceph-osd[32631]: bdev(0x55933555ee00 /var/lib/ceph/osd/ceph-4/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 26 02:46:09 localhost ceph-osd[32631]: bluestore(/var/lib/ceph/osd/ceph-4) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Nov 26 02:46:09 localhost ceph-osd[32631]: bdev(0x55933555ee00 /var/lib/ceph/osd/ceph-4/block) close Nov 26 02:46:09 localhost systemd[1]: var-lib-containers-storage-overlay-3d0d3fa74ea62a054274fb74692718f5bde12f58e73660e3a5f1403b66cd1c47-merged.mount: Deactivated successfully. Nov 26 02:46:09 localhost podman[32741]: 2025-11-26 07:46:09.067600107 +0000 UTC m=+0.083292509 container remove cb92bf62a2fe03b0141e67d0c75a9b0c07e9f832bb9eb29828e410c377d8b042 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_booth, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.openshift.expose-services=, ceph=True, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, release=553, GIT_CLEAN=True, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, name=rhceph) Nov 26 02:46:09 localhost systemd[1]: libpod-conmon-cb92bf62a2fe03b0141e67d0c75a9b0c07e9f832bb9eb29828e410c377d8b042.scope: Deactivated successfully. Nov 26 02:46:09 localhost podman[32767]: Nov 26 02:46:09 localhost podman[32767]: 2025-11-26 07:46:09.227058609 +0000 UTC m=+0.061971208 container create 5b590c74d7df20ca3e57273cfe8700913ca4374e140a4ba359f32bb06247cb7b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_kilby, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, CEPH_POINT_RELEASE=, distribution-scope=public, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, io.openshift.expose-services=) Nov 26 02:46:09 localhost systemd[1]: Started libpod-conmon-5b590c74d7df20ca3e57273cfe8700913ca4374e140a4ba359f32bb06247cb7b.scope. Nov 26 02:46:09 localhost systemd[1]: Started libcrun container. Nov 26 02:46:09 localhost ceph-osd[32631]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second Nov 26 02:46:09 localhost ceph-osd[32631]: osd.4:0.OSDShard using op scheduler mclock_scheduler, cutoff=196 Nov 26 02:46:09 localhost ceph-osd[32631]: bdev(0x55933555ee00 /var/lib/ceph/osd/ceph-4/block) open path /var/lib/ceph/osd/ceph-4/block Nov 26 02:46:09 localhost ceph-osd[32631]: bdev(0x55933555ee00 /var/lib/ceph/osd/ceph-4/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-4/block failed: (22) Invalid argument Nov 26 02:46:09 localhost ceph-osd[32631]: bdev(0x55933555ee00 /var/lib/ceph/osd/ceph-4/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 26 02:46:09 localhost ceph-osd[32631]: bluestore(/var/lib/ceph/osd/ceph-4) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Nov 26 02:46:09 localhost ceph-osd[32631]: bdev(0x55933555f180 /var/lib/ceph/osd/ceph-4/block) open path /var/lib/ceph/osd/ceph-4/block Nov 26 02:46:09 localhost ceph-osd[32631]: bdev(0x55933555f180 /var/lib/ceph/osd/ceph-4/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-4/block failed: (22) Invalid argument Nov 26 02:46:09 localhost ceph-osd[32631]: bdev(0x55933555f180 /var/lib/ceph/osd/ceph-4/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 26 02:46:09 localhost ceph-osd[32631]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-4/block size 7.0 GiB Nov 26 02:46:09 localhost ceph-osd[32631]: bluefs mount Nov 26 02:46:09 localhost ceph-osd[32631]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000 Nov 26 02:46:09 localhost ceph-osd[32631]: bluefs mount shared_bdev_used = 0 Nov 26 02:46:09 localhost ceph-osd[32631]: bluestore(/var/lib/ceph/osd/ceph-4) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540 Nov 26 02:46:09 localhost podman[32767]: 2025-11-26 07:46:09.197033954 +0000 UTC m=+0.031946583 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: RocksDB version: 7.9.2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Git sha 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Compile date 2025-09-23 00:00:00 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: DB SUMMARY Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: DB Session ID: RKQSZY13P3MD8JIYY8YX Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: CURRENT file: CURRENT Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: IDENTITY file: IDENTITY Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: MANIFEST file: MANIFEST-000032 size: 1007 Bytes Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: SST files in db.slow dir, Total Num: 0, files: Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.error_if_exists: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.create_if_missing: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.paranoid_checks: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.flush_verify_memtable_count: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.env: 0x5593357f2cb0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.fs: LegacyFileSystem Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.info_log: 0x5593364dc780 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_file_opening_threads: 16 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.statistics: (nil) Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.use_fsync: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_log_file_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_manifest_file_size: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.log_file_time_to_roll: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.keep_log_file_num: 1000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.recycle_log_file_num: 0 Nov 26 02:46:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/264054dccb3b5760eeb9557a9b09be06f74a804418a8aead089fd1f2f3be0c29/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.allow_fallocate: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.allow_mmap_reads: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.allow_mmap_writes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.use_direct_reads: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.create_missing_column_families: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.db_log_dir: Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.wal_dir: db.wal Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_cache_numshardbits: 6 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.WAL_ttl_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.WAL_size_limit_MB: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.manifest_preallocation_size: 4194304 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.is_fd_close_on_exec: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.advise_random_on_open: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.db_write_buffer_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.write_buffer_manager: 0x559335548140 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.access_hint_on_compaction_start: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.random_access_max_buffer_size: 1048576 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.use_adaptive_mutex: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.rate_limiter: (nil) Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.wal_recovery_mode: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_thread_tracking: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_pipelined_write: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.unordered_write: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.allow_concurrent_memtable_write: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.write_thread_max_yield_usec: 100 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.write_thread_slow_yield_usec: 3 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.row_cache: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.wal_filter: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.avoid_flush_during_recovery: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.allow_ingest_behind: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.two_write_queues: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.manual_wal_flush: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.wal_compression: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.atomic_flush: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.persist_stats_to_disk: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.write_dbid_to_manifest: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.log_readahead_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.file_checksum_gen_factory: Unknown Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.best_efforts_recovery: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.allow_data_in_errors: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.db_host_id: __hostname__ Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enforce_single_del_contracts: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_background_jobs: 4 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_background_compactions: -1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_subcompactions: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.avoid_flush_during_shutdown: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.writable_file_max_buffer_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.delayed_write_rate : 16777216 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_total_wal_size: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.stats_dump_period_sec: 600 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.stats_persist_period_sec: 600 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.stats_history_buffer_size: 1048576 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_open_files: -1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bytes_per_sync: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.wal_bytes_per_sync: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.strict_bytes_per_sync: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_readahead_size: 2097152 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_background_flushes: -1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Compression algorithms supported: Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: #011kZSTD supported: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: #011kXpressCompression supported: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: #011kBZip2Compression supported: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: #011kLZ4Compression supported: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: #011kZlibCompression supported: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: #011kLZ4HCCompression supported: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: #011kSnappyCompression supported: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Fast CRC32 supported: Supported on x86 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: DMutex implementation: pthread_mutex_t Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default) Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.merge_operator: .T:int64_array.b:bitwise_xor Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5593364dc940)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x559335536850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression: LZ4 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.num_levels: 7 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_properties_collectors: Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0) Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]: Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.merge_operator: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5593364dc940)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x559335536850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression: LZ4 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.num_levels: 7 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1) Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]: Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.merge_operator: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5593364dc940)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x559335536850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression: LZ4 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.num_levels: 7 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2) Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]: Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.merge_operator: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5593364dc940)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x559335536850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression: LZ4 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.num_levels: 7 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/264054dccb3b5760eeb9557a9b09be06f74a804418a8aead089fd1f2f3be0c29/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0) Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]: Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.merge_operator: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5593364dc940)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x559335536850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression: LZ4 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.num_levels: 7 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1) Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]: Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.merge_operator: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5593364dc940)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x559335536850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression: LZ4 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.num_levels: 7 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2) Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]: Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.merge_operator: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5593364dc940)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x559335536850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression: LZ4 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.num_levels: 7 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0) Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]: Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.merge_operator: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5593364dcb60)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5593355362d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression: LZ4 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.num_levels: 7 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1) Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]: Nov 26 02:46:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/264054dccb3b5760eeb9557a9b09be06f74a804418a8aead089fd1f2f3be0c29/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.merge_operator: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5593364dcb60)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5593355362d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression: LZ4 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.num_levels: 7 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2) Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]: Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.merge_operator: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5593364dcb60)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5593355362d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression: LZ4 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.num_levels: 7 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L) Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P) Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 2856d745-bff3-4075-9a36-d16a211b3a4b Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764143169312423, "job": 1, "event": "recovery_started", "wal_files": [31]} Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764143169312711, "job": 1, "event": "recovery_finished"} Nov 26 02:46:09 localhost ceph-osd[32631]: bluestore(/var/lib/ceph/osd/ceph-4) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 Nov 26 02:46:09 localhost ceph-osd[32631]: bluestore(/var/lib/ceph/osd/ceph-4) _open_super_meta old nid_max 1025 Nov 26 02:46:09 localhost ceph-osd[32631]: bluestore(/var/lib/ceph/osd/ceph-4) _open_super_meta old blobid_max 10240 Nov 26 02:46:09 localhost ceph-osd[32631]: bluestore(/var/lib/ceph/osd/ceph-4) _open_super_meta ondisk_format 4 compat_ondisk_format 3 Nov 26 02:46:09 localhost ceph-osd[32631]: bluestore(/var/lib/ceph/osd/ceph-4) _open_super_meta min_alloc_size 0x1000 Nov 26 02:46:09 localhost ceph-osd[32631]: freelist init Nov 26 02:46:09 localhost ceph-osd[32631]: freelist _read_cfg Nov 26 02:46:09 localhost ceph-osd[32631]: bluestore(/var/lib/ceph/osd/ceph-4) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete Nov 26 02:46:09 localhost ceph-osd[32631]: bluefs umount Nov 26 02:46:09 localhost ceph-osd[32631]: bdev(0x55933555f180 /var/lib/ceph/osd/ceph-4/block) close Nov 26 02:46:09 localhost podman[32767]: 2025-11-26 07:46:09.333558015 +0000 UTC m=+0.168470614 container init 5b590c74d7df20ca3e57273cfe8700913ca4374e140a4ba359f32bb06247cb7b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_kilby, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, vcs-type=git, release=553, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, GIT_BRANCH=main, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., io.buildah.version=1.33.12) Nov 26 02:46:09 localhost podman[32767]: 2025-11-26 07:46:09.343850206 +0000 UTC m=+0.178762805 container start 5b590c74d7df20ca3e57273cfe8700913ca4374e140a4ba359f32bb06247cb7b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_kilby, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, GIT_CLEAN=True, RELEASE=main, ceph=True, com.redhat.component=rhceph-container, architecture=x86_64, name=rhceph, distribution-scope=public, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, io.openshift.expose-services=) Nov 26 02:46:09 localhost podman[32767]: 2025-11-26 07:46:09.344151433 +0000 UTC m=+0.179064032 container attach 5b590c74d7df20ca3e57273cfe8700913ca4374e140a4ba359f32bb06247cb7b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_kilby, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, name=rhceph, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, architecture=x86_64, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, GIT_BRANCH=main, vendor=Red Hat, Inc., vcs-type=git, release=553, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, ceph=True) Nov 26 02:46:09 localhost ceph-osd[31674]: osd.0 11 state: booting -> active Nov 26 02:46:09 localhost ceph-osd[32631]: bdev(0x55933555f180 /var/lib/ceph/osd/ceph-4/block) open path /var/lib/ceph/osd/ceph-4/block Nov 26 02:46:09 localhost ceph-osd[32631]: bdev(0x55933555f180 /var/lib/ceph/osd/ceph-4/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-4/block failed: (22) Invalid argument Nov 26 02:46:09 localhost ceph-osd[32631]: bdev(0x55933555f180 /var/lib/ceph/osd/ceph-4/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 26 02:46:09 localhost ceph-osd[32631]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-4/block size 7.0 GiB Nov 26 02:46:09 localhost ceph-osd[32631]: bluefs mount Nov 26 02:46:09 localhost ceph-osd[32631]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000 Nov 26 02:46:09 localhost ceph-osd[32631]: bluefs mount shared_bdev_used = 4718592 Nov 26 02:46:09 localhost ceph-osd[32631]: bluestore(/var/lib/ceph/osd/ceph-4) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: RocksDB version: 7.9.2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Git sha 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Compile date 2025-09-23 00:00:00 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: DB SUMMARY Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: DB Session ID: RKQSZY13P3MD8JIYY8YW Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: CURRENT file: CURRENT Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: IDENTITY file: IDENTITY Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: MANIFEST file: MANIFEST-000032 size: 1007 Bytes Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: SST files in db.slow dir, Total Num: 0, files: Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.error_if_exists: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.create_if_missing: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.paranoid_checks: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.flush_verify_memtable_count: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.env: 0x5593357f3dc0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.fs: LegacyFileSystem Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.info_log: 0x5593364ddd00 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_file_opening_threads: 16 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.statistics: (nil) Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.use_fsync: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_log_file_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_manifest_file_size: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.log_file_time_to_roll: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.keep_log_file_num: 1000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.recycle_log_file_num: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.allow_fallocate: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.allow_mmap_reads: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.allow_mmap_writes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.use_direct_reads: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.create_missing_column_families: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.db_log_dir: Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.wal_dir: db.wal Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_cache_numshardbits: 6 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.WAL_ttl_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.WAL_size_limit_MB: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.manifest_preallocation_size: 4194304 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.is_fd_close_on_exec: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.advise_random_on_open: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.db_write_buffer_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.write_buffer_manager: 0x559335548140 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.access_hint_on_compaction_start: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.random_access_max_buffer_size: 1048576 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.use_adaptive_mutex: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.rate_limiter: (nil) Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.wal_recovery_mode: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_thread_tracking: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_pipelined_write: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.unordered_write: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.allow_concurrent_memtable_write: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.write_thread_max_yield_usec: 100 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.write_thread_slow_yield_usec: 3 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.row_cache: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.wal_filter: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.avoid_flush_during_recovery: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.allow_ingest_behind: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.two_write_queues: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.manual_wal_flush: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.wal_compression: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.atomic_flush: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.persist_stats_to_disk: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.write_dbid_to_manifest: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.log_readahead_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.file_checksum_gen_factory: Unknown Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.best_efforts_recovery: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.allow_data_in_errors: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.db_host_id: __hostname__ Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enforce_single_del_contracts: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_background_jobs: 4 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_background_compactions: -1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_subcompactions: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.avoid_flush_during_shutdown: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.writable_file_max_buffer_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.delayed_write_rate : 16777216 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_total_wal_size: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.stats_dump_period_sec: 600 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.stats_persist_period_sec: 600 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.stats_history_buffer_size: 1048576 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_open_files: -1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bytes_per_sync: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.wal_bytes_per_sync: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.strict_bytes_per_sync: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_readahead_size: 2097152 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_background_flushes: -1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Compression algorithms supported: Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: #011kZSTD supported: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: #011kXpressCompression supported: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: #011kBZip2Compression supported: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: #011kLZ4Compression supported: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: #011kZlibCompression supported: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: #011kLZ4HCCompression supported: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: #011kSnappyCompression supported: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Fast CRC32 supported: Supported on x86 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: DMutex implementation: pthread_mutex_t Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default) Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.merge_operator: .T:int64_array.b:bitwise_xor Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5593365440a0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5593355362d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression: LZ4 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.num_levels: 7 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_properties_collectors: Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0) Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]: Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.merge_operator: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5593365440a0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5593355362d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression: LZ4 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.num_levels: 7 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1) Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]: Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.merge_operator: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5593365440a0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5593355362d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression: LZ4 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.num_levels: 7 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2) Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]: Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.merge_operator: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5593365440a0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5593355362d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression: LZ4 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.num_levels: 7 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0) Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]: Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.merge_operator: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5593365440a0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5593355362d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression: LZ4 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.num_levels: 7 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1) Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]: Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.merge_operator: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5593365440a0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5593355362d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression: LZ4 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.num_levels: 7 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2) Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]: Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.merge_operator: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5593365440a0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5593355362d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression: LZ4 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.num_levels: 7 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0) Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]: Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.merge_operator: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5593365442e0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x559335537610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression: LZ4 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.num_levels: 7 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1) Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]: Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.merge_operator: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5593365442e0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x559335537610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression: LZ4 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.num_levels: 7 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2) Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]: Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.merge_operator: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_filter_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.sst_partitioner_factory: None Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5593365442e0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x559335537610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.write_buffer_size: 16777216 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number: 64 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression: LZ4 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression: Disabled Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.num_levels: 7 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.level: 32767 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.enabled: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.arena_block_size: 1048576 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_support: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.bloom_locality: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.max_successive_merges: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.force_consistency_checks: 1 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.ttl: 2592000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_files: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.min_blob_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_size: 268435456 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L) Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P) Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 2856d745-bff3-4075-9a36-d16a211b3a4b Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764143169580166, "job": 1, "event": "recovery_started", "wal_files": [31]} Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764143169602433, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1261, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764143169, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2856d745-bff3-4075-9a36-d16a211b3a4b", "db_session_id": "RKQSZY13P3MD8JIYY8YW", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}} Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764143169631460, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1609, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764143169, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2856d745-bff3-4075-9a36-d16a211b3a4b", "db_session_id": "RKQSZY13P3MD8JIYY8YW", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}} Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764143169657856, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764143169, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2856d745-bff3-4075-9a36-d16a211b3a4b", "db_session_id": "RKQSZY13P3MD8JIYY8YW", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}} Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl_open.cc:1432] Failed to truncate log #31: IO error: No such file or directory: While open a file for appending: db.wal/000031.log: No such file or directory Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764143169665516, "job": 1, "event": "recovery_finished"} Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/version_set.cc:5047] Creating manifest 40 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x5593355fa700 Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: DB pointer 0x559336433a00 Nov 26 02:46:09 localhost ceph-osd[32631]: bluestore(/var/lib/ceph/osd/ceph-4) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 Nov 26 02:46:09 localhost ceph-osd[32631]: bluestore(/var/lib/ceph/osd/ceph-4) _upgrade_super from 4, latest 4 Nov 26 02:46:09 localhost ceph-osd[32631]: bluestore(/var/lib/ceph/osd/ceph-4) _upgrade_super done Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 26 02:46:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.1 total, 0.1 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.02 0.00 1 0.022 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.02 0.00 1 0.022 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.02 0.00 1 0.022 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1 0.02 0.00 1 0.022 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5593355362d0#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 7.9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5593355362d0#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 7.9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012 Nov 26 02:46:09 localhost ceph-osd[32631]: /builddir/build/BUILD/ceph-18.2.1/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs Nov 26 02:46:09 localhost ceph-osd[32631]: /builddir/build/BUILD/ceph-18.2.1/src/cls/hello/cls_hello.cc:316: loading cls_hello Nov 26 02:46:09 localhost ceph-osd[32631]: _get_class not permitted to load lua Nov 26 02:46:09 localhost ceph-osd[32631]: _get_class not permitted to load sdk Nov 26 02:46:09 localhost ceph-osd[32631]: _get_class not permitted to load test_remote_reads Nov 26 02:46:09 localhost ceph-osd[32631]: osd.4 0 crush map has features 288232575208783872, adjusting msgr requires for clients Nov 26 02:46:09 localhost ceph-osd[32631]: osd.4 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons Nov 26 02:46:09 localhost ceph-osd[32631]: osd.4 0 crush map has features 288232575208783872, adjusting msgr requires for osds Nov 26 02:46:09 localhost ceph-osd[32631]: osd.4 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature Nov 26 02:46:09 localhost ceph-osd[32631]: osd.4 0 load_pgs Nov 26 02:46:09 localhost ceph-osd[32631]: osd.4 0 load_pgs opened 0 pgs Nov 26 02:46:09 localhost ceph-osd[32631]: osd.4 0 log_to_monitors true Nov 26 02:46:09 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-4[32627]: 2025-11-26T07:46:09.704+0000 7f43abf29a80 -1 osd.4 0 log_to_monitors true Nov 26 02:46:09 localhost relaxed_kilby[32783]: { Nov 26 02:46:09 localhost relaxed_kilby[32783]: "4c7370a1-c96a-417f-bde2-a93f51ef7561": { Nov 26 02:46:09 localhost relaxed_kilby[32783]: "ceph_fsid": "0d5e5e6d-3c4b-5efe-8c65-346ae6715606", Nov 26 02:46:09 localhost relaxed_kilby[32783]: "device": "/dev/mapper/ceph_vg0-ceph_lv0", Nov 26 02:46:09 localhost relaxed_kilby[32783]: "osd_id": 0, Nov 26 02:46:09 localhost relaxed_kilby[32783]: "osd_uuid": "4c7370a1-c96a-417f-bde2-a93f51ef7561", Nov 26 02:46:09 localhost relaxed_kilby[32783]: "type": "bluestore" Nov 26 02:46:09 localhost relaxed_kilby[32783]: }, Nov 26 02:46:09 localhost relaxed_kilby[32783]: "f9257f78-62ea-450a-a79b-9944ac21c834": { Nov 26 02:46:09 localhost relaxed_kilby[32783]: "ceph_fsid": "0d5e5e6d-3c4b-5efe-8c65-346ae6715606", Nov 26 02:46:09 localhost relaxed_kilby[32783]: "device": "/dev/mapper/ceph_vg1-ceph_lv1", Nov 26 02:46:09 localhost relaxed_kilby[32783]: "osd_id": 4, Nov 26 02:46:09 localhost relaxed_kilby[32783]: "osd_uuid": "f9257f78-62ea-450a-a79b-9944ac21c834", Nov 26 02:46:09 localhost relaxed_kilby[32783]: "type": "bluestore" Nov 26 02:46:09 localhost relaxed_kilby[32783]: } Nov 26 02:46:09 localhost relaxed_kilby[32783]: } Nov 26 02:46:09 localhost systemd[1]: libpod-5b590c74d7df20ca3e57273cfe8700913ca4374e140a4ba359f32bb06247cb7b.scope: Deactivated successfully. Nov 26 02:46:09 localhost podman[32767]: 2025-11-26 07:46:09.911480536 +0000 UTC m=+0.746393125 container died 5b590c74d7df20ca3e57273cfe8700913ca4374e140a4ba359f32bb06247cb7b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_kilby, io.buildah.version=1.33.12, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, release=553, RELEASE=main, distribution-scope=public, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 26 02:46:09 localhost systemd[1]: var-lib-containers-storage-overlay-264054dccb3b5760eeb9557a9b09be06f74a804418a8aead089fd1f2f3be0c29-merged.mount: Deactivated successfully. Nov 26 02:46:10 localhost podman[33228]: 2025-11-26 07:46:10.012267122 +0000 UTC m=+0.086085178 container remove 5b590c74d7df20ca3e57273cfe8700913ca4374e140a4ba359f32bb06247cb7b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_kilby, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, architecture=x86_64, version=7, GIT_CLEAN=True, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, name=rhceph, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.12, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7) Nov 26 02:46:10 localhost systemd[1]: libpod-conmon-5b590c74d7df20ca3e57273cfe8700913ca4374e140a4ba359f32bb06247cb7b.scope: Deactivated successfully. Nov 26 02:46:10 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : purged_snaps scrub starts Nov 26 02:46:10 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : purged_snaps scrub ok Nov 26 02:46:11 localhost podman[33351]: 2025-11-26 07:46:11.47444444 +0000 UTC m=+0.087033711 container exec a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, GIT_CLEAN=True, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, vcs-type=git, io.openshift.tags=rhceph ceph, name=rhceph, release=553, description=Red Hat Ceph Storage 7, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 26 02:46:11 localhost ceph-osd[32631]: osd.4 0 done with init, starting boot process Nov 26 02:46:11 localhost ceph-osd[32631]: osd.4 0 start_boot Nov 26 02:46:11 localhost ceph-osd[32631]: osd.4 0 maybe_override_options_for_qos osd_max_backfills set to 1 Nov 26 02:46:11 localhost ceph-osd[32631]: osd.4 0 maybe_override_options_for_qos osd_recovery_max_active set to 0 Nov 26 02:46:11 localhost ceph-osd[32631]: osd.4 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3 Nov 26 02:46:11 localhost ceph-osd[32631]: osd.4 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10 Nov 26 02:46:11 localhost ceph-osd[32631]: osd.4 0 bench count 12288000 bsize 4 KiB Nov 26 02:46:11 localhost ceph-osd[31674]: osd.0 13 crush map has features 288514051259236352, adjusting msgr requires for clients Nov 26 02:46:11 localhost ceph-osd[31674]: osd.0 13 crush map has features 288514051259236352 was 288514050185503233, adjusting msgr requires for mons Nov 26 02:46:11 localhost ceph-osd[31674]: osd.0 13 crush map has features 3314933000852226048, adjusting msgr requires for osds Nov 26 02:46:11 localhost podman[33351]: 2025-11-26 07:46:11.618294739 +0000 UTC m=+0.230884000 container exec_died a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, version=7, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , architecture=x86_64, release=553, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vendor=Red Hat, Inc., RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git) Nov 26 02:46:13 localhost ceph-osd[31674]: osd.0 pg_epoch: 13 pg[1.0( empty local-lis/les=0/0 n=0 ec=13/13 lis/c=0/0 les/c/f=0/0/0 sis=13) [2,0] r=1 lpr=13 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 02:46:13 localhost podman[33547]: Nov 26 02:46:13 localhost podman[33547]: 2025-11-26 07:46:13.547488354 +0000 UTC m=+0.072920645 container create 23c31a8f753791b820366bfde7f6e10477566315fd3285bd7d8d4c11572774a6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_franklin, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, architecture=x86_64, release=553, distribution-scope=public, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , GIT_CLEAN=True, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., ceph=True, GIT_BRANCH=main, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, version=7, io.openshift.tags=rhceph ceph) Nov 26 02:46:13 localhost systemd[1]: Started libpod-conmon-23c31a8f753791b820366bfde7f6e10477566315fd3285bd7d8d4c11572774a6.scope. Nov 26 02:46:13 localhost systemd[1]: Started libcrun container. Nov 26 02:46:13 localhost podman[33547]: 2025-11-26 07:46:13.505655811 +0000 UTC m=+0.031088132 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 02:46:13 localhost podman[33547]: 2025-11-26 07:46:13.618162263 +0000 UTC m=+0.143594564 container init 23c31a8f753791b820366bfde7f6e10477566315fd3285bd7d8d4c11572774a6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_franklin, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhceph ceph, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , GIT_CLEAN=True, distribution-scope=public, build-date=2025-09-24T08:57:55, version=7, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 26 02:46:13 localhost podman[33547]: 2025-11-26 07:46:13.629449099 +0000 UTC m=+0.154881420 container start 23c31a8f753791b820366bfde7f6e10477566315fd3285bd7d8d4c11572774a6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_franklin, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-type=git, com.redhat.component=rhceph-container, io.openshift.expose-services=, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, name=rhceph) Nov 26 02:46:13 localhost podman[33547]: 2025-11-26 07:46:13.630001993 +0000 UTC m=+0.155434284 container attach 23c31a8f753791b820366bfde7f6e10477566315fd3285bd7d8d4c11572774a6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_franklin, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.openshift.expose-services=, architecture=x86_64, release=553, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, distribution-scope=public, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 26 02:46:13 localhost affectionate_franklin[33563]: 167 167 Nov 26 02:46:13 localhost systemd[1]: libpod-23c31a8f753791b820366bfde7f6e10477566315fd3285bd7d8d4c11572774a6.scope: Deactivated successfully. Nov 26 02:46:13 localhost podman[33547]: 2025-11-26 07:46:13.634982835 +0000 UTC m=+0.160415176 container died 23c31a8f753791b820366bfde7f6e10477566315fd3285bd7d8d4c11572774a6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_franklin, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, RELEASE=main, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, build-date=2025-09-24T08:57:55, version=7, vendor=Red Hat, Inc., ceph=True, distribution-scope=public, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main) Nov 26 02:46:13 localhost ceph-osd[31674]: osd.0 pg_epoch: 15 pg[1.0( v 14'5 (0'0,14'5] local-lis/les=13/14 n=2 ec=13/13 lis/c=13/0 les/c/f=14/0/0 sis=15 pruub=15.769296646s) [2,0,3] r=1 lpr=15 pi=[13,15)/1 luod=0'0 lua=0'0 crt=14'5 lcod 14'4 mlcod 0'0 active pruub 24.129804611s@ mbc={}] start_peering_interval up [2,0] -> [2,0,3], acting [2,0] -> [2,0,3], acting_primary 2 -> 2, up_primary 2 -> 2, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 02:46:13 localhost ceph-osd[31674]: osd.0 pg_epoch: 15 pg[1.0( v 14'5 (0'0,14'5] local-lis/les=13/14 n=2 ec=13/13 lis/c=13/0 les/c/f=14/0/0 sis=15 pruub=15.769179344s) [2,0,3] r=1 lpr=15 pi=[13,15)/1 crt=14'5 lcod 14'4 mlcod 0'0 unknown NOTIFY pruub 24.129804611s@ mbc={}] state: transitioning to Stray Nov 26 02:46:13 localhost systemd[1]: var-lib-containers-storage-overlay-6d4d665884b6ecde60b817f1312414a74fa3c5e1faa90e7d377e73bdb2dac310-merged.mount: Deactivated successfully. Nov 26 02:46:13 localhost podman[33569]: 2025-11-26 07:46:13.754543171 +0000 UTC m=+0.107992644 container remove 23c31a8f753791b820366bfde7f6e10477566315fd3285bd7d8d4c11572774a6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_franklin, ceph=True, version=7, RELEASE=main, com.redhat.component=rhceph-container, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, vcs-type=git, architecture=x86_64, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, release=553, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7) Nov 26 02:46:13 localhost systemd[1]: libpod-conmon-23c31a8f753791b820366bfde7f6e10477566315fd3285bd7d8d4c11572774a6.scope: Deactivated successfully. Nov 26 02:46:13 localhost podman[33590]: Nov 26 02:46:13 localhost podman[33590]: 2025-11-26 07:46:13.930614259 +0000 UTC m=+0.060229025 container create e60d185a20979700fe9a9672941da0757847d451506ff8109c7034a3d5fa5936 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_keller, CEPH_POINT_RELEASE=, version=7, GIT_BRANCH=main, ceph=True, distribution-scope=public, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.openshift.expose-services=, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.buildah.version=1.33.12) Nov 26 02:46:13 localhost systemd[1]: Started libpod-conmon-e60d185a20979700fe9a9672941da0757847d451506ff8109c7034a3d5fa5936.scope. Nov 26 02:46:13 localhost systemd[1]: Started libcrun container. Nov 26 02:46:13 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cfec0ebc71903e3af53d03791dd2645a90d91f986f60293256b593e205c01163/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 26 02:46:14 localhost podman[33590]: 2025-11-26 07:46:13.907769159 +0000 UTC m=+0.037383995 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 02:46:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cfec0ebc71903e3af53d03791dd2645a90d91f986f60293256b593e205c01163/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 26 02:46:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cfec0ebc71903e3af53d03791dd2645a90d91f986f60293256b593e205c01163/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 26 02:46:14 localhost podman[33590]: 2025-11-26 07:46:14.032322377 +0000 UTC m=+0.161937183 container init e60d185a20979700fe9a9672941da0757847d451506ff8109c7034a3d5fa5936 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_keller, CEPH_POINT_RELEASE=, io.openshift.expose-services=, version=7, ceph=True, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, GIT_BRANCH=main, architecture=x86_64, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, io.buildah.version=1.33.12, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7) Nov 26 02:46:14 localhost podman[33590]: 2025-11-26 07:46:14.046523455 +0000 UTC m=+0.176138251 container start e60d185a20979700fe9a9672941da0757847d451506ff8109c7034a3d5fa5936 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_keller, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, release=553, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, name=rhceph, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True) Nov 26 02:46:14 localhost podman[33590]: 2025-11-26 07:46:14.046831992 +0000 UTC m=+0.176446788 container attach e60d185a20979700fe9a9672941da0757847d451506ff8109c7034a3d5fa5936 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_keller, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, release=553, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, GIT_BRANCH=main, vendor=Red Hat, Inc., name=rhceph, ceph=True, RELEASE=main, build-date=2025-09-24T08:57:55, vcs-type=git, GIT_CLEAN=True, io.buildah.version=1.33.12) Nov 26 02:46:14 localhost ceph-osd[32631]: osd.4 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 31.588 iops: 8086.432 elapsed_sec: 0.371 Nov 26 02:46:14 localhost ceph-osd[32631]: log_channel(cluster) log [WRN] : OSD bench result of 8086.431511 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.4. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd]. Nov 26 02:46:14 localhost ceph-osd[32631]: osd.4 0 waiting for initial osdmap Nov 26 02:46:14 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-4[32627]: 2025-11-26T07:46:14.320+0000 7f43a86bd640 -1 osd.4 0 waiting for initial osdmap Nov 26 02:46:14 localhost ceph-osd[32631]: osd.4 15 crush map has features 288514051259236352, adjusting msgr requires for clients Nov 26 02:46:14 localhost ceph-osd[32631]: osd.4 15 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons Nov 26 02:46:14 localhost ceph-osd[32631]: osd.4 15 crush map has features 3314933000852226048, adjusting msgr requires for osds Nov 26 02:46:14 localhost ceph-osd[32631]: osd.4 15 check_osdmap_features require_osd_release unknown -> reef Nov 26 02:46:14 localhost ceph-osd[32631]: osd.4 15 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory Nov 26 02:46:14 localhost ceph-osd[32631]: osd.4 15 set_numa_affinity not setting numa affinity Nov 26 02:46:14 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-osd-4[32627]: 2025-11-26T07:46:14.342+0000 7f43a34d2640 -1 osd.4 15 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory Nov 26 02:46:14 localhost ceph-osd[32631]: osd.4 15 _collect_metadata loop4: no unique device id for loop4: fallback method has no model nor serial Nov 26 02:46:14 localhost ceph-osd[32631]: osd.4 16 state: booting -> active Nov 26 02:46:14 localhost kind_keller[33605]: [ Nov 26 02:46:14 localhost kind_keller[33605]: { Nov 26 02:46:14 localhost kind_keller[33605]: "available": false, Nov 26 02:46:14 localhost kind_keller[33605]: "ceph_device": false, Nov 26 02:46:14 localhost kind_keller[33605]: "device_id": "QEMU_DVD-ROM_QM00001", Nov 26 02:46:14 localhost kind_keller[33605]: "lsm_data": {}, Nov 26 02:46:14 localhost kind_keller[33605]: "lvs": [], Nov 26 02:46:14 localhost kind_keller[33605]: "path": "/dev/sr0", Nov 26 02:46:14 localhost kind_keller[33605]: "rejected_reasons": [ Nov 26 02:46:14 localhost kind_keller[33605]: "Insufficient space (<5GB)", Nov 26 02:46:14 localhost kind_keller[33605]: "Has a FileSystem" Nov 26 02:46:14 localhost kind_keller[33605]: ], Nov 26 02:46:14 localhost kind_keller[33605]: "sys_api": { Nov 26 02:46:14 localhost kind_keller[33605]: "actuators": null, Nov 26 02:46:14 localhost kind_keller[33605]: "device_nodes": "sr0", Nov 26 02:46:14 localhost kind_keller[33605]: "human_readable_size": "482.00 KB", Nov 26 02:46:14 localhost kind_keller[33605]: "id_bus": "ata", Nov 26 02:46:14 localhost kind_keller[33605]: "model": "QEMU DVD-ROM", Nov 26 02:46:14 localhost kind_keller[33605]: "nr_requests": "2", Nov 26 02:46:14 localhost kind_keller[33605]: "partitions": {}, Nov 26 02:46:14 localhost kind_keller[33605]: "path": "/dev/sr0", Nov 26 02:46:14 localhost kind_keller[33605]: "removable": "1", Nov 26 02:46:14 localhost kind_keller[33605]: "rev": "2.5+", Nov 26 02:46:14 localhost kind_keller[33605]: "ro": "0", Nov 26 02:46:14 localhost kind_keller[33605]: "rotational": "1", Nov 26 02:46:14 localhost kind_keller[33605]: "sas_address": "", Nov 26 02:46:14 localhost kind_keller[33605]: "sas_device_handle": "", Nov 26 02:46:14 localhost kind_keller[33605]: "scheduler_mode": "mq-deadline", Nov 26 02:46:14 localhost kind_keller[33605]: "sectors": 0, Nov 26 02:46:14 localhost kind_keller[33605]: "sectorsize": "2048", Nov 26 02:46:14 localhost kind_keller[33605]: "size": 493568.0, Nov 26 02:46:14 localhost kind_keller[33605]: "support_discard": "0", Nov 26 02:46:14 localhost kind_keller[33605]: "type": "disk", Nov 26 02:46:14 localhost kind_keller[33605]: "vendor": "QEMU" Nov 26 02:46:14 localhost kind_keller[33605]: } Nov 26 02:46:14 localhost kind_keller[33605]: } Nov 26 02:46:14 localhost kind_keller[33605]: ] Nov 26 02:46:14 localhost systemd[1]: libpod-e60d185a20979700fe9a9672941da0757847d451506ff8109c7034a3d5fa5936.scope: Deactivated successfully. Nov 26 02:46:15 localhost podman[35080]: 2025-11-26 07:46:15.002016644 +0000 UTC m=+0.062424798 container died e60d185a20979700fe9a9672941da0757847d451506ff8109c7034a3d5fa5936 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_keller, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, RELEASE=main, release=553, io.buildah.version=1.33.12, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, io.openshift.tags=rhceph ceph, name=rhceph, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, version=7, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., ceph=True) Nov 26 02:46:15 localhost systemd[1]: var-lib-containers-storage-overlay-cfec0ebc71903e3af53d03791dd2645a90d91f986f60293256b593e205c01163-merged.mount: Deactivated successfully. Nov 26 02:46:15 localhost podman[35080]: 2025-11-26 07:46:15.037913813 +0000 UTC m=+0.098321927 container remove e60d185a20979700fe9a9672941da0757847d451506ff8109c7034a3d5fa5936 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_keller, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.buildah.version=1.33.12, release=553, version=7, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , io.openshift.expose-services=, architecture=x86_64, CEPH_POINT_RELEASE=, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 26 02:46:15 localhost systemd[1]: libpod-conmon-e60d185a20979700fe9a9672941da0757847d451506ff8109c7034a3d5fa5936.scope: Deactivated successfully. Nov 26 02:46:24 localhost systemd[26105]: Starting Mark boot as successful... Nov 26 02:46:24 localhost podman[35208]: 2025-11-26 07:46:24.046575712 +0000 UTC m=+0.084536270 container exec a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, com.redhat.component=rhceph-container, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vendor=Red Hat, Inc., GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, release=553, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True) Nov 26 02:46:24 localhost systemd[26105]: Finished Mark boot as successful. Nov 26 02:46:24 localhost podman[35208]: 2025-11-26 07:46:24.160174242 +0000 UTC m=+0.198134830 container exec_died a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, release=553, vendor=Red Hat, Inc., ceph=True, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_BRANCH=main) Nov 26 02:46:27 localhost sshd[35289]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:46:55 localhost sshd[35291]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:47:25 localhost podman[35394]: 2025-11-26 07:47:25.980351729 +0000 UTC m=+0.090052722 container exec a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, GIT_CLEAN=True, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_BRANCH=main, description=Red Hat Ceph Storage 7, ceph=True, io.buildah.version=1.33.12, RELEASE=main, build-date=2025-09-24T08:57:55, name=rhceph, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 26 02:47:26 localhost podman[35394]: 2025-11-26 07:47:26.107914793 +0000 UTC m=+0.217615786 container exec_died a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, distribution-scope=public, name=rhceph, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, version=7, ceph=True, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, GIT_BRANCH=main, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., release=553) Nov 26 02:47:28 localhost sshd[35534]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:47:40 localhost systemd[1]: session-14.scope: Deactivated successfully. Nov 26 02:47:40 localhost systemd[1]: session-14.scope: Consumed 22.207s CPU time. Nov 26 02:47:40 localhost systemd-logind[761]: Session 14 logged out. Waiting for processes to exit. Nov 26 02:47:40 localhost systemd-logind[761]: Removed session 14. Nov 26 02:48:32 localhost sshd[35614]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:48:32 localhost sshd[35615]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:49:35 localhost sshd[35693]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:49:35 localhost sshd[35694]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:49:37 localhost systemd[26105]: Created slice User Background Tasks Slice. Nov 26 02:49:37 localhost systemd[26105]: Starting Cleanup of User's Temporary Files and Directories... Nov 26 02:49:37 localhost systemd[26105]: Finished Cleanup of User's Temporary Files and Directories. Nov 26 02:50:26 localhost sshd[35697]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:50:26 localhost sshd[35698]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:50:56 localhost sshd[35778]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:50:57 localhost systemd-logind[761]: New session 28 of user zuul. Nov 26 02:50:57 localhost systemd[1]: Started Session 28 of User zuul. Nov 26 02:50:57 localhost python3[35826]: ansible-ansible.legacy.ping Invoked with data=pong Nov 26 02:50:58 localhost python3[35871]: ansible-setup Invoked with gather_subset=['!facter', '!ohai'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 02:50:58 localhost python3[35891]: ansible-user Invoked with name=tripleo-admin generate_ssh_key=False state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005536118.localdomain update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None Nov 26 02:50:59 localhost python3[35947]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/tripleo-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:50:59 localhost python3[35990]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/tripleo-admin mode=288 owner=root group=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764143459.114804-66802-213941191943490/source _original_basename=tmp9ad3ifid follow=False checksum=b3e7ecdcc699d217c6b083a91b07208207813d93 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:51:00 localhost python3[36020]: ansible-file Invoked with path=/home/tripleo-admin state=directory owner=tripleo-admin group=tripleo-admin mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:51:00 localhost python3[36036]: ansible-file Invoked with path=/home/tripleo-admin/.ssh state=directory owner=tripleo-admin group=tripleo-admin mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:51:00 localhost python3[36052]: ansible-file Invoked with path=/home/tripleo-admin/.ssh/authorized_keys state=touch owner=tripleo-admin group=tripleo-admin mode=384 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:51:01 localhost python3[36068]: ansible-lineinfile Invoked with path=/home/tripleo-admin/.ssh/authorized_keys line=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCkJddqZ+TwuLMCoD/CUKb6dnZ5nImZkr99k28vGFQTZD8B2L/Jx+KwKLctJwJdAbPZC/wCl/36ZPjbla3kwCBCcgDq4oMWypJH1O/63E9BgGHNHKyv8+W8cLdCN1zy1EpGO62uGHVn4l57+Bp2T37Fy3IKVmX+tQkDoTdmzgtr5i8E1khji5awitbNX6RCXkWRlMkvVByLh74T7HTnO21e4xp556VlHAFGjYIDNAjgNkyhO6M9ssBagiIOrBzbXvnmNyZxIeiznzLQGBwty3La7OiGgztNcwLCRTVHG+4hwiKk7RIRradK18HqKab9McNcGbbIU/uUQYbYTPIEWiEmDTYeyTBoy+veLsVUYfXRLJDerz6WvmIUiiLVU0ABmx7b9k9dwjYa9U8tscYuTfYVjocSnR3IVQDEikuw4Bklms2ijHLwfRS9oeb9XvpqyM10A4FQnSLPgHdrRpCWBm4+Nek0Esi3RXYub8PT5HuL5Q87j+qe66WazVu6iSRRGCM= zuul-build-sshkey#012 regexp=Generated by TripleO state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:51:02 localhost python3[36082]: ansible-ping Invoked with data=pong Nov 26 02:51:12 localhost sshd[36084]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:51:13 localhost sshd[36086]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:51:13 localhost systemd-logind[761]: New session 29 of user tripleo-admin. Nov 26 02:51:13 localhost systemd[1]: Created slice User Slice of UID 1003. Nov 26 02:51:13 localhost systemd[1]: Starting User Runtime Directory /run/user/1003... Nov 26 02:51:13 localhost systemd[1]: Finished User Runtime Directory /run/user/1003. Nov 26 02:51:13 localhost systemd[1]: Starting User Manager for UID 1003... Nov 26 02:51:13 localhost systemd[36090]: Queued start job for default target Main User Target. Nov 26 02:51:13 localhost systemd[36090]: Created slice User Application Slice. Nov 26 02:51:13 localhost systemd[36090]: Started Mark boot as successful after the user session has run 2 minutes. Nov 26 02:51:13 localhost systemd[36090]: Started Daily Cleanup of User's Temporary Directories. Nov 26 02:51:13 localhost systemd[36090]: Reached target Paths. Nov 26 02:51:13 localhost systemd[36090]: Reached target Timers. Nov 26 02:51:13 localhost systemd[36090]: Starting D-Bus User Message Bus Socket... Nov 26 02:51:13 localhost systemd[36090]: Starting Create User's Volatile Files and Directories... Nov 26 02:51:13 localhost systemd[36090]: Listening on D-Bus User Message Bus Socket. Nov 26 02:51:13 localhost systemd[36090]: Finished Create User's Volatile Files and Directories. Nov 26 02:51:13 localhost systemd[36090]: Reached target Sockets. Nov 26 02:51:13 localhost systemd[36090]: Reached target Basic System. Nov 26 02:51:13 localhost systemd[36090]: Reached target Main User Target. Nov 26 02:51:13 localhost systemd[36090]: Startup finished in 121ms. Nov 26 02:51:13 localhost systemd[1]: Started User Manager for UID 1003. Nov 26 02:51:13 localhost systemd[1]: Started Session 29 of User tripleo-admin. Nov 26 02:51:14 localhost python3[36151]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all', 'min'] gather_timeout=45 filter=[] fact_path=/etc/ansible/facts.d Nov 26 02:51:19 localhost python3[36171]: ansible-selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config Nov 26 02:51:20 localhost python3[36187]: ansible-tempfile Invoked with state=file suffix=tmphosts prefix=ansible. path=None Nov 26 02:51:20 localhost python3[36235]: ansible-ansible.legacy.copy Invoked with remote_src=True src=/etc/hosts dest=/tmp/ansible.0uv4uqzqtmphosts mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:51:21 localhost python3[36265]: ansible-blockinfile Invoked with state=absent path=/tmp/ansible.0uv4uqzqtmphosts block= marker=# {mark} marker_begin=HEAT_HOSTS_START - Do not edit manually within this section! marker_end=HEAT_HOSTS_END create=False backup=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:51:22 localhost python3[36281]: ansible-blockinfile Invoked with create=True path=/tmp/ansible.0uv4uqzqtmphosts insertbefore=BOF block=172.17.0.106 np0005536117.localdomain np0005536117#012172.18.0.106 np0005536117.storage.localdomain np0005536117.storage#012172.20.0.106 np0005536117.storagemgmt.localdomain np0005536117.storagemgmt#012172.17.0.106 np0005536117.internalapi.localdomain np0005536117.internalapi#012172.19.0.106 np0005536117.tenant.localdomain np0005536117.tenant#012192.168.122.106 np0005536117.ctlplane.localdomain np0005536117.ctlplane#012172.17.0.107 np0005536118.localdomain np0005536118#012172.18.0.107 np0005536118.storage.localdomain np0005536118.storage#012172.20.0.107 np0005536118.storagemgmt.localdomain np0005536118.storagemgmt#012172.17.0.107 np0005536118.internalapi.localdomain np0005536118.internalapi#012172.19.0.107 np0005536118.tenant.localdomain np0005536118.tenant#012192.168.122.107 np0005536118.ctlplane.localdomain np0005536118.ctlplane#012172.17.0.108 np0005536119.localdomain np0005536119#012172.18.0.108 np0005536119.storage.localdomain np0005536119.storage#012172.20.0.108 np0005536119.storagemgmt.localdomain np0005536119.storagemgmt#012172.17.0.108 np0005536119.internalapi.localdomain np0005536119.internalapi#012172.19.0.108 np0005536119.tenant.localdomain np0005536119.tenant#012192.168.122.108 np0005536119.ctlplane.localdomain np0005536119.ctlplane#012172.17.0.103 np0005536112.localdomain np0005536112#012172.18.0.103 np0005536112.storage.localdomain np0005536112.storage#012172.20.0.103 np0005536112.storagemgmt.localdomain np0005536112.storagemgmt#012172.17.0.103 np0005536112.internalapi.localdomain np0005536112.internalapi#012172.19.0.103 np0005536112.tenant.localdomain np0005536112.tenant#012192.168.122.103 np0005536112.ctlplane.localdomain np0005536112.ctlplane#012172.17.0.104 np0005536113.localdomain np0005536113#012172.18.0.104 np0005536113.storage.localdomain np0005536113.storage#012172.20.0.104 np0005536113.storagemgmt.localdomain np0005536113.storagemgmt#012172.17.0.104 np0005536113.internalapi.localdomain np0005536113.internalapi#012172.19.0.104 np0005536113.tenant.localdomain np0005536113.tenant#012192.168.122.104 np0005536113.ctlplane.localdomain np0005536113.ctlplane#012172.17.0.105 np0005536114.localdomain np0005536114#012172.18.0.105 np0005536114.storage.localdomain np0005536114.storage#012172.20.0.105 np0005536114.storagemgmt.localdomain np0005536114.storagemgmt#012172.17.0.105 np0005536114.internalapi.localdomain np0005536114.internalapi#012172.19.0.105 np0005536114.tenant.localdomain np0005536114.tenant#012192.168.122.105 np0005536114.ctlplane.localdomain np0005536114.ctlplane#012#012192.168.122.100 undercloud.ctlplane.localdomain undercloud.ctlplane#012192.168.122.99 overcloud.ctlplane.localdomain#012172.18.0.127 overcloud.storage.localdomain#012172.20.0.139 overcloud.storagemgmt.localdomain#012172.17.0.177 overcloud.internalapi.localdomain#012172.21.0.163 overcloud.localdomain#012 marker=# {mark} marker_begin=START_HOST_ENTRIES_FOR_STACK: overcloud marker_end=END_HOST_ENTRIES_FOR_STACK: overcloud state=present backup=False unsafe_writes=False insertafter=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:51:22 localhost python3[36297]: ansible-ansible.legacy.command Invoked with _raw_params=cp "/tmp/ansible.0uv4uqzqtmphosts" "/etc/hosts" _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:51:22 localhost python3[36314]: ansible-file Invoked with path=/tmp/ansible.0uv4uqzqtmphosts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:51:23 localhost python3[36330]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides rhosp-release _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:51:24 localhost python3[36347]: ansible-ansible.legacy.dnf Invoked with name=['rhosp-release'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 26 02:51:29 localhost python3[36366]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides driverctl lvm2 jq nftables openvswitch openstack-heat-agents openstack-selinux os-net-config python3-libselinux python3-pyyaml puppet-tripleo rsync tmpwatch sysstat iproute-tc _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:51:29 localhost python3[36383]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'jq', 'nftables', 'openvswitch', 'openstack-heat-agents', 'openstack-selinux', 'os-net-config', 'python3-libselinux', 'python3-pyyaml', 'puppet-tripleo', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 26 02:51:56 localhost sshd[36898]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:52:39 localhost kernel: SELinux: Converting 2698 SID table entries... Nov 26 02:52:39 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 26 02:52:39 localhost kernel: SELinux: policy capability open_perms=1 Nov 26 02:52:39 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 26 02:52:39 localhost kernel: SELinux: policy capability always_check_network=0 Nov 26 02:52:39 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 26 02:52:39 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 26 02:52:39 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 26 02:52:39 localhost dbus-broker-launch[756]: avc: op=load_policy lsm=selinux seqno=6 res=1 Nov 26 02:52:39 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 26 02:52:39 localhost systemd[1]: Starting man-db-cache-update.service... Nov 26 02:52:39 localhost systemd[1]: Reloading. Nov 26 02:52:39 localhost systemd-rc-local-generator[37262]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 02:52:39 localhost systemd-sysv-generator[37265]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 02:52:39 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 02:52:39 localhost systemd[1]: Queuing reload/restart jobs for marked units… Nov 26 02:52:40 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Nov 26 02:52:40 localhost systemd[1]: Finished man-db-cache-update.service. Nov 26 02:52:40 localhost systemd[1]: run-r6367dfc81395420fae7f36741f4b52cf.service: Deactivated successfully. Nov 26 02:52:40 localhost sshd[37707]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:52:42 localhost python3[37724]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 jq nftables openvswitch openstack-heat-agents openstack-selinux os-net-config python3-libselinux python3-pyyaml puppet-tripleo rsync tmpwatch sysstat iproute-tc _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:52:44 localhost python3[37863]: ansible-ansible.legacy.systemd Invoked with name=openvswitch enabled=True state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 02:52:44 localhost systemd[1]: Reloading. Nov 26 02:52:44 localhost systemd-sysv-generator[37896]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 02:52:44 localhost systemd-rc-local-generator[37890]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 02:52:44 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 02:52:45 localhost python3[37917]: ansible-file Invoked with path=/var/lib/heat-config/tripleo-config-download state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:52:46 localhost python3[37933]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides openstack-network-scripts _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:52:47 localhost python3[37950]: ansible-systemd Invoked with name=NetworkManager enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None Nov 26 02:52:48 localhost python3[37968]: ansible-ini_file Invoked with path=/etc/NetworkManager/NetworkManager.conf state=present no_extra_spaces=True section=main option=dns value=none backup=True exclusive=True allow_no_value=False create=True unsafe_writes=False values=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:52:49 localhost python3[37986]: ansible-ini_file Invoked with path=/etc/NetworkManager/NetworkManager.conf state=present no_extra_spaces=True section=main option=rc-manager value=unmanaged backup=True exclusive=True allow_no_value=False create=True unsafe_writes=False values=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:52:50 localhost python3[38004]: ansible-ansible.legacy.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 26 02:52:50 localhost systemd[1]: Reloading Network Manager... Nov 26 02:52:50 localhost NetworkManager[5970]: [1764143570.4436] audit: op="reload" arg="0" pid=38007 uid=0 result="success" Nov 26 02:52:50 localhost NetworkManager[5970]: [1764143570.4445] config: signal: SIGHUP,config-files,values,values-user,no-auto-default,dns-mode,rc-manager (/etc/NetworkManager/NetworkManager.conf (lib: 00-server.conf) (run: 15-carrier-timeout.conf)) Nov 26 02:52:50 localhost NetworkManager[5970]: [1764143570.4445] dns-mgr: init: dns=none,systemd-resolved rc-manager=unmanaged Nov 26 02:52:50 localhost systemd[1]: Reloaded Network Manager. Nov 26 02:52:50 localhost python3[38023]: ansible-ansible.legacy.command Invoked with _raw_params=ln -f -s /usr/share/openstack-puppet/modules/* /etc/puppet/modules/ _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:52:51 localhost python3[38040]: ansible-stat Invoked with path=/usr/bin/ansible-playbook follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 02:52:51 localhost python3[38058]: ansible-stat Invoked with path=/usr/bin/ansible-playbook-3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 02:52:52 localhost python3[38074]: ansible-file Invoked with state=link src=/usr/bin/ansible-playbook path=/usr/bin/ansible-playbook-3 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:52:52 localhost python3[38090]: ansible-tempfile Invoked with state=file prefix=ansible. suffix= path=None Nov 26 02:52:53 localhost python3[38106]: ansible-stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 02:52:54 localhost python3[38122]: ansible-blockinfile Invoked with path=/tmp/ansible.nhnknaty block=[192.168.122.106]*,[np0005536117.ctlplane.localdomain]*,[172.17.0.106]*,[np0005536117.internalapi.localdomain]*,[172.18.0.106]*,[np0005536117.storage.localdomain]*,[172.20.0.106]*,[np0005536117.storagemgmt.localdomain]*,[172.19.0.106]*,[np0005536117.tenant.localdomain]*,[np0005536117.localdomain]*,[np0005536117]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCvuiblKTsbmL2sEkAoI8N4XKL02+ZQhAWAKn7UrsKCBB+OGZN1PDIVxXaBQLqG39LX15HM+BCqqS/mo8n1AnvNg89PsWVtq1GxHkisfP/wlNkstEbw85Ezi/3gIjGsbvZ0Bnyh1Zi9GQny2ekd5fOPUP7VVcn5pOrgWNAsC4JWU2vNSsGyg9B+aqV/qfQxTX1REK7lVvKdzqw4RCTHje8SJXUTTSDnD81wreB6Vl3QWWAdVVHsW2UUA74nGGY3XyNUZZuGuCHYKAUElXQdguhdjsq986AS/I81Km6Ak6I9FajfVDmk/iJl/G/Kg7bcas4rbNMBNmR22pcxHOrRUR6RMBYnYdfFTmatTMJKzZr9SFHbYT8gIkC9S4Xi5esJmySHtFBlK0u2MMmOaQAyqGL7xZEOxPrYvaTwrn4QPJYOlGDJKH8HuruPKL6h1HrLKK5no0WhlfnufFo2rdMbbvNbGy+5PgwqJCpqX8DltF/Og/pa+33AXc1JlJZ8zPx1LYM=#012[192.168.122.107]*,[np0005536118.ctlplane.localdomain]*,[172.17.0.107]*,[np0005536118.internalapi.localdomain]*,[172.18.0.107]*,[np0005536118.storage.localdomain]*,[172.20.0.107]*,[np0005536118.storagemgmt.localdomain]*,[172.19.0.107]*,[np0005536118.tenant.localdomain]*,[np0005536118.localdomain]*,[np0005536118]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCfnfafkBxgNm6Kh5k6DvljneX0c5GykKaP2XXhwemEW1/qBm+4yfqmo2y4C08a3GNzTwfPv6B8iwTNKUs6SWEt24JWI/lAVh/ocwBt6VE7KXoscG0Ha7PCZsrtdondI3wHXdYnelEySlDWfgVDx7J/STk7rxKcUS+V5otrTN85yB2OHKKmJ492Y00oCiaudBb9eef6hCAL/JcF2/VXXOpMs1zH/15Mb8bYruhzP/5xB7CCeKLXivfRH7Wn37Ds8UzxSdZUUK5y7TD0QGXGoTnLf3XGG5pBLXvHjG5Mh+Owvt+B5RLXIbXX03+hVoOi8ZMOZNkSUJl/z82BYCARUxbkbrANQqxf9138BGvkERGRfDEWqQUW1dWYEWc8PGLx6fIdrWDBHglnS2RYtXjK/rQktahcaZD7FXQOnaXQv8mfQ0q8kAqmrFA+gV93Ss3keS7YBd8ZASQpIHPZQFRxkztDV638fMq/eiuc59AZGmrzDe3PFtDXroz1uJKJAJdLNhc=#012[192.168.122.108]*,[np0005536119.ctlplane.localdomain]*,[172.17.0.108]*,[np0005536119.internalapi.localdomain]*,[172.18.0.108]*,[np0005536119.storage.localdomain]*,[172.20.0.108]*,[np0005536119.storagemgmt.localdomain]*,[172.19.0.108]*,[np0005536119.tenant.localdomain]*,[np0005536119.localdomain]*,[np0005536119]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCyUtJmwR3ObwtG76w1yXvomHeazZ9D9Kciy4thYCqfEBxb3vhVsB3GMYI6r6bv/mIdyY4palMp4Sr9+W36ruD0JQWfBhLPCSN+hps78UbvG6JXVY/AXW8Cweb5cxgq+IgYeMfhHkpciM4wq8I7uZ0kcMw9+pL76alR0DkvQW/eedRwdkx1/b4pXDds4YPlbSAHas3nVgc/RrfGIQJ1tDnFyRK50M85UHs9j59jGMB/Bho4zv+gEU5EzIQuUPaCY0sdRohlIWCqIynw0PycXoJ7eeCuhrCd3U9FD1XuCVKtOPfX2U3altG+lcVUpyjgP+3dYffy/mDzC6vTljrAuxXHtoePKGJWvB6OS25CSmjSngLlV3ZbXFIDCi2RQpKrjVknksY86vz/sch6ul+qHi/m7r2zptSixRHV9c+BDd1EjAllDmCHp3R9E5dF9Re5IZtna4qBBxgKFmbLgYpuvoNtRNlzQ5ZshhVC01OfQBmSGOqXmr9+KJIYMTYePz1aGUs=#012[192.168.122.103]*,[np0005536112.ctlplane.localdomain]*,[172.17.0.103]*,[np0005536112.internalapi.localdomain]*,[172.18.0.103]*,[np0005536112.storage.localdomain]*,[172.20.0.103]*,[np0005536112.storagemgmt.localdomain]*,[172.19.0.103]*,[np0005536112.tenant.localdomain]*,[np0005536112.localdomain]*,[np0005536112]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCnw/aSCc3H/eikMKIv8MupYWNOR8z7XR+3smBj2uRUTCxpYQMrYPQJS0ym3zpBW0k4x6uKgLc5QfAOIUixQFqmWslELY62n9dp/YdIcnizhKWLQDam94X1ghhDgUbGQ+fdkkvUXWdf7fjjf1xQpHhiL9GDojGAucM1miq+IMHyr2MOJUch+9AXTKwY6Uj8bQi8zuipqxiZHqJhJTAqihg84NSz+4j2x2Ne5dj4Q51PgO2g4TbRhlGB6fKfAB4bRJoPCJ5B9CVBMQaMoWOjwTQ2IYOyF1S2NmYp1Q5+48gmnmW+/Q2RVpvV3nO+JamlCt62HbfuI7eVY6iA3yGJFdUMtlvEZnWj33b1ZflItoDXioyNIfjDo5apKjs3c2W7bnYUpj4Ibdm2IG6nnZJUwRmiAK+UJyvppntz7sD/sj9mlcMJ2Is+lKZKk6x+xMap1clUet53JUhbYz48+AlIKsLq42H0Q1bNkwHVjHe9G8J0Oyey6yoGZ/3Ct7WfChtICek=#012[192.168.122.104]*,[np0005536113.ctlplane.localdomain]*,[172.17.0.104]*,[np0005536113.internalapi.localdomain]*,[172.18.0.104]*,[np0005536113.storage.localdomain]*,[172.20.0.104]*,[np0005536113.storagemgmt.localdomain]*,[172.19.0.104]*,[np0005536113.tenant.localdomain]*,[np0005536113.localdomain]*,[np0005536113]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCcNCOFbU4kXLoESv/2g1Ngr/xjK4i+uRjHpmzX1pOkz0pFxT3jscjN9VufPlOVwhkzCZZmudRNYn+Yv6BKrU+arWLx5NIIkErYWE6+lYRTKPt81XDZ9pZwtdR59NiimZURgJntJ/Ru0lPTJGpMJ3x65MyMyQV4kOEjdCiwhnLSp4XlQMBsmOB7tpglxHiPSkFMvwWTg8pbWwMA+td249DJF7U5eM+rvSCXS4maLqVmbYXuy6O+rNjaPgpAiLoJUn4HclHA3QPPT0KFE4aamiRb6ge0mG/XEMj3yM402Amdu7Rf4uR5Y+25j0VGSlmEOLKsRLOpV2xNfgJthx7xnfOoPlzRsl5sp0VxC9k9FeKtdJH8vtnrRkBeCZpME3/DWwH7ZylwHFC2Ew4ws53R/R+hp27zxNJ0isqnkvAViw9HjfC+ChQv9H0Z52p8plqo14Nc0JCESOolX2/apTOTKbd6+Cfxv5QxsZWdNeteKQfFK4mUcGrhujE4vqv/U11ThQE=#012[192.168.122.105]*,[np0005536114.ctlplane.localdomain]*,[172.17.0.105]*,[np0005536114.internalapi.localdomain]*,[172.18.0.105]*,[np0005536114.storage.localdomain]*,[172.20.0.105]*,[np0005536114.storagemgmt.localdomain]*,[172.19.0.105]*,[np0005536114.tenant.localdomain]*,[np0005536114.localdomain]*,[np0005536114]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDMsepcg+BueYLiPgRHnP9Izs6ROoJIH+OgayDdq1vuZHUwaHTqGCuLqGGJHUB7pN6LVaaMeaxMqz615UuHzL1S8q0VpdrxlRYDvwaY/OI3okxeGCpmkUWORcZlxfhYklmXCUTnEfVisKc379eTDBcFWqgA/GKCRJ+KzzNuunc4S5HjuSGdXSMFSlNOhdX0yW1dGsGVIG7Yihr76o1WhifRGz4KEAQ9F3Kq3YTcbLLcsqlU9r6qHaAj3M19ulSoUaH8GvfUnQa9FX24xH5pbSFBL9P5onW2xZZf4Dl/K5sE4PETonYgeqPONH09NPE8qBduLsJKVGl3wXMkMNbOcxOM1TEuOld3F4kkmFj5txTfV+vftRpWL5fcP83bMRw4r1lm6XmiQi+5KAKizplNvKE74oOKiash0ylGy8qK4yFMkTVu5F8ulzb9EDeKyzLTBz6HFeosGXyEMG15QXT68QHoYUEPVl/WiDxcqy2qed8dCCZd0b7xV4tb3jJ45MUVI/M=#012 create=True state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:52:54 localhost python3[38138]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.nhnknaty' > /etc/ssh/ssh_known_hosts _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:52:54 localhost python3[38156]: ansible-file Invoked with path=/tmp/ansible.nhnknaty state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:52:55 localhost python3[38172]: ansible-file Invoked with path=/var/log/journal state=directory mode=0750 owner=root group=root setype=var_log_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 02:52:56 localhost python3[38188]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active cloud-init.service || systemctl is-enabled cloud-init.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:52:56 localhost python3[38206]: ansible-ansible.legacy.command Invoked with _raw_params=cat /proc/cmdline | grep -q cloud-init=disabled _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:52:57 localhost python3[38225]: ansible-community.general.cloud_init_data_facts Invoked with filter=status Nov 26 02:52:59 localhost python3[38362]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides tuned tuned-profiles-cpu-partitioning _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:53:00 localhost python3[38379]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 26 02:53:03 localhost dbus-broker-launch[752]: Noticed file-system modification, trigger reload. Nov 26 02:53:03 localhost dbus-broker-launch[752]: Noticed file-system modification, trigger reload. Nov 26 02:53:03 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 26 02:53:03 localhost systemd[1]: Starting man-db-cache-update.service... Nov 26 02:53:03 localhost systemd[1]: Reloading. Nov 26 02:53:03 localhost systemd-sysv-generator[38455]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 02:53:03 localhost systemd-rc-local-generator[38452]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 02:53:03 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 02:53:04 localhost systemd[1]: Queuing reload/restart jobs for marked units… Nov 26 02:53:04 localhost systemd[1]: Stopping Dynamic System Tuning Daemon... Nov 26 02:53:04 localhost systemd[1]: tuned.service: Deactivated successfully. Nov 26 02:53:04 localhost systemd[1]: Stopped Dynamic System Tuning Daemon. Nov 26 02:53:04 localhost systemd[1]: tuned.service: Consumed 1.758s CPU time. Nov 26 02:53:04 localhost systemd[1]: Starting Dynamic System Tuning Daemon... Nov 26 02:53:04 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Nov 26 02:53:04 localhost systemd[1]: Finished man-db-cache-update.service. Nov 26 02:53:04 localhost systemd[1]: run-r7763b8c98a8e4f8885126cabe0e20c8f.service: Deactivated successfully. Nov 26 02:53:05 localhost systemd[1]: Started Dynamic System Tuning Daemon. Nov 26 02:53:05 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 26 02:53:05 localhost systemd[1]: Starting man-db-cache-update.service... Nov 26 02:53:05 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Nov 26 02:53:05 localhost systemd[1]: Finished man-db-cache-update.service. Nov 26 02:53:05 localhost systemd[1]: run-rdf07df9e468c4e92800e3554a239f8fa.service: Deactivated successfully. Nov 26 02:53:06 localhost python3[38817]: ansible-systemd Invoked with name=tuned state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 02:53:06 localhost systemd[1]: Stopping Dynamic System Tuning Daemon... Nov 26 02:53:07 localhost systemd[1]: tuned.service: Deactivated successfully. Nov 26 02:53:07 localhost systemd[1]: Stopped Dynamic System Tuning Daemon. Nov 26 02:53:07 localhost systemd[1]: Starting Dynamic System Tuning Daemon... Nov 26 02:53:08 localhost systemd[1]: Started Dynamic System Tuning Daemon. Nov 26 02:53:08 localhost python3[39012]: ansible-ansible.legacy.command Invoked with _raw_params=which tuned-adm _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:53:09 localhost python3[39029]: ansible-slurp Invoked with src=/etc/tuned/active_profile Nov 26 02:53:09 localhost python3[39045]: ansible-stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 02:53:10 localhost python3[39061]: ansible-ansible.legacy.command Invoked with _raw_params=tuned-adm profile throughput-performance _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:53:12 localhost python3[39081]: ansible-ansible.legacy.command Invoked with _raw_params=cat /proc/cmdline _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:53:12 localhost python3[39098]: ansible-stat Invoked with path=/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 02:53:15 localhost python3[39114]: ansible-replace Invoked with regexp=TRIPLEO_HEAT_TEMPLATE_KERNEL_ARGS dest=/etc/default/grub replace= path=/etc/default/grub backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:53:19 localhost python3[39130]: ansible-file Invoked with path=/etc/puppet/hieradata state=directory mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:53:20 localhost python3[39178]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hiera.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:53:20 localhost python3[39223]: ansible-ansible.legacy.copy Invoked with mode=384 dest=/etc/puppet/hiera.yaml src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143600.0999057-71475-238565352721299/source _original_basename=tmp1mag8x17 follow=False checksum=aaf3699defba931d532f4955ae152f505046749a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:53:21 localhost python3[39253]: ansible-file Invoked with src=/etc/puppet/hiera.yaml dest=/etc/hiera.yaml state=link force=True path=/etc/hiera.yaml recurse=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:53:21 localhost python3[39301]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/all_nodes.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:53:22 localhost python3[39344]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143601.6316364-71568-206825177184121/source dest=/etc/puppet/hieradata/all_nodes.json _original_basename=overcloud.json follow=False checksum=c5ddefb043c2f26136cfdc1caee35f1056524cb0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:53:22 localhost python3[39406]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/bootstrap_node.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:53:23 localhost python3[39449]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143602.5146809-71625-39989502163088/source dest=/etc/puppet/hieradata/bootstrap_node.json mode=None follow=False _original_basename=bootstrap_node.j2 checksum=41a49edc245247058c919c995eb124d7af8393b7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:53:23 localhost python3[39511]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/vip_data.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:53:23 localhost sshd[39536]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:53:24 localhost python3[39556]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143603.3807998-71625-173973434873911/source dest=/etc/puppet/hieradata/vip_data.json mode=None follow=False _original_basename=vip_data.j2 checksum=066fa106166fda87f816d0263a357d8ae2b8f10a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:53:24 localhost python3[39618]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/net_ip_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:53:25 localhost python3[39661]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143604.2622015-71625-240389562264110/source dest=/etc/puppet/hieradata/net_ip_map.json mode=None follow=False _original_basename=net_ip_map.j2 checksum=175c760950d63a47f443f25b58088dba962f090b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:53:25 localhost python3[39723]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/cloud_domain.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:53:25 localhost python3[39766]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143605.2455232-71625-156768456716782/source dest=/etc/puppet/hieradata/cloud_domain.json mode=None follow=False _original_basename=cloud_domain.j2 checksum=5dd835a63e6a03d74797c2e2eadf4bea1cecd9d9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:53:26 localhost python3[39828]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/fqdn.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:53:26 localhost python3[39871]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143606.094303-71625-53643857225361/source dest=/etc/puppet/hieradata/fqdn.json mode=None follow=False _original_basename=fqdn.j2 checksum=0b200b2c71296040febc9513867f4aded8aa36e9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:53:27 localhost python3[39933]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_names.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:53:27 localhost python3[39976]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143606.9890668-71625-64997738191735/source dest=/etc/puppet/hieradata/service_names.json mode=None follow=False _original_basename=service_names.j2 checksum=ff586b96402d8ae133745cf06f17e772b2f22d52 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:53:28 localhost python3[40038]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_configs.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:53:28 localhost python3[40081]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143607.8313465-71625-121216403615412/source dest=/etc/puppet/hieradata/service_configs.json mode=None follow=False _original_basename=service_configs.j2 checksum=b20821b14e21b54f45f13ce8f7e67a77730dc68b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:53:29 localhost python3[40143]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:53:29 localhost python3[40186]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143608.6819417-71625-136688206865064/source dest=/etc/puppet/hieradata/extraconfig.json mode=None follow=False _original_basename=extraconfig.j2 checksum=5f36b2ea290645ee34d943220a14b54ee5ea5be5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:53:29 localhost python3[40248]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/role_extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:53:30 localhost python3[40291]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143609.510914-71625-265128348852638/source dest=/etc/puppet/hieradata/role_extraconfig.json mode=None follow=False _original_basename=role_extraconfig.j2 checksum=34875968bf996542162e620523f9dcfb3deac331 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:53:30 localhost python3[40353]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ovn_chassis_mac_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:53:31 localhost python3[40396]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143610.4152117-71625-5857472165952/source dest=/etc/puppet/hieradata/ovn_chassis_mac_map.json mode=None follow=False _original_basename=ovn_chassis_mac_map.j2 checksum=9b4fde34c31dca4e2edd4edbe5807674fe821b53 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:53:31 localhost python3[40426]: ansible-stat Invoked with path={'src': '/etc/puppet/hieradata/ansible_managed.json'} follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 02:53:32 localhost python3[40474]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ansible_managed.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:53:32 localhost python3[40517]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/ansible_managed.json owner=root group=root mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143612.2434323-72289-154408559437183/source _original_basename=tmpb8ma_n2w follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:53:37 localhost python3[40624]: ansible-setup Invoked with gather_subset=['!all', '!min', 'network'] filter=['ansible_default_ipv4'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 26 02:53:37 localhost systemd[36090]: Starting Mark boot as successful... Nov 26 02:53:37 localhost systemd[36090]: Finished Mark boot as successful. Nov 26 02:53:37 localhost python3[40685]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 38.102.83.1 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:53:42 localhost python3[40703]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 192.168.122.10 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:53:47 localhost python3[40720]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 192.168.122.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:53:47 localhost python3[40743]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 192.168.122.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:53:52 localhost python3[40760]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.18.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:53:52 localhost python3[40783]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.18.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:53:57 localhost python3[40800]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.18.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:54:01 localhost python3[40817]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.20.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:54:02 localhost python3[40840]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.20.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:54:06 localhost sshd[40858]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:54:06 localhost python3[40857]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.20.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:54:11 localhost python3[40876]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.17.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:54:11 localhost python3[40899]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.17.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:54:16 localhost python3[40916]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.17.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:54:20 localhost python3[40933]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.19.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:54:21 localhost python3[40956]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.19.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:54:25 localhost python3[40973]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.19.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:54:30 localhost python3[40990]: ansible-file Invoked with path=/etc/puppet/hieradata state=directory mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:54:31 localhost python3[41038]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hiera.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:54:31 localhost python3[41056]: ansible-ansible.legacy.file Invoked with mode=384 dest=/etc/puppet/hiera.yaml _original_basename=tmp6g8dt3ho recurse=False state=file path=/etc/puppet/hiera.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:54:32 localhost python3[41086]: ansible-file Invoked with src=/etc/puppet/hiera.yaml dest=/etc/hiera.yaml state=link force=True path=/etc/hiera.yaml recurse=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:54:32 localhost python3[41134]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/all_nodes.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:54:33 localhost python3[41152]: ansible-ansible.legacy.file Invoked with dest=/etc/puppet/hieradata/all_nodes.json _original_basename=overcloud.json recurse=False state=file path=/etc/puppet/hieradata/all_nodes.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:54:33 localhost python3[41214]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/bootstrap_node.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:54:33 localhost python3[41232]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/bootstrap_node.json _original_basename=bootstrap_node.j2 recurse=False state=file path=/etc/puppet/hieradata/bootstrap_node.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:54:34 localhost python3[41294]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/vip_data.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:54:34 localhost python3[41312]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/vip_data.json _original_basename=vip_data.j2 recurse=False state=file path=/etc/puppet/hieradata/vip_data.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:54:35 localhost python3[41374]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/net_ip_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:54:35 localhost python3[41392]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/net_ip_map.json _original_basename=net_ip_map.j2 recurse=False state=file path=/etc/puppet/hieradata/net_ip_map.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:54:36 localhost python3[41454]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/cloud_domain.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:54:36 localhost python3[41472]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/cloud_domain.json _original_basename=cloud_domain.j2 recurse=False state=file path=/etc/puppet/hieradata/cloud_domain.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:54:36 localhost python3[41564]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/fqdn.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:54:37 localhost python3[41601]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/fqdn.json _original_basename=fqdn.j2 recurse=False state=file path=/etc/puppet/hieradata/fqdn.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:54:37 localhost python3[41675]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_names.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:54:37 localhost python3[41693]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/service_names.json _original_basename=service_names.j2 recurse=False state=file path=/etc/puppet/hieradata/service_names.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:54:38 localhost python3[41755]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_configs.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:54:38 localhost python3[41773]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/service_configs.json _original_basename=service_configs.j2 recurse=False state=file path=/etc/puppet/hieradata/service_configs.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:54:39 localhost python3[41835]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:54:39 localhost python3[41853]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/extraconfig.json _original_basename=extraconfig.j2 recurse=False state=file path=/etc/puppet/hieradata/extraconfig.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:54:39 localhost python3[41930]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/role_extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:54:40 localhost python3[41948]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/role_extraconfig.json _original_basename=role_extraconfig.j2 recurse=False state=file path=/etc/puppet/hieradata/role_extraconfig.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:54:40 localhost python3[42010]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ovn_chassis_mac_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:54:41 localhost python3[42028]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/ovn_chassis_mac_map.json _original_basename=ovn_chassis_mac_map.j2 recurse=False state=file path=/etc/puppet/hieradata/ovn_chassis_mac_map.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:54:41 localhost python3[42058]: ansible-stat Invoked with path={'src': '/etc/puppet/hieradata/ansible_managed.json'} follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 02:54:42 localhost python3[42106]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ansible_managed.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:54:42 localhost python3[42124]: ansible-ansible.legacy.file Invoked with owner=root group=root mode=0644 dest=/etc/puppet/hieradata/ansible_managed.json _original_basename=tmpqk5dov73 recurse=False state=file path=/etc/puppet/hieradata/ansible_managed.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:54:45 localhost python3[42154]: ansible-dnf Invoked with name=['firewalld'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 26 02:54:50 localhost python3[42171]: ansible-ansible.builtin.systemd Invoked with name=iptables.service state=stopped enabled=False daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 02:54:50 localhost sshd[42190]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:54:50 localhost python3[42189]: ansible-ansible.builtin.systemd Invoked with name=ip6tables.service state=stopped enabled=False daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 02:54:52 localhost python3[42209]: ansible-ansible.builtin.systemd Invoked with name=nftables state=started enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 02:54:52 localhost systemd[1]: Reloading. Nov 26 02:54:52 localhost systemd-rc-local-generator[42234]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 02:54:52 localhost systemd-sysv-generator[42238]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 02:54:52 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 02:54:52 localhost systemd[1]: Starting Netfilter Tables... Nov 26 02:54:52 localhost systemd[1]: Finished Netfilter Tables. Nov 26 02:54:53 localhost python3[42299]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:54:53 localhost python3[42342]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143693.0313385-75311-144759830745562/source _original_basename=iptables.nft follow=False checksum=ede9860c99075946a7bc827210247aac639bc84a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:54:54 localhost python3[42372]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:54:55 localhost python3[42390]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:54:55 localhost python3[42439]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-jumps.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:54:56 localhost python3[42482]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-jumps.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143695.3390107-75452-194830544816899/source mode=None follow=False _original_basename=jump-chain.j2 checksum=eec306c3276262a27663d76bd0ea526457445afa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:54:56 localhost python3[42544]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-update-jumps.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:54:57 localhost python3[42587]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-update-jumps.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143696.3509276-75515-129504710330896/source mode=None follow=False _original_basename=jump-chain.j2 checksum=eec306c3276262a27663d76bd0ea526457445afa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:54:57 localhost python3[42649]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-flushes.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:54:57 localhost python3[42692]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-flushes.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143697.3473644-75570-44636985000267/source mode=None follow=False _original_basename=flush-chain.j2 checksum=e8e7b8db0d61a7fe393441cc91613f470eb34a6e backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:54:58 localhost python3[42754]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-chains.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:54:58 localhost python3[42797]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-chains.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143698.178817-75652-33955263221295/source mode=None follow=False _original_basename=chains.j2 checksum=e60ee651f5014e83924f4e901ecc8e25b1906610 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:54:59 localhost python3[42859]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-rules.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:55:00 localhost python3[42902]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-rules.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143699.0918806-75719-91641991975175/source mode=None follow=False _original_basename=ruleset.j2 checksum=0444e4206083f91e2fb2aabfa2928244c2db35ed backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:55:00 localhost python3[42932]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/nftables/tripleo-chains.nft /etc/nftables/tripleo-flushes.nft /etc/nftables/tripleo-rules.nft /etc/nftables/tripleo-update-jumps.nft /etc/nftables/tripleo-jumps.nft | nft -c -f - _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:55:01 localhost python3[42997]: ansible-ansible.builtin.blockinfile Invoked with path=/etc/sysconfig/nftables.conf backup=False validate=nft -c -f %s block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/tripleo-chains.nft"#012include "/etc/nftables/tripleo-rules.nft"#012include "/etc/nftables/tripleo-jumps.nft"#012 state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:55:01 localhost python3[43014]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/tripleo-chains.nft _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:55:02 localhost python3[43031]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/nftables/tripleo-flushes.nft /etc/nftables/tripleo-rules.nft /etc/nftables/tripleo-update-jumps.nft | nft -f - _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:55:02 localhost python3[43050]: ansible-file Invoked with mode=0750 path=/var/log/containers/collectd setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 02:55:02 localhost python3[43066]: ansible-file Invoked with mode=0755 path=/var/lib/container-user-scripts/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 02:55:03 localhost python3[43082]: ansible-file Invoked with mode=0750 path=/var/log/containers/ceilometer setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 02:55:03 localhost python3[43098]: ansible-seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False Nov 26 02:55:04 localhost dbus-broker-launch[756]: avc: op=load_policy lsm=selinux seqno=7 res=1 Nov 26 02:55:04 localhost python3[43118]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/etc/iscsi(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None Nov 26 02:55:05 localhost kernel: SELinux: Converting 2702 SID table entries... Nov 26 02:55:05 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 26 02:55:05 localhost kernel: SELinux: policy capability open_perms=1 Nov 26 02:55:05 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 26 02:55:05 localhost kernel: SELinux: policy capability always_check_network=0 Nov 26 02:55:05 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 26 02:55:05 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 26 02:55:05 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 26 02:55:05 localhost dbus-broker-launch[756]: avc: op=load_policy lsm=selinux seqno=8 res=1 Nov 26 02:55:06 localhost python3[43139]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/etc/target(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None Nov 26 02:55:06 localhost kernel: SELinux: Converting 2702 SID table entries... Nov 26 02:55:06 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 26 02:55:06 localhost kernel: SELinux: policy capability open_perms=1 Nov 26 02:55:06 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 26 02:55:06 localhost kernel: SELinux: policy capability always_check_network=0 Nov 26 02:55:06 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 26 02:55:06 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 26 02:55:06 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 26 02:55:07 localhost dbus-broker-launch[756]: avc: op=load_policy lsm=selinux seqno=9 res=1 Nov 26 02:55:07 localhost python3[43161]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/var/lib/iscsi(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None Nov 26 02:55:08 localhost kernel: SELinux: Converting 2702 SID table entries... Nov 26 02:55:08 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 26 02:55:08 localhost kernel: SELinux: policy capability open_perms=1 Nov 26 02:55:08 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 26 02:55:08 localhost kernel: SELinux: policy capability always_check_network=0 Nov 26 02:55:08 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 26 02:55:08 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 26 02:55:08 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 26 02:55:08 localhost dbus-broker-launch[756]: avc: op=load_policy lsm=selinux seqno=10 res=1 Nov 26 02:55:08 localhost python3[43186]: ansible-file Invoked with path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 02:55:09 localhost python3[43202]: ansible-file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 02:55:09 localhost python3[43218]: ansible-file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 02:55:09 localhost python3[43234]: ansible-stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 02:55:10 localhost python3[43250]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-enabled --quiet iscsi.service _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:55:10 localhost python3[43267]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 26 02:55:15 localhost python3[43284]: ansible-file Invoked with path=/etc/modules-load.d state=directory mode=493 owner=root group=root setype=etc_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 02:55:15 localhost python3[43332]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-tripleo.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:55:15 localhost python3[43375]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143715.2607994-76658-126384896553111/source dest=/etc/modules-load.d/99-tripleo.conf mode=420 owner=root group=root setype=etc_t follow=False _original_basename=tripleo-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 26 02:55:16 localhost python3[43405]: ansible-systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 26 02:55:16 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Nov 26 02:55:16 localhost systemd[1]: Stopped Load Kernel Modules. Nov 26 02:55:16 localhost systemd[1]: Stopping Load Kernel Modules... Nov 26 02:55:16 localhost systemd[1]: Starting Load Kernel Modules... Nov 26 02:55:16 localhost kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Nov 26 02:55:16 localhost systemd-modules-load[43408]: Inserted module 'br_netfilter' Nov 26 02:55:16 localhost kernel: Bridge firewalling registered Nov 26 02:55:16 localhost systemd-modules-load[43408]: Module 'msr' is built in Nov 26 02:55:16 localhost systemd[1]: Finished Load Kernel Modules. Nov 26 02:55:17 localhost python3[43459]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-tripleo.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:55:17 localhost python3[43502]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143716.7311835-76739-254435770993954/source dest=/etc/sysctl.d/99-tripleo.conf mode=420 owner=root group=root setype=etc_t follow=False _original_basename=tripleo-sysctl.conf.j2 checksum=cddb9401fdafaaf28a4a94b98448f98ae93c94c9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 26 02:55:17 localhost python3[43532]: ansible-sysctl Invoked with name=fs.aio-max-nr value=1048576 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 26 02:55:18 localhost python3[43549]: ansible-sysctl Invoked with name=fs.inotify.max_user_instances value=1024 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 26 02:55:18 localhost python3[43567]: ansible-sysctl Invoked with name=kernel.pid_max value=1048576 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 26 02:55:18 localhost python3[43585]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-arptables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 26 02:55:19 localhost python3[43602]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-ip6tables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 26 02:55:19 localhost python3[43619]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-iptables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 26 02:55:19 localhost python3[43636]: ansible-sysctl Invoked with name=net.ipv4.conf.all.rp_filter value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 26 02:55:20 localhost python3[43654]: ansible-sysctl Invoked with name=net.ipv4.ip_forward value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 26 02:55:20 localhost python3[43672]: ansible-sysctl Invoked with name=net.ipv4.ip_local_reserved_ports value=35357,49000-49001 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 26 02:55:20 localhost python3[43690]: ansible-sysctl Invoked with name=net.ipv4.ip_nonlocal_bind value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 26 02:55:21 localhost python3[43708]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh1 value=1024 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 26 02:55:21 localhost python3[43726]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh2 value=2048 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 26 02:55:21 localhost python3[43744]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh3 value=4096 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 26 02:55:22 localhost python3[43762]: ansible-sysctl Invoked with name=net.ipv6.conf.all.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 26 02:55:22 localhost python3[43779]: ansible-sysctl Invoked with name=net.ipv6.conf.all.forwarding value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 26 02:55:22 localhost python3[43796]: ansible-sysctl Invoked with name=net.ipv6.conf.default.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 26 02:55:22 localhost python3[43813]: ansible-sysctl Invoked with name=net.ipv6.conf.lo.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 26 02:55:23 localhost python3[43830]: ansible-sysctl Invoked with name=net.ipv6.ip_nonlocal_bind value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 26 02:55:23 localhost python3[43848]: ansible-systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 26 02:55:23 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully. Nov 26 02:55:23 localhost systemd[1]: Stopped Apply Kernel Variables. Nov 26 02:55:23 localhost systemd[1]: Stopping Apply Kernel Variables... Nov 26 02:55:23 localhost systemd[1]: Starting Apply Kernel Variables... Nov 26 02:55:23 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Nov 26 02:55:23 localhost systemd[1]: Finished Apply Kernel Variables. Nov 26 02:55:24 localhost python3[43868]: ansible-file Invoked with mode=0750 path=/var/log/containers/metrics_qdr setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 02:55:24 localhost python3[43884]: ansible-file Invoked with path=/var/lib/metrics_qdr setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 02:55:24 localhost python3[43900]: ansible-file Invoked with mode=0750 path=/var/log/containers/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 02:55:25 localhost python3[43916]: ansible-stat Invoked with path=/var/lib/nova/instances follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 02:55:25 localhost python3[43932]: ansible-file Invoked with path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 02:55:25 localhost python3[43948]: ansible-file Invoked with path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 02:55:26 localhost python3[43964]: ansible-file Invoked with path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 02:55:26 localhost python3[43980]: ansible-file Invoked with path=/var/lib/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 02:55:26 localhost python3[43996]: ansible-file Invoked with path=/etc/tmpfiles.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:55:27 localhost python3[44044]: ansible-ansible.legacy.stat Invoked with path=/etc/tmpfiles.d/run-nova.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:55:27 localhost python3[44087]: ansible-ansible.legacy.copy Invoked with dest=/etc/tmpfiles.d/run-nova.conf src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143727.0158722-77026-246780394038491/source _original_basename=tmp4rswp95i follow=False checksum=f834349098718ec09c7562bcb470b717a83ff411 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:55:28 localhost python3[44117]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-tmpfiles --create _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:55:29 localhost python3[44134]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:55:29 localhost python3[44182]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/delay-nova-compute follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:55:30 localhost python3[44225]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/nova/delay-nova-compute mode=493 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143729.5137398-77274-20206718071960/source _original_basename=tmpmmysddw_ follow=False checksum=f07ad3e8cf3766b3b3b07ae8278826a0ef3bb5e3 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:55:30 localhost python3[44255]: ansible-file Invoked with mode=0750 path=/var/log/containers/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 02:55:31 localhost python3[44271]: ansible-file Invoked with path=/etc/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 02:55:31 localhost python3[44287]: ansible-file Invoked with path=/etc/libvirt/secrets setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 02:55:31 localhost python3[44303]: ansible-file Invoked with path=/etc/libvirt/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 02:55:31 localhost python3[44319]: ansible-file Invoked with path=/var/lib/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 02:55:32 localhost python3[44335]: ansible-file Invoked with path=/var/cache/libvirt state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:55:32 localhost python3[44351]: ansible-file Invoked with path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 02:55:32 localhost python3[44367]: ansible-file Invoked with path=/run/libvirt state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:55:33 localhost python3[44383]: ansible-file Invoked with mode=0770 path=/var/log/containers/libvirt/swtpm setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 02:55:33 localhost python3[44399]: ansible-group Invoked with gid=107 name=qemu state=present system=False local=False non_unique=False Nov 26 02:55:34 localhost python3[44421]: ansible-user Invoked with comment=qemu user group=qemu name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005536118.localdomain update_password=always groups=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None Nov 26 02:55:34 localhost python3[44445]: ansible-file Invoked with group=qemu owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None serole=None selevel=None attributes=None Nov 26 02:55:34 localhost python3[44461]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/rpm -q libvirt-daemon _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:55:34 localhost sshd[44463]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:55:35 localhost python3[44512]: ansible-ansible.legacy.stat Invoked with path=/etc/tmpfiles.d/run-libvirt.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:55:35 localhost python3[44555]: ansible-ansible.legacy.copy Invoked with dest=/etc/tmpfiles.d/run-libvirt.conf src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143735.1278794-77515-203263432279436/source _original_basename=tmp51zthrvd follow=False checksum=57f3ff94c666c6aae69ae22e23feb750cf9e8b13 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:55:36 localhost python3[44585]: ansible-seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False Nov 26 02:55:36 localhost dbus-broker-launch[756]: avc: op=load_policy lsm=selinux seqno=11 res=1 Nov 26 02:55:37 localhost python3[44606]: ansible-file Invoked with path=/etc/crypto-policies/local.d/gnutls-qemu.config state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:55:37 localhost python3[44622]: ansible-file Invoked with path=/run/libvirt setype=virt_var_run_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 02:55:37 localhost python3[44638]: ansible-seboolean Invoked with name=logrotate_read_inside_containers persistent=True state=True ignore_selinux_state=False Nov 26 02:55:39 localhost dbus-broker-launch[756]: avc: op=load_policy lsm=selinux seqno=12 res=1 Nov 26 02:55:39 localhost python3[44658]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 26 02:55:42 localhost python3[44802]: ansible-setup Invoked with gather_subset=['!all', '!min', 'network'] filter=['ansible_interfaces'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 26 02:55:43 localhost python3[44863]: ansible-file Invoked with path=/etc/containers/networks state=directory recurse=True mode=493 owner=root group=root force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:55:43 localhost python3[44879]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:55:44 localhost python3[44939]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:55:44 localhost python3[44982]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143743.7367463-77905-194207445909039/source dest=/etc/containers/networks/podman.json mode=0644 owner=root group=root follow=False _original_basename=podman_network_config.j2 checksum=50b4d2e65d4655d73feb6f9108526c487ce9f5df backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:55:45 localhost python3[45044]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:55:45 localhost python3[45089]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143744.6798584-78178-280269836514073/source dest=/etc/containers/registries.conf owner=root group=root setype=etc_t mode=0644 follow=False _original_basename=registries.conf.j2 checksum=710a00cfb11a4c3eba9c028ef1984a9fea9ba83a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 26 02:55:45 localhost python3[45119]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=containers option=pids_limit value=4096 backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None Nov 26 02:55:46 localhost python3[45135]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=engine option=events_logger value="journald" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None Nov 26 02:55:46 localhost python3[45151]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=engine option=runtime value="crun" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None Nov 26 02:55:46 localhost python3[45167]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=network option=network_backend value="netavark" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None Nov 26 02:55:47 localhost python3[45215]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:55:47 localhost python3[45258]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143747.0903594-78286-194462923356047/source _original_basename=tmpqke9gd7u follow=False checksum=0bfbc70e9a4740c9004b9947da681f723d529c83 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:55:48 localhost python3[45288]: ansible-file Invoked with mode=0750 path=/var/log/containers/rsyslog setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 02:55:48 localhost python3[45304]: ansible-file Invoked with path=/var/lib/rsyslog.container setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 02:55:49 localhost python3[45320]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 26 02:55:52 localhost python3[45369]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:55:53 localhost python3[45414]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143752.4133458-78533-104728863584259/source validate=/usr/sbin/sshd -T -f %s mode=None follow=False _original_basename=sshd_config_block.j2 checksum=913c99ed7d5c33615bfb07a6792a4ef143dcfd2b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:55:53 localhost python3[45445]: ansible-systemd Invoked with name=sshd state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 02:55:53 localhost systemd[1]: Stopping OpenSSH server daemon... Nov 26 02:55:53 localhost systemd[1]: sshd.service: Deactivated successfully. Nov 26 02:55:53 localhost systemd[1]: Stopped OpenSSH server daemon. Nov 26 02:55:53 localhost systemd[1]: sshd.service: Consumed 2.428s CPU time, read 1.9M from disk, written 0B to disk. Nov 26 02:55:53 localhost systemd[1]: Stopped target sshd-keygen.target. Nov 26 02:55:53 localhost systemd[1]: Stopping sshd-keygen.target... Nov 26 02:55:53 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 26 02:55:53 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 26 02:55:53 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 26 02:55:53 localhost systemd[1]: Reached target sshd-keygen.target. Nov 26 02:55:53 localhost systemd[1]: Starting OpenSSH server daemon... Nov 26 02:55:53 localhost sshd[45449]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:55:53 localhost systemd[1]: Started OpenSSH server daemon. Nov 26 02:55:54 localhost python3[45465]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:55:55 localhost python3[45483]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:55:55 localhost python3[45501]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 26 02:55:59 localhost python3[45550]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:55:59 localhost python3[45568]: ansible-ansible.legacy.file Invoked with owner=root group=root mode=420 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:56:00 localhost python3[45598]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 02:56:00 localhost python3[45648]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/chrony-online.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:56:01 localhost python3[45666]: ansible-ansible.legacy.file Invoked with dest=/etc/systemd/system/chrony-online.service _original_basename=chrony-online.service recurse=False state=file path=/etc/systemd/system/chrony-online.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:56:01 localhost python3[45696]: ansible-systemd Invoked with state=started name=chrony-online.service enabled=True daemon-reload=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 02:56:01 localhost systemd[1]: Reloading. Nov 26 02:56:01 localhost systemd-rc-local-generator[45721]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 02:56:01 localhost systemd-sysv-generator[45726]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 02:56:01 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 02:56:01 localhost systemd[1]: Starting chronyd online sources service... Nov 26 02:56:01 localhost chronyc[45736]: 200 OK Nov 26 02:56:01 localhost systemd[1]: chrony-online.service: Deactivated successfully. Nov 26 02:56:01 localhost systemd[1]: Finished chronyd online sources service. Nov 26 02:56:02 localhost python3[45752]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:56:02 localhost chronyd[25901]: System clock was stepped by 0.000152 seconds Nov 26 02:56:02 localhost python3[45769]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:56:02 localhost python3[45786]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:56:03 localhost chronyd[25901]: System clock was stepped by -0.000000 seconds Nov 26 02:56:03 localhost python3[45803]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:56:03 localhost python3[45820]: ansible-timezone Invoked with name=UTC hwclock=None Nov 26 02:56:03 localhost systemd[1]: Starting Time & Date Service... Nov 26 02:56:03 localhost systemd[1]: Started Time & Date Service. Nov 26 02:56:04 localhost python3[45840]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides tuned tuned-profiles-cpu-partitioning _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:56:05 localhost python3[45857]: ansible-ansible.legacy.command Invoked with _raw_params=which tuned-adm _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:56:05 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 26 02:56:05 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 3402 writes, 16K keys, 3402 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.03 MB/s#012Cumulative WAL: 3402 writes, 206 syncs, 16.51 writes per sync, written: 0.01 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 3402 writes, 16K keys, 3402 commit groups, 1.0 writes per commit group, ingest: 15.29 MB, 0.03 MB/s#012Interval WAL: 3402 writes, 206 syncs, 16.51 writes per sync, written: 0.01 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5576e0ff82d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5576e0ff82d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memt Nov 26 02:56:05 localhost python3[45874]: ansible-slurp Invoked with src=/etc/tuned/active_profile Nov 26 02:56:06 localhost python3[45890]: ansible-stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 02:56:06 localhost python3[45906]: ansible-file Invoked with mode=0750 path=/var/log/containers/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 02:56:07 localhost python3[45922]: ansible-file Invoked with path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 02:56:07 localhost python3[45970]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/neutron-cleanup follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:56:08 localhost python3[46013]: ansible-ansible.legacy.copy Invoked with dest=/usr/libexec/neutron-cleanup force=True mode=0755 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143767.3892925-79332-246653865703939/source _original_basename=tmpxp0n2o68 follow=False checksum=f9cc7d1e91fbae49caa7e35eb2253bba146a73b4 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:56:08 localhost python3[46075]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/neutron-cleanup.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:56:09 localhost python3[46118]: ansible-ansible.legacy.copy Invoked with dest=/usr/lib/systemd/system/neutron-cleanup.service force=True src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143768.3209176-79391-126408765056182/source _original_basename=tmpas33e2ls follow=False checksum=6b6cd9f074903a28d054eb530a10c7235d0c39fc backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:56:09 localhost python3[46148]: ansible-ansible.legacy.systemd Invoked with enabled=True name=neutron-cleanup daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None Nov 26 02:56:09 localhost systemd[1]: Reloading. Nov 26 02:56:09 localhost systemd-rc-local-generator[46175]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 02:56:09 localhost systemd-sysv-generator[46179]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 02:56:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 26 02:56:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 3248 writes, 16K keys, 3248 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.02 MB/s#012Cumulative WAL: 3248 writes, 140 syncs, 23.20 writes per sync, written: 0.01 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 3248 writes, 16K keys, 3248 commit groups, 1.0 writes per commit group, ingest: 14.61 MB, 0.02 MB/s#012Interval WAL: 3248 writes, 140 syncs, 23.20 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.02 0.00 1 0.022 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.02 0.00 1 0.022 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1 0.02 0.00 1 0.022 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5593355362d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5593355362d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable Nov 26 02:56:09 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 02:56:10 localhost python3[46201]: ansible-file Invoked with mode=0750 path=/var/log/containers/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 02:56:10 localhost python3[46217]: ansible-ansible.legacy.command Invoked with _raw_params=ip netns add ns_temp _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:56:11 localhost python3[46234]: ansible-ansible.legacy.command Invoked with _raw_params=ip netns delete ns_temp _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:56:11 localhost systemd[1]: run-netns-ns_temp.mount: Deactivated successfully. Nov 26 02:56:11 localhost python3[46251]: ansible-file Invoked with path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 02:56:11 localhost python3[46267]: ansible-file Invoked with path=/var/lib/neutron/kill_scripts state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:56:12 localhost python3[46315]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:56:12 localhost python3[46358]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=493 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143772.055027-79634-21928789810447/source _original_basename=tmp9lpq24xh follow=False checksum=2f369fbe8f83639cdfd4efc53e7feb4ee77d1ed7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:56:17 localhost sshd[46373]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:56:33 localhost systemd[1]: systemd-timedated.service: Deactivated successfully. Nov 26 02:56:35 localhost python3[46392]: ansible-file Invoked with path=/var/log/containers state=directory setype=container_file_t selevel=s0 mode=488 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Nov 26 02:56:35 localhost python3[46408]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None setype=None attributes=None Nov 26 02:56:36 localhost python3[46424]: ansible-file Invoked with path=/var/lib/tripleo-config state=directory setype=container_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 26 02:56:36 localhost python3[46440]: ansible-file Invoked with path=/var/lib/container-startup-configs.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:56:36 localhost python3[46456]: ansible-file Invoked with path=/var/lib/docker-container-startup-configs.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:56:37 localhost python3[46472]: ansible-community.general.sefcontext Invoked with target=/var/lib/container-config-scripts(/.*)? setype=container_file_t state=present ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None Nov 26 02:56:37 localhost systemd[36090]: Created slice User Background Tasks Slice. Nov 26 02:56:37 localhost systemd[36090]: Starting Cleanup of User's Temporary Files and Directories... Nov 26 02:56:37 localhost systemd[36090]: Finished Cleanup of User's Temporary Files and Directories. Nov 26 02:56:37 localhost kernel: SELinux: Converting 2705 SID table entries... Nov 26 02:56:37 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 26 02:56:37 localhost kernel: SELinux: policy capability open_perms=1 Nov 26 02:56:37 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 26 02:56:37 localhost kernel: SELinux: policy capability always_check_network=0 Nov 26 02:56:37 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 26 02:56:37 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 26 02:56:37 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 26 02:56:38 localhost dbus-broker-launch[756]: avc: op=load_policy lsm=selinux seqno=13 res=1 Nov 26 02:56:38 localhost python3[46495]: ansible-file Invoked with path=/var/lib/container-config-scripts state=directory setype=container_file_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 02:56:40 localhost python3[46632]: ansible-container_startup_config Invoked with config_base_dir=/var/lib/tripleo-config/container-startup-config config_data={'step_1': {'metrics_qdr': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, 'metrics_qdr_init_logs': {'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}}, 'step_2': {'create_haproxy_wrapper': {'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, 'create_virtlogd_wrapper': {'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, 'nova_compute_init_log': {'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, 'nova_virtqemud_init_logs': {'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}}, 'step_3': {'ceilometer_init_log': {'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'collectd': {'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, 'iscsid': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, 'nova_statedir_owner': {'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, 'nova_virtlogd_wrapper': {'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': [ Nov 26 02:56:40 localhost rsyslogd[760]: message too long (31243) with configured size 8096, begin of message is: ansible-container_startup_config Invoked with config_base_dir=/var/lib/tripleo-c [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2445 ] Nov 26 02:56:40 localhost python3[46648]: ansible-file Invoked with path=/var/lib/kolla/config_files state=directory setype=container_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 26 02:56:41 localhost python3[46664]: ansible-file Invoked with path=/var/lib/config-data mode=493 state=directory setype=container_file_t selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Nov 26 02:56:41 localhost python3[46680]: ansible-tripleo_container_configs Invoked with config_data={'/var/lib/kolla/config_files/ceilometer-agent-ipmi.json': {'command': '/usr/bin/ceilometer-polling --polling-namespaces ipmi --logfile /var/log/ceilometer/ipmi.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/ceilometer_agent_compute.json': {'command': '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /var/log/ceilometer/compute.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/collectd.json': {'command': '/usr/sbin/collectd -f', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/', 'merge': False, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/etc/collectd.d'}], 'permissions': [{'owner': 'collectd:collectd', 'path': '/var/log/collectd', 'recurse': True}, {'owner': 'collectd:collectd', 'path': '/scripts', 'recurse': True}, {'owner': 'collectd:collectd', 'path': '/config-scripts', 'recurse': True}]}, '/var/lib/kolla/config_files/iscsid.json': {'command': '/usr/sbin/iscsid -f', 'config_files': [{'dest': '/etc/iscsi/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-iscsid/'}]}, '/var/lib/kolla/config_files/logrotate-crond.json': {'command': '/usr/sbin/crond -s -n', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/metrics_qdr.json': {'command': '/usr/sbin/qdrouterd -c /etc/qpid-dispatch/qdrouterd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/', 'merge': True, 'optional': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-tls/*'}], 'permissions': [{'owner': 'qdrouterd:qdrouterd', 'path': '/var/lib/qdrouterd', 'recurse': True}, {'optional': True, 'owner': 'qdrouterd:qdrouterd', 'path': '/etc/pki/tls/certs/metrics_qdr.crt'}, {'optional': True, 'owner': 'qdrouterd:qdrouterd', 'path': '/etc/pki/tls/private/metrics_qdr.key'}]}, '/var/lib/kolla/config_files/nova-migration-target.json': {'command': 'dumb-init --single-child -- /usr/sbin/sshd -D -p 2022', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ssh/', 'owner': 'root', 'perm': '0600', 'source': '/host-ssh/ssh_host_*_key'}]}, '/var/lib/kolla/config_files/nova_compute.json': {'command': '/var/lib/nova/delay-nova-compute --delay 180 --nova-binary /usr/bin/nova-compute ', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/iscsi/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-iscsid/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/var/log/nova', 'recurse': True}, {'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json': {'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_wait_for_compute_service.py', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'nova:nova', 'path': '/var/log/nova', 'recurse': True}]}, '/var/lib/kolla/config_files/nova_virtlogd.json': {'command': '/usr/local/bin/virtlogd_wrapper', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtnodedevd.json': {'command': '/usr/sbin/virtnodedevd --config /etc/libvirt/virtnodedevd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtproxyd.json': {'command': '/usr/sbin/virtproxyd --config /etc/libvirt/virtproxyd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtqemud.json': {'command': '/usr/sbin/virtqemud --config /etc/libvirt/virtqemud.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtsecretd.json': {'command': '/usr/sbin/virtsecretd --config /etc/libvirt/virtsecretd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtstoraged.json': {'command': '/usr/sbin/virtstoraged --config /etc/libvirt/virtstoraged.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/ovn_controller.json': {'command': '/usr/bin/ovn-controller --pidfile --log-file unix:/run/openvswitch/db.sock ', 'permissions': [{'owner': 'root:root', 'path': '/var/log/openvswitch', 'recurse': True}, {'owner': 'root:root', 'path': '/var/log/ovn', 'recurse': True}]}, '/var/lib/kolla/config_files/ovn_metadata_agent.json': {'command': '/usr/bin/networking-ovn-metadata-agent --config-file /etc/neutron/neutron.conf --config-file /etc/neutron/plugins/networking-ovn/networking-ovn-metadata-agent.ini --log-file=/var/log/neutron/ovn-metadata-agent.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'neutron:neutron', 'path': '/var/log/neutron', 'recurse': True}, {'owner': 'neutron:neutron', 'path': '/var/lib/neutron', 'recurse': True}, {'optional': True, 'owner': 'neutron:neutron', 'path': '/etc/pki/tls/certs/ovn_metadata.crt', 'perm': '0644'}, {'optional': True, 'owner': 'neutron:neutron', 'path': '/etc/pki/tls/private/ovn_metadata.key', 'perm': '0644'}]}, '/var/lib/kolla/config_files/rsyslog.json': {'command': '/usr/sbin/rsyslogd -n', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'root:root', 'path': '/var/lib/rsyslog', 'recurse': True}, {'owner': 'root:root', 'path': '/var/log/rsyslog', 'recurse': True}]}} Nov 26 02:56:46 localhost python3[46804]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:56:46 localhost python3[46847]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143806.132532-81334-69721371217958/source _original_basename=tmpv1m1r709 follow=False checksum=dfdcc7695edd230e7a2c06fc7b739bfa56506d8f backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:56:47 localhost python3[46877]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_1 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 02:56:49 localhost python3[47000]: ansible-file Invoked with path=/var/lib/container-puppet state=directory setype=container_file_t selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 26 02:56:50 localhost python3[47121]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Nov 26 02:56:52 localhost python3[47137]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q lvm2 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:56:53 localhost python3[47154]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 26 02:56:57 localhost dbus-broker-launch[752]: Noticed file-system modification, trigger reload. Nov 26 02:56:57 localhost dbus-broker-launch[18453]: Noticed file-system modification, trigger reload. Nov 26 02:56:57 localhost dbus-broker-launch[18453]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored Nov 26 02:56:57 localhost dbus-broker-launch[18453]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored Nov 26 02:56:57 localhost dbus-broker-launch[752]: Noticed file-system modification, trigger reload. Nov 26 02:56:58 localhost systemd[1]: Reexecuting. Nov 26 02:56:58 localhost systemd[1]: systemd 252-14.el9_2.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Nov 26 02:56:58 localhost systemd[1]: Detected virtualization kvm. Nov 26 02:56:58 localhost systemd[1]: Detected architecture x86-64. Nov 26 02:56:58 localhost systemd-rc-local-generator[47210]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 02:56:58 localhost systemd-sysv-generator[47216]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 02:56:58 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 02:57:00 localhost sshd[47229]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:57:06 localhost kernel: SELinux: Converting 2705 SID table entries... Nov 26 02:57:06 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 26 02:57:06 localhost kernel: SELinux: policy capability open_perms=1 Nov 26 02:57:06 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 26 02:57:06 localhost kernel: SELinux: policy capability always_check_network=0 Nov 26 02:57:06 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 26 02:57:06 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 26 02:57:06 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 26 02:57:06 localhost dbus-broker-launch[752]: Noticed file-system modification, trigger reload. Nov 26 02:57:06 localhost dbus-broker-launch[756]: avc: op=load_policy lsm=selinux seqno=14 res=1 Nov 26 02:57:06 localhost dbus-broker-launch[752]: Noticed file-system modification, trigger reload. Nov 26 02:57:07 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 26 02:57:07 localhost systemd[1]: Starting man-db-cache-update.service... Nov 26 02:57:07 localhost systemd[1]: Reloading. Nov 26 02:57:07 localhost systemd-sysv-generator[47371]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 02:57:07 localhost systemd-rc-local-generator[47368]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 02:57:07 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 02:57:07 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 26 02:57:07 localhost systemd[1]: Queuing reload/restart jobs for marked units… Nov 26 02:57:07 localhost systemd-journald[618]: Journal stopped Nov 26 02:57:07 localhost systemd[1]: Stopping Journal Service... Nov 26 02:57:07 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files... Nov 26 02:57:07 localhost systemd-journald[618]: Received SIGTERM from PID 1 (systemd). Nov 26 02:57:07 localhost systemd[1]: systemd-journald.service: Deactivated successfully. Nov 26 02:57:07 localhost systemd[1]: Stopped Journal Service. Nov 26 02:57:07 localhost systemd[1]: systemd-journald.service: Consumed 1.902s CPU time. Nov 26 02:57:07 localhost systemd[1]: Starting Journal Service... Nov 26 02:57:07 localhost systemd[1]: systemd-udevd.service: Deactivated successfully. Nov 26 02:57:07 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files. Nov 26 02:57:07 localhost systemd[1]: systemd-udevd.service: Consumed 2.943s CPU time. Nov 26 02:57:07 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files... Nov 26 02:57:07 localhost systemd-journald[47778]: Journal started Nov 26 02:57:07 localhost systemd-journald[47778]: Runtime Journal (/run/log/journal/ea6370aa35b896eb1e7cdbd81aa316d7) is 12.1M, max 314.7M, 302.6M free. Nov 26 02:57:07 localhost systemd[1]: Started Journal Service. Nov 26 02:57:07 localhost systemd-journald[47778]: Field hash table of /run/log/journal/ea6370aa35b896eb1e7cdbd81aa316d7/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation. Nov 26 02:57:07 localhost systemd-journald[47778]: /run/log/journal/ea6370aa35b896eb1e7cdbd81aa316d7/system.journal: Journal header limits reached or header out-of-date, rotating. Nov 26 02:57:07 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 26 02:57:07 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 26 02:57:07 localhost systemd-udevd[47783]: Using default interface naming scheme 'rhel-9.0'. Nov 26 02:57:07 localhost systemd[1]: Started Rule-based Manager for Device Events and Files. Nov 26 02:57:08 localhost systemd[1]: Reloading. Nov 26 02:57:08 localhost systemd-sysv-generator[48386]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 02:57:08 localhost systemd-rc-local-generator[48377]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 02:57:08 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 02:57:08 localhost systemd[1]: Queuing reload/restart jobs for marked units… Nov 26 02:57:08 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Nov 26 02:57:08 localhost systemd[1]: Finished man-db-cache-update.service. Nov 26 02:57:08 localhost systemd[1]: man-db-cache-update.service: Consumed 1.149s CPU time. Nov 26 02:57:08 localhost systemd[1]: run-r355c4faa66444886ab7e6c7588a1af6c.service: Deactivated successfully. Nov 26 02:57:08 localhost systemd[1]: run-r37e8760d89704804922a3f99c7f315ea.service: Deactivated successfully. Nov 26 02:57:10 localhost python3[48650]: ansible-sysctl Invoked with name=vm.unprivileged_userfaultfd reload=True state=present sysctl_file=/etc/sysctl.d/99-tripleo-postcopy.conf sysctl_set=True value=1 ignoreerrors=False Nov 26 02:57:10 localhost python3[48669]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ksm.service || systemctl is-enabled ksm.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 02:57:11 localhost python3[48687]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Nov 26 02:57:11 localhost python3[48687]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 --format json Nov 26 02:57:11 localhost python3[48687]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 -q --tls-verify=false Nov 26 02:57:18 localhost podman[48701]: 2025-11-26 07:57:11.875530266 +0000 UTC m=+0.039111583 image pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Nov 26 02:57:18 localhost python3[48687]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect bac901955dcf7a32a493c6ef724c092009bbc18467858aa8c55e916b8c2b2b8f --format json Nov 26 02:57:19 localhost python3[48802]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Nov 26 02:57:19 localhost python3[48802]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 --format json Nov 26 02:57:19 localhost python3[48802]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 -q --tls-verify=false Nov 26 02:57:26 localhost podman[48815]: 2025-11-26 07:57:19.318783491 +0000 UTC m=+0.042208272 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Nov 26 02:57:26 localhost python3[48802]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 44feaf8d87c1d40487578230316b622680576d805efdb45dfeea6aad464b41f1 --format json Nov 26 02:57:26 localhost python3[48918]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Nov 26 02:57:26 localhost python3[48918]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 --format json Nov 26 02:57:26 localhost python3[48918]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 -q --tls-verify=false Nov 26 02:57:43 localhost podman[48931]: 2025-11-26 07:57:27.01656118 +0000 UTC m=+0.040245257 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 26 02:57:43 localhost python3[48918]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 3a088c12511c977065fdc5f1594cba7b1a79f163578a6ffd0ac4a475b8e67938 --format json Nov 26 02:57:43 localhost python3[49278]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Nov 26 02:57:43 localhost python3[49278]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 --format json Nov 26 02:57:43 localhost python3[49278]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 -q --tls-verify=false Nov 26 02:57:44 localhost podman[49404]: 2025-11-26 07:57:44.734184411 +0000 UTC m=+0.086755864 container exec a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, GIT_CLEAN=True, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, name=rhceph, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.openshift.tags=rhceph ceph, version=7, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 26 02:57:44 localhost podman[49404]: 2025-11-26 07:57:44.860362059 +0000 UTC m=+0.212933512 container exec_died a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, description=Red Hat Ceph Storage 7, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, RELEASE=main, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, io.buildah.version=1.33.12, architecture=x86_64, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, name=rhceph, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux ) Nov 26 02:57:45 localhost sshd[49524]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:57:54 localhost podman[49307]: 2025-11-26 07:57:44.021671979 +0000 UTC m=+0.041367095 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Nov 26 02:57:54 localhost python3[49278]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 514d439186251360cf734cbc6d4a44c834664891872edf3798a653dfaacf10c0 --format json Nov 26 02:57:55 localhost python3[49615]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Nov 26 02:57:55 localhost python3[49615]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 --format json Nov 26 02:57:55 localhost python3[49615]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 -q --tls-verify=false Nov 26 02:57:59 localhost podman[49629]: 2025-11-26 07:57:55.193037394 +0000 UTC m=+0.042878632 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 Nov 26 02:57:59 localhost python3[49615]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect a9dd7a2ac6f35cb086249f87f74e2f8e74e7e2ad5141ce2228263be6faedce26 --format json Nov 26 02:57:59 localhost python3[49717]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Nov 26 02:57:59 localhost python3[49717]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 --format json Nov 26 02:57:59 localhost python3[49717]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 -q --tls-verify=false Nov 26 02:58:08 localhost podman[49729]: 2025-11-26 07:57:59.599146424 +0000 UTC m=+0.042115889 image pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Nov 26 02:58:08 localhost python3[49717]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 24976907b2c2553304119aba5731a800204d664feed24ca9eb7f2b4c7d81016b --format json Nov 26 02:58:08 localhost python3[50037]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Nov 26 02:58:08 localhost python3[50037]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 --format json Nov 26 02:58:09 localhost python3[50037]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 -q --tls-verify=false Nov 26 02:58:11 localhost podman[50050]: 2025-11-26 07:58:09.066796858 +0000 UTC m=+0.029883191 image pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Nov 26 02:58:11 localhost python3[50037]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 57163a7b21fdbb804a27897cb6e6052a5e5c7a339c45d663e80b52375a760dcf --format json Nov 26 02:58:11 localhost python3[50128]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Nov 26 02:58:11 localhost python3[50128]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 --format json Nov 26 02:58:11 localhost python3[50128]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 -q --tls-verify=false Nov 26 02:58:13 localhost podman[50141]: 2025-11-26 07:58:11.906382614 +0000 UTC m=+0.032387979 image pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Nov 26 02:58:13 localhost python3[50128]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 076d82a27d63c8328729ed27ceb4291585ae18d017befe6fe353df7aa11715ae --format json Nov 26 02:58:14 localhost python3[50217]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Nov 26 02:58:14 localhost python3[50217]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 --format json Nov 26 02:58:14 localhost python3[50217]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 -q --tls-verify=false Nov 26 02:58:16 localhost podman[50230]: 2025-11-26 07:58:14.287245638 +0000 UTC m=+0.042947205 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 Nov 26 02:58:16 localhost python3[50217]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect d0dbcb95546840a8d088df044347a7877ad5ea45a2ddba0578e9bb5de4ab0da5 --format json Nov 26 02:58:16 localhost python3[50307]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Nov 26 02:58:16 localhost python3[50307]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 --format json Nov 26 02:58:17 localhost python3[50307]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 -q --tls-verify=false Nov 26 02:58:21 localhost podman[50321]: 2025-11-26 07:58:17.064571875 +0000 UTC m=+0.044599515 image pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Nov 26 02:58:21 localhost python3[50307]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect e6e981540e553415b2d6eda490d7683db07164af2e7a0af8245623900338a4d6 --format json Nov 26 02:58:21 localhost python3[50411]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Nov 26 02:58:21 localhost python3[50411]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 --format json Nov 26 02:58:21 localhost python3[50411]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 -q --tls-verify=false Nov 26 02:58:23 localhost podman[50423]: 2025-11-26 07:58:21.861359661 +0000 UTC m=+0.043204691 image pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Nov 26 02:58:23 localhost python3[50411]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 87ee88cbf01fb42e0b22747072843bcca6130a90eda4de6e74b3ccd847bb4040 --format json Nov 26 02:58:24 localhost python3[50500]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_1 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 02:58:26 localhost ansible-async_wrapper.py[50672]: Invoked with 18327176409 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143905.4611466-83916-74782369103903/AnsiballZ_command.py _ Nov 26 02:58:26 localhost ansible-async_wrapper.py[50677]: Starting module and watcher Nov 26 02:58:26 localhost ansible-async_wrapper.py[50677]: Start watching 50678 (3600) Nov 26 02:58:26 localhost ansible-async_wrapper.py[50678]: Start module (50678) Nov 26 02:58:26 localhost ansible-async_wrapper.py[50672]: Return async_wrapper task started. Nov 26 02:58:26 localhost python3[50699]: ansible-ansible.legacy.async_status Invoked with jid=18327176409.50672 mode=status _async_dir=/tmp/.ansible_async Nov 26 02:58:28 localhost sshd[50739]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:58:29 localhost puppet-user[50683]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 26 02:58:29 localhost puppet-user[50683]: (file: /etc/puppet/hiera.yaml) Nov 26 02:58:29 localhost puppet-user[50683]: Warning: Undefined variable '::deploy_config_name'; Nov 26 02:58:29 localhost puppet-user[50683]: (file & line not available) Nov 26 02:58:29 localhost puppet-user[50683]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 26 02:58:29 localhost puppet-user[50683]: (file & line not available) Nov 26 02:58:29 localhost puppet-user[50683]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Nov 26 02:58:29 localhost puppet-user[50683]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Nov 26 02:58:29 localhost puppet-user[50683]: Notice: Compiled catalog for np0005536118.localdomain in environment production in 0.12 seconds Nov 26 02:58:29 localhost puppet-user[50683]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Exec[directory-create-etc-my.cnf.d]/returns: executed successfully Nov 26 02:58:29 localhost puppet-user[50683]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/File[/etc/my.cnf.d/tripleo.cnf]/ensure: created Nov 26 02:58:29 localhost puppet-user[50683]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Augeas[tripleo-mysql-client-conf]/returns: executed successfully Nov 26 02:58:29 localhost puppet-user[50683]: Notice: Applied catalog in 0.04 seconds Nov 26 02:58:29 localhost puppet-user[50683]: Application: Nov 26 02:58:29 localhost puppet-user[50683]: Initial environment: production Nov 26 02:58:29 localhost puppet-user[50683]: Converged environment: production Nov 26 02:58:29 localhost puppet-user[50683]: Run mode: user Nov 26 02:58:29 localhost puppet-user[50683]: Changes: Nov 26 02:58:29 localhost puppet-user[50683]: Total: 3 Nov 26 02:58:29 localhost puppet-user[50683]: Events: Nov 26 02:58:29 localhost puppet-user[50683]: Success: 3 Nov 26 02:58:29 localhost puppet-user[50683]: Total: 3 Nov 26 02:58:29 localhost puppet-user[50683]: Resources: Nov 26 02:58:29 localhost puppet-user[50683]: Changed: 3 Nov 26 02:58:29 localhost puppet-user[50683]: Out of sync: 3 Nov 26 02:58:29 localhost puppet-user[50683]: Total: 10 Nov 26 02:58:29 localhost puppet-user[50683]: Time: Nov 26 02:58:29 localhost puppet-user[50683]: Schedule: 0.00 Nov 26 02:58:29 localhost puppet-user[50683]: File: 0.00 Nov 26 02:58:29 localhost puppet-user[50683]: Exec: 0.01 Nov 26 02:58:29 localhost puppet-user[50683]: Augeas: 0.02 Nov 26 02:58:29 localhost puppet-user[50683]: Transaction evaluation: 0.04 Nov 26 02:58:29 localhost puppet-user[50683]: Catalog application: 0.04 Nov 26 02:58:29 localhost puppet-user[50683]: Config retrieval: 0.15 Nov 26 02:58:29 localhost puppet-user[50683]: Last run: 1764143909 Nov 26 02:58:29 localhost puppet-user[50683]: Filebucket: 0.00 Nov 26 02:58:29 localhost puppet-user[50683]: Total: 0.04 Nov 26 02:58:29 localhost puppet-user[50683]: Version: Nov 26 02:58:29 localhost puppet-user[50683]: Config: 1764143909 Nov 26 02:58:29 localhost puppet-user[50683]: Puppet: 7.10.0 Nov 26 02:58:30 localhost ansible-async_wrapper.py[50678]: Module complete (50678) Nov 26 02:58:31 localhost ansible-async_wrapper.py[50677]: Done in kid B. Nov 26 02:58:36 localhost python3[50829]: ansible-ansible.legacy.async_status Invoked with jid=18327176409.50672 mode=status _async_dir=/tmp/.ansible_async Nov 26 02:58:37 localhost python3[50845]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 26 02:58:37 localhost python3[50861]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 02:58:38 localhost python3[50909]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:58:38 localhost python3[50952]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/container-puppet/puppetlabs/facter.conf setype=svirt_sandbox_file_t selevel=s0 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143917.9027739-84135-14027264778203/source _original_basename=tmp22m5uj5o follow=False checksum=53908622cb869db5e2e2a68e737aa2ab1a872111 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 26 02:58:38 localhost python3[50982]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:58:40 localhost python3[51085]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Nov 26 02:58:40 localhost python3[51104]: ansible-file Invoked with path=/var/lib/tripleo-config/container-puppet-config mode=448 recurse=True setype=container_file_t force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 02:58:40 localhost python3[51120]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=False puppet_config=/var/lib/container-puppet/container-puppet.json short_hostname=np0005536118 step=1 update_config_hash_only=False Nov 26 02:58:41 localhost python3[51136]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:58:41 localhost python3[51152]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_1 config_pattern=container-puppet-*.json config_overrides={} debug=True Nov 26 02:58:42 localhost python3[51168]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Nov 26 02:58:43 localhost python3[51208]: ansible-tripleo_container_manage Invoked with config_id=tripleo_puppet_step1 config_dir=/var/lib/tripleo-config/container-puppet-config/step_1 config_patterns=container-puppet-*.json config_overrides={} concurrency=6 log_base_path=/var/log/containers/stdouts debug=False Nov 26 02:58:44 localhost podman[51403]: 2025-11-26 07:58:44.136481691 +0000 UTC m=+0.087402508 container create 6b81e4611bb5e1e02ce4f60ec21c6621ca2d62bbb5a92ce57c127a9564c5d201 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=container-puppet-nova_libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, url=https://www.redhat.com, release=1761123044, io.openshift.expose-services=, batch=17.1_20251118.1, build-date=2025-11-19T00:35:22Z, architecture=x86_64, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}) Nov 26 02:58:44 localhost podman[51413]: 2025-11-26 07:58:44.147389503 +0000 UTC m=+0.091867899 container create ce8feb837b22f7650b3de02512d73d9d64de9bd088a71d9ffde5621c22817a1e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, config_id=tripleo_puppet_step1, io.buildah.version=1.41.4, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., container_name=container-puppet-crond, managed_by=tripleo_ansible, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044) Nov 26 02:58:44 localhost podman[51406]: 2025-11-26 07:58:44.160188023 +0000 UTC m=+0.106098084 container create 04f2859716e73826af905aadb9557ae3d804a5f34812e88cf068c655c0bec652 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, config_id=tripleo_puppet_step1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, container_name=container-puppet-metrics_qdr, release=1761123044, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 02:58:44 localhost podman[51403]: 2025-11-26 07:58:44.081116417 +0000 UTC m=+0.032037254 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 26 02:58:44 localhost podman[51413]: 2025-11-26 07:58:44.083537183 +0000 UTC m=+0.028015589 image pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Nov 26 02:58:44 localhost podman[51406]: 2025-11-26 07:58:44.085176814 +0000 UTC m=+0.031086895 image pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Nov 26 02:58:44 localhost podman[51390]: 2025-11-26 07:58:44.194455667 +0000 UTC m=+0.162899264 container create 7e1a2071551517660ee6d28ca7a8a7beec5aa4c8719af174485690efc0cfc5cd (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_puppet_step1, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=container-puppet-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid) Nov 26 02:58:44 localhost systemd[1]: Started libpod-conmon-04f2859716e73826af905aadb9557ae3d804a5f34812e88cf068c655c0bec652.scope. Nov 26 02:58:44 localhost systemd[1]: Started libpod-conmon-ce8feb837b22f7650b3de02512d73d9d64de9bd088a71d9ffde5621c22817a1e.scope. Nov 26 02:58:44 localhost systemd[1]: Started libpod-conmon-7e1a2071551517660ee6d28ca7a8a7beec5aa4c8719af174485690efc0cfc5cd.scope. Nov 26 02:58:44 localhost podman[51390]: 2025-11-26 07:58:44.122915076 +0000 UTC m=+0.091358653 image pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Nov 26 02:58:44 localhost podman[51425]: 2025-11-26 07:58:44.124380562 +0000 UTC m=+0.043305178 image pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Nov 26 02:58:44 localhost systemd[1]: Started libcrun container. Nov 26 02:58:44 localhost systemd[1]: Started libcrun container. Nov 26 02:58:44 localhost systemd[1]: Started libcrun container. Nov 26 02:58:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/530f599619a39baef532cc44e4223e07e6aec95ffdcc3af67e9a79046d6bd825/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Nov 26 02:58:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1dbb13f7b8362d56ad42775e94222a336ba84da2e29af0e3ec833723d2b6e61c/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Nov 26 02:58:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d6ea731fdd03e16191558f2f7302aba6c7ea628ed7b242aa43f6ff82950e24e1/merged/tmp/iscsi.host supports timestamps until 2038 (0x7fffffff) Nov 26 02:58:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d6ea731fdd03e16191558f2f7302aba6c7ea628ed7b242aa43f6ff82950e24e1/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Nov 26 02:58:44 localhost podman[51413]: 2025-11-26 07:58:44.244792234 +0000 UTC m=+0.189270630 container init ce8feb837b22f7650b3de02512d73d9d64de9bd088a71d9ffde5621c22817a1e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, distribution-scope=public, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vendor=Red Hat, Inc., version=17.1.12, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.expose-services=, container_name=container-puppet-crond, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20251118.1, release=1761123044, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_puppet_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 02:58:44 localhost podman[51425]: 2025-11-26 07:58:44.246744094 +0000 UTC m=+0.165668680 container create 80669e6755acd5530d13ce765bfc4efa03cf762e2a200b506d624187fb920b3f (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=container-puppet-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, release=1761123044, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_puppet_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12) Nov 26 02:58:44 localhost systemd[1]: Started libpod-conmon-80669e6755acd5530d13ce765bfc4efa03cf762e2a200b506d624187fb920b3f.scope. Nov 26 02:58:44 localhost systemd[1]: Started libcrun container. Nov 26 02:58:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d565698d62e4a7ac4d5579be96d621c994dd08165edad7b4fd7325073352493/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Nov 26 02:58:45 localhost podman[51406]: 2025-11-26 07:58:45.090233925 +0000 UTC m=+1.036143986 container init 04f2859716e73826af905aadb9557ae3d804a5f34812e88cf068c655c0bec652 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, build-date=2025-11-18T22:49:46Z, release=1761123044, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, config_id=tripleo_puppet_step1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=container-puppet-metrics_qdr, managed_by=tripleo_ansible, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 02:58:45 localhost podman[51406]: 2025-11-26 07:58:45.146736135 +0000 UTC m=+1.092646226 container start 04f2859716e73826af905aadb9557ae3d804a5f34812e88cf068c655c0bec652 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=container-puppet-metrics_qdr, version=17.1.12, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, config_id=tripleo_puppet_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true) Nov 26 02:58:45 localhost podman[51406]: 2025-11-26 07:58:45.147114077 +0000 UTC m=+1.093024168 container attach 04f2859716e73826af905aadb9557ae3d804a5f34812e88cf068c655c0bec652 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_puppet_step1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, container_name=container-puppet-metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 02:58:45 localhost podman[51390]: 2025-11-26 07:58:45.154783017 +0000 UTC m=+1.123226634 container init 7e1a2071551517660ee6d28ca7a8a7beec5aa4c8719af174485690efc0cfc5cd (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=container-puppet-iscsid, tcib_managed=true, config_id=tripleo_puppet_step1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1) Nov 26 02:58:45 localhost podman[51413]: 2025-11-26 07:58:45.159444643 +0000 UTC m=+1.103923089 container start ce8feb837b22f7650b3de02512d73d9d64de9bd088a71d9ffde5621c22817a1e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, vcs-type=git, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=container-puppet-crond, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_puppet_step1, release=1761123044, vendor=Red Hat, Inc.) Nov 26 02:58:45 localhost podman[51413]: 2025-11-26 07:58:45.159866666 +0000 UTC m=+1.104345102 container attach ce8feb837b22f7650b3de02512d73d9d64de9bd088a71d9ffde5621c22817a1e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=container-puppet-crond, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1, url=https://www.redhat.com, name=rhosp17/openstack-cron, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 02:58:45 localhost podman[51390]: 2025-11-26 07:58:45.166838005 +0000 UTC m=+1.135281642 container start 7e1a2071551517660ee6d28ca7a8a7beec5aa4c8719af174485690efc0cfc5cd (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, container_name=container-puppet-iscsid, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_puppet_step1, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, vendor=Red Hat, Inc.) Nov 26 02:58:45 localhost podman[51390]: 2025-11-26 07:58:45.168687692 +0000 UTC m=+1.137131319 container attach 7e1a2071551517660ee6d28ca7a8a7beec5aa4c8719af174485690efc0cfc5cd (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, tcib_managed=true, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_puppet_step1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=container-puppet-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, version=17.1.12, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 02:58:45 localhost podman[51425]: 2025-11-26 07:58:45.17116498 +0000 UTC m=+1.090089596 container init 80669e6755acd5530d13ce765bfc4efa03cf762e2a200b506d624187fb920b3f (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, config_id=tripleo_puppet_step1, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vcs-type=git, release=1761123044, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=container-puppet-collectd) Nov 26 02:58:45 localhost podman[51425]: 2025-11-26 07:58:45.182058871 +0000 UTC m=+1.100983477 container start 80669e6755acd5530d13ce765bfc4efa03cf762e2a200b506d624187fb920b3f (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.12, com.redhat.component=openstack-collectd-container, architecture=x86_64, container_name=container-puppet-collectd, name=rhosp17/openstack-collectd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_id=tripleo_puppet_step1) Nov 26 02:58:45 localhost podman[51425]: 2025-11-26 07:58:45.182409842 +0000 UTC m=+1.101334508 container attach 80669e6755acd5530d13ce765bfc4efa03cf762e2a200b506d624187fb920b3f (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, config_id=tripleo_puppet_step1, version=17.1.12, com.redhat.component=openstack-collectd-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, container_name=container-puppet-collectd, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 02:58:45 localhost systemd[1]: Started libpod-conmon-6b81e4611bb5e1e02ce4f60ec21c6621ca2d62bbb5a92ce57c127a9564c5d201.scope. Nov 26 02:58:45 localhost systemd[1]: Started libcrun container. Nov 26 02:58:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9a4f9c83fc765d2523f15252985ad52f1a1e3e45616ae2a11946a3981b65bdf5/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Nov 26 02:58:45 localhost podman[51403]: 2025-11-26 07:58:45.268424416 +0000 UTC m=+1.219345253 container init 6b81e4611bb5e1e02ce4f60ec21c6621ca2d62bbb5a92ce57c127a9564c5d201 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, managed_by=tripleo_ansible, batch=17.1_20251118.1, architecture=x86_64, tcib_managed=true, name=rhosp17/openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1761123044, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_puppet_step1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.buildah.version=1.41.4, build-date=2025-11-19T00:35:22Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, container_name=container-puppet-nova_libvirt, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 02:58:45 localhost podman[51403]: 2025-11-26 07:58:45.276470409 +0000 UTC m=+1.227391236 container start 6b81e4611bb5e1e02ce4f60ec21c6621ca2d62bbb5a92ce57c127a9564c5d201 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, io.buildah.version=1.41.4, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, release=1761123044, name=rhosp17/openstack-nova-libvirt, vcs-type=git, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:35:22Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=container-puppet-nova_libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, tcib_managed=true, config_id=tripleo_puppet_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 26 02:58:45 localhost podman[51403]: 2025-11-26 07:58:45.276718836 +0000 UTC m=+1.227639663 container attach 6b81e4611bb5e1e02ce4f60ec21c6621ca2d62bbb5a92ce57c127a9564c5d201 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=container-puppet-nova_libvirt, release=1761123044, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, config_id=tripleo_puppet_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, name=rhosp17/openstack-nova-libvirt, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-11-19T00:35:22Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 02:58:46 localhost podman[51304]: 2025-11-26 07:58:43.996956371 +0000 UTC m=+0.047618323 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1 Nov 26 02:58:46 localhost podman[51629]: 2025-11-26 07:58:46.46062593 +0000 UTC m=+0.055311434 container create 373d5c869df5f4b6c549fc0bffc963e5064f15198d84a2ec31e2346176efccd1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, config_id=tripleo_puppet_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-central, build-date=2025-11-19T00:11:59Z, release=1761123044, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, container_name=container-puppet-ceilometer, io.buildah.version=1.41.4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-central-container, distribution-scope=public, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team) Nov 26 02:58:46 localhost systemd[1]: Started libpod-conmon-373d5c869df5f4b6c549fc0bffc963e5064f15198d84a2ec31e2346176efccd1.scope. Nov 26 02:58:46 localhost systemd[1]: tmp-crun.TI33Qv.mount: Deactivated successfully. Nov 26 02:58:46 localhost systemd[1]: Started libcrun container. Nov 26 02:58:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8429e6696a92b25e8a426aacc63cb14bbf8015d1d1cfea8ab0510d96125cec37/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Nov 26 02:58:46 localhost podman[51629]: 2025-11-26 07:58:46.524401367 +0000 UTC m=+0.119086881 container init 373d5c869df5f4b6c549fc0bffc963e5064f15198d84a2ec31e2346176efccd1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, vcs-type=git, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:59Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-central-container, container_name=container-puppet-ceilometer, url=https://www.redhat.com, tcib_managed=true, name=rhosp17/openstack-ceilometer-central, batch=17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, config_id=tripleo_puppet_step1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 26 02:58:46 localhost podman[51629]: 2025-11-26 07:58:46.530472387 +0000 UTC m=+0.125157901 container start 373d5c869df5f4b6c549fc0bffc963e5064f15198d84a2ec31e2346176efccd1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-central, url=https://www.redhat.com, io.buildah.version=1.41.4, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, container_name=container-puppet-ceilometer, distribution-scope=public, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_puppet_step1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, build-date=2025-11-19T00:11:59Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-central, com.redhat.component=openstack-ceilometer-central-container) Nov 26 02:58:46 localhost podman[51629]: 2025-11-26 07:58:46.530836239 +0000 UTC m=+0.125521783 container attach 373d5c869df5f4b6c549fc0bffc963e5064f15198d84a2ec31e2346176efccd1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, name=rhosp17/openstack-ceilometer-central, release=1761123044, url=https://www.redhat.com, tcib_managed=true, version=17.1.12, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-central, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1, com.redhat.component=openstack-ceilometer-central-container, build-date=2025-11-19T00:11:59Z, vendor=Red Hat, Inc., container_name=container-puppet-ceilometer, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 02:58:46 localhost podman[51629]: 2025-11-26 07:58:46.432374464 +0000 UTC m=+0.027059988 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1 Nov 26 02:58:46 localhost puppet-user[51504]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 26 02:58:46 localhost puppet-user[51504]: (file: /etc/puppet/hiera.yaml) Nov 26 02:58:46 localhost puppet-user[51504]: Warning: Undefined variable '::deploy_config_name'; Nov 26 02:58:46 localhost puppet-user[51504]: (file & line not available) Nov 26 02:58:46 localhost puppet-user[51531]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 26 02:58:46 localhost puppet-user[51531]: (file: /etc/puppet/hiera.yaml) Nov 26 02:58:46 localhost puppet-user[51531]: Warning: Undefined variable '::deploy_config_name'; Nov 26 02:58:46 localhost puppet-user[51531]: (file & line not available) Nov 26 02:58:46 localhost puppet-user[51504]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 26 02:58:46 localhost puppet-user[51504]: (file & line not available) Nov 26 02:58:46 localhost puppet-user[51531]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 26 02:58:46 localhost puppet-user[51531]: (file & line not available) Nov 26 02:58:46 localhost puppet-user[51504]: Notice: Compiled catalog for np0005536118.localdomain in environment production in 0.07 seconds Nov 26 02:58:47 localhost puppet-user[51504]: Notice: /Stage[main]/Tripleo::Profile::Base::Logging::Logrotate/File[/etc/logrotate-crond.conf]/ensure: defined content as '{sha256}1c3202f58bd2ae16cb31badcbb7f0d4e6697157b987d1887736ad96bb73d70b0' Nov 26 02:58:47 localhost puppet-user[51504]: Notice: /Stage[main]/Tripleo::Profile::Base::Logging::Logrotate/Cron[logrotate-crond]/ensure: created Nov 26 02:58:47 localhost puppet-user[51506]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 26 02:58:47 localhost puppet-user[51506]: (file: /etc/puppet/hiera.yaml) Nov 26 02:58:47 localhost puppet-user[51506]: Warning: Undefined variable '::deploy_config_name'; Nov 26 02:58:47 localhost puppet-user[51506]: (file & line not available) Nov 26 02:58:47 localhost puppet-user[51531]: Notice: Compiled catalog for np0005536118.localdomain in environment production in 0.11 seconds Nov 26 02:58:47 localhost ovs-vsctl[51933]: ovs|00001|db_ctl_base|ERR|unix:/var/run/openvswitch/db.sock: database connection failed (No such file or directory) Nov 26 02:58:47 localhost puppet-user[51504]: Notice: Applied catalog in 0.03 seconds Nov 26 02:58:47 localhost puppet-user[51504]: Application: Nov 26 02:58:47 localhost puppet-user[51504]: Initial environment: production Nov 26 02:58:47 localhost puppet-user[51504]: Converged environment: production Nov 26 02:58:47 localhost puppet-user[51504]: Run mode: user Nov 26 02:58:47 localhost puppet-user[51504]: Changes: Nov 26 02:58:47 localhost puppet-user[51504]: Total: 2 Nov 26 02:58:47 localhost puppet-user[51504]: Events: Nov 26 02:58:47 localhost puppet-user[51504]: Success: 2 Nov 26 02:58:47 localhost puppet-user[51504]: Total: 2 Nov 26 02:58:47 localhost puppet-user[51504]: Resources: Nov 26 02:58:47 localhost puppet-user[51504]: Changed: 2 Nov 26 02:58:47 localhost puppet-user[51504]: Out of sync: 2 Nov 26 02:58:47 localhost puppet-user[51504]: Skipped: 7 Nov 26 02:58:47 localhost puppet-user[51504]: Total: 9 Nov 26 02:58:47 localhost puppet-user[51504]: Time: Nov 26 02:58:47 localhost puppet-user[51504]: File: 0.01 Nov 26 02:58:47 localhost puppet-user[51504]: Cron: 0.01 Nov 26 02:58:47 localhost puppet-user[51504]: Transaction evaluation: 0.03 Nov 26 02:58:47 localhost puppet-user[51504]: Catalog application: 0.03 Nov 26 02:58:47 localhost puppet-user[51504]: Config retrieval: 0.10 Nov 26 02:58:47 localhost puppet-user[51504]: Last run: 1764143927 Nov 26 02:58:47 localhost puppet-user[51504]: Total: 0.03 Nov 26 02:58:47 localhost puppet-user[51504]: Version: Nov 26 02:58:47 localhost puppet-user[51504]: Config: 1764143926 Nov 26 02:58:47 localhost puppet-user[51504]: Puppet: 7.10.0 Nov 26 02:58:47 localhost puppet-user[51533]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 26 02:58:47 localhost puppet-user[51533]: (file: /etc/puppet/hiera.yaml) Nov 26 02:58:47 localhost puppet-user[51533]: Warning: Undefined variable '::deploy_config_name'; Nov 26 02:58:47 localhost puppet-user[51533]: (file & line not available) Nov 26 02:58:47 localhost puppet-user[51506]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 26 02:58:47 localhost puppet-user[51506]: (file & line not available) Nov 26 02:58:47 localhost puppet-user[51531]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Exec[reset-iscsi-initiator-name]/returns: executed successfully Nov 26 02:58:47 localhost puppet-user[51531]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/File[/etc/iscsi/.initiator_reset]/ensure: created Nov 26 02:58:47 localhost puppet-user[51533]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 26 02:58:47 localhost puppet-user[51533]: (file & line not available) Nov 26 02:58:47 localhost puppet-user[51506]: Notice: Accepting previously invalid value for target type 'Integer' Nov 26 02:58:47 localhost puppet-user[51531]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Exec[sync-iqn-to-host]/returns: executed successfully Nov 26 02:58:47 localhost puppet-user[51506]: Notice: Compiled catalog for np0005536118.localdomain in environment production in 0.13 seconds Nov 26 02:58:47 localhost puppet-user[51576]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 26 02:58:47 localhost puppet-user[51576]: (file: /etc/puppet/hiera.yaml) Nov 26 02:58:47 localhost puppet-user[51576]: Warning: Undefined variable '::deploy_config_name'; Nov 26 02:58:47 localhost puppet-user[51576]: (file & line not available) Nov 26 02:58:47 localhost puppet-user[51506]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/owner: owner changed 'qdrouterd' to 'root' Nov 26 02:58:47 localhost puppet-user[51506]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/group: group changed 'qdrouterd' to 'root' Nov 26 02:58:47 localhost puppet-user[51506]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/mode: mode changed '0700' to '0755' Nov 26 02:58:47 localhost puppet-user[51506]: Notice: /Stage[main]/Qdr::Config/File[/etc/qpid-dispatch/ssl]/ensure: created Nov 26 02:58:47 localhost puppet-user[51576]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 26 02:58:47 localhost puppet-user[51576]: (file & line not available) Nov 26 02:58:47 localhost puppet-user[51506]: Notice: /Stage[main]/Qdr::Config/File[qdrouterd.conf]/content: content changed '{sha256}89e10d8896247f992c5f0baf027c25a8ca5d0441be46d8859d9db2067ea74cd3' to '{sha256}b5bd859c5f57f2ae8ee4155d31e98c3f9ffa922b2035225d6b7e5639c1e17f2b' Nov 26 02:58:47 localhost puppet-user[51506]: Notice: /Stage[main]/Qdr::Config/File[/var/log/qdrouterd]/ensure: created Nov 26 02:58:47 localhost puppet-user[51506]: Notice: /Stage[main]/Qdr::Config/File[/var/log/qdrouterd/metrics_qdr.log]/ensure: created Nov 26 02:58:47 localhost puppet-user[51506]: Notice: Applied catalog in 0.05 seconds Nov 26 02:58:47 localhost puppet-user[51506]: Application: Nov 26 02:58:47 localhost puppet-user[51506]: Initial environment: production Nov 26 02:58:47 localhost puppet-user[51506]: Converged environment: production Nov 26 02:58:47 localhost puppet-user[51506]: Run mode: user Nov 26 02:58:47 localhost puppet-user[51506]: Changes: Nov 26 02:58:47 localhost puppet-user[51506]: Total: 7 Nov 26 02:58:47 localhost puppet-user[51506]: Events: Nov 26 02:58:47 localhost puppet-user[51506]: Success: 7 Nov 26 02:58:47 localhost puppet-user[51506]: Total: 7 Nov 26 02:58:47 localhost puppet-user[51506]: Resources: Nov 26 02:58:47 localhost puppet-user[51506]: Skipped: 13 Nov 26 02:58:47 localhost puppet-user[51506]: Changed: 5 Nov 26 02:58:47 localhost puppet-user[51506]: Out of sync: 5 Nov 26 02:58:47 localhost puppet-user[51506]: Total: 20 Nov 26 02:58:47 localhost puppet-user[51506]: Time: Nov 26 02:58:47 localhost puppet-user[51506]: File: 0.03 Nov 26 02:58:47 localhost puppet-user[51506]: Transaction evaluation: 0.04 Nov 26 02:58:47 localhost puppet-user[51506]: Catalog application: 0.05 Nov 26 02:58:47 localhost puppet-user[51506]: Config retrieval: 0.17 Nov 26 02:58:47 localhost puppet-user[51506]: Last run: 1764143927 Nov 26 02:58:47 localhost puppet-user[51506]: Total: 0.05 Nov 26 02:58:47 localhost puppet-user[51506]: Version: Nov 26 02:58:47 localhost puppet-user[51506]: Config: 1764143927 Nov 26 02:58:47 localhost puppet-user[51506]: Puppet: 7.10.0 Nov 26 02:58:47 localhost systemd[1]: libpod-ce8feb837b22f7650b3de02512d73d9d64de9bd088a71d9ffde5621c22817a1e.scope: Deactivated successfully. Nov 26 02:58:47 localhost systemd[1]: libpod-ce8feb837b22f7650b3de02512d73d9d64de9bd088a71d9ffde5621c22817a1e.scope: Consumed 2.051s CPU time. Nov 26 02:58:47 localhost podman[51413]: 2025-11-26 07:58:47.372267685 +0000 UTC m=+3.316746091 container died ce8feb837b22f7650b3de02512d73d9d64de9bd088a71d9ffde5621c22817a1e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_id=tripleo_puppet_step1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=container-puppet-crond, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, release=1761123044, distribution-scope=public, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}) Nov 26 02:58:47 localhost puppet-user[51533]: Notice: Compiled catalog for np0005536118.localdomain in environment production in 0.36 seconds Nov 26 02:58:47 localhost puppet-user[51576]: Warning: Scope(Class[Nova]): The os_region_name parameter is deprecated and will be removed \ Nov 26 02:58:47 localhost puppet-user[51576]: in a future release. Use nova::cinder::os_region_name instead Nov 26 02:58:47 localhost puppet-user[51576]: Warning: Scope(Class[Nova]): The catalog_info parameter is deprecated and will be removed \ Nov 26 02:58:47 localhost puppet-user[51576]: in a future release. Use nova::cinder::catalog_info instead Nov 26 02:58:47 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ce8feb837b22f7650b3de02512d73d9d64de9bd088a71d9ffde5621c22817a1e-userdata-shm.mount: Deactivated successfully. Nov 26 02:58:47 localhost systemd[1]: var-lib-containers-storage-overlay-530f599619a39baef532cc44e4223e07e6aec95ffdcc3af67e9a79046d6bd825-merged.mount: Deactivated successfully. Nov 26 02:58:47 localhost podman[52057]: 2025-11-26 07:58:47.476217041 +0000 UTC m=+0.096354389 container cleanup ce8feb837b22f7650b3de02512d73d9d64de9bd088a71d9ffde5621c22817a1e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.12, container_name=container-puppet-crond, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, managed_by=tripleo_ansible, config_id=tripleo_puppet_step1, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-type=git, url=https://www.redhat.com, io.buildah.version=1.41.4, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 02:58:47 localhost systemd[1]: libpod-04f2859716e73826af905aadb9557ae3d804a5f34812e88cf068c655c0bec652.scope: Deactivated successfully. Nov 26 02:58:47 localhost systemd[1]: libpod-04f2859716e73826af905aadb9557ae3d804a5f34812e88cf068c655c0bec652.scope: Consumed 2.143s CPU time. Nov 26 02:58:47 localhost podman[51406]: 2025-11-26 07:58:47.484226752 +0000 UTC m=+3.430136813 container died 04f2859716e73826af905aadb9557ae3d804a5f34812e88cf068c655c0bec652 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, container_name=container-puppet-metrics_qdr, architecture=x86_64, config_id=tripleo_puppet_step1, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 02:58:47 localhost systemd[1]: libpod-conmon-ce8feb837b22f7650b3de02512d73d9d64de9bd088a71d9ffde5621c22817a1e.scope: Deactivated successfully. Nov 26 02:58:47 localhost puppet-user[51531]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Augeas[chap_algs in /etc/iscsi/iscsid.conf]/returns: executed successfully Nov 26 02:58:47 localhost puppet-user[51531]: Notice: Applied catalog in 0.45 seconds Nov 26 02:58:47 localhost puppet-user[51531]: Application: Nov 26 02:58:47 localhost puppet-user[51531]: Initial environment: production Nov 26 02:58:47 localhost puppet-user[51531]: Converged environment: production Nov 26 02:58:47 localhost puppet-user[51531]: Run mode: user Nov 26 02:58:47 localhost puppet-user[51531]: Changes: Nov 26 02:58:47 localhost puppet-user[51531]: Total: 4 Nov 26 02:58:47 localhost puppet-user[51531]: Events: Nov 26 02:58:47 localhost puppet-user[51531]: Success: 4 Nov 26 02:58:47 localhost puppet-user[51531]: Total: 4 Nov 26 02:58:47 localhost puppet-user[51531]: Resources: Nov 26 02:58:47 localhost puppet-user[51531]: Changed: 4 Nov 26 02:58:47 localhost puppet-user[51531]: Out of sync: 4 Nov 26 02:58:47 localhost puppet-user[51531]: Skipped: 8 Nov 26 02:58:47 localhost puppet-user[51531]: Total: 13 Nov 26 02:58:47 localhost puppet-user[51531]: Time: Nov 26 02:58:47 localhost puppet-user[51531]: File: 0.00 Nov 26 02:58:47 localhost puppet-user[51531]: Exec: 0.04 Nov 26 02:58:47 localhost puppet-user[51531]: Config retrieval: 0.13 Nov 26 02:58:47 localhost puppet-user[51531]: Augeas: 0.40 Nov 26 02:58:47 localhost puppet-user[51531]: Transaction evaluation: 0.45 Nov 26 02:58:47 localhost puppet-user[51531]: Catalog application: 0.45 Nov 26 02:58:47 localhost puppet-user[51531]: Last run: 1764143927 Nov 26 02:58:47 localhost puppet-user[51531]: Total: 0.45 Nov 26 02:58:47 localhost puppet-user[51531]: Version: Nov 26 02:58:47 localhost puppet-user[51531]: Config: 1764143926 Nov 26 02:58:47 localhost puppet-user[51531]: Puppet: 7.10.0 Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/content: content changed '{sha256}aea388a73ebafc7e07a81ddb930a91099211f660eee55fbf92c13007a77501e5' to '{sha256}2523d01ee9c3022c0e9f61d896b1474a168e18472aee141cc278e69fe13f41c1' Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/owner: owner changed 'collectd' to 'root' Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/group: group changed 'collectd' to 'root' Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/mode: mode changed '0644' to '0640' Nov 26 02:58:47 localhost puppet-user[51576]: Warning: Unknown variable: '::nova::compute::verify_glance_signatures'. (file: /etc/puppet/modules/nova/manifests/glance.pp, line: 62, column: 41) Nov 26 02:58:47 localhost podman[52110]: 2025-11-26 07:58:47.541421303 +0000 UTC m=+0.045581029 container cleanup 04f2859716e73826af905aadb9557ae3d804a5f34812e88cf068c655c0bec652 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, managed_by=tripleo_ansible, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, container_name=container-puppet-metrics_qdr, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_puppet_step1, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true) Nov 26 02:58:47 localhost python3[51208]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-crond --conmon-pidfile /run/container-puppet-crond.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005536118 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron --env NAME=crond --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::logging::logrotate --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-crond --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-crond.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Nov 26 02:58:47 localhost systemd[1]: libpod-conmon-04f2859716e73826af905aadb9557ae3d804a5f34812e88cf068c655c0bec652.scope: Deactivated successfully. Nov 26 02:58:47 localhost python3[51208]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-metrics_qdr --conmon-pidfile /run/container-puppet-metrics_qdr.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005536118 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron --env NAME=metrics_qdr --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::metrics::qdr#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-metrics_qdr --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-metrics_qdr.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/owner: owner changed 'collectd' to 'root' Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/group: group changed 'collectd' to 'root' Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/mode: mode changed '0755' to '0750' Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-cpu.conf]/ensure: removed Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-interface.conf]/ensure: removed Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-load.conf]/ensure: removed Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-memory.conf]/ensure: removed Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-syslog.conf]/ensure: removed Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/apache.conf]/ensure: removed Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/dns.conf]/ensure: removed Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ipmi.conf]/ensure: removed Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/mcelog.conf]/ensure: removed Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/mysql.conf]/ensure: removed Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ovs-events.conf]/ensure: removed Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ovs-stats.conf]/ensure: removed Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ping.conf]/ensure: removed Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/pmu.conf]/ensure: removed Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/rdt.conf]/ensure: removed Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/sensors.conf]/ensure: removed Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/snmp.conf]/ensure: removed Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/write_prometheus.conf]/ensure: removed Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Plugin::Python/File[/usr/lib/python3.9/site-packages]/mode: mode changed '0755' to '0750' Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Plugin::Python/Collectd::Plugin[python]/File[python.load]/ensure: defined content as '{sha256}0163924a0099dd43fe39cb85e836df147fd2cfee8197dc6866d3c384539eb6ee' Nov 26 02:58:47 localhost puppet-user[51576]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_base_images'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 44, column: 5) Nov 26 02:58:47 localhost puppet-user[51576]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_original_minimum_age_seconds'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 48, column: 5) Nov 26 02:58:47 localhost puppet-user[51576]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_resized_minimum_age_seconds'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 52, column: 5) Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Plugin::Python/Concat[/etc/collectd.d/python-config.conf]/File[/etc/collectd.d/python-config.conf]/ensure: defined content as '{sha256}2e5fb20e60b30f84687fc456a37fc62451000d2d85f5bbc1b3fca3a5eac9deeb' Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Plugin::Logfile/Collectd::Plugin[logfile]/File[logfile.load]/ensure: defined content as '{sha256}07bbda08ef9b824089500bdc6ac5a86e7d1ef2ae3ed4ed423c0559fe6361e5af' Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Plugin::Amqp1/Collectd::Plugin[amqp1]/File[amqp1.load]/ensure: defined content as '{sha256}0d4e701b7b2398bbf396579a0713d46d3c496c79edc52f2e260456f359c9a46c' Nov 26 02:58:47 localhost puppet-user[51576]: Warning: Scope(Class[Tripleo::Profile::Base::Nova::Compute]): The keymgr_backend parameter has been deprecated Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Plugin::Ceph/Collectd::Plugin[ceph]/File[ceph.load]/ensure: defined content as '{sha256}c796abffda2e860875295b4fc11cc95c6032b4e13fa8fb128e839a305aa1676c' Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Plugin::Cpu/Collectd::Plugin[cpu]/File[cpu.load]/ensure: defined content as '{sha256}67d4c8bf6bf5785f4cb6b596712204d9eacbcebbf16fe289907195d4d3cb0e34' Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Plugin::Df/Collectd::Plugin[df]/File[df.load]/ensure: defined content as '{sha256}edeb4716d96fc9dca2c6adfe07bae70ba08c6af3944a3900581cba0f08f3c4ba' Nov 26 02:58:47 localhost puppet-user[51576]: Warning: Scope(Class[Nova::Compute]): vcpu_pin_set is deprecated, instead use cpu_dedicated_set or cpu_shared_set. Nov 26 02:58:47 localhost puppet-user[51576]: Warning: Scope(Class[Nova::Compute]): verify_glance_signatures is deprecated. Use the same parameter in nova::glance Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Plugin::Disk/Collectd::Plugin[disk]/File[disk.load]/ensure: defined content as '{sha256}1d0cb838278f3226fcd381f0fc2e0e1abaf0d590f4ba7bcb2fc6ec113d3ebde7' Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Plugin::Hugepages/Collectd::Plugin[hugepages]/File[hugepages.load]/ensure: defined content as '{sha256}9b9f35b65a73da8d4037e4355a23b678f2cf61997ccf7a5e1adf2a7ce6415827' Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Plugin::Hugepages/Collectd::Plugin[hugepages]/File[older_hugepages.load]/ensure: removed Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Plugin::Interface/Collectd::Plugin[interface]/File[interface.load]/ensure: defined content as '{sha256}b76b315dc312e398940fe029c6dbc5c18d2b974ff7527469fc7d3617b5222046' Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Plugin::Load/Collectd::Plugin[load]/File[load.load]/ensure: defined content as '{sha256}af2403f76aebd2f10202d66d2d55e1a8d987eed09ced5a3e3873a4093585dc31' Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Plugin::Memory/Collectd::Plugin[memory]/File[memory.load]/ensure: defined content as '{sha256}0f270425ee6b05fc9440ee32b9afd1010dcbddd9b04ca78ff693858f7ecb9d0e' Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Plugin::Unixsock/Collectd::Plugin[unixsock]/File[unixsock.load]/ensure: defined content as '{sha256}9d1ec1c51ba386baa6f62d2e019dbd6998ad924bf868b3edc2d24d3dc3c63885' Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Plugin::Uptime/Collectd::Plugin[uptime]/File[uptime.load]/ensure: defined content as '{sha256}f7a26c6369f904d0ca1af59627ebea15f5e72160bcacdf08d217af282b42e5c0' Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Plugin::Virt/Collectd::Plugin[virt]/File[virt.load]/ensure: defined content as '{sha256}9a2bcf913f6bf8a962a0ff351a9faea51ae863cc80af97b77f63f8ab68941c62' Nov 26 02:58:47 localhost puppet-user[51533]: Notice: /Stage[main]/Collectd::Plugin::Virt/Collectd::Plugin[virt]/File[older_virt.load]/ensure: removed Nov 26 02:58:47 localhost puppet-user[51533]: Notice: Applied catalog in 0.26 seconds Nov 26 02:58:47 localhost puppet-user[51533]: Application: Nov 26 02:58:47 localhost puppet-user[51533]: Initial environment: production Nov 26 02:58:47 localhost puppet-user[51533]: Converged environment: production Nov 26 02:58:47 localhost puppet-user[51533]: Run mode: user Nov 26 02:58:47 localhost puppet-user[51533]: Changes: Nov 26 02:58:47 localhost puppet-user[51533]: Total: 43 Nov 26 02:58:47 localhost puppet-user[51533]: Events: Nov 26 02:58:47 localhost puppet-user[51533]: Success: 43 Nov 26 02:58:47 localhost puppet-user[51533]: Total: 43 Nov 26 02:58:47 localhost puppet-user[51533]: Resources: Nov 26 02:58:47 localhost puppet-user[51533]: Skipped: 14 Nov 26 02:58:47 localhost puppet-user[51533]: Changed: 38 Nov 26 02:58:47 localhost puppet-user[51533]: Out of sync: 38 Nov 26 02:58:47 localhost puppet-user[51533]: Total: 82 Nov 26 02:58:47 localhost puppet-user[51533]: Time: Nov 26 02:58:47 localhost puppet-user[51533]: Concat fragment: 0.00 Nov 26 02:58:47 localhost puppet-user[51533]: Concat file: 0.00 Nov 26 02:58:47 localhost puppet-user[51533]: File: 0.10 Nov 26 02:58:47 localhost puppet-user[51533]: Transaction evaluation: 0.25 Nov 26 02:58:47 localhost puppet-user[51533]: Catalog application: 0.26 Nov 26 02:58:47 localhost puppet-user[51533]: Config retrieval: 0.42 Nov 26 02:58:47 localhost puppet-user[51533]: Last run: 1764143927 Nov 26 02:58:47 localhost puppet-user[51533]: Total: 0.26 Nov 26 02:58:47 localhost puppet-user[51533]: Version: Nov 26 02:58:47 localhost puppet-user[51533]: Config: 1764143927 Nov 26 02:58:47 localhost puppet-user[51533]: Puppet: 7.10.0 Nov 26 02:58:47 localhost systemd[1]: libpod-7e1a2071551517660ee6d28ca7a8a7beec5aa4c8719af174485690efc0cfc5cd.scope: Deactivated successfully. Nov 26 02:58:47 localhost systemd[1]: libpod-7e1a2071551517660ee6d28ca7a8a7beec5aa4c8719af174485690efc0cfc5cd.scope: Consumed 2.499s CPU time. Nov 26 02:58:47 localhost podman[51390]: 2025-11-26 07:58:47.808701725 +0000 UTC m=+3.777145352 container died 7e1a2071551517660ee6d28ca7a8a7beec5aa4c8719af174485690efc0cfc5cd (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, container_name=container-puppet-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_puppet_step1, build-date=2025-11-18T23:44:13Z, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.4, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, managed_by=tripleo_ansible) Nov 26 02:58:47 localhost puppet-user[51576]: Warning: Scope(Class[Nova::Compute::Libvirt]): nova::compute::libvirt::images_type will be required if rbd ephemeral storage is used. Nov 26 02:58:47 localhost podman[52225]: 2025-11-26 07:58:47.916406668 +0000 UTC m=+0.090834595 container cleanup 7e1a2071551517660ee6d28ca7a8a7beec5aa4c8719af174485690efc0cfc5cd (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, container_name=container-puppet-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, release=1761123044, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com) Nov 26 02:58:47 localhost systemd[1]: libpod-conmon-7e1a2071551517660ee6d28ca7a8a7beec5aa4c8719af174485690efc0cfc5cd.scope: Deactivated successfully. Nov 26 02:58:47 localhost python3[51208]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-iscsid --conmon-pidfile /run/container-puppet-iscsid.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005536118 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,iscsid_config --env NAME=iscsid --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::iscsid#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-iscsid --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-iscsid.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/iscsi:/tmp/iscsi.host:z --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Nov 26 02:58:48 localhost podman[52283]: 2025-11-26 07:58:48.021106179 +0000 UTC m=+0.089240407 container create 9fde5d7d7ddf0c6d78b84701c5b584dcce6ace57a27a37ab23ada58108bb5871 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=container-puppet-ovn_controller, batch=17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_puppet_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 02:58:48 localhost podman[52276]: 2025-11-26 07:58:47.943710234 +0000 UTC m=+0.029072641 image pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Nov 26 02:58:48 localhost podman[52276]: 2025-11-26 07:58:48.047914218 +0000 UTC m=+0.133276605 container create 18c91518aeaf5e3312d90da56c9cbfadc13e34e8587ff5ec371069aeb8b23d64 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, name=rhosp17/openstack-rsyslog, distribution-scope=public, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_puppet_step1, com.redhat.component=openstack-rsyslog-container, container_name=container-puppet-rsyslog) Nov 26 02:58:48 localhost systemd[1]: Started libpod-conmon-9fde5d7d7ddf0c6d78b84701c5b584dcce6ace57a27a37ab23ada58108bb5871.scope. Nov 26 02:58:48 localhost systemd[1]: Started libcrun container. Nov 26 02:58:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/435018edd8bd40c695e2155529ca14d60cdcfce0e73e35f8aa64d52f759a88b7/merged/etc/sysconfig/modules supports timestamps until 2038 (0x7fffffff) Nov 26 02:58:48 localhost systemd[1]: Started libpod-conmon-18c91518aeaf5e3312d90da56c9cbfadc13e34e8587ff5ec371069aeb8b23d64.scope. Nov 26 02:58:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/435018edd8bd40c695e2155529ca14d60cdcfce0e73e35f8aa64d52f759a88b7/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Nov 26 02:58:48 localhost podman[52283]: 2025-11-26 07:58:48.072371484 +0000 UTC m=+0.140505702 container init 9fde5d7d7ddf0c6d78b84701c5b584dcce6ace57a27a37ab23ada58108bb5871 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, release=1761123044, version=17.1.12, io.openshift.expose-services=, batch=17.1_20251118.1, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_puppet_step1, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=container-puppet-ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 26 02:58:48 localhost systemd[1]: Started libcrun container. Nov 26 02:58:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/97e8961208df3b0c873a479fd1758a4d1fe73c2607c9ea38c2b5c398da67b5e0/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Nov 26 02:58:48 localhost podman[52283]: 2025-11-26 07:58:48.079851719 +0000 UTC m=+0.147985967 container start 9fde5d7d7ddf0c6d78b84701c5b584dcce6ace57a27a37ab23ada58108bb5871 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, config_id=tripleo_puppet_step1, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=container-puppet-ovn_controller, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2025-11-18T23:34:05Z, tcib_managed=true, io.buildah.version=1.41.4, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1) Nov 26 02:58:48 localhost podman[52283]: 2025-11-26 07:58:48.080060725 +0000 UTC m=+0.148194953 container attach 9fde5d7d7ddf0c6d78b84701c5b584dcce6ace57a27a37ab23ada58108bb5871 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, container_name=container-puppet-ovn_controller, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git) Nov 26 02:58:48 localhost podman[52276]: 2025-11-26 07:58:48.083972067 +0000 UTC m=+0.169334464 container init 18c91518aeaf5e3312d90da56c9cbfadc13e34e8587ff5ec371069aeb8b23d64 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-rsyslog-container, io.openshift.expose-services=, name=rhosp17/openstack-rsyslog, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, release=1761123044, version=17.1.12, url=https://www.redhat.com, container_name=container-puppet-rsyslog, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_puppet_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog) Nov 26 02:58:48 localhost podman[52283]: 2025-11-26 07:58:47.992077059 +0000 UTC m=+0.060211337 image pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Nov 26 02:58:48 localhost podman[52276]: 2025-11-26 07:58:48.093762514 +0000 UTC m=+0.179124911 container start 18c91518aeaf5e3312d90da56c9cbfadc13e34e8587ff5ec371069aeb8b23d64 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=container-puppet-rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, release=1761123044, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-rsyslog-container, tcib_managed=true, build-date=2025-11-18T22:49:49Z, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, managed_by=tripleo_ansible, name=rhosp17/openstack-rsyslog, config_id=tripleo_puppet_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog) Nov 26 02:58:48 localhost podman[52276]: 2025-11-26 07:58:48.094172957 +0000 UTC m=+0.179535374 container attach 18c91518aeaf5e3312d90da56c9cbfadc13e34e8587ff5ec371069aeb8b23d64 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1761123044, config_id=tripleo_puppet_step1, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=container-puppet-rsyslog, build-date=2025-11-18T22:49:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, name=rhosp17/openstack-rsyslog, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, version=17.1.12, com.redhat.component=openstack-rsyslog-container, description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog) Nov 26 02:58:48 localhost systemd[1]: libpod-80669e6755acd5530d13ce765bfc4efa03cf762e2a200b506d624187fb920b3f.scope: Deactivated successfully. Nov 26 02:58:48 localhost systemd[1]: libpod-80669e6755acd5530d13ce765bfc4efa03cf762e2a200b506d624187fb920b3f.scope: Consumed 2.730s CPU time. Nov 26 02:58:48 localhost podman[51425]: 2025-11-26 07:58:48.104527581 +0000 UTC m=+4.023452167 container died 80669e6755acd5530d13ce765bfc4efa03cf762e2a200b506d624187fb920b3f (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=container-puppet-collectd, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64, config_id=tripleo_puppet_step1, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd) Nov 26 02:58:48 localhost systemd[1]: var-lib-containers-storage-overlay-d6ea731fdd03e16191558f2f7302aba6c7ea628ed7b242aa43f6ff82950e24e1-merged.mount: Deactivated successfully. Nov 26 02:58:48 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7e1a2071551517660ee6d28ca7a8a7beec5aa4c8719af174485690efc0cfc5cd-userdata-shm.mount: Deactivated successfully. Nov 26 02:58:48 localhost systemd[1]: var-lib-containers-storage-overlay-1dbb13f7b8362d56ad42775e94222a336ba84da2e29af0e3ec833723d2b6e61c-merged.mount: Deactivated successfully. Nov 26 02:58:48 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-04f2859716e73826af905aadb9557ae3d804a5f34812e88cf068c655c0bec652-userdata-shm.mount: Deactivated successfully. Nov 26 02:58:48 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-80669e6755acd5530d13ce765bfc4efa03cf762e2a200b506d624187fb920b3f-userdata-shm.mount: Deactivated successfully. Nov 26 02:58:48 localhost systemd[1]: var-lib-containers-storage-overlay-9d565698d62e4a7ac4d5579be96d621c994dd08165edad7b4fd7325073352493-merged.mount: Deactivated successfully. Nov 26 02:58:48 localhost podman[52382]: 2025-11-26 07:58:48.243860105 +0000 UTC m=+0.128442994 container cleanup 80669e6755acd5530d13ce765bfc4efa03cf762e2a200b506d624187fb920b3f (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, container_name=container-puppet-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, batch=17.1_20251118.1, config_id=tripleo_puppet_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Nov 26 02:58:48 localhost systemd[1]: libpod-conmon-80669e6755acd5530d13ce765bfc4efa03cf762e2a200b506d624187fb920b3f.scope: Deactivated successfully. Nov 26 02:58:48 localhost python3[51208]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-collectd --conmon-pidfile /run/container-puppet-collectd.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005536118 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,collectd_client_config,exec --env NAME=collectd --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::metrics::collectd --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-collectd --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-collectd.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Nov 26 02:58:48 localhost puppet-user[51576]: Notice: Compiled catalog for np0005536118.localdomain in environment production in 1.30 seconds Nov 26 02:58:48 localhost puppet-user[51663]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 26 02:58:48 localhost puppet-user[51663]: (file: /etc/puppet/hiera.yaml) Nov 26 02:58:48 localhost puppet-user[51663]: Warning: Undefined variable '::deploy_config_name'; Nov 26 02:58:48 localhost puppet-user[51663]: (file & line not available) Nov 26 02:58:48 localhost puppet-user[51663]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 26 02:58:48 localhost puppet-user[51663]: (file & line not available) Nov 26 02:58:48 localhost puppet-user[51663]: Warning: Unknown variable: '::ceilometer::cache_backend'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 145, column: 39) Nov 26 02:58:48 localhost puppet-user[51663]: Warning: Unknown variable: '::ceilometer::memcache_servers'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 146, column: 39) Nov 26 02:58:48 localhost puppet-user[51663]: Warning: Unknown variable: '::ceilometer::cache_tls_enabled'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 147, column: 39) Nov 26 02:58:48 localhost puppet-user[51663]: Warning: Unknown variable: '::ceilometer::cache_tls_cafile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 148, column: 39) Nov 26 02:58:48 localhost puppet-user[51663]: Warning: Unknown variable: '::ceilometer::cache_tls_certfile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 149, column: 39) Nov 26 02:58:48 localhost puppet-user[51663]: Warning: Unknown variable: '::ceilometer::cache_tls_keyfile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 150, column: 39) Nov 26 02:58:48 localhost puppet-user[51663]: Warning: Unknown variable: '::ceilometer::cache_tls_allowed_ciphers'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 151, column: 39) Nov 26 02:58:48 localhost puppet-user[51663]: Warning: Unknown variable: '::ceilometer::manage_backend_package'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 152, column: 39) Nov 26 02:58:48 localhost puppet-user[51576]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Client/File[/etc/nova/migration/identity]/content: content changed '{sha256}86610d84e745a3992358ae0b747297805d075492e5114c666fa08f8aecce7da0' to '{sha256}64612648ff59780f027ed5e542170b4bfd2b6e44811f308331dcbd4a50088db0' Nov 26 02:58:48 localhost puppet-user[51576]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Client/File_line[nova_ssh_port]/ensure: created Nov 26 02:58:48 localhost puppet-user[51576]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/File[/etc/sasl2/libvirt.conf]/content: content changed '{sha256}78510a0d6f14b269ddeb9f9638dfdfba9f976d370ee2ec04ba25352a8af6df35' to '{sha256}6d7bcae773217a30c0772f75d0d1b6d21f5d64e72853f5e3d91bb47799dbb7fe' Nov 26 02:58:48 localhost puppet-user[51576]: Warning: Empty environment setting 'TLS_PASSWORD' Nov 26 02:58:48 localhost puppet-user[51576]: (file: /etc/puppet/modules/tripleo/manifests/profile/base/nova/libvirt.pp, line: 182) Nov 26 02:58:48 localhost puppet-user[51663]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_password'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 63, column: 25) Nov 26 02:58:48 localhost puppet-user[51663]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_url'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 68, column: 25) Nov 26 02:58:48 localhost puppet-user[51663]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_region'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 69, column: 28) Nov 26 02:58:48 localhost puppet-user[51663]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_user'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 70, column: 25) Nov 26 02:58:48 localhost puppet-user[51663]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_tenant_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 71, column: 29) Nov 26 02:58:48 localhost puppet-user[51663]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_cacert'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 72, column: 23) Nov 26 02:58:48 localhost puppet-user[51663]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_endpoint_type'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 73, column: 26) Nov 26 02:58:48 localhost puppet-user[51663]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_user_domain_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 74, column: 33) Nov 26 02:58:48 localhost puppet-user[51663]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_project_domain_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 75, column: 36) Nov 26 02:58:48 localhost puppet-user[51663]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_type'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 76, column: 26) Nov 26 02:58:48 localhost puppet-user[51576]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/Exec[set libvirt sasl credentials]/returns: executed successfully Nov 26 02:58:48 localhost puppet-user[51576]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Target/File[/etc/nova/migration/authorized_keys]/content: content changed '{sha256}0d05a8832f36c0517b84e9c3ad11069d531c7d2be5297661e5552fd29e3a5e47' to '{sha256}ac8b5a10430f8e0e97cf3a405fd15d5cca1b6363afb22e84117627bdfd4e8f1f' Nov 26 02:58:48 localhost puppet-user[51576]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Target/File_line[nova_migration_logindefs]/ensure: created Nov 26 02:58:48 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Workarounds/Nova_config[workarounds/never_download_image_if_on_rbd]/ensure: created Nov 26 02:58:48 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Workarounds/Nova_config[workarounds/disable_compute_service_check_for_ffu]/ensure: created Nov 26 02:58:48 localhost puppet-user[51576]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/ssl_only]/ensure: created Nov 26 02:58:48 localhost puppet-user[51576]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/my_ip]/ensure: created Nov 26 02:58:48 localhost puppet-user[51576]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/host]/ensure: created Nov 26 02:58:48 localhost puppet-user[51663]: Notice: Compiled catalog for np0005536118.localdomain in environment production in 0.36 seconds Nov 26 02:58:48 localhost puppet-user[51576]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/cpu_allocation_ratio]/ensure: created Nov 26 02:58:48 localhost puppet-user[51576]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/ram_allocation_ratio]/ensure: created Nov 26 02:58:48 localhost puppet-user[51576]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/disk_allocation_ratio]/ensure: created Nov 26 02:58:48 localhost puppet-user[51576]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/dhcp_domain]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova/Nova_config[vif_plug_ovs/ovsdb_connection]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova/Nova_config[notifications/notification_format]/ensure: created Nov 26 02:58:49 localhost puppet-user[51663]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[DEFAULT/http_timeout]/ensure: created Nov 26 02:58:49 localhost puppet-user[51663]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[DEFAULT/host]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/state_path]/ensure: created Nov 26 02:58:49 localhost puppet-user[51663]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[publisher/telemetry_secret]/ensure: created Nov 26 02:58:49 localhost puppet-user[51663]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[hardware/readonly_user_name]/ensure: created Nov 26 02:58:49 localhost puppet-user[51663]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[hardware/readonly_user_password]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/service_down_time]/ensure: created Nov 26 02:58:49 localhost puppet-user[51663]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/auth_url]/ensure: created Nov 26 02:58:49 localhost puppet-user[51663]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/region_name]/ensure: created Nov 26 02:58:49 localhost puppet-user[51663]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/username]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/rootwrap_config]/ensure: created Nov 26 02:58:49 localhost puppet-user[51663]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/password]/ensure: created Nov 26 02:58:49 localhost puppet-user[51663]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/project_name]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/report_interval]/ensure: created Nov 26 02:58:49 localhost puppet-user[51663]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/interface]/ensure: created Nov 26 02:58:49 localhost puppet-user[51663]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/user_domain_name]/ensure: created Nov 26 02:58:49 localhost puppet-user[51663]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/project_domain_name]/ensure: created Nov 26 02:58:49 localhost puppet-user[51663]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/auth_type]/ensure: created Nov 26 02:58:49 localhost puppet-user[51663]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[compute/instance_discovery_method]/ensure: created Nov 26 02:58:49 localhost puppet-user[51663]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[DEFAULT/polling_namespaces]/ensure: created Nov 26 02:58:49 localhost puppet-user[51663]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[polling/tenant_name_discovery]/ensure: created Nov 26 02:58:49 localhost puppet-user[51663]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[coordination/backend_url]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova/Nova_config[notifications/notify_on_state_change]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova/Nova_config[cinder/cross_az_attach]/ensure: created Nov 26 02:58:49 localhost puppet-user[51663]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/backend]/ensure: created Nov 26 02:58:49 localhost puppet-user[51663]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/enabled]/ensure: created Nov 26 02:58:49 localhost puppet-user[51663]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/memcache_servers]/ensure: created Nov 26 02:58:49 localhost puppet-user[51663]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/tls_enabled]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Glance/Nova_config[glance/valid_interfaces]/ensure: created Nov 26 02:58:49 localhost puppet-user[51663]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Rabbit[ceilometer_config]/Ceilometer_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_type]/ensure: created Nov 26 02:58:49 localhost puppet-user[51663]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Amqp[ceilometer_config]/Ceilometer_config[oslo_messaging_amqp/rpc_address_prefix]/ensure: created Nov 26 02:58:49 localhost puppet-user[51663]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Amqp[ceilometer_config]/Ceilometer_config[oslo_messaging_amqp/notify_address_prefix]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_url]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/password]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/project_domain_name]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/project_name]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/user_domain_name]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/username]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/region_name]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/valid_interfaces]/ensure: created Nov 26 02:58:49 localhost puppet-user[51663]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/driver]/ensure: created Nov 26 02:58:49 localhost puppet-user[51663]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/transport_url]/ensure: created Nov 26 02:58:49 localhost puppet-user[51663]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/topics]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/password]/ensure: created Nov 26 02:58:49 localhost puppet-user[51663]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Default[ceilometer_config]/Ceilometer_config[DEFAULT/transport_url]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/auth_type]/ensure: created Nov 26 02:58:49 localhost puppet-user[51663]: Notice: /Stage[main]/Ceilometer::Logging/Oslo::Log[ceilometer_config]/Ceilometer_config[DEFAULT/debug]/ensure: created Nov 26 02:58:49 localhost puppet-user[51663]: Notice: /Stage[main]/Ceilometer::Logging/Oslo::Log[ceilometer_config]/Ceilometer_config[DEFAULT/log_dir]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/auth_url]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/region_name]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/project_name]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/project_domain_name]/ensure: created Nov 26 02:58:49 localhost puppet-user[51663]: Notice: Applied catalog in 0.44 seconds Nov 26 02:58:49 localhost puppet-user[51663]: Application: Nov 26 02:58:49 localhost puppet-user[51663]: Initial environment: production Nov 26 02:58:49 localhost puppet-user[51663]: Converged environment: production Nov 26 02:58:49 localhost puppet-user[51663]: Run mode: user Nov 26 02:58:49 localhost puppet-user[51663]: Changes: Nov 26 02:58:49 localhost puppet-user[51663]: Total: 31 Nov 26 02:58:49 localhost puppet-user[51663]: Events: Nov 26 02:58:49 localhost puppet-user[51663]: Success: 31 Nov 26 02:58:49 localhost puppet-user[51663]: Total: 31 Nov 26 02:58:49 localhost puppet-user[51663]: Resources: Nov 26 02:58:49 localhost puppet-user[51663]: Skipped: 22 Nov 26 02:58:49 localhost puppet-user[51663]: Changed: 31 Nov 26 02:58:49 localhost puppet-user[51663]: Out of sync: 31 Nov 26 02:58:49 localhost puppet-user[51663]: Total: 151 Nov 26 02:58:49 localhost puppet-user[51663]: Time: Nov 26 02:58:49 localhost puppet-user[51663]: Package: 0.03 Nov 26 02:58:49 localhost puppet-user[51663]: Ceilometer config: 0.34 Nov 26 02:58:49 localhost puppet-user[51663]: Config retrieval: 0.43 Nov 26 02:58:49 localhost puppet-user[51663]: Transaction evaluation: 0.44 Nov 26 02:58:49 localhost puppet-user[51663]: Catalog application: 0.44 Nov 26 02:58:49 localhost puppet-user[51663]: Last run: 1764143929 Nov 26 02:58:49 localhost puppet-user[51663]: Resources: 0.00 Nov 26 02:58:49 localhost puppet-user[51663]: Total: 0.44 Nov 26 02:58:49 localhost puppet-user[51663]: Version: Nov 26 02:58:49 localhost puppet-user[51663]: Config: 1764143928 Nov 26 02:58:49 localhost puppet-user[51663]: Puppet: 7.10.0 Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/username]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/user_domain_name]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/os_region_name]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/catalog_info]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/manager_interval]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_base_images]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_original_minimum_age_seconds]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_resized_minimum_age_seconds]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/precache_concurrency]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Vendordata/Nova_config[vendordata_dynamic_auth/project_domain_name]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Vendordata/Nova_config[vendordata_dynamic_auth/user_domain_name]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Provider/Nova_config[compute/provider_config_location]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Provider/File[/etc/nova/provider_config]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/use_cow_images]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/mkisofs_cmd]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/force_raw_images]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_host_memory_mb]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_huge_pages]/ensure: created Nov 26 02:58:49 localhost puppet-user[52402]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 26 02:58:49 localhost puppet-user[52402]: (file: /etc/puppet/hiera.yaml) Nov 26 02:58:49 localhost puppet-user[52402]: Warning: Undefined variable '::deploy_config_name'; Nov 26 02:58:49 localhost puppet-user[52402]: (file & line not available) Nov 26 02:58:49 localhost systemd[1]: libpod-373d5c869df5f4b6c549fc0bffc963e5064f15198d84a2ec31e2346176efccd1.scope: Deactivated successfully. Nov 26 02:58:49 localhost systemd[1]: libpod-373d5c869df5f4b6c549fc0bffc963e5064f15198d84a2ec31e2346176efccd1.scope: Consumed 3.059s CPU time. Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/resume_guests_state_on_host_boot]/ensure: created Nov 26 02:58:49 localhost podman[51629]: 2025-11-26 07:58:49.932644693 +0000 UTC m=+3.527330267 container died 373d5c869df5f4b6c549fc0bffc963e5064f15198d84a2ec31e2346176efccd1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-central, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:59Z, batch=17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-central, com.redhat.component=openstack-ceilometer-central-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, container_name=container-puppet-ceilometer, architecture=x86_64, io.buildah.version=1.41.4, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, config_id=tripleo_puppet_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central) Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute/Nova_config[key_manager/backend]/ensure: created Nov 26 02:58:49 localhost puppet-user[52402]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 26 02:58:49 localhost puppet-user[52402]: (file & line not available) Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/sync_power_state_interval]/ensure: created Nov 26 02:58:49 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/consecutive_build_service_disable_threshold]/ensure: created Nov 26 02:58:49 localhost puppet-user[52396]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 26 02:58:49 localhost puppet-user[52396]: (file: /etc/puppet/hiera.yaml) Nov 26 02:58:49 localhost puppet-user[52396]: Warning: Undefined variable '::deploy_config_name'; Nov 26 02:58:49 localhost puppet-user[52396]: (file & line not available) Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/live_migration_wait_for_vif_plug]/ensure: created Nov 26 02:58:50 localhost systemd[1]: tmp-crun.XRAZxk.mount: Deactivated successfully. Nov 26 02:58:50 localhost systemd[1]: var-lib-containers-storage-overlay-8429e6696a92b25e8a426aacc63cb14bbf8015d1d1cfea8ab0510d96125cec37-merged.mount: Deactivated successfully. Nov 26 02:58:50 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-373d5c869df5f4b6c549fc0bffc963e5064f15198d84a2ec31e2346176efccd1-userdata-shm.mount: Deactivated successfully. Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/max_disk_devices_to_attach]/ensure: created Nov 26 02:58:50 localhost puppet-user[52396]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 26 02:58:50 localhost puppet-user[52396]: (file & line not available) Nov 26 02:58:50 localhost podman[52691]: 2025-11-26 07:58:50.042507084 +0000 UTC m=+0.097380002 container cleanup 373d5c869df5f4b6c549fc0bffc963e5064f15198d84a2ec31e2346176efccd1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_puppet_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-central-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, tcib_managed=true, build-date=2025-11-19T00:11:59Z, name=rhosp17/openstack-ceilometer-central, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, vcs-type=git, batch=17.1_20251118.1, container_name=container-puppet-ceilometer, version=17.1.12) Nov 26 02:58:50 localhost systemd[1]: libpod-conmon-373d5c869df5f4b6c549fc0bffc963e5064f15198d84a2ec31e2346176efccd1.scope: Deactivated successfully. Nov 26 02:58:50 localhost python3[51208]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-ceilometer --conmon-pidfile /run/container-puppet-ceilometer.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005536118 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config --env NAME=ceilometer --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::ceilometer::agent::polling#012include tripleo::profile::base::ceilometer::agent::polling#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-ceilometer --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-ceilometer.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1 Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Vncproxy::Common/Nova_config[vnc/novncproxy_base_url]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/server_proxyclient_address]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/enabled]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute/Nova_config[spice/enabled]/ensure: created Nov 26 02:58:50 localhost puppet-user[52402]: Notice: Compiled catalog for np0005536118.localdomain in environment production in 0.24 seconds Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit_period]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_is_fatal]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_timeout]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/default_floating_pool]/ensure: created Nov 26 02:58:50 localhost puppet-user[52396]: Notice: Compiled catalog for np0005536118.localdomain in environment production in 0.22 seconds Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/timeout]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_name]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_domain_name]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/region_name]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/username]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/user_domain_name]/ensure: created Nov 26 02:58:50 localhost puppet-user[52402]: Notice: /Stage[main]/Rsyslog::Base/File[/etc/rsyslog.conf]/content: content changed '{sha256}d6f679f6a4eb6f33f9fc20c846cb30bef93811e1c86bc4da1946dc3100b826c3' to '{sha256}7963bd801fadd49a17561f4d3f80738c3f504b413b11c443432d8303138041f2' Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/password]/ensure: created Nov 26 02:58:50 localhost ovs-vsctl[52750]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-remote=tcp:172.17.0.103:6642,tcp:172.17.0.104:6642,tcp:172.17.0.105:6642 Nov 26 02:58:50 localhost puppet-user[52396]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-remote]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_url]/ensure: created Nov 26 02:58:50 localhost puppet-user[52402]: Notice: /Stage[main]/Rsyslog::Config::Global/Rsyslog::Component::Global_config[MaxMessageSize]/Rsyslog::Generate_concat[rsyslog::concat::global_config::MaxMessageSize]/Concat[/etc/rsyslog.d/00_rsyslog.conf]/File[/etc/rsyslog.d/00_rsyslog.conf]/ensure: defined content as '{sha256}a291d5cc6d5884a978161f4c7b5831d43edd07797cc590bae366e7f150b8643b' Nov 26 02:58:50 localhost puppet-user[52402]: Notice: /Stage[main]/Rsyslog::Config::Templates/Rsyslog::Component::Template[rsyslog-node-index]/Rsyslog::Generate_concat[rsyslog::concat::template::rsyslog-node-index]/Concat[/etc/rsyslog.d/50_openstack_logs.conf]/File[/etc/rsyslog.d/50_openstack_logs.conf]/ensure: defined content as '{sha256}9f4690969acd38737c877a7621ae4daff22181d0cafada041bcc7a14d361f103' Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/valid_interfaces]/ensure: created Nov 26 02:58:50 localhost puppet-user[52402]: Notice: Applied catalog in 0.11 seconds Nov 26 02:58:50 localhost puppet-user[52402]: Application: Nov 26 02:58:50 localhost puppet-user[52402]: Initial environment: production Nov 26 02:58:50 localhost puppet-user[52402]: Converged environment: production Nov 26 02:58:50 localhost puppet-user[52402]: Run mode: user Nov 26 02:58:50 localhost puppet-user[52402]: Changes: Nov 26 02:58:50 localhost puppet-user[52402]: Total: 3 Nov 26 02:58:50 localhost puppet-user[52402]: Events: Nov 26 02:58:50 localhost puppet-user[52402]: Success: 3 Nov 26 02:58:50 localhost puppet-user[52402]: Total: 3 Nov 26 02:58:50 localhost puppet-user[52402]: Resources: Nov 26 02:58:50 localhost puppet-user[52402]: Skipped: 11 Nov 26 02:58:50 localhost puppet-user[52402]: Changed: 3 Nov 26 02:58:50 localhost puppet-user[52402]: Out of sync: 3 Nov 26 02:58:50 localhost puppet-user[52402]: Total: 25 Nov 26 02:58:50 localhost puppet-user[52402]: Time: Nov 26 02:58:50 localhost puppet-user[52402]: Concat file: 0.00 Nov 26 02:58:50 localhost puppet-user[52402]: Concat fragment: 0.00 Nov 26 02:58:50 localhost puppet-user[52402]: File: 0.01 Nov 26 02:58:50 localhost puppet-user[52402]: Transaction evaluation: 0.10 Nov 26 02:58:50 localhost puppet-user[52402]: Catalog application: 0.11 Nov 26 02:58:50 localhost puppet-user[52402]: Config retrieval: 0.29 Nov 26 02:58:50 localhost puppet-user[52402]: Last run: 1764143930 Nov 26 02:58:50 localhost puppet-user[52402]: Total: 0.11 Nov 26 02:58:50 localhost puppet-user[52402]: Version: Nov 26 02:58:50 localhost puppet-user[52402]: Config: 1764143929 Nov 26 02:58:50 localhost puppet-user[52402]: Puppet: 7.10.0 Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/ovs_bridge]/ensure: created Nov 26 02:58:50 localhost ovs-vsctl[52752]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-type=geneve Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/extension_sync_interval]/ensure: created Nov 26 02:58:50 localhost puppet-user[52396]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-type]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_type]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_uri]/ensure: created Nov 26 02:58:50 localhost ovs-vsctl[52754]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-ip=172.19.0.107 Nov 26 02:58:50 localhost puppet-user[52396]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-ip]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_tunnelled]/ensure: created Nov 26 02:58:50 localhost ovs-vsctl[52757]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:hostname=np0005536118.localdomain Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_inbound_addr]/ensure: created Nov 26 02:58:50 localhost puppet-user[52396]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:hostname]/value: value changed 'np0005536118.novalocal' to 'np0005536118.localdomain' Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_permit_post_copy]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_permit_auto_converge]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_tls]/ensure: created Nov 26 02:58:50 localhost ovs-vsctl[52765]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-bridge=br-int Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_tcp]/ensure: created Nov 26 02:58:50 localhost puppet-user[52396]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-bridge]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/rbd_user]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/rbd_secret_uuid]/ensure: created Nov 26 02:58:50 localhost ovs-vsctl[52767]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-remote-probe-interval=60000 Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Rbd/File[/etc/nova/secret.xml]/ensure: defined content as '{sha256}923ff499ad2ca4b6a1119828e24fe060a65d41120910ab03c0ad243e75bcc1fe' Nov 26 02:58:50 localhost puppet-user[52396]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-remote-probe-interval]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_type]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_pool]/ensure: created Nov 26 02:58:50 localhost ovs-vsctl[52772]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-openflow-probe-interval=60 Nov 26 02:58:50 localhost puppet-user[52396]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-openflow-probe-interval]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_ceph_conf]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_store_name]/ensure: created Nov 26 02:58:50 localhost ovs-vsctl[52774]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-monitor-all=true Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_copy_poll_interval]/ensure: created Nov 26 02:58:50 localhost puppet-user[52396]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-monitor-all]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_copy_timeout]/ensure: created Nov 26 02:58:50 localhost ovs-vsctl[52776]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-ofctrl-wait-before-clear=8000 Nov 26 02:58:50 localhost puppet-user[52396]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-ofctrl-wait-before-clear]/ensure: created Nov 26 02:58:50 localhost ovs-vsctl[52782]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-tos=0 Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/compute_driver]/ensure: created Nov 26 02:58:50 localhost puppet-user[52396]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-tos]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/preallocate_images]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[vnc/server_listen]/ensure: created Nov 26 02:58:50 localhost ovs-vsctl[52791]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-chassis-mac-mappings=datacentre:fa:16:3e:8d:5f:be Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/virt_type]/ensure: created Nov 26 02:58:50 localhost puppet-user[52396]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-chassis-mac-mappings]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_mode]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_password]/ensure: created Nov 26 02:58:50 localhost ovs-vsctl[52796]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-bridge-mappings=datacentre:br-ex Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_key]/ensure: created Nov 26 02:58:50 localhost puppet-user[52396]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-bridge-mappings]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_partition]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/hw_disk_discard]/ensure: created Nov 26 02:58:50 localhost ovs-vsctl[52803]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-match-northd-version=false Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/hw_machine_type]/ensure: created Nov 26 02:58:50 localhost systemd[1]: libpod-18c91518aeaf5e3312d90da56c9cbfadc13e34e8587ff5ec371069aeb8b23d64.scope: Deactivated successfully. Nov 26 02:58:50 localhost systemd[1]: libpod-18c91518aeaf5e3312d90da56c9cbfadc13e34e8587ff5ec371069aeb8b23d64.scope: Consumed 2.370s CPU time. Nov 26 02:58:50 localhost puppet-user[52396]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-match-northd-version]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/enabled_perf_events]/ensure: created Nov 26 02:58:50 localhost podman[52276]: 2025-11-26 07:58:50.632537956 +0000 UTC m=+2.717900343 container died 18c91518aeaf5e3312d90da56c9cbfadc13e34e8587ff5ec371069aeb8b23d64 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, com.redhat.component=openstack-rsyslog-container, vendor=Red Hat, Inc., release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:49Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=container-puppet-rsyslog, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, name=rhosp17/openstack-rsyslog, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, distribution-scope=public, version=17.1.12, io.buildah.version=1.41.4, config_id=tripleo_puppet_step1) Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/rx_queue_size]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/tx_queue_size]/ensure: created Nov 26 02:58:50 localhost ovs-vsctl[52817]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:garp-max-timeout-sec=0 Nov 26 02:58:50 localhost puppet-user[52396]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:garp-max-timeout-sec]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/file_backed_memory]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/volume_use_multipath]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/num_pcie_ports]/ensure: created Nov 26 02:58:50 localhost puppet-user[52396]: Notice: Applied catalog in 0.48 seconds Nov 26 02:58:50 localhost puppet-user[52396]: Application: Nov 26 02:58:50 localhost puppet-user[52396]: Initial environment: production Nov 26 02:58:50 localhost puppet-user[52396]: Converged environment: production Nov 26 02:58:50 localhost puppet-user[52396]: Run mode: user Nov 26 02:58:50 localhost puppet-user[52396]: Changes: Nov 26 02:58:50 localhost puppet-user[52396]: Total: 14 Nov 26 02:58:50 localhost puppet-user[52396]: Events: Nov 26 02:58:50 localhost puppet-user[52396]: Success: 14 Nov 26 02:58:50 localhost puppet-user[52396]: Total: 14 Nov 26 02:58:50 localhost puppet-user[52396]: Resources: Nov 26 02:58:50 localhost puppet-user[52396]: Skipped: 12 Nov 26 02:58:50 localhost puppet-user[52396]: Changed: 14 Nov 26 02:58:50 localhost puppet-user[52396]: Out of sync: 14 Nov 26 02:58:50 localhost puppet-user[52396]: Total: 29 Nov 26 02:58:50 localhost puppet-user[52396]: Time: Nov 26 02:58:50 localhost puppet-user[52396]: Exec: 0.01 Nov 26 02:58:50 localhost puppet-user[52396]: Config retrieval: 0.25 Nov 26 02:58:50 localhost puppet-user[52396]: Vs config: 0.42 Nov 26 02:58:50 localhost puppet-user[52396]: Transaction evaluation: 0.47 Nov 26 02:58:50 localhost puppet-user[52396]: Catalog application: 0.48 Nov 26 02:58:50 localhost puppet-user[52396]: Last run: 1764143930 Nov 26 02:58:50 localhost puppet-user[52396]: Total: 0.48 Nov 26 02:58:50 localhost puppet-user[52396]: Version: Nov 26 02:58:50 localhost puppet-user[52396]: Config: 1764143929 Nov 26 02:58:50 localhost puppet-user[52396]: Puppet: 7.10.0 Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/mem_stats_period_seconds]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/pmem_namespaces]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/swtpm_enabled]/ensure: created Nov 26 02:58:50 localhost podman[52811]: 2025-11-26 07:58:50.738566547 +0000 UTC m=+0.094291955 container cleanup 18c91518aeaf5e3312d90da56c9cbfadc13e34e8587ff5ec371069aeb8b23d64 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, name=rhosp17/openstack-rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, container_name=container-puppet-rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:49Z, maintainer=OpenStack TripleO Team, architecture=x86_64, version=17.1.12, distribution-scope=public, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, io.buildah.version=1.41.4, release=1761123044, com.redhat.component=openstack-rsyslog-container, config_id=tripleo_puppet_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 rsyslog) Nov 26 02:58:50 localhost systemd[1]: libpod-conmon-18c91518aeaf5e3312d90da56c9cbfadc13e34e8587ff5ec371069aeb8b23d64.scope: Deactivated successfully. Nov 26 02:58:50 localhost python3[51208]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-rsyslog --conmon-pidfile /run/container-puppet-rsyslog.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005536118 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment --env NAME=rsyslog --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::logging::rsyslog --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-rsyslog --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-rsyslog.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_model_extra_flags]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/disk_cachemodes]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtlogd/Virtlogd_config[log_filters]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtlogd/Virtlogd_config[log_outputs]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtproxyd/Virtproxyd_config[log_filters]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtproxyd/Virtproxyd_config[log_outputs]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtqemud/Virtqemud_config[log_filters]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtqemud/Virtqemud_config[log_outputs]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtnodedevd/Virtnodedevd_config[log_filters]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtnodedevd/Virtnodedevd_config[log_outputs]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtstoraged/Virtstoraged_config[log_filters]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtstoraged/Virtstoraged_config[log_outputs]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtsecretd/Virtsecretd_config[log_filters]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtsecretd/Virtsecretd_config[log_outputs]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_group]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[auth_unix_ro]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[auth_unix_rw]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_ro_perms]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_rw_perms]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_group]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[auth_unix_ro]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[auth_unix_rw]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_ro_perms]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_rw_perms]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_group]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[auth_unix_ro]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[auth_unix_rw]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_ro_perms]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_rw_perms]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_group]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[auth_unix_ro]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[auth_unix_rw]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_ro_perms]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_rw_perms]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_group]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[auth_unix_ro]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[auth_unix_rw]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_ro_perms]/ensure: created Nov 26 02:58:50 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_rw_perms]/ensure: created Nov 26 02:58:50 localhost systemd[1]: var-lib-containers-storage-overlay-97e8961208df3b0c873a479fd1758a4d1fe73c2607c9ea38c2b5c398da67b5e0-merged.mount: Deactivated successfully. Nov 26 02:58:50 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-18c91518aeaf5e3312d90da56c9cbfadc13e34e8587ff5ec371069aeb8b23d64-userdata-shm.mount: Deactivated successfully. Nov 26 02:58:51 localhost systemd[1]: libpod-9fde5d7d7ddf0c6d78b84701c5b584dcce6ace57a27a37ab23ada58108bb5871.scope: Deactivated successfully. Nov 26 02:58:51 localhost systemd[1]: libpod-9fde5d7d7ddf0c6d78b84701c5b584dcce6ace57a27a37ab23ada58108bb5871.scope: Consumed 2.743s CPU time. Nov 26 02:58:51 localhost podman[52283]: 2025-11-26 07:58:51.057475165 +0000 UTC m=+3.125609403 container died 9fde5d7d7ddf0c6d78b84701c5b584dcce6ace57a27a37ab23ada58108bb5871 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, vendor=Red Hat, Inc., architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=container-puppet-ovn_controller, build-date=2025-11-18T23:34:05Z, config_id=tripleo_puppet_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, release=1761123044) Nov 26 02:58:51 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Compute::Libvirt::Qemu/Augeas[qemu-conf-limits]/returns: executed successfully Nov 26 02:58:51 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9fde5d7d7ddf0c6d78b84701c5b584dcce6ace57a27a37ab23ada58108bb5871-userdata-shm.mount: Deactivated successfully. Nov 26 02:58:51 localhost systemd[1]: var-lib-containers-storage-overlay-435018edd8bd40c695e2155529ca14d60cdcfce0e73e35f8aa64d52f759a88b7-merged.mount: Deactivated successfully. Nov 26 02:58:51 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Migration::Qemu/Augeas[qemu-conf-migration-ports]/returns: executed successfully Nov 26 02:58:51 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/debug]/ensure: created Nov 26 02:58:51 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/log_dir]/ensure: created Nov 26 02:58:51 localhost podman[52899]: 2025-11-26 07:58:51.929690246 +0000 UTC m=+0.862122365 container cleanup 9fde5d7d7ddf0c6d78b84701c5b584dcce6ace57a27a37ab23ada58108bb5871 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, architecture=x86_64, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, container_name=container-puppet-ovn_controller, config_id=tripleo_puppet_step1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}) Nov 26 02:58:51 localhost systemd[1]: libpod-conmon-9fde5d7d7ddf0c6d78b84701c5b584dcce6ace57a27a37ab23ada58108bb5871.scope: Deactivated successfully. Nov 26 02:58:51 localhost python3[51208]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-ovn_controller --conmon-pidfile /run/container-puppet-ovn_controller.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005536118 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,vs_config,exec --env NAME=ovn_controller --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::neutron::agents::ovn#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-ovn_controller --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-ovn_controller.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /etc/sysconfig/modules:/etc/sysconfig/modules --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Nov 26 02:58:51 localhost podman[52428]: 2025-11-26 07:58:48.220846405 +0000 UTC m=+0.029776284 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1 Nov 26 02:58:52 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/backend]/ensure: created Nov 26 02:58:52 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/enabled]/ensure: created Nov 26 02:58:52 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/memcache_servers]/ensure: created Nov 26 02:58:52 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/tls_enabled]/ensure: created Nov 26 02:58:52 localhost podman[52957]: 2025-11-26 07:58:52.186792699 +0000 UTC m=+0.077976603 container create d7b84403fc3916033b0046105925898295f709f65d3442ce3770ecd53e58551f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, config_id=tripleo_puppet_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-server, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, com.redhat.component=openstack-neutron-server-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, release=1761123044, distribution-scope=public, container_name=container-puppet-neutron, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-server, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:23:27Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-server, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server) Nov 26 02:58:52 localhost systemd[1]: Started libpod-conmon-d7b84403fc3916033b0046105925898295f709f65d3442ce3770ecd53e58551f.scope. Nov 26 02:58:52 localhost systemd[1]: Started libcrun container. Nov 26 02:58:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/61b78344a7655d1356d5611c58b53e5d1f1449149a2ea477d54243044567e90e/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Nov 26 02:58:52 localhost podman[52957]: 2025-11-26 07:58:52.146028032 +0000 UTC m=+0.037212006 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1 Nov 26 02:58:52 localhost podman[52957]: 2025-11-26 07:58:52.252473286 +0000 UTC m=+0.143657200 container init d7b84403fc3916033b0046105925898295f709f65d3442ce3770ecd53e58551f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, release=1761123044, name=rhosp17/openstack-neutron-server, com.redhat.component=openstack-neutron-server-container, summary=Red Hat OpenStack Platform 17.1 neutron-server, build-date=2025-11-19T00:23:27Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-server, batch=17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, container_name=container-puppet-neutron, managed_by=tripleo_ansible, version=17.1.12, maintainer=OpenStack TripleO Team, config_id=tripleo_puppet_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=) Nov 26 02:58:52 localhost puppet-user[51576]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created Nov 26 02:58:52 localhost podman[52957]: 2025-11-26 07:58:52.261189019 +0000 UTC m=+0.152372923 container start d7b84403fc3916033b0046105925898295f709f65d3442ce3770ecd53e58551f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, vcs-type=git, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-server, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, name=rhosp17/openstack-neutron-server, com.redhat.component=openstack-neutron-server-container, build-date=2025-11-19T00:23:27Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-server, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=container-puppet-neutron, batch=17.1_20251118.1, config_id=tripleo_puppet_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, io.buildah.version=1.41.4, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, tcib_managed=true, managed_by=tripleo_ansible) Nov 26 02:58:52 localhost podman[52957]: 2025-11-26 07:58:52.261389085 +0000 UTC m=+0.152572989 container attach d7b84403fc3916033b0046105925898295f709f65d3442ce3770ecd53e58551f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, release=1761123044, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, io.buildah.version=1.41.4, config_id=tripleo_puppet_step1, build-date=2025-11-19T00:23:27Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, batch=17.1_20251118.1, url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-server-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-server, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=container-puppet-neutron, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, description=Red Hat OpenStack Platform 17.1 neutron-server, name=rhosp17/openstack-neutron-server, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 02:58:52 localhost puppet-user[51576]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created Nov 26 02:58:52 localhost puppet-user[51576]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/ssl]/ensure: created Nov 26 02:58:52 localhost puppet-user[51576]: Notice: /Stage[main]/Nova/Oslo::Messaging::Default[nova_config]/Nova_config[DEFAULT/transport_url]/ensure: created Nov 26 02:58:52 localhost puppet-user[51576]: Notice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/driver]/ensure: created Nov 26 02:58:52 localhost puppet-user[51576]: Notice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/transport_url]/ensure: created Nov 26 02:58:52 localhost puppet-user[51576]: Notice: /Stage[main]/Nova/Oslo::Concurrency[nova_config]/Nova_config[oslo_concurrency/lock_path]/ensure: created Nov 26 02:58:52 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/auth_type]/ensure: created Nov 26 02:58:52 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/region_name]/ensure: created Nov 26 02:58:52 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/auth_url]/ensure: created Nov 26 02:58:52 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/username]/ensure: created Nov 26 02:58:52 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/password]/ensure: created Nov 26 02:58:52 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/user_domain_name]/ensure: created Nov 26 02:58:52 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/project_name]/ensure: created Nov 26 02:58:52 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/project_domain_name]/ensure: created Nov 26 02:58:52 localhost puppet-user[51576]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/send_service_user_token]/ensure: created Nov 26 02:58:52 localhost puppet-user[51576]: Notice: /Stage[main]/Ssh::Server::Config/Concat[/etc/ssh/sshd_config]/File[/etc/ssh/sshd_config]/ensure: defined content as '{sha256}3a12438802493a75725c4f7704f2af6db1ef72af396369e5de28f6f4d6a7ed98' Nov 26 02:58:52 localhost puppet-user[51576]: Notice: Applied catalog in 4.23 seconds Nov 26 02:58:52 localhost puppet-user[51576]: Application: Nov 26 02:58:52 localhost puppet-user[51576]: Initial environment: production Nov 26 02:58:52 localhost puppet-user[51576]: Converged environment: production Nov 26 02:58:52 localhost puppet-user[51576]: Run mode: user Nov 26 02:58:52 localhost puppet-user[51576]: Changes: Nov 26 02:58:52 localhost puppet-user[51576]: Total: 183 Nov 26 02:58:52 localhost puppet-user[51576]: Events: Nov 26 02:58:52 localhost puppet-user[51576]: Success: 183 Nov 26 02:58:52 localhost puppet-user[51576]: Total: 183 Nov 26 02:58:52 localhost puppet-user[51576]: Resources: Nov 26 02:58:52 localhost puppet-user[51576]: Changed: 183 Nov 26 02:58:52 localhost puppet-user[51576]: Out of sync: 183 Nov 26 02:58:52 localhost puppet-user[51576]: Skipped: 57 Nov 26 02:58:52 localhost puppet-user[51576]: Total: 487 Nov 26 02:58:52 localhost puppet-user[51576]: Time: Nov 26 02:58:52 localhost puppet-user[51576]: Concat file: 0.00 Nov 26 02:58:52 localhost puppet-user[51576]: Concat fragment: 0.00 Nov 26 02:58:52 localhost puppet-user[51576]: Anchor: 0.00 Nov 26 02:58:52 localhost puppet-user[51576]: File line: 0.00 Nov 26 02:58:52 localhost puppet-user[51576]: Virtlogd config: 0.00 Nov 26 02:58:52 localhost puppet-user[51576]: Virtqemud config: 0.01 Nov 26 02:58:52 localhost puppet-user[51576]: Virtnodedevd config: 0.01 Nov 26 02:58:52 localhost puppet-user[51576]: Exec: 0.01 Nov 26 02:58:52 localhost puppet-user[51576]: Package: 0.02 Nov 26 02:58:52 localhost puppet-user[51576]: Virtsecretd config: 0.02 Nov 26 02:58:52 localhost puppet-user[51576]: File: 0.02 Nov 26 02:58:52 localhost puppet-user[51576]: Virtproxyd config: 0.03 Nov 26 02:58:52 localhost puppet-user[51576]: Virtstoraged config: 0.03 Nov 26 02:58:52 localhost puppet-user[51576]: Augeas: 0.96 Nov 26 02:58:52 localhost puppet-user[51576]: Config retrieval: 1.54 Nov 26 02:58:52 localhost puppet-user[51576]: Last run: 1764143932 Nov 26 02:58:52 localhost puppet-user[51576]: Nova config: 2.90 Nov 26 02:58:52 localhost puppet-user[51576]: Transaction evaluation: 4.21 Nov 26 02:58:52 localhost puppet-user[51576]: Catalog application: 4.23 Nov 26 02:58:52 localhost puppet-user[51576]: Resources: 0.00 Nov 26 02:58:52 localhost puppet-user[51576]: Total: 4.23 Nov 26 02:58:52 localhost puppet-user[51576]: Version: Nov 26 02:58:52 localhost puppet-user[51576]: Config: 1764143927 Nov 26 02:58:52 localhost puppet-user[51576]: Puppet: 7.10.0 Nov 26 02:58:53 localhost systemd[1]: libpod-6b81e4611bb5e1e02ce4f60ec21c6621ca2d62bbb5a92ce57c127a9564c5d201.scope: Deactivated successfully. Nov 26 02:58:53 localhost systemd[1]: libpod-6b81e4611bb5e1e02ce4f60ec21c6621ca2d62bbb5a92ce57c127a9564c5d201.scope: Consumed 8.415s CPU time. Nov 26 02:58:53 localhost podman[51403]: 2025-11-26 07:58:53.808471914 +0000 UTC m=+9.759392811 container died 6b81e4611bb5e1e02ce4f60ec21c6621ca2d62bbb5a92ce57c127a9564c5d201 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=container-puppet-nova_libvirt, name=rhosp17/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, version=17.1.12, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, distribution-scope=public, build-date=2025-11-19T00:35:22Z, config_id=tripleo_puppet_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, release=1761123044, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 26 02:58:53 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6b81e4611bb5e1e02ce4f60ec21c6621ca2d62bbb5a92ce57c127a9564c5d201-userdata-shm.mount: Deactivated successfully. Nov 26 02:58:53 localhost systemd[1]: var-lib-containers-storage-overlay-9a4f9c83fc765d2523f15252985ad52f1a1e3e45616ae2a11946a3981b65bdf5-merged.mount: Deactivated successfully. Nov 26 02:58:53 localhost puppet-user[52987]: Error: Facter: error while resolving custom fact "haproxy_version": undefined method `strip' for nil:NilClass Nov 26 02:58:54 localhost podman[53030]: 2025-11-26 07:58:54.104805606 +0000 UTC m=+0.282884402 container cleanup 6b81e4611bb5e1e02ce4f60ec21c6621ca2d62bbb5a92ce57c127a9564c5d201 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, io.buildah.version=1.41.4, url=https://www.redhat.com, build-date=2025-11-19T00:35:22Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, config_id=tripleo_puppet_step1, batch=17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=container-puppet-nova_libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64) Nov 26 02:58:54 localhost systemd[1]: libpod-conmon-6b81e4611bb5e1e02ce4f60ec21c6621ca2d62bbb5a92ce57c127a9564c5d201.scope: Deactivated successfully. Nov 26 02:58:54 localhost python3[51208]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-nova_libvirt --conmon-pidfile /run/container-puppet-nova_libvirt.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005536118 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password --env NAME=nova_libvirt --env STEP_CONFIG=include ::tripleo::packages#012# TODO(emilien): figure how to deal with libvirt profile.#012# We'll probably treat it like we do with Neutron plugins.#012# Until then, just include it in the default nova-compute role.#012include tripleo::profile::base::nova::compute::libvirt#012#012include tripleo::profile::base::nova::libvirt#012#012include tripleo::profile::base::nova::compute::libvirt_guests#012#012include tripleo::profile::base::sshd#012include tripleo::profile::base::nova::migration::target --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-nova_libvirt --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-nova_libvirt.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 26 02:58:54 localhost puppet-user[52987]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 26 02:58:54 localhost puppet-user[52987]: (file: /etc/puppet/hiera.yaml) Nov 26 02:58:54 localhost puppet-user[52987]: Warning: Undefined variable '::deploy_config_name'; Nov 26 02:58:54 localhost puppet-user[52987]: (file & line not available) Nov 26 02:58:54 localhost puppet-user[52987]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 26 02:58:54 localhost puppet-user[52987]: (file & line not available) Nov 26 02:58:54 localhost puppet-user[52987]: Warning: Unknown variable: 'dhcp_agents_per_net'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/neutron.pp, line: 154, column: 37) Nov 26 02:58:54 localhost puppet-user[52987]: Notice: Compiled catalog for np0005536118.localdomain in environment production in 0.62 seconds Nov 26 02:58:54 localhost puppet-user[52987]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]/ensure: created Nov 26 02:58:54 localhost puppet-user[52987]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]/ensure: created Nov 26 02:58:54 localhost puppet-user[52987]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/host]/ensure: created Nov 26 02:58:54 localhost puppet-user[52987]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dns_domain]/ensure: created Nov 26 02:58:54 localhost puppet-user[52987]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dhcp_agent_notification]/ensure: created Nov 26 02:58:54 localhost puppet-user[52987]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]/ensure: created Nov 26 02:58:54 localhost puppet-user[52987]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/global_physnet_mtu]/ensure: created Nov 26 02:58:54 localhost puppet-user[52987]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/vlan_transparent]/ensure: created Nov 26 02:58:54 localhost puppet-user[52987]: Notice: /Stage[main]/Neutron/Neutron_config[agent/root_helper]/ensure: created Nov 26 02:58:54 localhost puppet-user[52987]: Notice: /Stage[main]/Neutron/Neutron_config[agent/report_interval]/ensure: created Nov 26 02:58:54 localhost puppet-user[52987]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]/ensure: created Nov 26 02:58:54 localhost puppet-user[52987]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/debug]/ensure: created Nov 26 02:58:54 localhost puppet-user[52987]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/nova_metadata_host]/ensure: created Nov 26 02:58:54 localhost puppet-user[52987]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/nova_metadata_protocol]/ensure: created Nov 26 02:58:54 localhost puppet-user[52987]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/metadata_proxy_shared_secret]/ensure: created Nov 26 02:58:54 localhost puppet-user[52987]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/metadata_workers]/ensure: created Nov 26 02:58:54 localhost puppet-user[52987]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/state_path]/ensure: created Nov 26 02:58:54 localhost puppet-user[52987]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/hwol_qos_enabled]/ensure: created Nov 26 02:58:54 localhost puppet-user[52987]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[agent/root_helper]/ensure: created Nov 26 02:58:54 localhost puppet-user[52987]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovs/ovsdb_connection]/ensure: created Nov 26 02:58:54 localhost puppet-user[52987]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovs/ovsdb_connection_timeout]/ensure: created Nov 26 02:58:54 localhost puppet-user[52987]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovsdb_probe_interval]/ensure: created Nov 26 02:58:54 localhost puppet-user[52987]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_nb_connection]/ensure: created Nov 26 02:58:54 localhost puppet-user[52987]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_sb_connection]/ensure: created Nov 26 02:58:54 localhost puppet-user[52987]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/transport_url]/ensure: created Nov 26 02:58:54 localhost puppet-user[52987]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/control_exchange]/ensure: created Nov 26 02:58:54 localhost puppet-user[52987]: Notice: /Stage[main]/Neutron/Oslo::Concurrency[neutron_config]/Neutron_config[oslo_concurrency/lock_path]/ensure: created Nov 26 02:58:54 localhost puppet-user[52987]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Notifications[neutron_config]/Neutron_config[oslo_messaging_notifications/driver]/ensure: created Nov 26 02:58:54 localhost puppet-user[52987]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Notifications[neutron_config]/Neutron_config[oslo_messaging_notifications/transport_url]/ensure: created Nov 26 02:58:54 localhost puppet-user[52987]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created Nov 26 02:58:54 localhost puppet-user[52987]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created Nov 26 02:58:55 localhost puppet-user[52987]: Notice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/debug]/ensure: created Nov 26 02:58:55 localhost puppet-user[52987]: Notice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/log_dir]/ensure: created Nov 26 02:58:55 localhost puppet-user[52987]: Notice: Applied catalog in 0.43 seconds Nov 26 02:58:55 localhost puppet-user[52987]: Application: Nov 26 02:58:55 localhost puppet-user[52987]: Initial environment: production Nov 26 02:58:55 localhost puppet-user[52987]: Converged environment: production Nov 26 02:58:55 localhost puppet-user[52987]: Run mode: user Nov 26 02:58:55 localhost puppet-user[52987]: Changes: Nov 26 02:58:55 localhost puppet-user[52987]: Total: 33 Nov 26 02:58:55 localhost puppet-user[52987]: Events: Nov 26 02:58:55 localhost puppet-user[52987]: Success: 33 Nov 26 02:58:55 localhost puppet-user[52987]: Total: 33 Nov 26 02:58:55 localhost puppet-user[52987]: Resources: Nov 26 02:58:55 localhost puppet-user[52987]: Skipped: 21 Nov 26 02:58:55 localhost puppet-user[52987]: Changed: 33 Nov 26 02:58:55 localhost puppet-user[52987]: Out of sync: 33 Nov 26 02:58:55 localhost puppet-user[52987]: Total: 155 Nov 26 02:58:55 localhost puppet-user[52987]: Time: Nov 26 02:58:55 localhost puppet-user[52987]: Resources: 0.00 Nov 26 02:58:55 localhost puppet-user[52987]: Ovn metadata agent config: 0.01 Nov 26 02:58:55 localhost puppet-user[52987]: Neutron config: 0.36 Nov 26 02:58:55 localhost puppet-user[52987]: Transaction evaluation: 0.42 Nov 26 02:58:55 localhost puppet-user[52987]: Catalog application: 0.43 Nov 26 02:58:55 localhost puppet-user[52987]: Config retrieval: 0.69 Nov 26 02:58:55 localhost puppet-user[52987]: Last run: 1764143935 Nov 26 02:58:55 localhost puppet-user[52987]: Total: 0.43 Nov 26 02:58:55 localhost puppet-user[52987]: Version: Nov 26 02:58:55 localhost puppet-user[52987]: Config: 1764143934 Nov 26 02:58:55 localhost puppet-user[52987]: Puppet: 7.10.0 Nov 26 02:58:55 localhost systemd[1]: libpod-d7b84403fc3916033b0046105925898295f709f65d3442ce3770ecd53e58551f.scope: Deactivated successfully. Nov 26 02:58:55 localhost systemd[1]: libpod-d7b84403fc3916033b0046105925898295f709f65d3442ce3770ecd53e58551f.scope: Consumed 3.482s CPU time. Nov 26 02:58:55 localhost podman[53170]: 2025-11-26 07:58:55.797480845 +0000 UTC m=+0.049389767 container died d7b84403fc3916033b0046105925898295f709f65d3442ce3770ecd53e58551f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, name=rhosp17/openstack-neutron-server, version=17.1.12, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-server, io.buildah.version=1.41.4, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-server, com.redhat.component=openstack-neutron-server-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=container-puppet-neutron, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:23:27Z, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, release=1761123044, tcib_managed=true) Nov 26 02:58:55 localhost systemd[1]: tmp-crun.KOxld7.mount: Deactivated successfully. Nov 26 02:58:55 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d7b84403fc3916033b0046105925898295f709f65d3442ce3770ecd53e58551f-userdata-shm.mount: Deactivated successfully. Nov 26 02:58:55 localhost systemd[1]: var-lib-containers-storage-overlay-61b78344a7655d1356d5611c58b53e5d1f1449149a2ea477d54243044567e90e-merged.mount: Deactivated successfully. Nov 26 02:58:55 localhost podman[53170]: 2025-11-26 07:58:55.869483851 +0000 UTC m=+0.121392703 container cleanup d7b84403fc3916033b0046105925898295f709f65d3442ce3770ecd53e58551f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, tcib_managed=true, config_id=tripleo_puppet_step1, summary=Red Hat OpenStack Platform 17.1 neutron-server, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, distribution-scope=public, vcs-type=git, release=1761123044, com.redhat.component=openstack-neutron-server-container, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, build-date=2025-11-19T00:23:27Z, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-server, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, version=17.1.12, name=rhosp17/openstack-neutron-server, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=container-puppet-neutron, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 02:58:55 localhost systemd[1]: libpod-conmon-d7b84403fc3916033b0046105925898295f709f65d3442ce3770ecd53e58551f.scope: Deactivated successfully. Nov 26 02:58:55 localhost python3[51208]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-neutron --conmon-pidfile /run/container-puppet-neutron.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005536118 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config --env NAME=neutron --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::neutron::ovn_metadata#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-neutron --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005536118', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-neutron.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1 Nov 26 02:58:56 localhost python3[53221]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:58:57 localhost python3[53253]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 02:58:58 localhost python3[53303]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:58:58 localhost python3[53346]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143937.8313973-84713-174451628462680/source dest=/usr/libexec/tripleo-container-shutdown mode=0700 owner=root group=root _original_basename=tripleo-container-shutdown follow=False checksum=7d67b1986212f5548057505748cd74cfcf9c0d35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:58:59 localhost python3[53408]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:58:59 localhost python3[53451]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143938.6771042-84713-55862032534021/source dest=/usr/libexec/tripleo-start-podman-container mode=0700 owner=root group=root _original_basename=tripleo-start-podman-container follow=False checksum=536965633b8d3b1ce794269ffb07be0105a560a0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:58:59 localhost python3[53513]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:59:00 localhost python3[53556]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143939.6479783-84775-8156703363262/source dest=/usr/lib/systemd/system/tripleo-container-shutdown.service mode=0644 owner=root group=root _original_basename=tripleo-container-shutdown-service follow=False checksum=66c1d41406ba8714feb9ed0a35259a7a57ef9707 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:59:00 localhost python3[53618]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:59:01 localhost python3[53661]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143940.532325-84807-279411759038020/source dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset mode=0644 owner=root group=root _original_basename=91-tripleo-container-shutdown-preset follow=False checksum=bccb1207dcbcfaa5ca05f83c8f36ce4c2460f081 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:59:01 localhost python3[53691]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 02:59:01 localhost systemd[1]: Reloading. Nov 26 02:59:01 localhost systemd-rc-local-generator[53713]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 02:59:01 localhost systemd-sysv-generator[53718]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 02:59:01 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 02:59:02 localhost systemd[1]: Reloading. Nov 26 02:59:02 localhost systemd-rc-local-generator[53754]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 02:59:02 localhost systemd-sysv-generator[53759]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 02:59:02 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 02:59:02 localhost systemd[1]: Starting TripleO Container Shutdown... Nov 26 02:59:02 localhost systemd[1]: Finished TripleO Container Shutdown. Nov 26 02:59:02 localhost python3[53815]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:59:03 localhost python3[53858]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143942.6150546-84863-45323173373169/source dest=/usr/lib/systemd/system/netns-placeholder.service mode=0644 owner=root group=root _original_basename=netns-placeholder-service follow=False checksum=8e9c6d5ce3a6e7f71c18780ec899f32f23de4c71 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:59:03 localhost python3[53920]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 02:59:04 localhost python3[53963]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143943.511654-84968-42104955935432/source dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset mode=0644 owner=root group=root _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:59:04 localhost python3[53993]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 02:59:04 localhost systemd[1]: Reloading. Nov 26 02:59:04 localhost systemd-sysv-generator[54023]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 02:59:04 localhost systemd-rc-local-generator[54018]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 02:59:04 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 02:59:04 localhost systemd[1]: Reloading. Nov 26 02:59:04 localhost systemd-rc-local-generator[54058]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 02:59:04 localhost systemd-sysv-generator[54061]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 02:59:05 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 02:59:05 localhost systemd[1]: Starting Create netns directory... Nov 26 02:59:05 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Nov 26 02:59:05 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Nov 26 02:59:05 localhost systemd[1]: Finished Create netns directory. Nov 26 02:59:05 localhost python3[54086]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Nov 26 02:59:05 localhost python3[54086]: ansible-container_puppet_config [WARNING] Config change detected for metrics_qdr, new hash: cb52c88276a571bf332b7657a13eab07 Nov 26 02:59:05 localhost python3[54086]: ansible-container_puppet_config [WARNING] Config change detected for collectd, new hash: 4767aaabc3de112d8791c290aa2b669d Nov 26 02:59:05 localhost python3[54086]: ansible-container_puppet_config [WARNING] Config change detected for iscsid, new hash: 62782eb5f982aaac812488dee300321e Nov 26 02:59:05 localhost python3[54086]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtlogd_wrapper, new hash: c7803ed1795969cb7cf47e6d4d57c4b9 Nov 26 02:59:05 localhost python3[54086]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtnodedevd, new hash: c7803ed1795969cb7cf47e6d4d57c4b9 Nov 26 02:59:05 localhost python3[54086]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtproxyd, new hash: c7803ed1795969cb7cf47e6d4d57c4b9 Nov 26 02:59:05 localhost python3[54086]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtqemud, new hash: c7803ed1795969cb7cf47e6d4d57c4b9 Nov 26 02:59:05 localhost python3[54086]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtsecretd, new hash: c7803ed1795969cb7cf47e6d4d57c4b9 Nov 26 02:59:05 localhost python3[54086]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtstoraged, new hash: c7803ed1795969cb7cf47e6d4d57c4b9 Nov 26 02:59:05 localhost python3[54086]: ansible-container_puppet_config [WARNING] Config change detected for rsyslog, new hash: 89e2bf3e240198013fa934e7fe0b50df Nov 26 02:59:05 localhost python3[54086]: ansible-container_puppet_config [WARNING] Config change detected for ceilometer_agent_compute, new hash: f94fd18b42545cee37022470afd201a1 Nov 26 02:59:05 localhost python3[54086]: ansible-container_puppet_config [WARNING] Config change detected for ceilometer_agent_ipmi, new hash: f94fd18b42545cee37022470afd201a1 Nov 26 02:59:05 localhost python3[54086]: ansible-container_puppet_config [WARNING] Config change detected for logrotate_crond, new hash: 53ed83bb0cae779ff95edb2002262c6f Nov 26 02:59:05 localhost python3[54086]: ansible-container_puppet_config [WARNING] Config change detected for nova_libvirt_init_secret, new hash: c7803ed1795969cb7cf47e6d4d57c4b9 Nov 26 02:59:05 localhost python3[54086]: ansible-container_puppet_config [WARNING] Config change detected for nova_migration_target, new hash: c7803ed1795969cb7cf47e6d4d57c4b9 Nov 26 02:59:05 localhost python3[54086]: ansible-container_puppet_config [WARNING] Config change detected for ovn_metadata_agent, new hash: 8346a4a86ac2c2b1d52b2e36f598d419 Nov 26 02:59:05 localhost python3[54086]: ansible-container_puppet_config [WARNING] Config change detected for nova_compute, new hash: 62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9 Nov 26 02:59:05 localhost python3[54086]: ansible-container_puppet_config [WARNING] Config change detected for nova_wait_for_compute_service, new hash: c7803ed1795969cb7cf47e6d4d57c4b9 Nov 26 02:59:07 localhost python3[54144]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step1 config_dir=/var/lib/tripleo-config/container-startup-config/step_1 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Nov 26 02:59:07 localhost podman[54185]: 2025-11-26 07:59:07.435503041 +0000 UTC m=+0.094091289 container create 11d177bb8d3637875ec51fb429bbc4a9c2772fb33cc75a2e136cd81f3830fc98 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, url=https://www.redhat.com, tcib_managed=true, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, release=1761123044, managed_by=tripleo_ansible, io.openshift.expose-services=, container_name=metrics_qdr_init_logs, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 02:59:07 localhost systemd[1]: Started libpod-conmon-11d177bb8d3637875ec51fb429bbc4a9c2772fb33cc75a2e136cd81f3830fc98.scope. Nov 26 02:59:07 localhost podman[54185]: 2025-11-26 07:59:07.392434251 +0000 UTC m=+0.051022529 image pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Nov 26 02:59:07 localhost systemd[1]: Started libcrun container. Nov 26 02:59:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/29f95cfe95f535f9d59641ec99dde393a4714016cd95f7dbb20217cba1000992/merged/var/log/qdrouterd supports timestamps until 2038 (0x7fffffff) Nov 26 02:59:07 localhost podman[54185]: 2025-11-26 07:59:07.50858415 +0000 UTC m=+0.167172368 container init 11d177bb8d3637875ec51fb429bbc4a9c2772fb33cc75a2e136cd81f3830fc98 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, maintainer=OpenStack TripleO Team, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr_init_logs) Nov 26 02:59:07 localhost podman[54185]: 2025-11-26 07:59:07.518213731 +0000 UTC m=+0.176801949 container start 11d177bb8d3637875ec51fb429bbc4a9c2772fb33cc75a2e136cd81f3830fc98 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, batch=17.1_20251118.1, container_name=metrics_qdr_init_logs, distribution-scope=public, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc.) Nov 26 02:59:07 localhost podman[54185]: 2025-11-26 07:59:07.518443598 +0000 UTC m=+0.177031876 container attach 11d177bb8d3637875ec51fb429bbc4a9c2772fb33cc75a2e136cd81f3830fc98 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr_init_logs, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, tcib_managed=true, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, release=1761123044, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 02:59:07 localhost systemd[1]: libpod-11d177bb8d3637875ec51fb429bbc4a9c2772fb33cc75a2e136cd81f3830fc98.scope: Deactivated successfully. Nov 26 02:59:07 localhost podman[54185]: 2025-11-26 07:59:07.527312047 +0000 UTC m=+0.185900305 container died 11d177bb8d3637875ec51fb429bbc4a9c2772fb33cc75a2e136cd81f3830fc98 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, architecture=x86_64, release=1761123044, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr_init_logs) Nov 26 02:59:07 localhost podman[54204]: 2025-11-26 07:59:07.612119743 +0000 UTC m=+0.075893428 container cleanup 11d177bb8d3637875ec51fb429bbc4a9c2772fb33cc75a2e136cd81f3830fc98 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr_init_logs, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-type=git, release=1761123044, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, architecture=x86_64) Nov 26 02:59:07 localhost systemd[1]: libpod-conmon-11d177bb8d3637875ec51fb429bbc4a9c2772fb33cc75a2e136cd81f3830fc98.scope: Deactivated successfully. Nov 26 02:59:07 localhost python3[54144]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name metrics_qdr_init_logs --conmon-pidfile /run/metrics_qdr_init_logs.pid --detach=False --label config_id=tripleo_step1 --label container_name=metrics_qdr_init_logs --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/metrics_qdr_init_logs.log --network none --privileged=False --user root --volume /var/log/containers/metrics_qdr:/var/log/qdrouterd:z registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 /bin/bash -c chown -R qdrouterd:qdrouterd /var/log/qdrouterd Nov 26 02:59:08 localhost systemd[1]: var-lib-containers-storage-overlay-29f95cfe95f535f9d59641ec99dde393a4714016cd95f7dbb20217cba1000992-merged.mount: Deactivated successfully. Nov 26 02:59:08 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-11d177bb8d3637875ec51fb429bbc4a9c2772fb33cc75a2e136cd81f3830fc98-userdata-shm.mount: Deactivated successfully. Nov 26 02:59:09 localhost podman[54278]: 2025-11-26 07:59:09.035825537 +0000 UTC m=+0.077546900 container create b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, architecture=x86_64, release=1761123044, batch=17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-type=git, build-date=2025-11-18T22:49:46Z, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=) Nov 26 02:59:09 localhost systemd[1]: Started libpod-conmon-b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.scope. Nov 26 02:59:09 localhost systemd[1]: Started libcrun container. Nov 26 02:59:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/772983d29741817fb5112b04db0ec34846c51e947d40ce51144a956997c63192/merged/var/log/qdrouterd supports timestamps until 2038 (0x7fffffff) Nov 26 02:59:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/772983d29741817fb5112b04db0ec34846c51e947d40ce51144a956997c63192/merged/var/lib/qdrouterd supports timestamps until 2038 (0x7fffffff) Nov 26 02:59:09 localhost podman[54278]: 2025-11-26 07:59:08.997483956 +0000 UTC m=+0.039205329 image pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Nov 26 02:59:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 02:59:09 localhost podman[54278]: 2025-11-26 07:59:09.122584274 +0000 UTC m=+0.164305637 container init b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, version=17.1.12, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 02:59:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 02:59:09 localhost podman[54278]: 2025-11-26 07:59:09.167669026 +0000 UTC m=+0.209390389 container start b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Nov 26 02:59:09 localhost python3[54144]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name metrics_qdr --conmon-pidfile /run/metrics_qdr.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=cb52c88276a571bf332b7657a13eab07 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step1 --label container_name=metrics_qdr --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/metrics_qdr.log --network host --privileged=False --user qdrouterd --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro --volume /var/lib/metrics_qdr:/var/lib/qdrouterd:z --volume /var/log/containers/metrics_qdr:/var/log/qdrouterd:z registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Nov 26 02:59:09 localhost podman[54301]: 2025-11-26 07:59:09.262835927 +0000 UTC m=+0.087526162 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=starting, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, vcs-type=git, distribution-scope=public, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step1, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12) Nov 26 02:59:09 localhost podman[54301]: 2025-11-26 07:59:09.471292617 +0000 UTC m=+0.295982852 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, name=rhosp17/openstack-qdrouterd, version=17.1.12, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20251118.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Nov 26 02:59:09 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 02:59:09 localhost python3[54374]: ansible-file Invoked with path=/etc/systemd/system/tripleo_metrics_qdr.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:59:09 localhost sshd[54390]: main: sshd: ssh-rsa algorithm is disabled Nov 26 02:59:10 localhost python3[54391]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_metrics_qdr_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 02:59:10 localhost python3[54453]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764143950.1165714-85106-118975249135556/source dest=/etc/systemd/system/tripleo_metrics_qdr.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:59:10 localhost python3[54469]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 26 02:59:11 localhost systemd[1]: Reloading. Nov 26 02:59:11 localhost systemd-rc-local-generator[54490]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 02:59:11 localhost systemd-sysv-generator[54494]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 02:59:11 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 02:59:11 localhost python3[54520]: ansible-systemd Invoked with state=restarted name=tripleo_metrics_qdr.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 02:59:11 localhost systemd[1]: Reloading. Nov 26 02:59:11 localhost systemd-sysv-generator[54553]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 02:59:11 localhost systemd-rc-local-generator[54550]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 02:59:12 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 02:59:12 localhost systemd[1]: Starting metrics_qdr container... Nov 26 02:59:12 localhost systemd[1]: Started metrics_qdr container. Nov 26 02:59:12 localhost python3[54602]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks1.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:59:13 localhost python3[54723]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks1.json short_hostname=np0005536118 step=1 update_config_hash_only=False Nov 26 02:59:14 localhost python3[54739]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 02:59:14 localhost python3[54755]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_1 config_pattern=container-puppet-*.json config_overrides={} debug=True Nov 26 02:59:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 02:59:39 localhost podman[54756]: 2025-11-26 07:59:39.830369947 +0000 UTC m=+0.086677196 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, architecture=x86_64, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step1) Nov 26 02:59:40 localhost podman[54756]: 2025-11-26 07:59:40.036772922 +0000 UTC m=+0.293080161 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, version=17.1.12, io.buildah.version=1.41.4, architecture=x86_64, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vendor=Red Hat, Inc., url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, batch=17.1_20251118.1, vcs-type=git) Nov 26 02:59:40 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 02:59:50 localhost sshd[54863]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:00:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:00:10 localhost podman[54865]: 2025-11-26 08:00:10.810249598 +0000 UTC m=+0.074214295 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 03:00:11 localhost podman[54865]: 2025-11-26 08:00:11.019621107 +0000 UTC m=+0.283585804 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, version=17.1.12, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, io.buildah.version=1.41.4) Nov 26 03:00:11 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:00:34 localhost sshd[54895]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:00:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:00:41 localhost systemd[1]: tmp-crun.PVtO2D.mount: Deactivated successfully. Nov 26 03:00:41 localhost podman[54897]: 2025-11-26 08:00:41.822154359 +0000 UTC m=+0.085254282 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, vendor=Red Hat, Inc., release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 03:00:42 localhost podman[54897]: 2025-11-26 08:00:42.041227441 +0000 UTC m=+0.304327304 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, tcib_managed=true, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:00:42 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:01:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:01:12 localhost systemd[1]: tmp-crun.W9b1cH.mount: Deactivated successfully. Nov 26 03:01:12 localhost podman[55015]: 2025-11-26 08:01:12.823183536 +0000 UTC m=+0.087953374 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, version=17.1.12, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, distribution-scope=public, batch=17.1_20251118.1, io.buildah.version=1.41.4) Nov 26 03:01:13 localhost podman[55015]: 2025-11-26 08:01:13.01430392 +0000 UTC m=+0.279073768 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_id=tripleo_step1, version=17.1.12, architecture=x86_64, name=rhosp17/openstack-qdrouterd) Nov 26 03:01:13 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:01:16 localhost sshd[55044]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:01:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:01:43 localhost podman[55046]: 2025-11-26 08:01:43.814349405 +0000 UTC m=+0.079677379 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, version=17.1.12, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1) Nov 26 03:01:44 localhost podman[55046]: 2025-11-26 08:01:44.007345505 +0000 UTC m=+0.272673509 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, batch=17.1_20251118.1, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, architecture=x86_64, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git) Nov 26 03:01:44 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:01:59 localhost sshd[55151]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:02:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:02:14 localhost systemd[1]: tmp-crun.toa8EV.mount: Deactivated successfully. Nov 26 03:02:14 localhost podman[55153]: 2025-11-26 08:02:14.833299439 +0000 UTC m=+0.086242381 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=metrics_qdr, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Nov 26 03:02:15 localhost podman[55153]: 2025-11-26 08:02:15.055219602 +0000 UTC m=+0.308162514 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com) Nov 26 03:02:15 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:02:43 localhost sshd[55183]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:02:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:02:45 localhost systemd[1]: tmp-crun.H1Bh1T.mount: Deactivated successfully. Nov 26 03:02:45 localhost podman[55185]: 2025-11-26 08:02:45.831143993 +0000 UTC m=+0.088875761 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, container_name=metrics_qdr, io.buildah.version=1.41.4, tcib_managed=true, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.) Nov 26 03:02:46 localhost podman[55185]: 2025-11-26 08:02:46.006332076 +0000 UTC m=+0.264063794 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, batch=17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible) Nov 26 03:02:46 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:03:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:03:16 localhost podman[55290]: 2025-11-26 08:03:16.821852136 +0000 UTC m=+0.074690441 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=metrics_qdr, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, version=17.1.12, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 03:03:16 localhost podman[55290]: 2025-11-26 08:03:16.993705033 +0000 UTC m=+0.246543368 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.expose-services=, version=17.1.12, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 03:03:17 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:03:27 localhost sshd[55320]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:03:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:03:47 localhost podman[55322]: 2025-11-26 08:03:47.827439727 +0000 UTC m=+0.081526290 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, distribution-scope=public, vcs-type=git, name=rhosp17/openstack-qdrouterd) Nov 26 03:03:48 localhost podman[55322]: 2025-11-26 08:03:48.070508997 +0000 UTC m=+0.324595570 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, distribution-scope=public, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, release=1761123044, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Nov 26 03:03:48 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:04:00 localhost ceph-osd[31674]: osd.0 pg_epoch: 18 pg[2.0( empty local-lis/les=0/0 n=0 ec=18/18 lis/c=0/0 les/c/f=0/0/0 sis=18) [2,0,3] r=1 lpr=18 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:00 localhost ceph-osd[31674]: osd.0 pg_epoch: 19 pg[3.0( empty local-lis/les=0/0 n=0 ec=19/19 lis/c=0/0 les/c/f=0/0/0 sis=19) [0,2,1] r=0 lpr=19 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:01 localhost ceph-osd[31674]: osd.0 pg_epoch: 20 pg[3.0( empty local-lis/les=19/20 n=0 ec=19/19 lis/c=0/0 les/c/f=0/0/0 sis=19) [0,2,1] r=0 lpr=19 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:02 localhost ceph-osd[31674]: osd.0 pg_epoch: 20 pg[4.0( empty local-lis/les=0/0 n=0 ec=20/20 lis/c=0/0 les/c/f=0/0/0 sis=20) [3,5,0] r=2 lpr=20 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:03 localhost ceph-osd[31674]: osd.0 pg_epoch: 22 pg[5.0( empty local-lis/les=0/0 n=0 ec=22/22 lis/c=0/0 les/c/f=0/0/0 sis=22) [0,1,2] r=0 lpr=22 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:04 localhost ceph-osd[31674]: osd.0 pg_epoch: 23 pg[5.0( empty local-lis/les=22/23 n=0 ec=22/22 lis/c=0/0 les/c/f=0/0/0 sis=22) [0,1,2] r=0 lpr=22 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:10 localhost sshd[55433]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:04:16 localhost ceph-osd[32631]: osd.4 pg_epoch: 28 pg[6.0( empty local-lis/les=0/0 n=0 ec=28/28 lis/c=0/0 les/c/f=0/0/0 sis=28) [1,5,4] r=2 lpr=28 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:18 localhost ceph-osd[31674]: osd.0 pg_epoch: 29 pg[7.0( empty local-lis/les=0/0 n=0 ec=29/29 lis/c=0/0 les/c/f=0/0/0 sis=29) [5,0,3] r=1 lpr=29 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:04:18 localhost podman[55435]: 2025-11-26 08:04:18.841503786 +0000 UTC m=+0.098780518 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, managed_by=tripleo_ansible, batch=17.1_20251118.1, distribution-scope=public, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_id=tripleo_step1, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd) Nov 26 03:04:19 localhost podman[55435]: 2025-11-26 08:04:19.035262845 +0000 UTC m=+0.292539527 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-18T22:49:46Z, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, distribution-scope=public, container_name=metrics_qdr, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 03:04:19 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:04:37 localhost ceph-osd[31674]: osd.0 pg_epoch: 34 pg[3.0( empty local-lis/les=19/20 n=0 ec=19/19 lis/c=19/19 les/c/f=20/20/0 sis=34 pruub=12.149264336s) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active pruub 1124.516357422s@ mbc={}] start_peering_interval up [0,2,1] -> [0,2,1], acting [0,2,1] -> [0,2,1], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:37 localhost ceph-osd[31674]: osd.0 pg_epoch: 34 pg[2.0( empty local-lis/les=18/19 n=0 ec=18/18 lis/c=18/18 les/c/f=19/19/0 sis=34 pruub=11.257480621s) [2,0,3] r=1 lpr=34 pi=[18,34)/1 crt=0'0 mlcod 0'0 active pruub 1123.624511719s@ mbc={}] start_peering_interval up [2,0,3] -> [2,0,3], acting [2,0,3] -> [2,0,3], acting_primary 2 -> 2, up_primary 2 -> 2, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:37 localhost ceph-osd[31674]: osd.0 pg_epoch: 34 pg[3.0( empty local-lis/les=19/20 n=0 ec=19/19 lis/c=19/19 les/c/f=20/20/0 sis=34 pruub=12.149264336s) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown pruub 1124.516357422s@ mbc={}] state: transitioning to Primary Nov 26 03:04:37 localhost ceph-osd[31674]: osd.0 pg_epoch: 34 pg[2.0( empty local-lis/les=18/19 n=0 ec=18/18 lis/c=18/18 les/c/f=19/19/0 sis=34 pruub=11.254325867s) [2,0,3] r=1 lpr=34 pi=[18,34)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1123.624511719s@ mbc={}] state: transitioning to Stray Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[2.1f( empty local-lis/les=18/19 n=0 ec=34/18 lis/c=18/18 les/c/f=19/19/0 sis=34) [2,0,3] r=1 lpr=34 pi=[18,34)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[2.1d( empty local-lis/les=18/19 n=0 ec=34/18 lis/c=18/18 les/c/f=19/19/0 sis=34) [2,0,3] r=1 lpr=34 pi=[18,34)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[2.1e( empty local-lis/les=18/19 n=0 ec=34/18 lis/c=18/18 les/c/f=19/19/0 sis=34) [2,0,3] r=1 lpr=34 pi=[18,34)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[2.1c( empty local-lis/les=18/19 n=0 ec=34/18 lis/c=18/18 les/c/f=19/19/0 sis=34) [2,0,3] r=1 lpr=34 pi=[18,34)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[2.19( empty local-lis/les=18/19 n=0 ec=34/18 lis/c=18/18 les/c/f=19/19/0 sis=34) [2,0,3] r=1 lpr=34 pi=[18,34)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[2.1a( empty local-lis/les=18/19 n=0 ec=34/18 lis/c=18/18 les/c/f=19/19/0 sis=34) [2,0,3] r=1 lpr=34 pi=[18,34)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[2.4( empty local-lis/les=18/19 n=0 ec=34/18 lis/c=18/18 les/c/f=19/19/0 sis=34) [2,0,3] r=1 lpr=34 pi=[18,34)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[2.2( empty local-lis/les=18/19 n=0 ec=34/18 lis/c=18/18 les/c/f=19/19/0 sis=34) [2,0,3] r=1 lpr=34 pi=[18,34)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[2.17( empty local-lis/les=18/19 n=0 ec=34/18 lis/c=18/18 les/c/f=19/19/0 sis=34) [2,0,3] r=1 lpr=34 pi=[18,34)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[2.8( empty local-lis/les=18/19 n=0 ec=34/18 lis/c=18/18 les/c/f=19/19/0 sis=34) [2,0,3] r=1 lpr=34 pi=[18,34)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[2.9( empty local-lis/les=18/19 n=0 ec=34/18 lis/c=18/18 les/c/f=19/19/0 sis=34) [2,0,3] r=1 lpr=34 pi=[18,34)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[2.7( empty local-lis/les=18/19 n=0 ec=34/18 lis/c=18/18 les/c/f=19/19/0 sis=34) [2,0,3] r=1 lpr=34 pi=[18,34)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.16( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.19( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[2.18( empty local-lis/les=18/19 n=0 ec=34/18 lis/c=18/18 les/c/f=19/19/0 sis=34) [2,0,3] r=1 lpr=34 pi=[18,34)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.18( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[2.16( empty local-lis/les=18/19 n=0 ec=34/18 lis/c=18/18 les/c/f=19/19/0 sis=34) [2,0,3] r=1 lpr=34 pi=[18,34)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[2.1b( empty local-lis/les=18/19 n=0 ec=34/18 lis/c=18/18 les/c/f=19/19/0 sis=34) [2,0,3] r=1 lpr=34 pi=[18,34)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[2.15( empty local-lis/les=18/19 n=0 ec=34/18 lis/c=18/18 les/c/f=19/19/0 sis=34) [2,0,3] r=1 lpr=34 pi=[18,34)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.17( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.15( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.14( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[2.14( empty local-lis/les=18/19 n=0 ec=34/18 lis/c=18/18 les/c/f=19/19/0 sis=34) [2,0,3] r=1 lpr=34 pi=[18,34)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[2.11( empty local-lis/les=18/19 n=0 ec=34/18 lis/c=18/18 les/c/f=19/19/0 sis=34) [2,0,3] r=1 lpr=34 pi=[18,34)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[2.13( empty local-lis/les=18/19 n=0 ec=34/18 lis/c=18/18 les/c/f=19/19/0 sis=34) [2,0,3] r=1 lpr=34 pi=[18,34)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.13( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[2.f( empty local-lis/les=18/19 n=0 ec=34/18 lis/c=18/18 les/c/f=19/19/0 sis=34) [2,0,3] r=1 lpr=34 pi=[18,34)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.12( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[2.12( empty local-lis/les=18/19 n=0 ec=34/18 lis/c=18/18 les/c/f=19/19/0 sis=34) [2,0,3] r=1 lpr=34 pi=[18,34)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.10( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.11( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.e( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[2.e( empty local-lis/les=18/19 n=0 ec=34/18 lis/c=18/18 les/c/f=19/19/0 sis=34) [2,0,3] r=1 lpr=34 pi=[18,34)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[2.10( empty local-lis/les=18/19 n=0 ec=34/18 lis/c=18/18 les/c/f=19/19/0 sis=34) [2,0,3] r=1 lpr=34 pi=[18,34)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[2.d( empty local-lis/les=18/19 n=0 ec=34/18 lis/c=18/18 les/c/f=19/19/0 sis=34) [2,0,3] r=1 lpr=34 pi=[18,34)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.f( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.c( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[2.c( empty local-lis/les=18/19 n=0 ec=34/18 lis/c=18/18 les/c/f=19/19/0 sis=34) [2,0,3] r=1 lpr=34 pi=[18,34)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[2.b( empty local-lis/les=18/19 n=0 ec=34/18 lis/c=18/18 les/c/f=19/19/0 sis=34) [2,0,3] r=1 lpr=34 pi=[18,34)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.a( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[2.a( empty local-lis/les=18/19 n=0 ec=34/18 lis/c=18/18 les/c/f=19/19/0 sis=34) [2,0,3] r=1 lpr=34 pi=[18,34)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.d( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.b( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[2.3( empty local-lis/les=18/19 n=0 ec=34/18 lis/c=18/18 les/c/f=19/19/0 sis=34) [2,0,3] r=1 lpr=34 pi=[18,34)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.2( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.8( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.9( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.1( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[2.1( empty local-lis/les=18/19 n=0 ec=34/18 lis/c=18/18 les/c/f=19/19/0 sis=34) [2,0,3] r=1 lpr=34 pi=[18,34)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[2.6( empty local-lis/les=18/19 n=0 ec=34/18 lis/c=18/18 les/c/f=19/19/0 sis=34) [2,0,3] r=1 lpr=34 pi=[18,34)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.7( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.6( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.4( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.3( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.5( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[2.5( empty local-lis/les=18/19 n=0 ec=34/18 lis/c=18/18 les/c/f=19/19/0 sis=34) [2,0,3] r=1 lpr=34 pi=[18,34)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.1b( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.1d( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.1c( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.1a( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.1e( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.1f( empty local-lis/les=19/20 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.0( empty local-lis/les=34/35 n=0 ec=19/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.1a( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.17( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.19( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.14( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.18( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.11( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.15( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.f( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.13( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.12( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.16( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.e( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.d( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.c( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.b( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.5( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.8( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.7( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.3( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.2( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.a( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.1d( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.6( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.9( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.1e( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.1( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.1b( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.4( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.1c( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.1f( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:38 localhost ceph-osd[31674]: osd.0 pg_epoch: 35 pg[3.10( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=19/19 les/c/f=20/20/0 sis=34) [0,2,1] r=0 lpr=34 pi=[19,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:39 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 3.0 deep-scrub starts Nov 26 03:04:39 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 3.0 deep-scrub ok Nov 26 03:04:39 localhost ceph-osd[31674]: osd.0 pg_epoch: 36 pg[4.0( empty local-lis/les=20/21 n=0 ec=20/20 lis/c=20/20 les/c/f=21/21/0 sis=36 pruub=12.151624680s) [3,5,0] r=2 lpr=36 pi=[20,36)/1 crt=0'0 mlcod 0'0 active pruub 1126.539184570s@ mbc={}] start_peering_interval up [3,5,0] -> [3,5,0], acting [3,5,0] -> [3,5,0], acting_primary 3 -> 3, up_primary 3 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:39 localhost ceph-osd[31674]: osd.0 pg_epoch: 36 pg[5.0( empty local-lis/les=22/23 n=0 ec=22/22 lis/c=22/22 les/c/f=23/23/0 sis=36 pruub=13.203097343s) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 active pruub 1127.590942383s@ mbc={}] start_peering_interval up [0,1,2] -> [0,1,2], acting [0,1,2] -> [0,1,2], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:39 localhost ceph-osd[31674]: osd.0 pg_epoch: 36 pg[5.0( empty local-lis/les=22/23 n=0 ec=22/22 lis/c=22/22 les/c/f=23/23/0 sis=36 pruub=13.203097343s) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 unknown pruub 1127.590942383s@ mbc={}] state: transitioning to Primary Nov 26 03:04:39 localhost ceph-osd[31674]: osd.0 pg_epoch: 36 pg[4.0( empty local-lis/les=20/21 n=0 ec=20/20 lis/c=20/20 les/c/f=21/21/0 sis=36 pruub=12.147985458s) [3,5,0] r=2 lpr=36 pi=[20,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1126.539184570s@ mbc={}] state: transitioning to Stray Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[4.1f( empty local-lis/les=20/21 n=0 ec=36/20 lis/c=20/20 les/c/f=21/21/0 sis=36) [3,5,0] r=2 lpr=36 pi=[20,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[4.10( empty local-lis/les=20/21 n=0 ec=36/20 lis/c=20/20 les/c/f=21/21/0 sis=36) [3,5,0] r=2 lpr=36 pi=[20,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[4.19( empty local-lis/les=20/21 n=0 ec=36/20 lis/c=20/20 les/c/f=21/21/0 sis=36) [3,5,0] r=2 lpr=36 pi=[20,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[4.12( empty local-lis/les=20/21 n=0 ec=36/20 lis/c=20/20 les/c/f=21/21/0 sis=36) [3,5,0] r=2 lpr=36 pi=[20,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[4.13( empty local-lis/les=20/21 n=0 ec=36/20 lis/c=20/20 les/c/f=21/21/0 sis=36) [3,5,0] r=2 lpr=36 pi=[20,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[4.1e( empty local-lis/les=20/21 n=0 ec=36/20 lis/c=20/20 les/c/f=21/21/0 sis=36) [3,5,0] r=2 lpr=36 pi=[20,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.18( empty local-lis/les=22/23 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[4.15( empty local-lis/les=20/21 n=0 ec=36/20 lis/c=20/20 les/c/f=21/21/0 sis=36) [3,5,0] r=2 lpr=36 pi=[20,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[4.17( empty local-lis/les=20/21 n=0 ec=36/20 lis/c=20/20 les/c/f=21/21/0 sis=36) [3,5,0] r=2 lpr=36 pi=[20,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[4.18( empty local-lis/les=20/21 n=0 ec=36/20 lis/c=20/20 les/c/f=21/21/0 sis=36) [3,5,0] r=2 lpr=36 pi=[20,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.19( empty local-lis/les=22/23 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[4.1b( empty local-lis/les=20/21 n=0 ec=36/20 lis/c=20/20 les/c/f=21/21/0 sis=36) [3,5,0] r=2 lpr=36 pi=[20,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[4.16( empty local-lis/les=20/21 n=0 ec=36/20 lis/c=20/20 les/c/f=21/21/0 sis=36) [3,5,0] r=2 lpr=36 pi=[20,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[4.1a( empty local-lis/les=20/21 n=0 ec=36/20 lis/c=20/20 les/c/f=21/21/0 sis=36) [3,5,0] r=2 lpr=36 pi=[20,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.1b( empty local-lis/les=22/23 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.1a( empty local-lis/les=22/23 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[4.14( empty local-lis/les=20/21 n=0 ec=36/20 lis/c=20/20 les/c/f=21/21/0 sis=36) [3,5,0] r=2 lpr=36 pi=[20,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.1c( empty local-lis/les=22/23 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.1d( empty local-lis/les=22/23 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.1e( empty local-lis/les=22/23 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[4.1c( empty local-lis/les=20/21 n=0 ec=36/20 lis/c=20/20 les/c/f=21/21/0 sis=36) [3,5,0] r=2 lpr=36 pi=[20,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.1f( empty local-lis/les=22/23 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[4.2( empty local-lis/les=20/21 n=0 ec=36/20 lis/c=20/20 les/c/f=21/21/0 sis=36) [3,5,0] r=2 lpr=36 pi=[20,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[4.1d( empty local-lis/les=20/21 n=0 ec=36/20 lis/c=20/20 les/c/f=21/21/0 sis=36) [3,5,0] r=2 lpr=36 pi=[20,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[4.4( empty local-lis/les=20/21 n=0 ec=36/20 lis/c=20/20 les/c/f=21/21/0 sis=36) [3,5,0] r=2 lpr=36 pi=[20,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.5( empty local-lis/les=22/23 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.3( empty local-lis/les=22/23 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.1( empty local-lis/les=22/23 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[4.1( empty local-lis/les=20/21 n=0 ec=36/20 lis/c=20/20 les/c/f=21/21/0 sis=36) [3,5,0] r=2 lpr=36 pi=[20,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.2( empty local-lis/les=22/23 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[4.7( empty local-lis/les=20/21 n=0 ec=36/20 lis/c=20/20 les/c/f=21/21/0 sis=36) [3,5,0] r=2 lpr=36 pi=[20,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.6( empty local-lis/les=22/23 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[4.3( empty local-lis/les=20/21 n=0 ec=36/20 lis/c=20/20 les/c/f=21/21/0 sis=36) [3,5,0] r=2 lpr=36 pi=[20,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[4.6( empty local-lis/les=20/21 n=0 ec=36/20 lis/c=20/20 les/c/f=21/21/0 sis=36) [3,5,0] r=2 lpr=36 pi=[20,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.7( empty local-lis/les=22/23 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[4.8( empty local-lis/les=20/21 n=0 ec=36/20 lis/c=20/20 les/c/f=21/21/0 sis=36) [3,5,0] r=2 lpr=36 pi=[20,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[4.5( empty local-lis/les=20/21 n=0 ec=36/20 lis/c=20/20 les/c/f=21/21/0 sis=36) [3,5,0] r=2 lpr=36 pi=[20,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.4( empty local-lis/les=22/23 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[4.b( empty local-lis/les=20/21 n=0 ec=36/20 lis/c=20/20 les/c/f=21/21/0 sis=36) [3,5,0] r=2 lpr=36 pi=[20,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[4.e( empty local-lis/les=20/21 n=0 ec=36/20 lis/c=20/20 les/c/f=21/21/0 sis=36) [3,5,0] r=2 lpr=36 pi=[20,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.f( empty local-lis/les=22/23 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[4.f( empty local-lis/les=20/21 n=0 ec=36/20 lis/c=20/20 les/c/f=21/21/0 sis=36) [3,5,0] r=2 lpr=36 pi=[20,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.d( empty local-lis/les=22/23 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.e( empty local-lis/les=22/23 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[4.d( empty local-lis/les=20/21 n=0 ec=36/20 lis/c=20/20 les/c/f=21/21/0 sis=36) [3,5,0] r=2 lpr=36 pi=[20,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.c( empty local-lis/les=22/23 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[4.a( empty local-lis/les=20/21 n=0 ec=36/20 lis/c=20/20 les/c/f=21/21/0 sis=36) [3,5,0] r=2 lpr=36 pi=[20,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[4.c( empty local-lis/les=20/21 n=0 ec=36/20 lis/c=20/20 les/c/f=21/21/0 sis=36) [3,5,0] r=2 lpr=36 pi=[20,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[4.9( empty local-lis/les=20/21 n=0 ec=36/20 lis/c=20/20 les/c/f=21/21/0 sis=36) [3,5,0] r=2 lpr=36 pi=[20,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.b( empty local-lis/les=22/23 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.a( empty local-lis/les=22/23 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.9( empty local-lis/les=22/23 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.8( empty local-lis/les=22/23 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.17( empty local-lis/les=22/23 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.15( empty local-lis/les=22/23 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.16( empty local-lis/les=22/23 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.14( empty local-lis/les=22/23 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.13( empty local-lis/les=22/23 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.12( empty local-lis/les=22/23 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.11( empty local-lis/les=22/23 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[4.11( empty local-lis/les=20/21 n=0 ec=36/20 lis/c=20/20 les/c/f=21/21/0 sis=36) [3,5,0] r=2 lpr=36 pi=[20,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.10( empty local-lis/les=22/23 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.0( empty local-lis/les=36/37 n=0 ec=22/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.19( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.1a( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.f( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.e( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.c( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.5( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.2( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.1b( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.1( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.3( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.4( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.a( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.8( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.d( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.1e( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.7( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.1f( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.1c( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.18( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.6( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.9( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.10( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.1d( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.13( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.11( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.12( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.b( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.17( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.14( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.15( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:40 localhost ceph-osd[31674]: osd.0 pg_epoch: 37 pg[5.16( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=22/22 les/c/f=23/23/0 sis=36) [0,1,2] r=0 lpr=36 pi=[22,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:41 localhost ceph-osd[32631]: osd.4 pg_epoch: 38 pg[6.0( empty local-lis/les=28/29 n=0 ec=28/28 lis/c=28/28 les/c/f=29/29/0 sis=38 pruub=15.057910919s) [1,5,4] r=2 lpr=38 pi=[28,38)/1 crt=0'0 mlcod 0'0 active pruub 1127.126220703s@ mbc={}] start_peering_interval up [1,5,4] -> [1,5,4], acting [1,5,4] -> [1,5,4], acting_primary 1 -> 1, up_primary 1 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:41 localhost ceph-osd[32631]: osd.4 pg_epoch: 38 pg[6.0( empty local-lis/les=28/29 n=0 ec=28/28 lis/c=28/28 les/c/f=29/29/0 sis=38 pruub=15.055257797s) [1,5,4] r=2 lpr=38 pi=[28,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.126220703s@ mbc={}] state: transitioning to Stray Nov 26 03:04:41 localhost ceph-osd[31674]: osd.0 pg_epoch: 38 pg[7.0( v 31'39 (0'0,31'39] local-lis/les=29/30 n=22 ec=29/29 lis/c=29/29 les/c/f=30/30/0 sis=38 pruub=8.575904846s) [5,0,3] r=1 lpr=38 pi=[29,38)/1 luod=0'0 lua=31'37 crt=31'39 lcod 31'38 mlcod 0'0 active pruub 1125.075439453s@ mbc={}] start_peering_interval up [5,0,3] -> [5,0,3], acting [5,0,3] -> [5,0,3], acting_primary 5 -> 5, up_primary 5 -> 5, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:41 localhost ceph-osd[31674]: osd.0 pg_epoch: 38 pg[7.0( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=29/30 n=1 ec=29/29 lis/c=29/29 les/c/f=30/30/0 sis=38 pruub=8.573743820s) [5,0,3] r=1 lpr=38 pi=[29,38)/1 crt=31'39 lcod 31'38 mlcod 0'0 unknown NOTIFY pruub 1125.075439453s@ mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[32631]: osd.4 pg_epoch: 39 pg[6.1f( empty local-lis/les=28/29 n=0 ec=38/28 lis/c=28/28 les/c/f=29/29/0 sis=38) [1,5,4] r=2 lpr=38 pi=[28,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[32631]: osd.4 pg_epoch: 39 pg[6.1e( empty local-lis/les=28/29 n=0 ec=38/28 lis/c=28/28 les/c/f=29/29/0 sis=38) [1,5,4] r=2 lpr=38 pi=[28,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[32631]: osd.4 pg_epoch: 39 pg[6.c( empty local-lis/les=28/29 n=0 ec=38/28 lis/c=28/28 les/c/f=29/29/0 sis=38) [1,5,4] r=2 lpr=38 pi=[28,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[32631]: osd.4 pg_epoch: 39 pg[6.a( empty local-lis/les=28/29 n=0 ec=38/28 lis/c=28/28 les/c/f=29/29/0 sis=38) [1,5,4] r=2 lpr=38 pi=[28,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[32631]: osd.4 pg_epoch: 39 pg[6.1d( empty local-lis/les=28/29 n=0 ec=38/28 lis/c=28/28 les/c/f=29/29/0 sis=38) [1,5,4] r=2 lpr=38 pi=[28,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[32631]: osd.4 pg_epoch: 39 pg[6.7( empty local-lis/les=28/29 n=0 ec=38/28 lis/c=28/28 les/c/f=29/29/0 sis=38) [1,5,4] r=2 lpr=38 pi=[28,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[32631]: osd.4 pg_epoch: 39 pg[6.18( empty local-lis/les=28/29 n=0 ec=38/28 lis/c=28/28 les/c/f=29/29/0 sis=38) [1,5,4] r=2 lpr=38 pi=[28,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[32631]: osd.4 pg_epoch: 39 pg[6.b( empty local-lis/les=28/29 n=0 ec=38/28 lis/c=28/28 les/c/f=29/29/0 sis=38) [1,5,4] r=2 lpr=38 pi=[28,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[32631]: osd.4 pg_epoch: 39 pg[6.8( empty local-lis/les=28/29 n=0 ec=38/28 lis/c=28/28 les/c/f=29/29/0 sis=38) [1,5,4] r=2 lpr=38 pi=[28,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[32631]: osd.4 pg_epoch: 39 pg[6.6( empty local-lis/les=28/29 n=0 ec=38/28 lis/c=28/28 les/c/f=29/29/0 sis=38) [1,5,4] r=2 lpr=38 pi=[28,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[32631]: osd.4 pg_epoch: 39 pg[6.1b( empty local-lis/les=28/29 n=0 ec=38/28 lis/c=28/28 les/c/f=29/29/0 sis=38) [1,5,4] r=2 lpr=38 pi=[28,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[32631]: osd.4 pg_epoch: 39 pg[6.5( empty local-lis/les=28/29 n=0 ec=38/28 lis/c=28/28 les/c/f=29/29/0 sis=38) [1,5,4] r=2 lpr=38 pi=[28,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[32631]: osd.4 pg_epoch: 39 pg[6.9( empty local-lis/les=28/29 n=0 ec=38/28 lis/c=28/28 les/c/f=29/29/0 sis=38) [1,5,4] r=2 lpr=38 pi=[28,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[32631]: osd.4 pg_epoch: 39 pg[6.19( empty local-lis/les=28/29 n=0 ec=38/28 lis/c=28/28 les/c/f=29/29/0 sis=38) [1,5,4] r=2 lpr=38 pi=[28,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[32631]: osd.4 pg_epoch: 39 pg[6.3( empty local-lis/les=28/29 n=0 ec=38/28 lis/c=28/28 les/c/f=29/29/0 sis=38) [1,5,4] r=2 lpr=38 pi=[28,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[32631]: osd.4 pg_epoch: 39 pg[6.2( empty local-lis/les=28/29 n=0 ec=38/28 lis/c=28/28 les/c/f=29/29/0 sis=38) [1,5,4] r=2 lpr=38 pi=[28,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[32631]: osd.4 pg_epoch: 39 pg[6.4( empty local-lis/les=28/29 n=0 ec=38/28 lis/c=28/28 les/c/f=29/29/0 sis=38) [1,5,4] r=2 lpr=38 pi=[28,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[32631]: osd.4 pg_epoch: 39 pg[6.1( empty local-lis/les=28/29 n=0 ec=38/28 lis/c=28/28 les/c/f=29/29/0 sis=38) [1,5,4] r=2 lpr=38 pi=[28,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[32631]: osd.4 pg_epoch: 39 pg[6.d( empty local-lis/les=28/29 n=0 ec=38/28 lis/c=28/28 les/c/f=29/29/0 sis=38) [1,5,4] r=2 lpr=38 pi=[28,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[32631]: osd.4 pg_epoch: 39 pg[6.f( empty local-lis/les=28/29 n=0 ec=38/28 lis/c=28/28 les/c/f=29/29/0 sis=38) [1,5,4] r=2 lpr=38 pi=[28,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[32631]: osd.4 pg_epoch: 39 pg[6.e( empty local-lis/les=28/29 n=0 ec=38/28 lis/c=28/28 les/c/f=29/29/0 sis=38) [1,5,4] r=2 lpr=38 pi=[28,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[32631]: osd.4 pg_epoch: 39 pg[6.11( empty local-lis/les=28/29 n=0 ec=38/28 lis/c=28/28 les/c/f=29/29/0 sis=38) [1,5,4] r=2 lpr=38 pi=[28,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[32631]: osd.4 pg_epoch: 39 pg[6.10( empty local-lis/les=28/29 n=0 ec=38/28 lis/c=28/28 les/c/f=29/29/0 sis=38) [1,5,4] r=2 lpr=38 pi=[28,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[32631]: osd.4 pg_epoch: 39 pg[6.14( empty local-lis/les=28/29 n=0 ec=38/28 lis/c=28/28 les/c/f=29/29/0 sis=38) [1,5,4] r=2 lpr=38 pi=[28,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[32631]: osd.4 pg_epoch: 39 pg[6.13( empty local-lis/les=28/29 n=0 ec=38/28 lis/c=28/28 les/c/f=29/29/0 sis=38) [1,5,4] r=2 lpr=38 pi=[28,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[32631]: osd.4 pg_epoch: 39 pg[6.12( empty local-lis/les=28/29 n=0 ec=38/28 lis/c=28/28 les/c/f=29/29/0 sis=38) [1,5,4] r=2 lpr=38 pi=[28,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[32631]: osd.4 pg_epoch: 39 pg[6.15( empty local-lis/les=28/29 n=0 ec=38/28 lis/c=28/28 les/c/f=29/29/0 sis=38) [1,5,4] r=2 lpr=38 pi=[28,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[32631]: osd.4 pg_epoch: 39 pg[6.16( empty local-lis/les=28/29 n=0 ec=38/28 lis/c=28/28 les/c/f=29/29/0 sis=38) [1,5,4] r=2 lpr=38 pi=[28,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[32631]: osd.4 pg_epoch: 39 pg[6.17( empty local-lis/les=28/29 n=0 ec=38/28 lis/c=28/28 les/c/f=29/29/0 sis=38) [1,5,4] r=2 lpr=38 pi=[28,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[32631]: osd.4 pg_epoch: 39 pg[6.1a( empty local-lis/les=28/29 n=0 ec=38/28 lis/c=28/28 les/c/f=29/29/0 sis=38) [1,5,4] r=2 lpr=38 pi=[28,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[32631]: osd.4 pg_epoch: 39 pg[6.1c( empty local-lis/les=28/29 n=0 ec=38/28 lis/c=28/28 les/c/f=29/29/0 sis=38) [1,5,4] r=2 lpr=38 pi=[28,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[31674]: osd.0 pg_epoch: 39 pg[7.a( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=29/30 n=1 ec=38/29 lis/c=29/29 les/c/f=30/30/0 sis=38) [5,0,3] r=1 lpr=38 pi=[29,38)/1 crt=31'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[31674]: osd.0 pg_epoch: 39 pg[7.8( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=29/30 n=1 ec=38/29 lis/c=29/29 les/c/f=30/30/0 sis=38) [5,0,3] r=1 lpr=38 pi=[29,38)/1 crt=31'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[31674]: osd.0 pg_epoch: 39 pg[7.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=29/30 n=1 ec=38/29 lis/c=29/29 les/c/f=30/30/0 sis=38) [5,0,3] r=1 lpr=38 pi=[29,38)/1 crt=31'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[31674]: osd.0 pg_epoch: 39 pg[7.e( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=29/30 n=1 ec=38/29 lis/c=29/29 les/c/f=30/30/0 sis=38) [5,0,3] r=1 lpr=38 pi=[29,38)/1 crt=31'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[31674]: osd.0 pg_epoch: 39 pg[7.f( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=29/30 n=1 ec=38/29 lis/c=29/29 les/c/f=30/30/0 sis=38) [5,0,3] r=1 lpr=38 pi=[29,38)/1 crt=31'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[31674]: osd.0 pg_epoch: 39 pg[7.c( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=29/30 n=1 ec=38/29 lis/c=29/29 les/c/f=30/30/0 sis=38) [5,0,3] r=1 lpr=38 pi=[29,38)/1 crt=31'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[31674]: osd.0 pg_epoch: 39 pg[7.6( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=29/30 n=2 ec=38/29 lis/c=29/29 les/c/f=30/30/0 sis=38) [5,0,3] r=1 lpr=38 pi=[29,38)/1 crt=31'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[31674]: osd.0 pg_epoch: 39 pg[7.d( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=29/30 n=1 ec=38/29 lis/c=29/29 les/c/f=30/30/0 sis=38) [5,0,3] r=1 lpr=38 pi=[29,38)/1 crt=31'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[31674]: osd.0 pg_epoch: 39 pg[7.5( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=29/30 n=2 ec=38/29 lis/c=29/29 les/c/f=30/30/0 sis=38) [5,0,3] r=1 lpr=38 pi=[29,38)/1 crt=31'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[31674]: osd.0 pg_epoch: 39 pg[7.4( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=29/30 n=2 ec=38/29 lis/c=29/29 les/c/f=30/30/0 sis=38) [5,0,3] r=1 lpr=38 pi=[29,38)/1 crt=31'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[31674]: osd.0 pg_epoch: 39 pg[7.9( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=29/30 n=1 ec=38/29 lis/c=29/29 les/c/f=30/30/0 sis=38) [5,0,3] r=1 lpr=38 pi=[29,38)/1 crt=31'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[31674]: osd.0 pg_epoch: 39 pg[7.2( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=29/30 n=2 ec=38/29 lis/c=29/29 les/c/f=30/30/0 sis=38) [5,0,3] r=1 lpr=38 pi=[29,38)/1 crt=31'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[31674]: osd.0 pg_epoch: 39 pg[7.3( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=29/30 n=2 ec=38/29 lis/c=29/29 les/c/f=30/30/0 sis=38) [5,0,3] r=1 lpr=38 pi=[29,38)/1 crt=31'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[31674]: osd.0 pg_epoch: 39 pg[7.7( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=29/30 n=1 ec=38/29 lis/c=29/29 les/c/f=30/30/0 sis=38) [5,0,3] r=1 lpr=38 pi=[29,38)/1 crt=31'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:42 localhost ceph-osd[31674]: osd.0 pg_epoch: 39 pg[7.1( v 31'39 (0'0,31'39] local-lis/les=29/30 n=2 ec=38/29 lis/c=29/29 les/c/f=30/30/0 sis=38) [5,0,3] r=1 lpr=38 pi=[29,38)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:43 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 3.1a scrub starts Nov 26 03:04:43 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 3.1a scrub ok Nov 26 03:04:44 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 3.19 scrub starts Nov 26 03:04:44 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 3.19 scrub ok Nov 26 03:04:47 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 3.17 scrub starts Nov 26 03:04:47 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 3.17 scrub ok Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[6.12( empty local-lis/les=0/0 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40) [0,2,1] r=0 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.15( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.531064987s) [0,2,3] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.395996094s@ mbc={}] start_peering_interval up [0,2,1] -> [0,2,3], acting [0,2,1] -> [0,2,3], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.15( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.531064987s) [0,2,3] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown pruub 1137.395996094s@ mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.15( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.557859421s) [5,1,0] r=2 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.423217773s@ mbc={}] start_peering_interval up [0,1,2] -> [5,1,0], acting [0,1,2] -> [5,1,0], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.12( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.531510353s) [1,5,0] r=2 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.396850586s@ mbc={}] start_peering_interval up [0,2,1] -> [1,5,0], acting [0,2,1] -> [1,5,0], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.15( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.557815552s) [5,1,0] r=2 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.423217773s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.12( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.531427383s) [1,5,0] r=2 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.396850586s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.10( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.532307625s) [5,0,3] r=1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.397827148s@ mbc={}] start_peering_interval up [0,2,1] -> [5,0,3], acting [0,2,1] -> [5,0,3], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.16( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.557715416s) [5,3,0] r=2 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.423217773s@ mbc={}] start_peering_interval up [0,1,2] -> [5,3,0], acting [0,1,2] -> [5,3,0], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.8( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.554381371s) [0,1,5] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.419921875s@ mbc={}] start_peering_interval up [0,1,2] -> [0,1,5], acting [0,1,2] -> [0,1,5], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.10( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.532279968s) [5,0,3] r=1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.397827148s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.16( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.557681084s) [5,3,0] r=2 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.423217773s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.8( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.554381371s) [0,1,5] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown pruub 1131.419921875s@ mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.e( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.531207085s) [0,5,3] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.396972656s@ mbc={}] start_peering_interval up [0,2,1] -> [0,5,3], acting [0,2,1] -> [0,5,3], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.9( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.555023193s) [5,0,1] r=1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.420654297s@ mbc={}] start_peering_interval up [0,1,2] -> [5,0,1], acting [0,1,2] -> [5,0,1], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.e( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.531207085s) [0,5,3] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown pruub 1137.396972656s@ mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.9( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.554998398s) [5,0,1] r=1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.420654297s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.c( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.531152725s) [5,3,0] r=2 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.396972656s@ mbc={}] start_peering_interval up [0,2,1] -> [5,3,0], acting [0,2,1] -> [5,3,0], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.c( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.531101227s) [5,3,0] r=2 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.396972656s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.b( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.555727005s) [0,1,5] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.421630859s@ mbc={}] start_peering_interval up [0,1,2] -> [0,1,5], acting [0,1,2] -> [0,1,5], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.b( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.555727005s) [0,1,5] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown pruub 1131.421630859s@ mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.d( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.530930519s) [5,0,3] r=1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.396972656s@ mbc={}] start_peering_interval up [0,2,1] -> [5,0,3], acting [0,2,1] -> [5,0,3], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.d( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.530901909s) [5,0,3] r=1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.396972656s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.c( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.553213120s) [1,2,4] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.419311523s@ mbc={}] start_peering_interval up [0,1,2] -> [1,2,4], acting [0,1,2] -> [1,2,4], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.c( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.553182602s) [1,2,4] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.419311523s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.17( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.524384499s) [1,5,0] r=2 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.390380859s@ mbc={}] start_peering_interval up [0,2,1] -> [1,5,0], acting [0,2,1] -> [1,5,0], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.6( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.554113388s) [1,5,0] r=2 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.420166016s@ mbc={}] start_peering_interval up [0,1,2] -> [1,5,0], acting [0,1,2] -> [1,5,0], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.1( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.531469345s) [1,2,4] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.397583008s@ mbc={}] start_peering_interval up [0,2,1] -> [1,2,4], acting [0,2,1] -> [1,2,4], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.6( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.531396866s) [1,2,4] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.397583008s@ mbc={}] start_peering_interval up [0,2,1] -> [1,2,4], acting [0,2,1] -> [1,2,4], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.6( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.554066658s) [1,5,0] r=2 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.420166016s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.1( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.531442642s) [1,2,4] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.397583008s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.17( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.524287224s) [1,5,0] r=2 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.390380859s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.6( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.531371117s) [1,2,4] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.397583008s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.4( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.531420708s) [1,0,2] r=1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.397705078s@ mbc={}] start_peering_interval up [0,2,1] -> [1,0,2], acting [0,2,1] -> [1,0,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.4( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.531394958s) [1,0,2] r=1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.397705078s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.5( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.552867889s) [1,2,0] r=2 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.419311523s@ mbc={}] start_peering_interval up [0,1,2] -> [1,2,0], acting [0,1,2] -> [1,2,0], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.5( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.552842140s) [1,2,0] r=2 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.419311523s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.3( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.553058624s) [1,5,0] r=2 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.419555664s@ mbc={}] start_peering_interval up [0,1,2] -> [1,5,0], acting [0,1,2] -> [1,5,0], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.3( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.553030968s) [1,5,0] r=2 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.419555664s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.19( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.523960114s) [1,2,0] r=2 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.390502930s@ mbc={}] start_peering_interval up [0,2,1] -> [1,2,0], acting [0,2,1] -> [1,2,0], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.1e( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.553524017s) [1,2,4] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.420043945s@ mbc={}] start_peering_interval up [0,1,2] -> [1,2,4], acting [0,1,2] -> [1,2,4], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.19( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.523932457s) [1,2,0] r=2 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.390502930s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.1c( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.703365326s) [4,3,5] r=0 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 active pruub 1129.164672852s@ mbc={}] start_peering_interval up [1,5,4] -> [4,3,5], acting [1,5,4] -> [4,3,5], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.1a( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.709716797s) [5,0,1] r=-1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 active pruub 1129.171142578s@ mbc={}] start_peering_interval up [1,5,4] -> [5,0,1], acting [1,5,4] -> [5,0,1], acting_primary 1 -> 5, up_primary 1 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.1c( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.703365326s) [4,3,5] r=0 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown pruub 1129.164672852s@ mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.1a( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.709658623s) [5,0,1] r=-1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1129.171142578s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.1e( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.553498268s) [1,2,4] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.420043945s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.14( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.709445000s) [1,5,0] r=-1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 active pruub 1129.171142578s@ mbc={}] start_peering_interval up [1,5,4] -> [1,5,0], acting [1,5,4] -> [1,5,0], acting_primary 1 -> 1, up_primary 1 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[3.11( empty local-lis/les=0/0 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40) [4,5,3] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.17( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.709058762s) [4,1,2] r=0 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 active pruub 1129.170776367s@ mbc={}] start_peering_interval up [1,5,4] -> [4,1,2], acting [1,5,4] -> [4,1,2], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.14( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.709377289s) [1,5,0] r=-1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1129.171142578s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.17( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.709058762s) [4,1,2] r=0 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown pruub 1129.170776367s@ mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.12( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.703918457s) [0,2,1] r=-1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 active pruub 1129.165893555s@ mbc={}] start_peering_interval up [1,5,4] -> [0,2,1], acting [1,5,4] -> [0,2,1], acting_primary 1 -> 0, up_primary 1 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[5.12( empty local-lis/les=0/0 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40) [4,5,3] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.1d( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.554248810s) [1,2,0] r=2 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.421020508s@ mbc={}] start_peering_interval up [0,1,2] -> [1,2,0], acting [0,1,2] -> [1,2,0], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.12( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.703841209s) [0,2,1] r=-1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1129.165893555s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.1f( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.518447876s) [3,2,4] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.385131836s@ mbc={}] start_peering_interval up [2,0,3] -> [3,2,4], acting [2,0,3] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.1a( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.552224159s) [0,5,1] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.419067383s@ mbc={}] start_peering_interval up [0,1,2] -> [0,5,1], acting [0,1,2] -> [0,5,1], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.1f( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.518418312s) [3,2,4] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.385131836s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.1d( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.554205894s) [1,2,0] r=2 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.421020508s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.1a( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.552224159s) [0,5,1] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown pruub 1131.419067383s@ mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.1b( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.530854225s) [0,5,1] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.397705078s@ mbc={}] start_peering_interval up [0,2,1] -> [0,5,1], acting [0,2,1] -> [0,5,1], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.1b( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.530854225s) [0,5,1] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown pruub 1137.397705078s@ mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.4( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.552698135s) [0,1,5] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.419555664s@ mbc={}] start_peering_interval up [0,1,2] -> [0,1,5], acting [0,1,2] -> [0,1,5], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[5.13( empty local-lis/les=0/0 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40) [4,3,5] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[6.1b( empty local-lis/les=0/0 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40) [0,2,3] r=0 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.4( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.552698135s) [0,1,5] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown pruub 1131.419555664s@ mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.19( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.547824860s) [0,3,2] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.415893555s@ mbc={}] start_peering_interval up [3,5,0] -> [0,3,2], acting [3,5,0] -> [0,3,2], acting_primary 3 -> 0, up_primary 3 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.19( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.547824860s) [0,3,2] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown pruub 1131.415893555s@ mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.19( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.546835899s) [1,5,0] r=2 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.415039062s@ mbc={}] start_peering_interval up [0,1,2] -> [1,5,0], acting [0,1,2] -> [1,5,0], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.1e( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.529251099s) [3,4,5] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.397583008s@ mbc={}] start_peering_interval up [0,2,1] -> [3,4,5], acting [0,2,1] -> [3,4,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.1e( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.529218674s) [3,4,5] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.397583008s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.18( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.551806450s) [2,0,1] r=1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.420166016s@ mbc={}] start_peering_interval up [0,1,2] -> [2,0,1], acting [0,1,2] -> [2,0,1], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.19( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.546731949s) [1,5,0] r=2 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.415039062s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.1f( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.529320717s) [1,5,4] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.397827148s@ mbc={}] start_peering_interval up [0,2,1] -> [1,5,4], acting [0,2,1] -> [1,5,4], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.18( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.551752090s) [2,0,1] r=1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.420166016s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.1f( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.529288292s) [1,5,4] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.397827148s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.1e( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.527795792s) [3,4,5] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.396362305s@ mbc={}] start_peering_interval up [2,0,3] -> [3,4,5], acting [2,0,3] -> [3,4,5], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.1e( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.527745247s) [3,4,5] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.396362305s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.18( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.546166420s) [2,1,0] r=2 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.414794922s@ mbc={}] start_peering_interval up [3,5,0] -> [2,1,0], acting [3,5,0] -> [2,1,0], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.18( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.546140671s) [2,1,0] r=2 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.414794922s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.1d( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.527873039s) [0,5,3] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.396606445s@ mbc={}] start_peering_interval up [2,0,3] -> [0,5,3], acting [2,0,3] -> [0,5,3], acting_primary 2 -> 0, up_primary 2 -> 0, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.1b( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.548849106s) [2,3,0] r=2 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.417602539s@ mbc={}] start_peering_interval up [3,5,0] -> [2,3,0], acting [3,5,0] -> [2,3,0], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.1c( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.528992653s) [5,1,4] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.397705078s@ mbc={}] start_peering_interval up [0,2,1] -> [5,1,4], acting [0,2,1] -> [5,1,4], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.1b( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.548826218s) [2,3,0] r=2 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.417602539s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.1d( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.527873039s) [0,5,3] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown pruub 1137.396606445s@ mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.1c( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.527559280s) [4,2,1] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.396362305s@ mbc={}] start_peering_interval up [2,0,3] -> [4,2,1], acting [2,0,3] -> [4,2,1], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.1c( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.528932571s) [5,1,4] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.397705078s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.1c( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.527530670s) [4,2,1] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.396362305s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[5.d( empty local-lis/les=0/0 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40) [4,5,3] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.1d( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.528516769s) [4,2,1] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.397460938s@ mbc={}] start_peering_interval up [0,2,1] -> [4,2,1], acting [0,2,1] -> [4,2,1], acting_primary 0 -> 4, up_primary 0 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.1a( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.548540115s) [2,3,0] r=2 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.417602539s@ mbc={}] start_peering_interval up [3,5,0] -> [2,3,0], acting [3,5,0] -> [2,3,0], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.1a( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.548517227s) [2,3,0] r=2 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.417602539s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.1b( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.528149605s) [0,2,3] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.397216797s@ mbc={}] start_peering_interval up [2,0,3] -> [0,2,3], acting [2,0,3] -> [0,2,3], acting_primary 2 -> 0, up_primary 2 -> 0, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.1d( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.528487206s) [4,2,1] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.397460938s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.1b( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.550255775s) [2,3,4] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.419311523s@ mbc={}] start_peering_interval up [0,1,2] -> [2,3,4], acting [0,1,2] -> [2,3,4], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.1b( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.528149605s) [0,2,3] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown pruub 1137.397216797s@ mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.1a( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.521301270s) [4,1,2] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.390380859s@ mbc={}] start_peering_interval up [0,2,1] -> [4,1,2], acting [0,2,1] -> [4,1,2], acting_primary 0 -> 4, up_primary 0 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.1d( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.548791885s) [0,5,3] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.417968750s@ mbc={}] start_peering_interval up [3,5,0] -> [0,5,3], acting [3,5,0] -> [0,5,3], acting_primary 3 -> 0, up_primary 3 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.1b( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.550207138s) [2,3,4] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.419311523s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.1a( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.521269798s) [4,1,2] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.390380859s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.1d( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.548791885s) [0,5,3] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown pruub 1131.417968750s@ mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.1c( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.550916672s) [2,4,3] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.420166016s@ mbc={}] start_peering_interval up [0,1,2] -> [2,4,3], acting [0,1,2] -> [2,4,3], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.1a( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.515859604s) [2,4,1] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.385131836s@ mbc={}] start_peering_interval up [2,0,3] -> [2,4,1], acting [2,0,3] -> [2,4,1], acting_primary 2 -> 2, up_primary 2 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.1c( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.550872803s) [2,4,3] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.420166016s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.1c( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.548184395s) [4,1,2] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.417602539s@ mbc={}] start_peering_interval up [3,5,0] -> [4,1,2], acting [3,5,0] -> [4,1,2], acting_primary 3 -> 4, up_primary 3 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.1a( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.515836716s) [2,4,1] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.385131836s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.19( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.527242661s) [3,2,0] r=2 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.396728516s@ mbc={}] start_peering_interval up [2,0,3] -> [3,2,0], acting [2,0,3] -> [3,2,0], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.19( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.527215004s) [3,2,0] r=2 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.396728516s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.1c( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.548160553s) [4,1,2] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.417602539s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.1f( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.548047066s) [0,5,1] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.417724609s@ mbc={}] start_peering_interval up [3,5,0] -> [0,5,1], acting [3,5,0] -> [0,5,1], acting_primary 3 -> 0, up_primary 3 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.18( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.526444435s) [3,0,5] r=1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.395996094s@ mbc={}] start_peering_interval up [0,2,1] -> [3,0,5], acting [0,2,1] -> [3,0,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.18( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.526410103s) [3,0,5] r=1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.395996094s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[5.e( empty local-lis/les=0/0 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40) [4,3,2] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.1f( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.550448418s) [2,4,1] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.420166016s@ mbc={}] start_peering_interval up [0,1,2] -> [2,4,1], acting [0,1,2] -> [2,4,1], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.1f( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.550419807s) [2,4,1] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.420166016s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.1f( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.548047066s) [0,5,1] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown pruub 1131.417724609s@ mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.1e( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.548219681s) [1,5,0] r=2 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.418090820s@ mbc={}] start_peering_interval up [3,5,0] -> [1,5,0], acting [3,5,0] -> [1,5,0], acting_primary 3 -> 1, up_primary 3 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.1e( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.548196793s) [1,5,0] r=2 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.418090820s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[7.1( v 31'39 (0'0,31'39] local-lis/les=38/39 n=2 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.714408875s) [2,4,3] r=-1 lpr=40 pi=[38,40)/1 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1133.584350586s@ mbc={}] start_peering_interval up [5,0,3] -> [2,4,3], acting [5,0,3] -> [2,4,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[7.1( v 31'39 (0'0,31'39] local-lis/les=38/39 n=2 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.714364052s) [2,4,3] r=-1 lpr=40 pi=[38,40)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1133.584350586s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.18( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.527700424s) [4,2,3] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.397705078s@ mbc={}] start_peering_interval up [2,0,3] -> [4,2,3], acting [2,0,3] -> [4,2,3], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.4( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.526471138s) [1,0,2] r=1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.396606445s@ mbc={}] start_peering_interval up [2,0,3] -> [1,0,2], acting [2,0,3] -> [1,0,2], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.5( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.526962280s) [5,3,4] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.397094727s@ mbc={}] start_peering_interval up [0,2,1] -> [5,3,4], acting [0,2,1] -> [5,3,4], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.2( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.545653343s) [4,5,1] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.415771484s@ mbc={}] start_peering_interval up [3,5,0] -> [4,5,1], acting [3,5,0] -> [4,5,1], acting_primary 3 -> 4, up_primary 3 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.18( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.527669907s) [4,2,3] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.397705078s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.4( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.526445389s) [1,0,2] r=1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.396606445s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.2( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.545627594s) [4,5,1] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.415771484s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[3.8( empty local-lis/les=0/0 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40) [4,1,5] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.5( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.526906013s) [5,3,4] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.397094727s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.2( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.526351929s) [5,1,0] r=2 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.396606445s@ mbc={}] start_peering_interval up [2,0,3] -> [5,1,0], acting [2,0,3] -> [5,1,0], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.2( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.526328087s) [5,1,0] r=2 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.396606445s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[7.7( v 31'39 (0'0,31'39] local-lis/les=38/39 n=1 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.714810371s) [2,4,3] r=-1 lpr=40 pi=[38,40)/1 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1133.585083008s@ mbc={}] start_peering_interval up [5,0,3] -> [2,4,3], acting [5,0,3] -> [2,4,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.3( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.526964188s) [2,1,4] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.397338867s@ mbc={}] start_peering_interval up [0,2,1] -> [2,1,4], acting [0,2,1] -> [2,1,4], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.3( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.526930809s) [2,1,4] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.397338867s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[7.7( v 31'39 (0'0,31'39] local-lis/les=38/39 n=1 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.714781761s) [2,4,3] r=-1 lpr=40 pi=[38,40)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1133.585083008s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.5( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.527487755s) [0,3,2] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.397949219s@ mbc={}] start_peering_interval up [2,0,3] -> [0,3,2], acting [2,0,3] -> [0,3,2], acting_primary 2 -> 0, up_primary 2 -> 0, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.4( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.545720100s) [1,0,2] r=1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.416137695s@ mbc={}] start_peering_interval up [3,5,0] -> [1,0,2], acting [3,5,0] -> [1,0,2], acting_primary 3 -> 1, up_primary 3 -> 1, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.d( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.705169678s) [2,1,0] r=-1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 active pruub 1129.171020508s@ mbc={}] start_peering_interval up [1,5,4] -> [2,1,0], acting [1,5,4] -> [2,1,0], acting_primary 1 -> 2, up_primary 1 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.5( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.527487755s) [0,3,2] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown pruub 1137.397949219s@ mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.d( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.705135345s) [2,1,0] r=-1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1129.171020508s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.10( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.705083847s) [1,0,2] r=-1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 active pruub 1129.171020508s@ mbc={}] start_peering_interval up [1,5,4] -> [1,0,2], acting [1,5,4] -> [1,0,2], acting_primary 1 -> 1, up_primary 1 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.10( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.705053329s) [1,0,2] r=-1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1129.171020508s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.4( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.545669556s) [1,0,2] r=1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.416137695s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.3( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.545343399s) [0,5,1] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.415893555s@ mbc={}] start_peering_interval up [3,5,0] -> [0,5,1], acting [3,5,0] -> [0,5,1], acting_primary 3 -> 0, up_primary 3 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.2( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.548690796s) [5,1,0] r=2 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.419311523s@ mbc={}] start_peering_interval up [0,1,2] -> [5,1,0], acting [0,1,2] -> [5,1,0], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.3( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.545343399s) [0,5,1] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown pruub 1131.415893555s@ mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.2( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.548657417s) [5,1,0] r=2 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.419311523s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.1( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.699552536s) [4,5,3] r=0 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 active pruub 1129.165771484s@ mbc={}] start_peering_interval up [1,5,4] -> [4,5,3], acting [1,5,4] -> [4,5,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.7( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.526018143s) [5,0,1] r=1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.396728516s@ mbc={}] start_peering_interval up [2,0,3] -> [5,0,1], acting [2,0,3] -> [5,0,1], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.1( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.699552536s) [4,5,3] r=0 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown pruub 1129.165771484s@ mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.7( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.525993347s) [5,0,1] r=1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.396728516s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.18( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.699154854s) [1,2,4] r=2 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 active pruub 1129.165527344s@ mbc={}] start_peering_interval up [1,5,4] -> [1,2,4], acting [1,5,4] -> [1,2,4], acting_primary 1 -> 1, up_primary 1 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.1( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.545244217s) [4,2,1] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.416015625s@ mbc={}] start_peering_interval up [3,5,0] -> [4,2,1], acting [3,5,0] -> [4,2,1], acting_primary 3 -> 4, up_primary 3 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[3.1d( empty local-lis/les=0/0 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40) [4,2,1] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.4( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.698431015s) [1,2,4] r=2 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 active pruub 1129.164916992s@ mbc={}] start_peering_interval up [1,5,4] -> [1,2,4], acting [1,5,4] -> [1,2,4], acting_primary 1 -> 1, up_primary 1 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.18( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.698963165s) [1,2,4] r=2 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1129.165527344s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[7.3( v 31'39 (0'0,31'39] local-lis/les=38/39 n=2 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.718173981s) [2,4,3] r=-1 lpr=40 pi=[38,40)/1 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1133.588867188s@ mbc={}] start_peering_interval up [5,0,3] -> [2,4,3], acting [5,0,3] -> [2,4,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.1( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.545216560s) [4,2,1] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.416015625s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.4( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.698365211s) [1,2,4] r=2 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1129.164916992s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[7.3( v 31'39 (0'0,31'39] local-lis/les=38/39 n=2 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.718143463s) [2,4,3] r=-1 lpr=40 pi=[38,40)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1133.588867188s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.a( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.698361397s) [5,3,4] r=2 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 active pruub 1129.165039062s@ mbc={}] start_peering_interval up [1,5,4] -> [5,3,4], acting [1,5,4] -> [5,3,4], acting_primary 1 -> 5, up_primary 1 -> 5, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.7( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.526396751s) [3,0,2] r=1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.397216797s@ mbc={}] start_peering_interval up [0,2,1] -> [3,0,2], acting [0,2,1] -> [3,0,2], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[3.9( empty local-lis/les=0/0 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40) [4,2,1] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.a( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.698319435s) [5,3,4] r=2 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1129.165039062s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.6( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.527070999s) [1,0,5] r=1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.397949219s@ mbc={}] start_peering_interval up [2,0,3] -> [1,0,5], acting [2,0,3] -> [1,0,5], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.6( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.527046204s) [1,0,5] r=1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.397949219s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.7( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.526344299s) [3,0,2] r=1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.397216797s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.1( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.548480034s) [2,3,0] r=2 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.419433594s@ mbc={}] start_peering_interval up [0,1,2] -> [2,3,0], acting [0,1,2] -> [2,3,0], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.1( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.526992798s) [3,4,5] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.397949219s@ mbc={}] start_peering_interval up [2,0,3] -> [3,4,5], acting [2,0,3] -> [3,4,5], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.1( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.548452377s) [2,3,0] r=2 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.419433594s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.7( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.546923637s) [1,0,2] r=1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.417968750s@ mbc={}] start_peering_interval up [3,5,0] -> [1,0,2], acting [3,5,0] -> [1,0,2], acting_primary 3 -> 1, up_primary 3 -> 1, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.1( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.526955605s) [3,4,5] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.397949219s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.7( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.546897888s) [1,0,2] r=1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.417968750s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[7.5( v 31'39 (0'0,31'39] local-lis/les=38/39 n=2 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.713837624s) [2,4,3] r=-1 lpr=40 pi=[38,40)/1 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1133.584960938s@ mbc={}] start_peering_interval up [5,0,3] -> [2,4,3], acting [5,0,3] -> [2,4,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.1e( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.695622444s) [4,5,3] r=0 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 active pruub 1129.162841797s@ mbc={}] start_peering_interval up [1,5,4] -> [4,5,3], acting [1,5,4] -> [4,5,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.1e( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.695622444s) [4,5,3] r=0 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown pruub 1129.162841797s@ mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[7.5( v 31'39 (0'0,31'39] local-lis/les=38/39 n=2 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.713810921s) [2,4,3] r=-1 lpr=40 pi=[38,40)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1133.584960938s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.7( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.548915863s) [5,1,4] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.420043945s@ mbc={}] start_peering_interval up [0,1,2] -> [5,1,4], acting [0,1,2] -> [5,1,4], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.6( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.546507835s) [0,1,2] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.417724609s@ mbc={}] start_peering_interval up [3,5,0] -> [0,1,2], acting [3,5,0] -> [0,1,2], acting_primary 3 -> 0, up_primary 3 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.7( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.548887253s) [5,1,4] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.420043945s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.2( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.526097298s) [3,4,5] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.397338867s@ mbc={}] start_peering_interval up [0,2,1] -> [3,4,5], acting [0,2,1] -> [3,4,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.2( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.526067734s) [3,4,5] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.397338867s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.3( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.526603699s) [5,3,0] r=2 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.397827148s@ mbc={}] start_peering_interval up [2,0,3] -> [5,3,0], acting [2,0,3] -> [5,3,0], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.3( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.526559830s) [5,3,0] r=2 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.397827148s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.5( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.544772148s) [5,0,1] r=1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.416137695s@ mbc={}] start_peering_interval up [3,5,0] -> [5,0,1], acting [3,5,0] -> [5,0,1], acting_primary 3 -> 5, up_primary 3 -> 5, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.5( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.544744492s) [5,0,1] r=1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.416137695s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.9( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.526146889s) [4,2,1] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.397583008s@ mbc={}] start_peering_interval up [0,2,1] -> [4,2,1], acting [0,2,1] -> [4,2,1], acting_primary 0 -> 4, up_primary 0 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[7.d( v 31'39 (0'0,31'39] local-lis/les=38/39 n=1 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.713279724s) [2,4,3] r=-1 lpr=40 pi=[38,40)/1 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1133.584716797s@ mbc={}] start_peering_interval up [5,0,3] -> [2,4,3], acting [5,0,3] -> [2,4,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.9( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.526122093s) [4,2,1] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.397583008s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.8( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.525237083s) [2,0,1] r=1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.396728516s@ mbc={}] start_peering_interval up [2,0,3] -> [2,0,1], acting [2,0,3] -> [2,0,1], acting_primary 2 -> 2, up_primary 2 -> 2, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[7.d( v 31'39 (0'0,31'39] local-lis/les=38/39 n=1 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.713249207s) [2,4,3] r=-1 lpr=40 pi=[38,40)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1133.584716797s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.8( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.525213242s) [2,0,1] r=1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.396728516s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[3.1a( empty local-lis/les=0/0 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40) [4,1,2] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.e( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.543787956s) [2,4,1] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.415283203s@ mbc={}] start_peering_interval up [3,5,0] -> [2,4,1], acting [3,5,0] -> [2,4,1], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.f( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.547409058s) [5,0,3] r=1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.419067383s@ mbc={}] start_peering_interval up [0,1,2] -> [5,0,3], acting [0,1,2] -> [5,0,3], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.f( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.547382355s) [5,0,3] r=1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.419067383s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.6( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.546507835s) [0,1,2] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown pruub 1131.417724609s@ mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.1f( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.694233894s) [3,0,5] r=-1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 active pruub 1129.162597656s@ mbc={}] start_peering_interval up [1,5,4] -> [3,0,5], acting [1,5,4] -> [3,0,5], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.8( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.525150299s) [4,1,5] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.396972656s@ mbc={}] start_peering_interval up [0,2,1] -> [4,1,5], acting [0,2,1] -> [4,1,5], acting_primary 0 -> 4, up_primary 0 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.c( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.697188377s) [3,2,4] r=2 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 active pruub 1129.165649414s@ mbc={}] start_peering_interval up [1,5,4] -> [3,2,4], acting [1,5,4] -> [3,2,4], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.8( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.696886063s) [2,0,3] r=-1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 active pruub 1129.165405273s@ mbc={}] start_peering_interval up [1,5,4] -> [2,0,3], acting [1,5,4] -> [2,0,3], acting_primary 1 -> 2, up_primary 1 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.1d( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.692039490s) [1,4,5] r=1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 active pruub 1129.160522461s@ mbc={}] start_peering_interval up [1,5,4] -> [1,4,5], acting [1,5,4] -> [1,4,5], acting_primary 1 -> 1, up_primary 1 -> 1, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.c( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.697107315s) [3,2,4] r=2 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1129.165649414s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.1f( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.694190025s) [3,0,5] r=-1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1129.162597656s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.1d( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.692004204s) [1,4,5] r=1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1129.160522461s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.8( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.696784973s) [2,0,3] r=-1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1129.165405273s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.8( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.525108337s) [4,1,5] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.396972656s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.e( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.543766022s) [2,4,1] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.415283203s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.f( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.543153763s) [1,4,5] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.415161133s@ mbc={}] start_peering_interval up [3,5,0] -> [1,4,5], acting [3,5,0] -> [1,4,5], acting_primary 3 -> 1, up_primary 3 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.9( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.524765968s) [1,5,4] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.396972656s@ mbc={}] start_peering_interval up [2,0,3] -> [1,5,4], acting [2,0,3] -> [1,5,4], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.f( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.543128967s) [1,4,5] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.415161133s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.e( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.546971321s) [4,3,2] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.419189453s@ mbc={}] start_peering_interval up [0,1,2] -> [4,3,2], acting [0,1,2] -> [4,3,2], acting_primary 0 -> 4, up_primary 0 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.9( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.524712563s) [1,5,4] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.396972656s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.b( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.524639130s) [3,5,4] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.397094727s@ mbc={}] start_peering_interval up [0,2,1] -> [3,5,4], acting [0,2,1] -> [3,5,4], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.b( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.524610519s) [3,5,4] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.397094727s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.7( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.696077347s) [5,3,4] r=2 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 active pruub 1129.165039062s@ mbc={}] start_peering_interval up [1,5,4] -> [5,3,4], acting [1,5,4] -> [5,3,4], acting_primary 1 -> 5, up_primary 1 -> 5, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.7( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.696048737s) [5,3,4] r=2 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1129.165039062s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.a( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.525521278s) [0,3,2] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.397949219s@ mbc={}] start_peering_interval up [2,0,3] -> [0,3,2], acting [2,0,3] -> [0,3,2], acting_primary 2 -> 0, up_primary 2 -> 0, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.c( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.543226242s) [5,3,0] r=2 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.415771484s@ mbc={}] start_peering_interval up [3,5,0] -> [5,3,0], acting [3,5,0] -> [5,3,0], acting_primary 3 -> 5, up_primary 3 -> 5, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.a( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.525521278s) [0,3,2] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown pruub 1137.397949219s@ mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.c( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.543202400s) [5,3,0] r=2 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.415771484s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.9( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.696534157s) [3,4,5] r=1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 active pruub 1129.165893555s@ mbc={}] start_peering_interval up [1,5,4] -> [3,4,5], acting [1,5,4] -> [3,4,5], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.d( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.547184944s) [4,5,3] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.419921875s@ mbc={}] start_peering_interval up [0,1,2] -> [4,5,3], acting [0,1,2] -> [4,5,3], acting_primary 0 -> 4, up_primary 0 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.9( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.696483612s) [3,4,5] r=1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1129.165893555s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.e( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.546456337s) [4,3,2] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.419189453s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.19( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.695762634s) [5,1,4] r=2 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 active pruub 1129.165283203s@ mbc={}] start_peering_interval up [1,5,4] -> [5,1,4], acting [1,5,4] -> [5,1,4], acting_primary 1 -> 5, up_primary 1 -> 5, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[7.f( v 31'39 (0'0,31'39] local-lis/les=38/39 n=1 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.712112427s) [2,4,3] r=-1 lpr=40 pi=[38,40)/1 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1133.584960938s@ mbc={}] start_peering_interval up [5,0,3] -> [2,4,3], acting [5,0,3] -> [2,4,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.d( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.547129631s) [4,5,3] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.419921875s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.19( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.695738792s) [5,1,4] r=2 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1129.165283203s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[7.f( v 31'39 (0'0,31'39] local-lis/les=38/39 n=1 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.712077141s) [2,4,3] r=-1 lpr=40 pi=[38,40)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1133.584960938s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.b( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.524798393s) [0,5,1] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.397705078s@ mbc={}] start_peering_interval up [2,0,3] -> [0,5,1], acting [2,0,3] -> [0,5,1], acting_primary 2 -> 0, up_primary 2 -> 0, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.6( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.695596695s) [3,5,4] r=2 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 active pruub 1129.165405273s@ mbc={}] start_peering_interval up [1,5,4] -> [3,5,4], acting [1,5,4] -> [3,5,4], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.1b( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.695829391s) [0,2,3] r=-1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 active pruub 1129.165649414s@ mbc={}] start_peering_interval up [1,5,4] -> [0,2,3], acting [1,5,4] -> [0,2,3], acting_primary 1 -> 0, up_primary 1 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.6( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.695567131s) [3,5,4] r=2 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1129.165405273s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.3( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.695649147s) [5,4,3] r=1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 active pruub 1129.165405273s@ mbc={}] start_peering_interval up [1,5,4] -> [5,4,3], acting [1,5,4] -> [5,4,3], acting_primary 1 -> 5, up_primary 1 -> 5, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.1b( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.695797920s) [0,2,3] r=-1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1129.165649414s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.3( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.695531845s) [5,4,3] r=1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1129.165405273s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.2( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.695512772s) [5,3,4] r=2 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 active pruub 1129.165527344s@ mbc={}] start_peering_interval up [1,5,4] -> [5,3,4], acting [1,5,4] -> [5,3,4], acting_primary 1 -> 5, up_primary 1 -> 5, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.b( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.524798393s) [0,5,1] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown pruub 1137.397705078s@ mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.2( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.695485115s) [5,3,4] r=2 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1129.165527344s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.c( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.524282455s) [4,3,5] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.397338867s@ mbc={}] start_peering_interval up [2,0,3] -> [4,3,5], acting [2,0,3] -> [4,3,5], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.d( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.542307854s) [2,4,1] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.415527344s@ mbc={}] start_peering_interval up [3,5,0] -> [2,4,1], acting [3,5,0] -> [2,4,1], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.5( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.695068359s) [5,0,1] r=-1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 active pruub 1129.165039062s@ mbc={}] start_peering_interval up [1,5,4] -> [5,0,1], acting [1,5,4] -> [5,0,1], acting_primary 1 -> 5, up_primary 1 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.5( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.695000648s) [5,0,1] r=-1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1129.165039062s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.f( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.695568085s) [3,4,5] r=1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 active pruub 1129.165771484s@ mbc={}] start_peering_interval up [1,5,4] -> [3,4,5], acting [1,5,4] -> [3,4,5], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.11( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.695774078s) [3,4,2] r=1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 active pruub 1129.166015625s@ mbc={}] start_peering_interval up [1,5,4] -> [3,4,2], acting [1,5,4] -> [3,4,2], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.e( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.695361137s) [5,1,0] r=-1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 active pruub 1129.165649414s@ mbc={}] start_peering_interval up [1,5,4] -> [5,1,0], acting [1,5,4] -> [5,1,0], acting_primary 1 -> 5, up_primary 1 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.f( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.695537567s) [3,4,5] r=1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1129.165771484s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.11( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.695745468s) [3,4,2] r=1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1129.166015625s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.e( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.695332527s) [5,1,0] r=-1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1129.165649414s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.13( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.695370674s) [3,4,2] r=1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 active pruub 1129.165771484s@ mbc={}] start_peering_interval up [1,5,4] -> [3,4,2], acting [1,5,4] -> [3,4,2], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.16( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.700817108s) [3,5,4] r=2 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 active pruub 1129.171142578s@ mbc={}] start_peering_interval up [1,5,4] -> [3,5,4], acting [1,5,4] -> [3,5,4], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.15( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.700806618s) [2,4,1] r=1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 active pruub 1129.171264648s@ mbc={}] start_peering_interval up [1,5,4] -> [2,4,1], acting [1,5,4] -> [2,4,1], acting_primary 1 -> 2, up_primary 1 -> 2, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[7.9( v 31'39 (0'0,31'39] local-lis/les=38/39 n=1 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.711924553s) [2,4,3] r=-1 lpr=40 pi=[38,40)/1 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1133.585083008s@ mbc={}] start_peering_interval up [5,0,3] -> [2,4,3], acting [5,0,3] -> [2,4,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.15( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.700777054s) [2,4,1] r=1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1129.171264648s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.16( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.700757027s) [3,5,4] r=2 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1129.171142578s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.13( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.695317268s) [3,4,2] r=1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1129.165771484s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.c( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.524256706s) [4,3,5] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.397338867s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[4.14( empty local-lis/les=0/0 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40) [4,1,5] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[7.9( v 31'39 (0'0,31'39] local-lis/les=38/39 n=1 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.711890221s) [2,4,3] r=-1 lpr=40 pi=[38,40)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1133.585083008s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.d( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.542258263s) [2,4,1] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.415527344s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.d( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.524420738s) [0,5,3] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.397705078s@ mbc={}] start_peering_interval up [2,0,3] -> [0,5,3], acting [2,0,3] -> [0,5,3], acting_primary 2 -> 0, up_primary 2 -> 0, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.a( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.524521828s) [5,1,0] r=2 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.397338867s@ mbc={}] start_peering_interval up [0,2,1] -> [5,1,0], acting [0,2,1] -> [5,1,0], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.b( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.542621613s) [3,2,4] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.416015625s@ mbc={}] start_peering_interval up [3,5,0] -> [3,2,4], acting [3,5,0] -> [3,2,4], acting_primary 3 -> 3, up_primary 3 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.d( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.524420738s) [0,5,3] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown pruub 1137.397705078s@ mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[2.1c( empty local-lis/les=0/0 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40) [4,2,1] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.a( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.523998260s) [5,1,0] r=2 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.397338867s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.b( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.542595863s) [3,2,4] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.416015625s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[7.b( v 31'39 (0'0,31'39] local-lis/les=38/39 n=1 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.711745262s) [2,4,3] r=-1 lpr=40 pi=[38,40)/1 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1133.585205078s@ mbc={}] start_peering_interval up [5,0,3] -> [2,4,3], acting [5,0,3] -> [2,4,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.e( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.523836136s) [1,4,2] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.397338867s@ mbc={}] start_peering_interval up [2,0,3] -> [1,4,2], acting [2,0,3] -> [1,4,2], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.f( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.522586823s) [2,4,1] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.395996094s@ mbc={}] start_peering_interval up [0,2,1] -> [2,4,1], acting [0,2,1] -> [2,4,1], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.a( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.546360016s) [3,0,2] r=1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.419799805s@ mbc={}] start_peering_interval up [0,1,2] -> [3,0,2], acting [0,1,2] -> [3,0,2], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.e( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.523815155s) [1,4,2] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.397338867s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[4.15( empty local-lis/les=0/0 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40) [4,3,2] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.a( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.544461250s) [2,1,4] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.417602539s@ mbc={}] start_peering_interval up [3,5,0] -> [2,1,4], acting [3,5,0] -> [2,1,4], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.f( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.522561073s) [2,4,1] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.395996094s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.a( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.544060707s) [2,1,4] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.417602539s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.a( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.546306610s) [3,0,2] r=1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.419799805s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[7.b( v 31'39 (0'0,31'39] local-lis/les=38/39 n=1 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.711597443s) [2,4,3] r=-1 lpr=40 pi=[38,40)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1133.585205078s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.8( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.543330193s) [4,2,3] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.417480469s@ mbc={}] start_peering_interval up [3,5,0] -> [4,2,3], acting [3,5,0] -> [4,2,3], acting_primary 3 -> 4, up_primary 3 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.8( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.543302536s) [4,2,3] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.417480469s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.f( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.522981644s) [0,2,3] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.397705078s@ mbc={}] start_peering_interval up [2,0,3] -> [0,2,3], acting [2,0,3] -> [0,2,3], acting_primary 2 -> 0, up_primary 2 -> 0, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.f( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.522981644s) [0,2,3] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown pruub 1137.397705078s@ mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.9( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.542092323s) [0,1,2] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.417480469s@ mbc={}] start_peering_interval up [3,5,0] -> [0,1,2], acting [3,5,0] -> [0,1,2], acting_primary 3 -> 0, up_primary 3 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.9( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.542092323s) [0,1,2] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown pruub 1131.417480469s@ mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.10( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.521348953s) [4,1,5] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.397460938s@ mbc={}] start_peering_interval up [2,0,3] -> [4,1,5], acting [2,0,3] -> [4,1,5], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.10( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.521290779s) [4,1,5] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.397460938s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.16( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.539278030s) [3,2,0] r=2 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.415771484s@ mbc={}] start_peering_interval up [3,5,0] -> [3,2,0], acting [3,5,0] -> [3,2,0], acting_primary 3 -> 3, up_primary 3 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.17( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.545277596s) [3,0,5] r=1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.421752930s@ mbc={}] start_peering_interval up [0,1,2] -> [3,0,5], acting [0,1,2] -> [3,0,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.b( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.693769455s) [3,2,4] r=2 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 active pruub 1129.165405273s@ mbc={}] start_peering_interval up [1,5,4] -> [3,2,4], acting [1,5,4] -> [3,2,4], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.16( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.539220810s) [3,2,0] r=2 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.415771484s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[6.b( empty local-lis/les=38/39 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40 pruub=10.693547249s) [3,2,4] r=2 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1129.165405273s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[4.1c( empty local-lis/les=0/0 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40) [4,1,2] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.11( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.520702362s) [5,1,4] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.397338867s@ mbc={}] start_peering_interval up [2,0,3] -> [5,1,4], acting [2,0,3] -> [5,1,4], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.17( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.545179367s) [3,0,5] r=1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.421752930s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.11( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.520649910s) [5,1,4] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.397338867s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.11( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.519301414s) [4,5,3] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.395996094s@ mbc={}] start_peering_interval up [0,2,1] -> [4,5,3], acting [0,2,1] -> [4,5,3], acting_primary 0 -> 4, up_primary 0 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.11( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.519265175s) [4,5,3] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.395996094s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.13( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.519315720s) [2,1,4] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.396118164s@ mbc={}] start_peering_interval up [0,2,1] -> [2,1,4], acting [0,2,1] -> [2,1,4], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.13( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.519286156s) [2,1,4] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.396118164s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.14( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.538310051s) [4,1,5] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.415161133s@ mbc={}] start_peering_interval up [3,5,0] -> [4,1,5], acting [3,5,0] -> [4,1,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.12( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.521095276s) [4,3,2] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.397949219s@ mbc={}] start_peering_interval up [2,0,3] -> [4,3,2], acting [2,0,3] -> [4,3,2], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.14( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.538284302s) [4,1,5] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.415161133s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.17( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.537755966s) [3,2,0] r=2 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.414794922s@ mbc={}] start_peering_interval up [3,5,0] -> [3,2,0], acting [3,5,0] -> [3,2,0], acting_primary 3 -> 3, up_primary 3 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.12( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.521046638s) [4,3,2] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.397949219s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.17( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.537670135s) [3,2,0] r=2 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.414794922s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[2.18( empty local-lis/les=0/0 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40) [4,2,3] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[4.2( empty local-lis/les=0/0 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40) [4,5,1] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.15( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.537803650s) [4,3,2] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.414916992s@ mbc={}] start_peering_interval up [3,5,0] -> [4,3,2], acting [3,5,0] -> [4,3,2], acting_primary 3 -> 4, up_primary 3 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.15( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.537695885s) [4,3,2] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.414916992s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.14( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.519906998s) [2,4,1] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.397216797s@ mbc={}] start_peering_interval up [2,0,3] -> [2,4,1], acting [2,0,3] -> [2,4,1], acting_primary 2 -> 2, up_primary 2 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.14( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.545886040s) [3,4,5] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.423217773s@ mbc={}] start_peering_interval up [0,1,2] -> [3,4,5], acting [0,1,2] -> [3,4,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[4.1( empty local-lis/les=0/0 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40) [4,2,1] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.14( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.519864082s) [2,4,1] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.397216797s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.13( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.519850731s) [0,5,3] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.397216797s@ mbc={}] start_peering_interval up [2,0,3] -> [0,5,3], acting [2,0,3] -> [0,5,3], acting_primary 2 -> 0, up_primary 2 -> 0, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.12( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.537525177s) [1,4,2] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.414916992s@ mbc={}] start_peering_interval up [3,5,0] -> [1,4,2], acting [3,5,0] -> [1,4,2], acting_primary 3 -> 1, up_primary 3 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.13( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.543646812s) [4,3,5] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.421142578s@ mbc={}] start_peering_interval up [0,1,2] -> [4,3,5], acting [0,1,2] -> [4,3,5], acting_primary 0 -> 4, up_primary 0 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.13( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.543621063s) [4,3,5] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.421142578s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.13( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.519850731s) [0,5,3] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown pruub 1137.397216797s@ mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.14( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.545675278s) [3,4,5] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.423217773s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.14( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.518434525s) [2,4,1] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.395996094s@ mbc={}] start_peering_interval up [0,2,1] -> [2,4,1], acting [0,2,1] -> [2,4,1], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.15( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.519551277s) [4,1,2] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.397094727s@ mbc={}] start_peering_interval up [2,0,3] -> [4,1,2], acting [2,0,3] -> [4,1,2], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[2.c( empty local-lis/les=0/0 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40) [4,3,5] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.15( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.519525528s) [4,1,2] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.397094727s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.14( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.518398285s) [2,4,1] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.395996094s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.16( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.519194603s) [5,0,1] r=1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.396972656s@ mbc={}] start_peering_interval up [2,0,3] -> [5,0,1], acting [2,0,3] -> [5,0,1], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.13( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.538315773s) [2,0,1] r=1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.416015625s@ mbc={}] start_peering_interval up [3,5,0] -> [2,0,1], acting [3,5,0] -> [2,0,1], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.16( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.519171715s) [5,0,1] r=1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.396972656s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.10( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.537351608s) [3,4,2] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.415039062s@ mbc={}] start_peering_interval up [3,5,0] -> [3,4,2], acting [3,5,0] -> [3,4,2], acting_primary 3 -> 3, up_primary 3 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[4.8( empty local-lis/les=0/0 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40) [4,2,3] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.12( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.543825150s) [4,5,3] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.421630859s@ mbc={}] start_peering_interval up [0,1,2] -> [4,5,3], acting [0,1,2] -> [4,5,3], acting_primary 0 -> 4, up_primary 0 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.10( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.537322044s) [3,4,2] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.415039062s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.13( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.538264275s) [2,0,1] r=1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.416015625s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[4.12( empty local-lis/les=36/37 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.537111282s) [1,4,2] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.414916992s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.16( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.518940926s) [2,1,4] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.396850586s@ mbc={}] start_peering_interval up [0,2,1] -> [2,1,4], acting [0,2,1] -> [2,1,4], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[3.16( empty local-lis/les=34/35 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.518908501s) [2,1,4] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.396850586s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.17( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.518776894s) [5,4,3] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active pruub 1137.396850586s@ mbc={}] start_peering_interval up [2,0,3] -> [5,4,3], acting [2,0,3] -> [5,4,3], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.10( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.542869568s) [2,4,3] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.421020508s@ mbc={}] start_peering_interval up [0,1,2] -> [2,4,3], acting [0,1,2] -> [2,4,3], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[2.15( empty local-lis/les=0/0 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40) [4,1,2] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.10( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.542813301s) [2,4,3] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.421020508s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.11( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.543355942s) [2,4,3] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active pruub 1131.421264648s@ mbc={}] start_peering_interval up [0,1,2] -> [2,4,3], acting [0,1,2] -> [2,4,3], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.11( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.543018341s) [2,4,3] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.421264648s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[2.17( empty local-lis/les=34/35 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40 pruub=14.518570900s) [5,4,3] r=-1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.396850586s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[5.12( empty local-lis/les=36/37 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40 pruub=8.543621063s) [4,5,3] r=-1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1131.421630859s@ mbc={}] state: transitioning to Stray Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[2.12( empty local-lis/les=0/0 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40) [4,3,2] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[2.10( empty local-lis/les=0/0 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40) [4,1,5] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:04:48 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 5.0 scrub starts Nov 26 03:04:48 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 5.0 scrub ok Nov 26 03:04:49 localhost python3[55525]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[3.1f( empty local-lis/les=0/0 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40) [1,5,4] r=2 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[5.c( empty local-lis/les=0/0 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40) [1,2,4] r=2 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[2.9( empty local-lis/les=0/0 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40) [1,5,4] r=2 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[4.f( empty local-lis/les=0/0 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40) [1,4,5] r=1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[3.1e( empty local-lis/les=0/0 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40) [3,4,5] r=1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[3.1( empty local-lis/les=0/0 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40) [1,2,4] r=2 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[3.6( empty local-lis/les=0/0 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40) [1,2,4] r=2 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[6.10( empty local-lis/les=0/0 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40) [1,0,2] r=1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[3.2( empty local-lis/les=0/0 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40) [3,4,5] r=1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[6.14( empty local-lis/les=0/0 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40) [1,5,0] r=2 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[2.e( empty local-lis/les=0/0 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40) [1,4,2] r=1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[4.12( empty local-lis/les=0/0 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40) [1,4,2] r=1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[3.b( empty local-lis/les=0/0 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40) [3,5,4] r=2 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[6.1f( empty local-lis/les=0/0 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40) [3,0,5] r=1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[5.1e( empty local-lis/les=0/0 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40) [1,2,4] r=2 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[5.14( empty local-lis/les=0/0 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40) [3,4,5] r=1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[4.10( empty local-lis/les=0/0 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40) [3,4,2] r=1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[5.10( empty local-lis/les=0/0 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40) [2,4,3] r=1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[5.1f( empty local-lis/les=0/0 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40) [2,4,1] r=1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[4.b( empty local-lis/les=0/0 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40) [3,2,4] r=2 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[6.1a( empty local-lis/les=0/0 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40) [5,0,1] r=1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[2.1( empty local-lis/les=0/0 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40) [3,4,5] r=1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[3.16( empty local-lis/les=0/0 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40) [2,1,4] r=2 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[5.11( empty local-lis/les=0/0 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40) [2,4,3] r=1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[2.1f( empty local-lis/les=0/0 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40) [3,2,4] r=2 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[2.1e( empty local-lis/les=0/0 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40) [3,4,5] r=1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[6.e( empty local-lis/les=0/0 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40) [5,1,0] r=2 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[3.14( empty local-lis/les=0/0 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40) [2,4,1] r=1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[6.5( empty local-lis/les=0/0 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40) [5,0,1] r=1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[3.13( empty local-lis/les=0/0 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40) [2,1,4] r=2 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[3.f( empty local-lis/les=0/0 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40) [2,4,1] r=1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[3.3( empty local-lis/les=0/0 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40) [2,1,4] r=2 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[5.1c( empty local-lis/les=0/0 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40) [2,4,3] r=1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[5.1b( empty local-lis/les=0/0 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40) [2,3,4] r=2 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[3.5( empty local-lis/les=0/0 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40) [5,3,4] r=2 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[2.1a( empty local-lis/les=0/0 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40) [2,4,1] r=1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[5.7( empty local-lis/les=0/0 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40) [5,1,4] r=2 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[6.8( empty local-lis/les=0/0 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40) [2,0,3] r=1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[3.1c( empty local-lis/les=0/0 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40) [5,1,4] r=2 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[7.d( empty local-lis/les=0/0 n=0 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=40) [2,4,3] r=1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[31674]: osd.0 pg_epoch: 40 pg[6.d( empty local-lis/les=0/0 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40) [2,1,0] r=2 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[4.e( empty local-lis/les=0/0 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40) [2,4,1] r=1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[7.3( empty local-lis/les=0/0 n=0 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=40) [2,4,3] r=1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[2.17( empty local-lis/les=0/0 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40) [5,4,3] r=1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[7.1( empty local-lis/les=0/0 n=0 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=40) [2,4,3] r=1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[7.7( empty local-lis/les=0/0 n=0 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=40) [2,4,3] r=1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[31674]: osd.0 pg_epoch: 41 pg[3.15( empty local-lis/les=40/41 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40) [0,2,3] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[31674]: osd.0 pg_epoch: 41 pg[6.12( empty local-lis/les=40/41 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40) [0,2,1] r=0 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[7.5( empty local-lis/les=0/0 n=0 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=40) [2,4,3] r=1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[2.11( empty local-lis/les=0/0 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40) [5,1,4] r=2 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 41 pg[3.11( empty local-lis/les=40/41 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40) [4,5,3] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 41 pg[4.14( empty local-lis/les=40/41 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40) [4,1,5] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 41 pg[5.12( empty local-lis/les=40/41 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40) [4,5,3] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[7.f( empty local-lis/les=0/0 n=0 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=40) [2,4,3] r=1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[31674]: osd.0 pg_epoch: 41 pg[5.b( empty local-lis/les=40/41 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40) [0,1,5] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 41 pg[5.13( empty local-lis/les=40/41 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40) [4,3,5] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[31674]: osd.0 pg_epoch: 41 pg[5.8( empty local-lis/les=40/41 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40) [0,1,5] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[4.d( empty local-lis/les=0/0 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40) [2,4,1] r=1 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[7.9( empty local-lis/les=0/0 n=0 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=40) [2,4,3] r=1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[4.a( empty local-lis/les=0/0 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40) [2,1,4] r=2 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 41 pg[3.8( empty local-lis/les=40/41 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40) [4,1,5] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 41 pg[2.10( empty local-lis/les=40/41 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40) [4,1,5] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[31674]: osd.0 pg_epoch: 41 pg[5.1a( empty local-lis/les=40/41 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40) [0,5,1] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[31674]: osd.0 pg_epoch: 41 pg[2.1d( empty local-lis/les=40/41 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40) [0,5,3] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 41 pg[5.d( empty local-lis/les=40/41 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40) [4,5,3] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[2.14( empty local-lis/les=0/0 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40) [2,4,1] r=1 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[31674]: osd.0 pg_epoch: 41 pg[3.e( empty local-lis/les=40/41 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40) [0,5,3] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 40 pg[7.b( empty local-lis/les=0/0 n=0 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=40) [2,4,3] r=1 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:04:49 localhost ceph-osd[31674]: osd.0 pg_epoch: 41 pg[3.1b( empty local-lis/les=40/41 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40) [0,5,1] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 41 pg[6.1( empty local-lis/les=40/41 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40) [4,5,3] r=0 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 41 pg[4.2( empty local-lis/les=40/41 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40) [4,5,1] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 41 pg[2.c( empty local-lis/les=40/41 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40) [4,3,5] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 41 pg[2.18( empty local-lis/les=40/41 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40) [4,2,3] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 41 pg[2.15( empty local-lis/les=40/41 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40) [4,1,2] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 41 pg[6.17( empty local-lis/les=40/41 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40) [4,1,2] r=0 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 41 pg[2.12( empty local-lis/les=40/41 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40) [4,3,2] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 41 pg[4.8( empty local-lis/les=40/41 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40) [4,2,3] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 41 pg[4.1( empty local-lis/les=40/41 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40) [4,2,1] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 41 pg[5.e( empty local-lis/les=40/41 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40) [4,3,2] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 41 pg[2.1c( empty local-lis/les=40/41 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40) [4,2,1] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 41 pg[4.1c( empty local-lis/les=40/41 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40) [4,1,2] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 41 pg[3.1a( empty local-lis/les=40/41 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40) [4,1,2] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 41 pg[3.1d( empty local-lis/les=40/41 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40) [4,2,1] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 41 pg[4.15( empty local-lis/les=40/41 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40) [4,3,2] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 41 pg[6.1e( empty local-lis/les=40/41 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40) [4,5,3] r=0 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 41 pg[6.1c( empty local-lis/les=40/41 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40) [4,3,5] r=0 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[31674]: osd.0 pg_epoch: 41 pg[5.4( empty local-lis/les=40/41 n=0 ec=36/22 lis/c=36/36 les/c/f=37/37/0 sis=40) [0,1,5] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[32631]: osd.4 pg_epoch: 41 pg[3.9( empty local-lis/les=40/41 n=0 ec=34/19 lis/c=34/34 les/c/f=35/35/0 sis=40) [4,2,1] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[31674]: osd.0 pg_epoch: 41 pg[6.1b( empty local-lis/les=40/41 n=0 ec=38/28 lis/c=38/38 les/c/f=39/39/0 sis=40) [0,2,3] r=0 lpr=40 pi=[38,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[31674]: osd.0 pg_epoch: 41 pg[4.19( empty local-lis/les=40/41 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40) [0,3,2] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[31674]: osd.0 pg_epoch: 41 pg[2.1b( empty local-lis/les=40/41 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40) [0,2,3] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[31674]: osd.0 pg_epoch: 41 pg[2.a( empty local-lis/les=40/41 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40) [0,3,2] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[31674]: osd.0 pg_epoch: 41 pg[4.6( empty local-lis/les=40/41 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40) [0,1,2] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[31674]: osd.0 pg_epoch: 41 pg[2.f( empty local-lis/les=40/41 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40) [0,2,3] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[31674]: osd.0 pg_epoch: 41 pg[4.9( empty local-lis/les=40/41 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40) [0,1,2] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[31674]: osd.0 pg_epoch: 41 pg[2.5( empty local-lis/les=40/41 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40) [0,3,2] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[31674]: osd.0 pg_epoch: 41 pg[4.1f( empty local-lis/les=40/41 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40) [0,5,1] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[31674]: osd.0 pg_epoch: 41 pg[4.1d( empty local-lis/les=40/41 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40) [0,5,3] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[31674]: osd.0 pg_epoch: 41 pg[2.b( empty local-lis/les=40/41 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40) [0,5,1] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[31674]: osd.0 pg_epoch: 41 pg[2.d( empty local-lis/les=40/41 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40) [0,5,3] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[31674]: osd.0 pg_epoch: 41 pg[4.3( empty local-lis/les=40/41 n=0 ec=36/20 lis/c=36/36 les/c/f=37/37/0 sis=40) [0,5,1] r=0 lpr=40 pi=[36,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost ceph-osd[31674]: osd.0 pg_epoch: 41 pg[2.13( empty local-lis/les=40/41 n=0 ec=34/18 lis/c=34/34 les/c/f=35/35/0 sis=40) [0,5,3] r=0 lpr=40 pi=[34,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:04:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:04:49 localhost podman[55526]: 2025-11-26 08:04:49.817990473 +0000 UTC m=+0.079286271 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, container_name=metrics_qdr, tcib_managed=true, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com) Nov 26 03:04:50 localhost podman[55526]: 2025-11-26 08:04:50.006236653 +0000 UTC m=+0.267532421 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., container_name=metrics_qdr, vcs-type=git, name=rhosp17/openstack-qdrouterd) Nov 26 03:04:50 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:04:50 localhost ceph-osd[31674]: osd.0 pg_epoch: 42 pg[7.e( v 31'39 (0'0,31'39] local-lis/les=38/39 n=1 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=42 pruub=8.502484322s) [3,0,5] r=1 lpr=42 pi=[38,42)/1 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1133.584594727s@ mbc={}] start_peering_interval up [5,0,3] -> [3,0,5], acting [5,0,3] -> [3,0,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:50 localhost ceph-osd[31674]: osd.0 pg_epoch: 42 pg[7.e( v 31'39 (0'0,31'39] local-lis/les=38/39 n=1 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=42 pruub=8.502388954s) [3,0,5] r=1 lpr=42 pi=[38,42)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1133.584594727s@ mbc={}] state: transitioning to Stray Nov 26 03:04:50 localhost ceph-osd[31674]: osd.0 pg_epoch: 42 pg[7.6( v 31'39 (0'0,31'39] local-lis/les=38/39 n=2 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=42 pruub=8.502665520s) [3,0,5] r=1 lpr=42 pi=[38,42)/1 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1133.584838867s@ mbc={}] start_peering_interval up [5,0,3] -> [3,0,5], acting [5,0,3] -> [3,0,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:50 localhost ceph-osd[31674]: osd.0 pg_epoch: 42 pg[7.6( v 31'39 (0'0,31'39] local-lis/les=38/39 n=2 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=42 pruub=8.502598763s) [3,0,5] r=1 lpr=42 pi=[38,42)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1133.584838867s@ mbc={}] state: transitioning to Stray Nov 26 03:04:50 localhost ceph-osd[31674]: osd.0 pg_epoch: 42 pg[7.2( v 31'39 (0'0,31'39] local-lis/les=38/39 n=2 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=42 pruub=8.506554604s) [3,0,5] r=1 lpr=42 pi=[38,42)/1 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1133.588989258s@ mbc={}] start_peering_interval up [5,0,3] -> [3,0,5], acting [5,0,3] -> [3,0,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:50 localhost ceph-osd[31674]: osd.0 pg_epoch: 42 pg[7.2( v 31'39 (0'0,31'39] local-lis/les=38/39 n=2 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=42 pruub=8.506464958s) [3,0,5] r=1 lpr=42 pi=[38,42)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1133.588989258s@ mbc={}] state: transitioning to Stray Nov 26 03:04:50 localhost ceph-osd[31674]: osd.0 pg_epoch: 42 pg[7.a( v 31'39 (0'0,31'39] local-lis/les=38/39 n=1 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=42 pruub=8.503144264s) [3,0,5] r=1 lpr=42 pi=[38,42)/1 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1133.585205078s@ mbc={}] start_peering_interval up [5,0,3] -> [3,0,5], acting [5,0,3] -> [3,0,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:50 localhost ceph-osd[31674]: osd.0 pg_epoch: 42 pg[7.a( v 31'39 (0'0,31'39] local-lis/les=38/39 n=1 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=42 pruub=8.502145767s) [3,0,5] r=1 lpr=42 pi=[38,42)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1133.585205078s@ mbc={}] state: transitioning to Stray Nov 26 03:04:51 localhost python3[55571]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:04:52 localhost sshd[55572]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:04:52 localhost ceph-osd[32631]: osd.4 pg_epoch: 44 pg[7.f( v 31'39 lc 31'1 (0'0,31'39] local-lis/les=40/41 n=1 ec=38/29 lis/c=40/38 les/c/f=41/39/0 sis=44 pruub=12.962328911s) [3,2,4] r=2 lpr=44 pi=[38,44)/2 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1135.689453125s@ m=3 mbc={}] start_peering_interval up [2,4,3] -> [3,2,4], acting [2,4,3] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:52 localhost ceph-osd[32631]: osd.4 pg_epoch: 44 pg[7.f( v 31'39 lc 31'1 (0'0,31'39] local-lis/les=40/41 n=1 ec=38/29 lis/c=40/38 les/c/f=41/39/0 sis=44 pruub=12.962266922s) [3,2,4] r=2 lpr=44 pi=[38,44)/2 crt=31'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1135.689453125s@ m=3 mbc={}] state: transitioning to Stray Nov 26 03:04:52 localhost ceph-osd[32631]: osd.4 pg_epoch: 44 pg[7.7( v 31'39 (0'0,31'39] local-lis/les=40/41 n=1 ec=38/29 lis/c=40/40 les/c/f=41/42/0 sis=44 pruub=12.959233284s) [3,2,4] r=2 lpr=44 pi=[40,44)/1 luod=0'0 crt=31'39 mlcod 0'0 active pruub 1135.687011719s@ mbc={}] start_peering_interval up [2,4,3] -> [3,2,4], acting [2,4,3] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:52 localhost ceph-osd[32631]: osd.4 pg_epoch: 44 pg[7.7( v 31'39 (0'0,31'39] local-lis/les=40/41 n=1 ec=38/29 lis/c=40/40 les/c/f=41/42/0 sis=44 pruub=12.959167480s) [3,2,4] r=2 lpr=44 pi=[40,44)/1 crt=31'39 mlcod 0'0 unknown NOTIFY pruub 1135.687011719s@ mbc={}] state: transitioning to Stray Nov 26 03:04:52 localhost ceph-osd[32631]: osd.4 pg_epoch: 44 pg[7.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=40/41 n=1 ec=38/29 lis/c=40/38 les/c/f=41/39/0 sis=44 pruub=12.961388588s) [3,2,4] r=2 lpr=44 pi=[38,44)/2 luod=0'0 crt=31'39 mlcod 0'0 active pruub 1135.689208984s@ m=1 mbc={}] start_peering_interval up [2,4,3] -> [3,2,4], acting [2,4,3] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:52 localhost ceph-osd[32631]: osd.4 pg_epoch: 44 pg[7.3( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=40/41 n=2 ec=38/29 lis/c=40/38 les/c/f=41/39/0 sis=44 pruub=12.959120750s) [3,2,4] r=2 lpr=44 pi=[38,44)/2 luod=0'0 crt=31'39 mlcod 0'0 active pruub 1135.687011719s@ m=2 mbc={}] start_peering_interval up [2,4,3] -> [3,2,4], acting [2,4,3] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:52 localhost ceph-osd[32631]: osd.4 pg_epoch: 44 pg[7.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=40/41 n=1 ec=38/29 lis/c=40/38 les/c/f=41/39/0 sis=44 pruub=12.961311340s) [3,2,4] r=2 lpr=44 pi=[38,44)/2 crt=31'39 mlcod 0'0 unknown NOTIFY pruub 1135.689208984s@ m=1 mbc={}] state: transitioning to Stray Nov 26 03:04:52 localhost ceph-osd[32631]: osd.4 pg_epoch: 44 pg[7.3( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=40/41 n=2 ec=38/29 lis/c=40/38 les/c/f=41/39/0 sis=44 pruub=12.959087372s) [3,2,4] r=2 lpr=44 pi=[38,44)/2 crt=31'39 mlcod 0'0 unknown NOTIFY pruub 1135.687011719s@ m=2 mbc={}] state: transitioning to Stray Nov 26 03:04:52 localhost ceph-osd[31674]: osd.0 pg_epoch: 44 pg[7.f( v 31'39 (0'0,31'39] local-lis/les=38/39 n=1 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,2,4] r=-1 lpr=44 pi=[38,44)/2 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] start_peering_interval up [2,4,3] -> [3,2,4], acting [2,4,3] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:52 localhost ceph-osd[31674]: osd.0 pg_epoch: 44 pg[7.f( v 31'39 (0'0,31'39] local-lis/les=38/39 n=1 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,2,4] r=-1 lpr=44 pi=[38,44)/2 crt=31'39 lcod 0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:52 localhost ceph-osd[31674]: osd.0 pg_epoch: 44 pg[7.b( v 31'39 (0'0,31'39] local-lis/les=38/39 n=1 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,2,4] r=-1 lpr=44 pi=[38,44)/2 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] start_peering_interval up [2,4,3] -> [3,2,4], acting [2,4,3] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:52 localhost ceph-osd[31674]: osd.0 pg_epoch: 44 pg[7.b( v 31'39 (0'0,31'39] local-lis/les=38/39 n=1 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,2,4] r=-1 lpr=44 pi=[38,44)/2 crt=31'39 lcod 0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:52 localhost ceph-osd[31674]: osd.0 pg_epoch: 44 pg[7.3( v 31'39 (0'0,31'39] local-lis/les=38/39 n=2 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,2,4] r=-1 lpr=44 pi=[38,44)/2 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] start_peering_interval up [2,4,3] -> [3,2,4], acting [2,4,3] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:52 localhost ceph-osd[31674]: osd.0 pg_epoch: 44 pg[7.7( v 31'39 (0'0,31'39] lb MIN local-lis/les=38/39 n=1 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,2,4] r=-1 lpr=44 pi=[38,44)/2 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] start_peering_interval up [2,4,3] -> [3,2,4], acting [2,4,3] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:52 localhost ceph-osd[31674]: osd.0 pg_epoch: 44 pg[7.7( v 31'39 (0'0,31'39] lb MIN local-lis/les=38/39 n=1 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,2,4] r=-1 lpr=44 pi=[38,44)/2 crt=31'39 lcod 0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:52 localhost ceph-osd[31674]: osd.0 pg_epoch: 44 pg[7.3( v 31'39 (0'0,31'39] local-lis/les=38/39 n=2 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,2,4] r=-1 lpr=44 pi=[38,44)/2 crt=31'39 lcod 0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 26 03:04:52 localhost python3[55589]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:04:55 localhost python3[55637]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:04:56 localhost python3[55680]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764144295.3896842-92246-74160465251864/source dest=/var/lib/tripleo-config/ceph/ceph.client.openstack.keyring mode=600 _original_basename=ceph.client.openstack.keyring follow=False checksum=fb7204a16245207b5739f6a2b62bcbdeec90bcc9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:04:56 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 5.8 scrub starts Nov 26 03:04:57 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 5.8 scrub ok Nov 26 03:04:57 localhost ceph-osd[31674]: osd.0 pg_epoch: 46 pg[7.c( v 31'39 (0'0,31'39] local-lis/les=38/39 n=1 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=9.250824928s) [1,5,0] r=2 lpr=46 pi=[38,46)/1 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1141.584472656s@ mbc={}] start_peering_interval up [5,0,3] -> [1,5,0], acting [5,0,3] -> [1,5,0], acting_primary 5 -> 1, up_primary 5 -> 1, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:57 localhost ceph-osd[31674]: osd.0 pg_epoch: 46 pg[7.4( v 31'39 (0'0,31'39] local-lis/les=38/39 n=2 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=9.250993729s) [1,5,0] r=2 lpr=46 pi=[38,46)/1 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1141.584594727s@ mbc={}] start_peering_interval up [5,0,3] -> [1,5,0], acting [5,0,3] -> [1,5,0], acting_primary 5 -> 1, up_primary 5 -> 1, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:57 localhost ceph-osd[31674]: osd.0 pg_epoch: 46 pg[7.c( v 31'39 (0'0,31'39] local-lis/les=38/39 n=1 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=9.250760078s) [1,5,0] r=2 lpr=46 pi=[38,46)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1141.584472656s@ mbc={}] state: transitioning to Stray Nov 26 03:04:57 localhost ceph-osd[31674]: osd.0 pg_epoch: 46 pg[7.4( v 31'39 (0'0,31'39] local-lis/les=38/39 n=2 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=9.250929832s) [1,5,0] r=2 lpr=46 pi=[38,46)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1141.584594727s@ mbc={}] state: transitioning to Stray Nov 26 03:04:58 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 2.a scrub starts Nov 26 03:04:58 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 2.a scrub ok Nov 26 03:04:58 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 6.1 scrub starts Nov 26 03:04:58 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 6.1 scrub ok Nov 26 03:04:59 localhost ceph-osd[32631]: osd.4 pg_epoch: 48 pg[7.5( v 31'39 (0'0,31'39] local-lis/les=40/41 n=2 ec=38/29 lis/c=40/40 les/c/f=41/45/0 sis=48 pruub=13.684480667s) [4,3,2] r=0 lpr=48 pi=[40,48)/1 luod=0'0 crt=31'39 mlcod 0'0 active pruub 1143.689697266s@ mbc={}] start_peering_interval up [2,4,3] -> [4,3,2], acting [2,4,3] -> [4,3,2], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:59 localhost ceph-osd[32631]: osd.4 pg_epoch: 48 pg[7.5( v 31'39 (0'0,31'39] local-lis/les=40/41 n=2 ec=38/29 lis/c=40/40 les/c/f=41/45/0 sis=48 pruub=13.684480667s) [4,3,2] r=0 lpr=48 pi=[40,48)/1 crt=31'39 mlcod 0'0 unknown pruub 1143.689697266s@ mbc={}] state: transitioning to Primary Nov 26 03:04:59 localhost ceph-osd[32631]: osd.4 pg_epoch: 48 pg[7.d( v 31'39 (0'0,31'39] local-lis/les=40/41 n=1 ec=38/29 lis/c=40/40 les/c/f=41/45/0 sis=48 pruub=13.681457520s) [4,3,2] r=0 lpr=48 pi=[40,48)/1 luod=0'0 crt=31'39 mlcod 0'0 active pruub 1143.687377930s@ mbc={}] start_peering_interval up [2,4,3] -> [4,3,2], acting [2,4,3] -> [4,3,2], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:04:59 localhost ceph-osd[32631]: osd.4 pg_epoch: 48 pg[7.d( v 31'39 (0'0,31'39] local-lis/les=40/41 n=1 ec=38/29 lis/c=40/40 les/c/f=41/45/0 sis=48 pruub=13.681457520s) [4,3,2] r=0 lpr=48 pi=[40,48)/1 crt=31'39 mlcod 0'0 unknown pruub 1143.687377930s@ mbc={}] state: transitioning to Primary Nov 26 03:05:00 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 2.18 deep-scrub starts Nov 26 03:05:00 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 2.18 deep-scrub ok Nov 26 03:05:00 localhost python3[55742]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.client.manila.keyring follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:05:00 localhost ceph-osd[32631]: osd.4 pg_epoch: 49 pg[7.5( v 31'39 (0'0,31'39] local-lis/les=48/49 n=2 ec=38/29 lis/c=40/40 les/c/f=41/45/0 sis=48) [4,3,2] r=0 lpr=48 pi=[40,48)/1 crt=31'39 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:05:00 localhost ceph-osd[32631]: osd.4 pg_epoch: 49 pg[7.d( v 31'39 (0'0,31'39] local-lis/les=48/49 n=1 ec=38/29 lis/c=40/40 les/c/f=41/45/0 sis=48) [4,3,2] r=0 lpr=48 pi=[40,48)/1 crt=31'39 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:05:00 localhost python3[55785]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764144300.2654455-92246-175567425139114/source dest=/var/lib/tripleo-config/ceph/ceph.client.manila.keyring mode=600 _original_basename=ceph.client.manila.keyring follow=False checksum=ed15ddf982ee6bf185ab5c82af7275dcc16a80b1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:05:02 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 2.5 scrub starts Nov 26 03:05:02 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 2.5 scrub ok Nov 26 03:05:04 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 4.1f scrub starts Nov 26 03:05:04 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 4.1f scrub ok Nov 26 03:05:05 localhost python3[55847]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:05:06 localhost python3[55890]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764144305.3550365-92246-113596737874004/source dest=/var/lib/tripleo-config/ceph/ceph.conf mode=644 _original_basename=ceph.conf follow=False checksum=516b2b29193ca54dc570087b840f3446a8725769 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:05:07 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 2.12 deep-scrub starts Nov 26 03:05:07 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 2.12 deep-scrub ok Nov 26 03:05:07 localhost ceph-osd[31674]: osd.0 pg_epoch: 50 pg[7.6( v 31'39 (0'0,31'39] local-lis/les=42/43 n=2 ec=38/29 lis/c=42/42 les/c/f=43/43/0 sis=50 pruub=15.597479820s) [3,2,4] r=-1 lpr=50 pi=[42,50)/1 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1158.129394531s@ mbc={}] start_peering_interval up [3,0,5] -> [3,2,4], acting [3,0,5] -> [3,2,4], acting_primary 3 -> 3, up_primary 3 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:05:07 localhost ceph-osd[31674]: osd.0 pg_epoch: 50 pg[7.e( v 31'39 (0'0,31'39] local-lis/les=42/43 n=1 ec=38/29 lis/c=42/42 les/c/f=43/43/0 sis=50 pruub=15.597455025s) [3,2,4] r=-1 lpr=50 pi=[42,50)/1 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1158.129394531s@ mbc={}] start_peering_interval up [3,0,5] -> [3,2,4], acting [3,0,5] -> [3,2,4], acting_primary 3 -> 3, up_primary 3 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:05:07 localhost ceph-osd[31674]: osd.0 pg_epoch: 50 pg[7.e( v 31'39 (0'0,31'39] local-lis/les=42/43 n=1 ec=38/29 lis/c=42/42 les/c/f=43/43/0 sis=50 pruub=15.597414970s) [3,2,4] r=-1 lpr=50 pi=[42,50)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1158.129394531s@ mbc={}] state: transitioning to Stray Nov 26 03:05:07 localhost ceph-osd[31674]: osd.0 pg_epoch: 50 pg[7.6( v 31'39 (0'0,31'39] local-lis/les=42/43 n=2 ec=38/29 lis/c=42/42 les/c/f=43/43/0 sis=50 pruub=15.597375870s) [3,2,4] r=-1 lpr=50 pi=[42,50)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1158.129394531s@ mbc={}] state: transitioning to Stray Nov 26 03:05:08 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 5.b deep-scrub starts Nov 26 03:05:08 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 5.b deep-scrub ok Nov 26 03:05:08 localhost ceph-osd[32631]: osd.4 pg_epoch: 50 pg[7.e( empty local-lis/les=0/0 n=0 ec=38/29 lis/c=42/42 les/c/f=43/43/0 sis=50) [3,2,4] r=2 lpr=50 pi=[42,50)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:05:08 localhost ceph-osd[32631]: osd.4 pg_epoch: 50 pg[7.6( empty local-lis/les=0/0 n=0 ec=38/29 lis/c=42/42 les/c/f=43/43/0 sis=50) [3,2,4] r=2 lpr=50 pi=[42,50)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:05:09 localhost ceph-osd[32631]: osd.4 pg_epoch: 52 pg[7.f( v 31'39 (0'0,31'39] local-lis/les=44/45 n=1 ec=38/29 lis/c=44/44 les/c/f=45/45/0 sis=52 pruub=15.595621109s) [2,0,3] r=-1 lpr=52 pi=[44,52)/1 luod=0'0 crt=31'39 mlcod 0'0 active pruub 1155.764038086s@ mbc={}] start_peering_interval up [3,2,4] -> [2,0,3], acting [3,2,4] -> [2,0,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:05:09 localhost ceph-osd[32631]: osd.4 pg_epoch: 52 pg[7.7( v 31'39 (0'0,31'39] local-lis/les=44/45 n=1 ec=38/29 lis/c=44/44 les/c/f=45/45/0 sis=52 pruub=15.599453926s) [2,0,3] r=-1 lpr=52 pi=[44,52)/1 luod=0'0 crt=31'39 mlcod 0'0 active pruub 1155.767944336s@ mbc={}] start_peering_interval up [3,2,4] -> [2,0,3], acting [3,2,4] -> [2,0,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:05:09 localhost ceph-osd[32631]: osd.4 pg_epoch: 52 pg[7.7( v 31'39 (0'0,31'39] local-lis/les=44/45 n=1 ec=38/29 lis/c=44/44 les/c/f=45/45/0 sis=52 pruub=15.599343300s) [2,0,3] r=-1 lpr=52 pi=[44,52)/1 crt=31'39 mlcod 0'0 unknown NOTIFY pruub 1155.767944336s@ mbc={}] state: transitioning to Stray Nov 26 03:05:09 localhost ceph-osd[32631]: osd.4 pg_epoch: 52 pg[7.f( v 31'39 (0'0,31'39] local-lis/les=44/45 n=1 ec=38/29 lis/c=44/44 les/c/f=45/45/0 sis=52 pruub=15.595314980s) [2,0,3] r=-1 lpr=52 pi=[44,52)/1 crt=31'39 mlcod 0'0 unknown NOTIFY pruub 1155.764038086s@ mbc={}] state: transitioning to Stray Nov 26 03:05:10 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 5.1a scrub starts Nov 26 03:05:10 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 5.1a scrub ok Nov 26 03:05:10 localhost python3[55952]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:05:11 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 3.1b scrub starts Nov 26 03:05:11 localhost ceph-osd[31674]: osd.0 pg_epoch: 52 pg[7.f( empty local-lis/les=0/0 n=0 ec=38/29 lis/c=44/44 les/c/f=45/45/0 sis=52) [2,0,3] r=1 lpr=52 pi=[44,52)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:05:11 localhost ceph-osd[31674]: osd.0 pg_epoch: 52 pg[7.7( empty local-lis/les=0/0 n=0 ec=38/29 lis/c=44/44 les/c/f=45/45/0 sis=52) [2,0,3] r=1 lpr=52 pi=[44,52)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:05:11 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 3.1b scrub ok Nov 26 03:05:11 localhost python3[55997]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764144310.5572765-92760-146121445471006/source _original_basename=tmph4lme6fl follow=False checksum=f17091ee142621a3c8290c8c96b5b52d67b3a864 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:05:12 localhost ceph-osd[31674]: osd.0 pg_epoch: 54 pg[7.8( v 31'39 (0'0,31'39] local-lis/les=38/39 n=1 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=54 pruub=10.828530312s) [3,2,0] r=2 lpr=54 pi=[38,54)/1 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1157.585449219s@ mbc={}] start_peering_interval up [5,0,3] -> [3,2,0], acting [5,0,3] -> [3,2,0], acting_primary 5 -> 3, up_primary 5 -> 3, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:05:12 localhost ceph-osd[31674]: osd.0 pg_epoch: 54 pg[7.8( v 31'39 (0'0,31'39] local-lis/les=38/39 n=1 ec=38/29 lis/c=38/38 les/c/f=39/39/0 sis=54 pruub=10.828450203s) [3,2,0] r=2 lpr=54 pi=[38,54)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1157.585449219s@ mbc={}] state: transitioning to Stray Nov 26 03:05:12 localhost python3[56059]: ansible-ansible.legacy.stat Invoked with path=/usr/local/sbin/containers-tmpwatch follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:05:12 localhost python3[56102]: ansible-ansible.legacy.copy Invoked with dest=/usr/local/sbin/containers-tmpwatch group=root mode=493 owner=root src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764144312.1204607-92848-4992147593131/source _original_basename=tmp53igsydc follow=False checksum=84397b037dad9813fed388c4bcdd4871f384cd22 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:05:13 localhost python3[56132]: ansible-cron Invoked with job=/usr/local/sbin/containers-tmpwatch name=Remove old logs special_time=daily user=root state=present backup=False minute=* hour=* day=* month=* weekday=* disabled=False env=False cron_file=None insertafter=None insertbefore=None Nov 26 03:05:13 localhost python3[56150]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_2 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 03:05:14 localhost ceph-osd[32631]: osd.4 pg_epoch: 56 pg[7.9( v 31'39 (0'0,31'39] local-lis/les=40/41 n=1 ec=38/29 lis/c=40/40 les/c/f=41/41/0 sis=56 pruub=15.311073303s) [1,0,2] r=-1 lpr=56 pi=[40,56)/1 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1159.689819336s@ mbc={}] start_peering_interval up [2,4,3] -> [1,0,2], acting [2,4,3] -> [1,0,2], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:05:14 localhost ceph-osd[32631]: osd.4 pg_epoch: 56 pg[7.9( v 31'39 (0'0,31'39] local-lis/les=40/41 n=1 ec=38/29 lis/c=40/40 les/c/f=41/41/0 sis=56 pruub=15.311008453s) [1,0,2] r=-1 lpr=56 pi=[40,56)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1159.689819336s@ mbc={}] state: transitioning to Stray Nov 26 03:05:15 localhost ceph-osd[31674]: osd.0 pg_epoch: 56 pg[7.9( empty local-lis/les=0/0 n=0 ec=38/29 lis/c=40/40 les/c/f=41/41/0 sis=56) [1,0,2] r=1 lpr=56 pi=[40,56)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:05:15 localhost ansible-async_wrapper.py[56322]: Invoked with 593342735899 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764144314.6965084-93009-76407543070331/AnsiballZ_command.py _ Nov 26 03:05:15 localhost ansible-async_wrapper.py[56325]: Starting module and watcher Nov 26 03:05:15 localhost ansible-async_wrapper.py[56325]: Start watching 56326 (3600) Nov 26 03:05:15 localhost ansible-async_wrapper.py[56326]: Start module (56326) Nov 26 03:05:15 localhost ansible-async_wrapper.py[56322]: Return async_wrapper task started. Nov 26 03:05:15 localhost python3[56343]: ansible-ansible.legacy.async_status Invoked with jid=593342735899.56322 mode=status _async_dir=/tmp/.ansible_async Nov 26 03:05:16 localhost ceph-osd[32631]: osd.4 pg_epoch: 58 pg[7.a( empty local-lis/les=0/0 n=0 ec=38/29 lis/c=42/42 les/c/f=43/43/0 sis=58) [4,3,5] r=0 lpr=58 pi=[42,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 26 03:05:16 localhost ceph-osd[31674]: osd.0 pg_epoch: 58 pg[7.a( v 31'39 (0'0,31'39] local-lis/les=42/43 n=1 ec=38/29 lis/c=42/42 les/c/f=43/43/0 sis=58 pruub=15.288205147s) [4,3,5] r=-1 lpr=58 pi=[42,58)/1 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1166.132690430s@ mbc={}] start_peering_interval up [3,0,5] -> [4,3,5], acting [3,0,5] -> [4,3,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:05:16 localhost ceph-osd[31674]: osd.0 pg_epoch: 58 pg[7.a( v 31'39 (0'0,31'39] local-lis/les=42/43 n=1 ec=38/29 lis/c=42/42 les/c/f=43/43/0 sis=58 pruub=15.288147926s) [4,3,5] r=-1 lpr=58 pi=[42,58)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1166.132690430s@ mbc={}] state: transitioning to Stray Nov 26 03:05:16 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 3.9 scrub starts Nov 26 03:05:16 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 3.9 scrub ok Nov 26 03:05:17 localhost ceph-osd[32631]: osd.4 pg_epoch: 59 pg[7.a( v 31'39 (0'0,31'39] local-lis/les=58/59 n=1 ec=38/29 lis/c=42/42 les/c/f=43/43/0 sis=58) [4,3,5] r=0 lpr=58 pi=[42,58)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 26 03:05:17 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 3.1d scrub starts Nov 26 03:05:17 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 3.1d scrub ok Nov 26 03:05:18 localhost ceph-osd[32631]: osd.4 pg_epoch: 60 pg[7.b( v 31'39 (0'0,31'39] local-lis/les=44/45 n=1 ec=38/29 lis/c=44/44 les/c/f=45/45/0 sis=60 pruub=15.282883644s) [1,2,4] r=2 lpr=60 pi=[44,60)/1 luod=0'0 crt=31'39 mlcod 0'0 active pruub 1163.768798828s@ mbc={}] start_peering_interval up [3,2,4] -> [1,2,4], acting [3,2,4] -> [1,2,4], acting_primary 3 -> 1, up_primary 3 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:05:18 localhost ceph-osd[32631]: osd.4 pg_epoch: 60 pg[7.b( v 31'39 (0'0,31'39] local-lis/les=44/45 n=1 ec=38/29 lis/c=44/44 les/c/f=45/45/0 sis=60 pruub=15.282533646s) [1,2,4] r=2 lpr=60 pi=[44,60)/1 crt=31'39 mlcod 0'0 unknown NOTIFY pruub 1163.768798828s@ mbc={}] state: transitioning to Stray Nov 26 03:05:18 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 5.e scrub starts Nov 26 03:05:18 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 5.e scrub ok Nov 26 03:05:18 localhost puppet-user[56346]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 26 03:05:18 localhost puppet-user[56346]: (file: /etc/puppet/hiera.yaml) Nov 26 03:05:18 localhost puppet-user[56346]: Warning: Undefined variable '::deploy_config_name'; Nov 26 03:05:18 localhost puppet-user[56346]: (file & line not available) Nov 26 03:05:18 localhost puppet-user[56346]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 26 03:05:18 localhost puppet-user[56346]: (file & line not available) Nov 26 03:05:18 localhost puppet-user[56346]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Nov 26 03:05:19 localhost puppet-user[56346]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Nov 26 03:05:19 localhost puppet-user[56346]: Notice: Compiled catalog for np0005536118.localdomain in environment production in 0.11 seconds Nov 26 03:05:19 localhost puppet-user[56346]: Notice: Applied catalog in 0.04 seconds Nov 26 03:05:19 localhost puppet-user[56346]: Application: Nov 26 03:05:19 localhost puppet-user[56346]: Initial environment: production Nov 26 03:05:19 localhost puppet-user[56346]: Converged environment: production Nov 26 03:05:19 localhost puppet-user[56346]: Run mode: user Nov 26 03:05:19 localhost puppet-user[56346]: Changes: Nov 26 03:05:19 localhost puppet-user[56346]: Events: Nov 26 03:05:19 localhost puppet-user[56346]: Resources: Nov 26 03:05:19 localhost puppet-user[56346]: Total: 10 Nov 26 03:05:19 localhost puppet-user[56346]: Time: Nov 26 03:05:19 localhost puppet-user[56346]: Schedule: 0.00 Nov 26 03:05:19 localhost puppet-user[56346]: File: 0.00 Nov 26 03:05:19 localhost puppet-user[56346]: Augeas: 0.01 Nov 26 03:05:19 localhost puppet-user[56346]: Exec: 0.01 Nov 26 03:05:19 localhost puppet-user[56346]: Transaction evaluation: 0.04 Nov 26 03:05:19 localhost puppet-user[56346]: Catalog application: 0.04 Nov 26 03:05:19 localhost puppet-user[56346]: Config retrieval: 0.14 Nov 26 03:05:19 localhost puppet-user[56346]: Last run: 1764144319 Nov 26 03:05:19 localhost puppet-user[56346]: Filebucket: 0.00 Nov 26 03:05:19 localhost puppet-user[56346]: Total: 0.05 Nov 26 03:05:19 localhost puppet-user[56346]: Version: Nov 26 03:05:19 localhost puppet-user[56346]: Config: 1764144318 Nov 26 03:05:19 localhost puppet-user[56346]: Puppet: 7.10.0 Nov 26 03:05:19 localhost ansible-async_wrapper.py[56326]: Module complete (56326) Nov 26 03:05:19 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 6.1e deep-scrub starts Nov 26 03:05:19 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 6.1e deep-scrub ok Nov 26 03:05:20 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 5.4 scrub starts Nov 26 03:05:20 localhost ceph-osd[31674]: osd.0 pg_epoch: 62 pg[7.c( v 31'39 (0'0,31'39] local-lis/les=46/47 n=1 ec=38/29 lis/c=46/46 les/c/f=47/47/0 sis=62 pruub=10.446937561s) [2,3,4] r=-1 lpr=62 pi=[46,62)/1 luod=0'0 crt=31'39 lcod 0'0 mlcod 0'0 active pruub 1165.383544922s@ mbc={}] start_peering_interval up [1,5,0] -> [2,3,4], acting [1,5,0] -> [2,3,4], acting_primary 1 -> 2, up_primary 1 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:05:20 localhost ceph-osd[31674]: osd.0 pg_epoch: 62 pg[7.c( v 31'39 (0'0,31'39] local-lis/les=46/47 n=1 ec=38/29 lis/c=46/46 les/c/f=47/47/0 sis=62 pruub=10.446599960s) [2,3,4] r=-1 lpr=62 pi=[46,62)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1165.383544922s@ mbc={}] state: transitioning to Stray Nov 26 03:05:20 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 5.4 scrub ok Nov 26 03:05:20 localhost ansible-async_wrapper.py[56325]: Done in kid B. Nov 26 03:05:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:05:20 localhost podman[56457]: 2025-11-26 08:05:20.363116149 +0000 UTC m=+0.075681741 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step1, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:05:20 localhost podman[56457]: 2025-11-26 08:05:20.558369874 +0000 UTC m=+0.270935396 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., container_name=metrics_qdr, io.buildah.version=1.41.4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z) Nov 26 03:05:20 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:05:21 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 2.f scrub starts Nov 26 03:05:21 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 2.f scrub ok Nov 26 03:05:21 localhost ceph-osd[32631]: osd.4 pg_epoch: 62 pg[7.c( empty local-lis/les=0/0 n=0 ec=38/29 lis/c=46/46 les/c/f=47/47/0 sis=62) [2,3,4] r=2 lpr=62 pi=[46,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:05:22 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 4.3 scrub starts Nov 26 03:05:22 localhost ceph-osd[32631]: osd.4 pg_epoch: 64 pg[7.d( v 31'39 (0'0,31'39] local-lis/les=48/49 n=1 ec=38/29 lis/c=48/48 les/c/f=49/49/0 sis=64 pruub=10.445858955s) [2,1,0] r=-1 lpr=64 pi=[48,64)/1 crt=31'39 mlcod 0'0 active pruub 1163.028442383s@ mbc={}] start_peering_interval up [4,3,2] -> [2,1,0], acting [4,3,2] -> [2,1,0], acting_primary 4 -> 2, up_primary 4 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:05:22 localhost ceph-osd[32631]: osd.4 pg_epoch: 64 pg[7.d( v 31'39 (0'0,31'39] local-lis/les=48/49 n=1 ec=38/29 lis/c=48/48 les/c/f=49/49/0 sis=64 pruub=10.445764542s) [2,1,0] r=-1 lpr=64 pi=[48,64)/1 crt=31'39 mlcod 0'0 unknown NOTIFY pruub 1163.028442383s@ mbc={}] state: transitioning to Stray Nov 26 03:05:22 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 4.3 scrub ok Nov 26 03:05:22 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 6.17 scrub starts Nov 26 03:05:22 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 6.17 scrub ok Nov 26 03:05:23 localhost ceph-osd[31674]: osd.0 pg_epoch: 64 pg[7.d( empty local-lis/les=0/0 n=0 ec=38/29 lis/c=48/48 les/c/f=49/49/0 sis=64) [2,1,0] r=2 lpr=64 pi=[48,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:05:23 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 6.1c scrub starts Nov 26 03:05:23 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 6.1c scrub ok Nov 26 03:05:24 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 2.1b scrub starts Nov 26 03:05:24 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 2.1b scrub ok Nov 26 03:05:24 localhost ceph-osd[32631]: osd.4 pg_epoch: 66 pg[7.e( v 31'39 (0'0,31'39] local-lis/les=50/51 n=1 ec=38/29 lis/c=50/50 les/c/f=51/51/0 sis=66 pruub=8.881623268s) [1,4,5] r=1 lpr=66 pi=[50,66)/1 luod=0'0 crt=31'39 mlcod 0'0 active pruub 1163.511474609s@ mbc={}] start_peering_interval up [3,2,4] -> [1,4,5], acting [3,2,4] -> [1,4,5], acting_primary 3 -> 1, up_primary 3 -> 1, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:05:24 localhost ceph-osd[32631]: osd.4 pg_epoch: 66 pg[7.e( v 31'39 (0'0,31'39] local-lis/les=50/51 n=1 ec=38/29 lis/c=50/50 les/c/f=51/51/0 sis=66 pruub=8.881560326s) [1,4,5] r=1 lpr=66 pi=[50,66)/1 crt=31'39 mlcod 0'0 unknown NOTIFY pruub 1163.511474609s@ mbc={}] state: transitioning to Stray Nov 26 03:05:24 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 4.14 scrub starts Nov 26 03:05:24 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 4.14 scrub ok Nov 26 03:05:25 localhost python3[56578]: ansible-ansible.legacy.async_status Invoked with jid=593342735899.56322 mode=status _async_dir=/tmp/.ansible_async Nov 26 03:05:26 localhost ceph-osd[31674]: osd.0 pg_epoch: 68 pg[7.f( v 31'39 (0'0,31'39] local-lis/les=52/53 n=1 ec=38/29 lis/c=52/52 les/c/f=53/53/0 sis=68 pruub=8.805140495s) [3,4,5] r=-1 lpr=68 pi=[52,68)/1 luod=0'0 crt=31'39 mlcod 0'0 active pruub 1169.892333984s@ mbc={}] start_peering_interval up [2,0,3] -> [3,4,5], acting [2,0,3] -> [3,4,5], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:05:26 localhost ceph-osd[31674]: osd.0 pg_epoch: 68 pg[7.f( v 31'39 (0'0,31'39] local-lis/les=52/53 n=1 ec=38/29 lis/c=52/52 les/c/f=53/53/0 sis=68 pruub=8.805055618s) [3,4,5] r=-1 lpr=68 pi=[52,68)/1 crt=31'39 mlcod 0'0 unknown NOTIFY pruub 1169.892333984s@ mbc={}] state: transitioning to Stray Nov 26 03:05:26 localhost python3[56594]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 26 03:05:27 localhost python3[56610]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 03:05:27 localhost ceph-osd[32631]: osd.4 pg_epoch: 68 pg[7.f( empty local-lis/les=0/0 n=0 ec=38/29 lis/c=52/52 les/c/f=53/53/0 sis=68) [3,4,5] r=1 lpr=68 pi=[52,68)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:05:27 localhost python3[56660]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:05:27 localhost python3[56678]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmp8j8sn1yo recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 26 03:05:28 localhost python3[56708]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:05:29 localhost python3[56812]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Nov 26 03:05:30 localhost python3[56831]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:05:30 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 4.1d scrub starts Nov 26 03:05:30 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 4.1d scrub ok Nov 26 03:05:30 localhost python3[56863]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 03:05:31 localhost python3[56913]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:05:31 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 4.2 scrub starts Nov 26 03:05:31 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 4.2 scrub ok Nov 26 03:05:31 localhost python3[56931]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:05:32 localhost python3[56993]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:05:32 localhost python3[57011]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:05:33 localhost python3[57073]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:05:33 localhost python3[57091]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:05:33 localhost python3[57153]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:05:34 localhost python3[57171]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:05:34 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 4.19 scrub starts Nov 26 03:05:34 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 4.19 scrub ok Nov 26 03:05:34 localhost sshd[57202]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:05:34 localhost python3[57201]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 03:05:34 localhost systemd[1]: Reloading. Nov 26 03:05:34 localhost systemd-rc-local-generator[57225]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:05:34 localhost systemd-sysv-generator[57228]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:05:34 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:05:35 localhost python3[57289]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:05:35 localhost ceph-osd[31674]: osd.0 70 crush map has features 432629239337189376, adjusting msgr requires for clients Nov 26 03:05:35 localhost ceph-osd[31674]: osd.0 70 crush map has features 432629239337189376 was 288514051259245057, adjusting msgr requires for mons Nov 26 03:05:35 localhost ceph-osd[31674]: osd.0 70 crush map has features 3314933000854323200, adjusting msgr requires for osds Nov 26 03:05:35 localhost ceph-osd[32631]: osd.4 70 crush map has features 432629239337189376, adjusting msgr requires for clients Nov 26 03:05:35 localhost ceph-osd[32631]: osd.4 70 crush map has features 432629239337189376 was 288514051259245057, adjusting msgr requires for mons Nov 26 03:05:35 localhost ceph-osd[32631]: osd.4 70 crush map has features 3314933000854323200, adjusting msgr requires for osds Nov 26 03:05:35 localhost ceph-osd[32631]: osd.4 pg_epoch: 70 pg[6.11( empty local-lis/les=40/41 n=0 ec=38/28 lis/c=40/40 les/c/f=41/41/0 sis=70 pruub=9.799412727s) [3,0,2] r=-1 lpr=70 pi=[40,70)/1 crt=0'0 mlcod 0'0 active pruub 1175.673706055s@ mbc={}] start_peering_interval up [3,4,2] -> [3,0,2], acting [3,4,2] -> [3,0,2], acting_primary 3 -> 3, up_primary 3 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 26 03:05:35 localhost ceph-osd[32631]: osd.4 pg_epoch: 70 pg[6.11( empty local-lis/les=40/41 n=0 ec=38/28 lis/c=40/40 les/c/f=41/41/0 sis=70 pruub=9.799309731s) [3,0,2] r=-1 lpr=70 pi=[40,70)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1175.673706055s@ mbc={}] state: transitioning to Stray Nov 26 03:05:35 localhost python3[57307]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:05:36 localhost python3[57369]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:05:36 localhost ceph-osd[31674]: osd.0 pg_epoch: 70 pg[6.11( empty local-lis/les=0/0 n=0 ec=38/28 lis/c=40/40 les/c/f=41/41/0 sis=70) [3,0,2] r=1 lpr=70 pi=[40,70)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 26 03:05:36 localhost python3[57387]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:05:36 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 2.1c scrub starts Nov 26 03:05:36 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 2.1c scrub ok Nov 26 03:05:37 localhost python3[57417]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 03:05:37 localhost systemd[1]: Reloading. Nov 26 03:05:37 localhost systemd-sysv-generator[57444]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:05:37 localhost systemd-rc-local-generator[57441]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:05:37 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:05:37 localhost systemd[1]: Starting Create netns directory... Nov 26 03:05:37 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Nov 26 03:05:37 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Nov 26 03:05:37 localhost systemd[1]: Finished Create netns directory. Nov 26 03:05:38 localhost python3[57476]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Nov 26 03:05:38 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 3.15 scrub starts Nov 26 03:05:38 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 3.15 scrub ok Nov 26 03:05:39 localhost python3[57534]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step2 config_dir=/var/lib/tripleo-config/container-startup-config/step_2 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Nov 26 03:05:39 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 2.15 scrub starts Nov 26 03:05:39 localhost podman[57607]: 2025-11-26 08:05:39.733880668 +0000 UTC m=+0.086123631 container create be5982fcd687e82860b080b799e695b90682b3fd1b100a2990a5b909b8891628 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, release=1761123044, container_name=nova_compute_init_log, batch=17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-nova-compute, architecture=x86_64, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step2, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:05:39 localhost systemd[1]: Started libpod-conmon-be5982fcd687e82860b080b799e695b90682b3fd1b100a2990a5b909b8891628.scope. Nov 26 03:05:39 localhost podman[57614]: 2025-11-26 08:05:39.781550659 +0000 UTC m=+0.106570157 container create 3a5b352373b1cb4201e9a0c41bc14392b8ec31b7ee990afd81cebce34fe31490 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, build-date=2025-11-19T00:35:22Z, config_id=tripleo_step2, batch=17.1_20251118.1, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, container_name=nova_virtqemud_init_logs, summary=Red Hat OpenStack Platform 17.1 nova-libvirt) Nov 26 03:05:39 localhost podman[57607]: 2025-11-26 08:05:39.686395292 +0000 UTC m=+0.038638255 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Nov 26 03:05:39 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 2.15 scrub ok Nov 26 03:05:39 localhost systemd[1]: Started libcrun container. Nov 26 03:05:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a9100a267b24367cb8370dbd929de9892c803b386b41da1c102b2b3d1fba1b2a/merged/var/log/nova supports timestamps until 2038 (0x7fffffff) Nov 26 03:05:39 localhost systemd[1]: Started libpod-conmon-3a5b352373b1cb4201e9a0c41bc14392b8ec31b7ee990afd81cebce34fe31490.scope. Nov 26 03:05:39 localhost podman[57614]: 2025-11-26 08:05:39.730155793 +0000 UTC m=+0.055175311 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 26 03:05:39 localhost podman[57607]: 2025-11-26 08:05:39.832702697 +0000 UTC m=+0.184945670 container init be5982fcd687e82860b080b799e695b90682b3fd1b100a2990a5b909b8891628 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step2, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, container_name=nova_compute_init_log, io.openshift.expose-services=, io.buildah.version=1.41.4, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, version=17.1.12, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc.) Nov 26 03:05:39 localhost systemd[1]: Started libcrun container. Nov 26 03:05:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3df8000495686f63312283822b59896f01c9669473559757e06844f715334a89/merged/var/log/swtpm supports timestamps until 2038 (0x7fffffff) Nov 26 03:05:39 localhost podman[57607]: 2025-11-26 08:05:39.845831469 +0000 UTC m=+0.198074442 container start be5982fcd687e82860b080b799e695b90682b3fd1b100a2990a5b909b8891628 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, container_name=nova_compute_init_log, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, url=https://www.redhat.com, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step2, release=1761123044, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:05:39 localhost python3[57534]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_compute_init_log --conmon-pidfile /run/nova_compute_init_log.pid --detach=True --env TRIPLEO_DEPLOY_IDENTIFIER=1764143208 --label config_id=tripleo_step2 --label container_name=nova_compute_init_log --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_compute_init_log.log --network none --privileged=False --user root --volume /var/log/containers/nova:/var/log/nova:z registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 /bin/bash -c chown -R nova:nova /var/log/nova Nov 26 03:05:39 localhost systemd[1]: libpod-be5982fcd687e82860b080b799e695b90682b3fd1b100a2990a5b909b8891628.scope: Deactivated successfully. Nov 26 03:05:39 localhost podman[57614]: 2025-11-26 08:05:39.854848156 +0000 UTC m=+0.179867644 container init 3a5b352373b1cb4201e9a0c41bc14392b8ec31b7ee990afd81cebce34fe31490 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-nova-libvirt, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step2, tcib_managed=true, release=1761123044, build-date=2025-11-19T00:35:22Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, container_name=nova_virtqemud_init_logs, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 26 03:05:39 localhost podman[57614]: 2025-11-26 08:05:39.863742668 +0000 UTC m=+0.188762156 container start 3a5b352373b1cb4201e9a0c41bc14392b8ec31b7ee990afd81cebce34fe31490 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, architecture=x86_64, config_id=tripleo_step2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, batch=17.1_20251118.1, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, release=1761123044, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtqemud_init_logs, build-date=2025-11-19T00:35:22Z) Nov 26 03:05:39 localhost python3[57534]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtqemud_init_logs --conmon-pidfile /run/nova_virtqemud_init_logs.pid --detach=True --env TRIPLEO_DEPLOY_IDENTIFIER=1764143208 --label config_id=tripleo_step2 --label container_name=nova_virtqemud_init_logs --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtqemud_init_logs.log --network none --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --user root --volume /var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /bin/bash -c chown -R tss:tss /var/log/swtpm Nov 26 03:05:39 localhost systemd[1]: libpod-3a5b352373b1cb4201e9a0c41bc14392b8ec31b7ee990afd81cebce34fe31490.scope: Deactivated successfully. Nov 26 03:05:39 localhost podman[57648]: 2025-11-26 08:05:39.941156011 +0000 UTC m=+0.069845772 container died be5982fcd687e82860b080b799e695b90682b3fd1b100a2990a5b909b8891628 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_id=tripleo_step2, batch=17.1_20251118.1, architecture=x86_64, release=1761123044, container_name=nova_compute_init_log, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-nova-compute-container) Nov 26 03:05:39 localhost podman[57648]: 2025-11-26 08:05:39.962255817 +0000 UTC m=+0.090945558 container cleanup be5982fcd687e82860b080b799e695b90682b3fd1b100a2990a5b909b8891628 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., container_name=nova_compute_init_log, distribution-scope=public, vcs-type=git, name=rhosp17/openstack-nova-compute, config_id=tripleo_step2, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Nov 26 03:05:39 localhost systemd[1]: libpod-conmon-be5982fcd687e82860b080b799e695b90682b3fd1b100a2990a5b909b8891628.scope: Deactivated successfully. Nov 26 03:05:40 localhost podman[57662]: 2025-11-26 08:05:40.041866308 +0000 UTC m=+0.153056123 container died 3a5b352373b1cb4201e9a0c41bc14392b8ec31b7ee990afd81cebce34fe31490 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtqemud_init_logs, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step2, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:35:22Z, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, url=https://www.redhat.com, name=rhosp17/openstack-nova-libvirt) Nov 26 03:05:40 localhost podman[57662]: 2025-11-26 08:05:40.066338938 +0000 UTC m=+0.177528703 container cleanup 3a5b352373b1cb4201e9a0c41bc14392b8ec31b7ee990afd81cebce34fe31490 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:35:22Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtqemud_init_logs, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, config_id=tripleo_step2, managed_by=tripleo_ansible, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt) Nov 26 03:05:40 localhost systemd[1]: libpod-conmon-3a5b352373b1cb4201e9a0c41bc14392b8ec31b7ee990afd81cebce34fe31490.scope: Deactivated successfully. Nov 26 03:05:40 localhost podman[57794]: 2025-11-26 08:05:40.492588432 +0000 UTC m=+0.098544981 container create f8118db956e544120def2cacecabfa4c7c38cd832f024cf42788c5abc74db944 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, io.buildah.version=1.41.4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, batch=17.1_20251118.1, container_name=create_virtlogd_wrapper, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}) Nov 26 03:05:40 localhost podman[57794]: 2025-11-26 08:05:40.430032875 +0000 UTC m=+0.035989464 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 26 03:05:40 localhost podman[57807]: 2025-11-26 08:05:40.531706361 +0000 UTC m=+0.109257649 container create f80b99e9ccb9d4c8b192bd89ee6f17979ddf68682155996a4408c78250c3b069 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, release=1761123044, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step2, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, container_name=create_haproxy_wrapper, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, version=17.1.12) Nov 26 03:05:40 localhost systemd[1]: Started libpod-conmon-f8118db956e544120def2cacecabfa4c7c38cd832f024cf42788c5abc74db944.scope. Nov 26 03:05:40 localhost systemd[1]: Started libcrun container. Nov 26 03:05:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/97061593b7b68895c2bf349b5cf4d3820f346f775d77096764091fa4e8f8d995/merged/var/lib/container-config-scripts supports timestamps until 2038 (0x7fffffff) Nov 26 03:05:40 localhost podman[57794]: 2025-11-26 08:05:40.557567105 +0000 UTC m=+0.163523664 container init f8118db956e544120def2cacecabfa4c7c38cd832f024cf42788c5abc74db944 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, vendor=Red Hat, Inc., io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:35:22Z, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, maintainer=OpenStack TripleO Team, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, config_id=tripleo_step2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=create_virtlogd_wrapper, com.redhat.component=openstack-nova-libvirt-container, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, batch=17.1_20251118.1, distribution-scope=public) Nov 26 03:05:40 localhost systemd[1]: Started libpod-conmon-f80b99e9ccb9d4c8b192bd89ee6f17979ddf68682155996a4408c78250c3b069.scope. Nov 26 03:05:40 localhost podman[57794]: 2025-11-26 08:05:40.568655405 +0000 UTC m=+0.174611934 container start f8118db956e544120def2cacecabfa4c7c38cd832f024cf42788c5abc74db944 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, vcs-type=git, container_name=create_virtlogd_wrapper, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step2, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, url=https://www.redhat.com, release=1761123044, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:05:40 localhost podman[57794]: 2025-11-26 08:05:40.570727288 +0000 UTC m=+0.176683977 container attach f8118db956e544120def2cacecabfa4c7c38cd832f024cf42788c5abc74db944 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step2, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20251118.1, container_name=create_virtlogd_wrapper, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, build-date=2025-11-19T00:35:22Z) Nov 26 03:05:40 localhost podman[57807]: 2025-11-26 08:05:40.478949695 +0000 UTC m=+0.056501023 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Nov 26 03:05:40 localhost systemd[1]: Started libcrun container. Nov 26 03:05:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f82f216961b52eae10893792e45fbdb3b64d10f3200729b24023287a013ca587/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 03:05:40 localhost podman[57807]: 2025-11-26 08:05:40.593554747 +0000 UTC m=+0.171106055 container init f80b99e9ccb9d4c8b192bd89ee6f17979ddf68682155996a4408c78250c3b069 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=create_haproxy_wrapper, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step2, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c) Nov 26 03:05:40 localhost podman[57807]: 2025-11-26 08:05:40.601705107 +0000 UTC m=+0.179256405 container start f80b99e9ccb9d4c8b192bd89ee6f17979ddf68682155996a4408c78250c3b069 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, vendor=Red Hat, Inc., config_id=tripleo_step2, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, release=1761123044, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=create_haproxy_wrapper, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, version=17.1.12, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git) Nov 26 03:05:40 localhost podman[57807]: 2025-11-26 08:05:40.60209747 +0000 UTC m=+0.179648768 container attach f80b99e9ccb9d4c8b192bd89ee6f17979ddf68682155996a4408c78250c3b069 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, io.buildah.version=1.41.4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=create_haproxy_wrapper, url=https://www.redhat.com, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, version=17.1.12, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step2, io.openshift.expose-services=) Nov 26 03:05:40 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 3.11 deep-scrub starts Nov 26 03:05:40 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 3.11 deep-scrub ok Nov 26 03:05:40 localhost systemd[1]: var-lib-containers-storage-overlay-3df8000495686f63312283822b59896f01c9669473559757e06844f715334a89-merged.mount: Deactivated successfully. Nov 26 03:05:40 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3a5b352373b1cb4201e9a0c41bc14392b8ec31b7ee990afd81cebce34fe31490-userdata-shm.mount: Deactivated successfully. Nov 26 03:05:40 localhost systemd[1]: var-lib-containers-storage-overlay-a9100a267b24367cb8370dbd929de9892c803b386b41da1c102b2b3d1fba1b2a-merged.mount: Deactivated successfully. Nov 26 03:05:40 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-be5982fcd687e82860b080b799e695b90682b3fd1b100a2990a5b909b8891628-userdata-shm.mount: Deactivated successfully. Nov 26 03:05:41 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 5.12 scrub starts Nov 26 03:05:41 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 5.12 scrub ok Nov 26 03:05:42 localhost ovs-vsctl[57898]: ovs|00001|db_ctl_base|ERR|unix:/var/run/openvswitch/db.sock: database connection failed (No such file or directory) Nov 26 03:05:42 localhost systemd[1]: libpod-f8118db956e544120def2cacecabfa4c7c38cd832f024cf42788c5abc74db944.scope: Deactivated successfully. Nov 26 03:05:42 localhost systemd[1]: libpod-f8118db956e544120def2cacecabfa4c7c38cd832f024cf42788c5abc74db944.scope: Consumed 2.082s CPU time. Nov 26 03:05:42 localhost podman[57989]: 2025-11-26 08:05:42.700898461 +0000 UTC m=+0.033423276 container died f8118db956e544120def2cacecabfa4c7c38cd832f024cf42788c5abc74db944 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, name=rhosp17/openstack-nova-libvirt, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, build-date=2025-11-19T00:35:22Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, container_name=create_virtlogd_wrapper, url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.openshift.expose-services=, config_id=tripleo_step2, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vendor=Red Hat, Inc.) Nov 26 03:05:42 localhost systemd[1]: tmp-crun.y89lLS.mount: Deactivated successfully. Nov 26 03:05:42 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f8118db956e544120def2cacecabfa4c7c38cd832f024cf42788c5abc74db944-userdata-shm.mount: Deactivated successfully. Nov 26 03:05:42 localhost podman[57989]: 2025-11-26 08:05:42.741579597 +0000 UTC m=+0.074104412 container cleanup f8118db956e544120def2cacecabfa4c7c38cd832f024cf42788c5abc74db944 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, config_id=tripleo_step2, io.openshift.expose-services=, build-date=2025-11-19T00:35:22Z, container_name=create_virtlogd_wrapper, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12) Nov 26 03:05:42 localhost systemd[1]: libpod-conmon-f8118db956e544120def2cacecabfa4c7c38cd832f024cf42788c5abc74db944.scope: Deactivated successfully. Nov 26 03:05:42 localhost python3[57534]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name create_virtlogd_wrapper --cgroupns=host --conmon-pidfile /run/create_virtlogd_wrapper.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1764143208 --label config_id=tripleo_step2 --label container_name=create_virtlogd_wrapper --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/create_virtlogd_wrapper.log --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /container_puppet_apply.sh 4 file include ::tripleo::profile::base::nova::virtlogd_wrapper Nov 26 03:05:43 localhost systemd[1]: libpod-f80b99e9ccb9d4c8b192bd89ee6f17979ddf68682155996a4408c78250c3b069.scope: Deactivated successfully. Nov 26 03:05:43 localhost systemd[1]: libpod-f80b99e9ccb9d4c8b192bd89ee6f17979ddf68682155996a4408c78250c3b069.scope: Consumed 2.237s CPU time. Nov 26 03:05:43 localhost podman[57807]: 2025-11-26 08:05:43.6554911 +0000 UTC m=+3.233042368 container died f80b99e9ccb9d4c8b192bd89ee6f17979ddf68682155996a4408c78250c3b069 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, config_id=tripleo_step2, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=create_haproxy_wrapper, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, release=1761123044, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=) Nov 26 03:05:43 localhost systemd[1]: var-lib-containers-storage-overlay-97061593b7b68895c2bf349b5cf4d3820f346f775d77096764091fa4e8f8d995-merged.mount: Deactivated successfully. Nov 26 03:05:43 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f80b99e9ccb9d4c8b192bd89ee6f17979ddf68682155996a4408c78250c3b069-userdata-shm.mount: Deactivated successfully. Nov 26 03:05:43 localhost systemd[1]: var-lib-containers-storage-overlay-f82f216961b52eae10893792e45fbdb3b64d10f3200729b24023287a013ca587-merged.mount: Deactivated successfully. Nov 26 03:05:43 localhost podman[58088]: 2025-11-26 08:05:43.718327067 +0000 UTC m=+0.057415391 container cleanup f80b99e9ccb9d4c8b192bd89ee6f17979ddf68682155996a4408c78250c3b069 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, build-date=2025-11-19T00:14:25Z, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step2, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=create_haproxy_wrapper, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c) Nov 26 03:05:43 localhost systemd[1]: libpod-conmon-f80b99e9ccb9d4c8b192bd89ee6f17979ddf68682155996a4408c78250c3b069.scope: Deactivated successfully. Nov 26 03:05:43 localhost python3[57534]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name create_haproxy_wrapper --conmon-pidfile /run/create_haproxy_wrapper.pid --detach=False --label config_id=tripleo_step2 --label container_name=create_haproxy_wrapper --label managed_by=tripleo_ansible --label config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/create_haproxy_wrapper.log --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron:/var/lib/neutron:shared,z registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 /container_puppet_apply.sh 4 file include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers Nov 26 03:05:44 localhost python3[58143]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks2.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:05:44 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 3.e deep-scrub starts Nov 26 03:05:44 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 3.e deep-scrub ok Nov 26 03:05:45 localhost python3[58264]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks2.json short_hostname=np0005536118 step=2 update_config_hash_only=False Nov 26 03:05:46 localhost python3[58280]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:05:46 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 5.13 scrub starts Nov 26 03:05:46 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 5.13 scrub ok Nov 26 03:05:46 localhost python3[58296]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_2 config_pattern=container-puppet-*.json config_overrides={} debug=True Nov 26 03:05:50 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 6.12 scrub starts Nov 26 03:05:50 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 6.12 scrub ok Nov 26 03:05:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:05:50 localhost podman[58297]: 2025-11-26 08:05:50.833357691 +0000 UTC m=+0.092534067 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, url=https://www.redhat.com, version=17.1.12, managed_by=tripleo_ansible, container_name=metrics_qdr, vcs-type=git, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1) Nov 26 03:05:51 localhost podman[58297]: 2025-11-26 08:05:51.048419443 +0000 UTC m=+0.307595819 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64) Nov 26 03:05:51 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:05:53 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 6.1b scrub starts Nov 26 03:05:53 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 6.1b scrub ok Nov 26 03:05:53 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 5.d scrub starts Nov 26 03:05:53 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 5.d scrub ok Nov 26 03:05:56 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 4.15 deep-scrub starts Nov 26 03:05:56 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 4.15 deep-scrub ok Nov 26 03:05:58 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 4.1c scrub starts Nov 26 03:05:58 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 4.1c scrub ok Nov 26 03:05:59 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 3.8 scrub starts Nov 26 03:05:59 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 3.8 scrub ok Nov 26 03:06:00 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 2.1d scrub starts Nov 26 03:06:00 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 2.1d scrub ok Nov 26 03:06:01 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 4.1 scrub starts Nov 26 03:06:01 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 4.1 scrub ok Nov 26 03:06:02 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 4.8 deep-scrub starts Nov 26 03:06:02 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 4.8 deep-scrub ok Nov 26 03:06:03 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 2.c scrub starts Nov 26 03:06:03 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 2.c scrub ok Nov 26 03:06:05 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 26 03:06:05 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 5167 writes, 23K keys, 5167 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 5167 writes, 564 syncs, 9.16 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1765 writes, 6551 keys, 1765 commit groups, 1.0 writes per commit group, ingest: 2.68 MB, 0.00 MB/s#012Interval WAL: 1765 writes, 358 syncs, 4.93 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5576e0ff82d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5576e0ff82d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 me Nov 26 03:06:05 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 4.6 scrub starts Nov 26 03:06:05 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 4.6 scrub ok Nov 26 03:06:06 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 4.9 scrub starts Nov 26 03:06:06 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 4.9 scrub ok Nov 26 03:06:06 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 2.10 scrub starts Nov 26 03:06:06 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 2.10 scrub ok Nov 26 03:06:07 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 7.5 scrub starts Nov 26 03:06:07 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 7.5 scrub ok Nov 26 03:06:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 26 03:06:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 4310 writes, 19K keys, 4310 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4310 writes, 413 syncs, 10.44 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1062 writes, 3615 keys, 1062 commit groups, 1.0 writes per commit group, ingest: 1.95 MB, 0.00 MB/s#012Interval WAL: 1062 writes, 273 syncs, 3.89 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.02 0.00 1 0.022 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.02 0.00 1 0.022 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1 0.02 0.00 1 0.022 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5593355362d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5593355362d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 m Nov 26 03:06:14 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 7.a scrub starts Nov 26 03:06:14 localhost ceph-osd[32631]: log_channel(cluster) log [DBG] : 7.a scrub ok Nov 26 03:06:16 localhost sshd[58326]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:06:16 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 2.b scrub starts Nov 26 03:06:16 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 2.b scrub ok Nov 26 03:06:20 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 2.d scrub starts Nov 26 03:06:20 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 2.d scrub ok Nov 26 03:06:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:06:21 localhost podman[58328]: 2025-11-26 08:06:21.795081784 +0000 UTC m=+0.062969861 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, config_id=tripleo_step1, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.41.4) Nov 26 03:06:21 localhost podman[58328]: 2025-11-26 08:06:21.970342037 +0000 UTC m=+0.238230154 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, container_name=metrics_qdr, batch=17.1_20251118.1, distribution-scope=public, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, version=17.1.12) Nov 26 03:06:21 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:06:24 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 2.13 scrub starts Nov 26 03:06:24 localhost ceph-osd[31674]: log_channel(cluster) log [DBG] : 2.13 scrub ok Nov 26 03:06:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:06:53 localhost systemd[1]: tmp-crun.WE6Faf.mount: Deactivated successfully. Nov 26 03:06:53 localhost podman[58486]: 2025-11-26 08:06:53.722952981 +0000 UTC m=+0.984507836 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, batch=17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:06:53 localhost podman[58486]: 2025-11-26 08:06:53.95032249 +0000 UTC m=+1.211877295 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z) Nov 26 03:06:53 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:06:57 localhost sshd[58515]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:07:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:07:24 localhost systemd[1]: tmp-crun.obOwTD.mount: Deactivated successfully. Nov 26 03:07:24 localhost podman[58517]: 2025-11-26 08:07:24.828561187 +0000 UTC m=+0.091223128 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-type=git, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:07:25 localhost podman[58517]: 2025-11-26 08:07:25.048392164 +0000 UTC m=+0.311054065 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, architecture=x86_64, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team) Nov 26 03:07:25 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:07:40 localhost sshd[58623]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:07:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:07:55 localhost systemd[1]: tmp-crun.CnlTeg.mount: Deactivated successfully. Nov 26 03:07:55 localhost podman[58625]: 2025-11-26 08:07:55.811749355 +0000 UTC m=+0.078996888 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, distribution-scope=public, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, release=1761123044, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team) Nov 26 03:07:56 localhost podman[58625]: 2025-11-26 08:07:56.017166145 +0000 UTC m=+0.284413678 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., vcs-type=git, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, tcib_managed=true, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, version=17.1.12, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:07:56 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:08:24 localhost sshd[58654]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:08:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:08:26 localhost systemd[1]: tmp-crun.rXyDzE.mount: Deactivated successfully. Nov 26 03:08:26 localhost podman[58656]: 2025-11-26 08:08:26.810745581 +0000 UTC m=+0.073103028 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., version=17.1.12, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, io.openshift.expose-services=, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, release=1761123044, maintainer=OpenStack TripleO Team) Nov 26 03:08:26 localhost podman[58656]: 2025-11-26 08:08:26.989370479 +0000 UTC m=+0.251727976 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, release=1761123044, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step1, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, name=rhosp17/openstack-qdrouterd) Nov 26 03:08:26 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:08:29 localhost systemd[1]: tmp-crun.zRcZxR.mount: Deactivated successfully. Nov 26 03:08:29 localhost podman[58784]: 2025-11-26 08:08:29.079899342 +0000 UTC m=+0.096817407 container exec a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-type=git, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, release=553, RELEASE=main, maintainer=Guillaume Abrioux , version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, vendor=Red Hat, Inc., GIT_CLEAN=True, distribution-scope=public, CEPH_POINT_RELEASE=) Nov 26 03:08:29 localhost podman[58784]: 2025-11-26 08:08:29.205274375 +0000 UTC m=+0.222192380 container exec_died a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, architecture=x86_64, GIT_CLEAN=True, release=553, RELEASE=main, ceph=True, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, GIT_BRANCH=main, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph) Nov 26 03:08:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:08:57 localhost podman[58924]: 2025-11-26 08:08:57.812520693 +0000 UTC m=+0.074988836 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.buildah.version=1.41.4, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step1, release=1761123044, vcs-type=git) Nov 26 03:08:58 localhost podman[58924]: 2025-11-26 08:08:58.03136981 +0000 UTC m=+0.293837903 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, architecture=x86_64) Nov 26 03:08:58 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:09:06 localhost sshd[58954]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:09:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:09:28 localhost systemd[1]: tmp-crun.TL8Oqz.mount: Deactivated successfully. Nov 26 03:09:28 localhost podman[58956]: 2025-11-26 08:09:28.848437327 +0000 UTC m=+0.107628618 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, tcib_managed=true, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, url=https://www.redhat.com, vcs-type=git, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr) Nov 26 03:09:29 localhost podman[58956]: 2025-11-26 08:09:29.098436188 +0000 UTC m=+0.357627489 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, container_name=metrics_qdr, release=1761123044, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:09:29 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:09:50 localhost sshd[59063]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:09:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:09:59 localhost podman[59065]: 2025-11-26 08:09:59.818544367 +0000 UTC m=+0.081855598 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:10:00 localhost podman[59065]: 2025-11-26 08:10:00.044606982 +0000 UTC m=+0.307918203 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, tcib_managed=true, vcs-type=git, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.41.4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 03:10:00 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:10:21 localhost python3[59141]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:10:21 localhost python3[59186]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764144620.7716272-99121-36694456494034/source _original_basename=tmpiozpbvif follow=False checksum=62439dd24dde40c90e7a39f6a1b31cc6061fe59b backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:10:22 localhost python3[59216]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 03:10:24 localhost ansible-async_wrapper.py[59388]: Invoked with 184407856279 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764144623.5706718-99343-245436207750892/AnsiballZ_command.py _ Nov 26 03:10:24 localhost ansible-async_wrapper.py[59391]: Starting module and watcher Nov 26 03:10:24 localhost ansible-async_wrapper.py[59391]: Start watching 59392 (3600) Nov 26 03:10:24 localhost ansible-async_wrapper.py[59392]: Start module (59392) Nov 26 03:10:24 localhost ansible-async_wrapper.py[59388]: Return async_wrapper task started. Nov 26 03:10:24 localhost python3[59412]: ansible-ansible.legacy.async_status Invoked with jid=184407856279.59388 mode=status _async_dir=/tmp/.ansible_async Nov 26 03:10:27 localhost puppet-user[59411]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 26 03:10:27 localhost puppet-user[59411]: (file: /etc/puppet/hiera.yaml) Nov 26 03:10:27 localhost puppet-user[59411]: Warning: Undefined variable '::deploy_config_name'; Nov 26 03:10:27 localhost puppet-user[59411]: (file & line not available) Nov 26 03:10:27 localhost puppet-user[59411]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 26 03:10:27 localhost puppet-user[59411]: (file & line not available) Nov 26 03:10:27 localhost puppet-user[59411]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Nov 26 03:10:27 localhost puppet-user[59411]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Nov 26 03:10:27 localhost puppet-user[59411]: Notice: Compiled catalog for np0005536118.localdomain in environment production in 0.12 seconds Nov 26 03:10:28 localhost puppet-user[59411]: Notice: Applied catalog in 0.04 seconds Nov 26 03:10:28 localhost puppet-user[59411]: Application: Nov 26 03:10:28 localhost puppet-user[59411]: Initial environment: production Nov 26 03:10:28 localhost puppet-user[59411]: Converged environment: production Nov 26 03:10:28 localhost puppet-user[59411]: Run mode: user Nov 26 03:10:28 localhost puppet-user[59411]: Changes: Nov 26 03:10:28 localhost puppet-user[59411]: Events: Nov 26 03:10:28 localhost puppet-user[59411]: Resources: Nov 26 03:10:28 localhost puppet-user[59411]: Total: 10 Nov 26 03:10:28 localhost puppet-user[59411]: Time: Nov 26 03:10:28 localhost puppet-user[59411]: Schedule: 0.00 Nov 26 03:10:28 localhost puppet-user[59411]: File: 0.00 Nov 26 03:10:28 localhost puppet-user[59411]: Exec: 0.01 Nov 26 03:10:28 localhost puppet-user[59411]: Augeas: 0.01 Nov 26 03:10:28 localhost puppet-user[59411]: Transaction evaluation: 0.03 Nov 26 03:10:28 localhost puppet-user[59411]: Catalog application: 0.04 Nov 26 03:10:28 localhost puppet-user[59411]: Config retrieval: 0.16 Nov 26 03:10:28 localhost puppet-user[59411]: Last run: 1764144628 Nov 26 03:10:28 localhost puppet-user[59411]: Filebucket: 0.00 Nov 26 03:10:28 localhost puppet-user[59411]: Total: 0.04 Nov 26 03:10:28 localhost puppet-user[59411]: Version: Nov 26 03:10:28 localhost puppet-user[59411]: Config: 1764144627 Nov 26 03:10:28 localhost puppet-user[59411]: Puppet: 7.10.0 Nov 26 03:10:28 localhost ansible-async_wrapper.py[59392]: Module complete (59392) Nov 26 03:10:29 localhost ansible-async_wrapper.py[59391]: Done in kid B. Nov 26 03:10:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:10:30 localhost systemd[1]: tmp-crun.u4LLMt.mount: Deactivated successfully. Nov 26 03:10:30 localhost podman[59523]: 2025-11-26 08:10:30.82769144 +0000 UTC m=+0.088603876 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, architecture=x86_64, release=1761123044, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, distribution-scope=public, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:10:31 localhost podman[59523]: 2025-11-26 08:10:31.007348874 +0000 UTC m=+0.268261360 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, version=17.1.12, release=1761123044, vcs-type=git, config_id=tripleo_step1, tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible) Nov 26 03:10:31 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:10:33 localhost sshd[59614]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:10:34 localhost python3[59646]: ansible-ansible.legacy.async_status Invoked with jid=184407856279.59388 mode=status _async_dir=/tmp/.ansible_async Nov 26 03:10:35 localhost python3[59662]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 26 03:10:35 localhost python3[59678]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 03:10:36 localhost python3[59728]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:10:36 localhost python3[59746]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmp67iqmibs recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 26 03:10:37 localhost python3[59776]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:10:38 localhost python3[59879]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Nov 26 03:10:39 localhost python3[59898]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:10:40 localhost python3[59930]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 03:10:40 localhost python3[59980]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:10:41 localhost python3[59998]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:10:41 localhost python3[60060]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:10:41 localhost python3[60078]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:10:42 localhost python3[60140]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:10:42 localhost python3[60158]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:10:43 localhost python3[60220]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:10:43 localhost python3[60238]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:10:44 localhost python3[60268]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 03:10:44 localhost systemd[1]: Reloading. Nov 26 03:10:44 localhost systemd-sysv-generator[60296]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:10:44 localhost systemd-rc-local-generator[60292]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:10:44 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:10:44 localhost python3[60354]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:10:45 localhost python3[60372]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:10:45 localhost python3[60434]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:10:45 localhost python3[60452]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:10:46 localhost python3[60482]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 03:10:46 localhost systemd[1]: Reloading. Nov 26 03:10:46 localhost systemd-rc-local-generator[60504]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:10:46 localhost systemd-sysv-generator[60511]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:10:46 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:10:46 localhost systemd[1]: Starting Create netns directory... Nov 26 03:10:46 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Nov 26 03:10:46 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Nov 26 03:10:46 localhost systemd[1]: Finished Create netns directory. Nov 26 03:10:47 localhost python3[60540]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Nov 26 03:10:49 localhost python3[60596]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step3 config_dir=/var/lib/tripleo-config/container-startup-config/step_3 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Nov 26 03:10:49 localhost podman[60745]: 2025-11-26 08:10:49.444787047 +0000 UTC m=+0.066073606 container create 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vendor=Red Hat, Inc.) Nov 26 03:10:49 localhost podman[60748]: 2025-11-26 08:10:49.466404528 +0000 UTC m=+0.076697220 container create da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '89e2bf3e240198013fa934e7fe0b50df'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, container_name=rsyslog, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.component=openstack-rsyslog-container, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, managed_by=tripleo_ansible, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, build-date=2025-11-18T22:49:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, release=1761123044) Nov 26 03:10:49 localhost systemd[1]: Started libpod-conmon-1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.scope. Nov 26 03:10:49 localhost systemd[1]: Started libpod-conmon-da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de.scope. Nov 26 03:10:49 localhost podman[60790]: 2025-11-26 08:10:49.497752399 +0000 UTC m=+0.066566420 container create 09cc51740922973a562a88ec1d07ccee0f68805063f5f5804d728047ab977d74 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step3, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, tcib_managed=true, release=1761123044, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_init_log, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Nov 26 03:10:49 localhost systemd[1]: Started libcrun container. Nov 26 03:10:49 localhost systemd[1]: Started libcrun container. Nov 26 03:10:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab1eeb830657f9ab8bbf0a1c1e595d808a09550d63278050b820041c6a307d5f/merged/scripts supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:49 localhost podman[60746]: 2025-11-26 08:10:49.510635194 +0000 UTC m=+0.127166997 container create 8745f127beba509bb46acdd315816193362d33cc29035590ca4ada21f6718d93 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, url=https://www.redhat.com, container_name=nova_virtlogd_wrapper, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step3, io.buildah.version=1.41.4, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044) Nov 26 03:10:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab1eeb830657f9ab8bbf0a1c1e595d808a09550d63278050b820041c6a307d5f/merged/var/log/collectd supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4bba6b9e4ee096356fde8a3c1b121b7bd17e19e8adc642d7cdd467db161ba283/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4bba6b9e4ee096356fde8a3c1b121b7bd17e19e8adc642d7cdd467db161ba283/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:49 localhost podman[60745]: 2025-11-26 08:10:49.412025943 +0000 UTC m=+0.033312502 image pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Nov 26 03:10:49 localhost systemd[1]: Started libpod-conmon-09cc51740922973a562a88ec1d07ccee0f68805063f5f5804d728047ab977d74.scope. Nov 26 03:10:49 localhost podman[60748]: 2025-11-26 08:10:49.519576028 +0000 UTC m=+0.129868580 container init da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, build-date=2025-11-18T22:49:49Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '89e2bf3e240198013fa934e7fe0b50df'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_id=tripleo_step3, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, version=17.1.12, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-rsyslog, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:10:49 localhost podman[60748]: 2025-11-26 08:10:49.421847773 +0000 UTC m=+0.032140345 image pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Nov 26 03:10:49 localhost systemd[1]: Started libcrun container. Nov 26 03:10:49 localhost systemd[1]: Started libpod-conmon-8745f127beba509bb46acdd315816193362d33cc29035590ca4ada21f6718d93.scope. Nov 26 03:10:49 localhost podman[60748]: 2025-11-26 08:10:49.528251383 +0000 UTC m=+0.138543935 container start da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '89e2bf3e240198013fa934e7fe0b50df'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-rsyslog, container_name=rsyslog, config_id=tripleo_step3, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, com.redhat.component=openstack-rsyslog-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 rsyslog) Nov 26 03:10:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd66387fc89c132424d887c45935af2d1eacff944555d01cc251c3d6d1b83282/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:49 localhost python3[60596]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name rsyslog --conmon-pidfile /run/rsyslog.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=89e2bf3e240198013fa934e7fe0b50df --label config_id=tripleo_step3 --label container_name=rsyslog --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '89e2bf3e240198013fa934e7fe0b50df'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/rsyslog.log --network host --privileged=True --security-opt label=disable --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro --volume /var/log/containers:/var/log/containers:ro --volume /var/log/containers/rsyslog:/var/log/rsyslog:rw,z --volume /var/log:/var/log/host:ro --volume /var/lib/rsyslog.container:/var/lib/rsyslog:rw,z registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Nov 26 03:10:49 localhost podman[60790]: 2025-11-26 08:10:49.537151807 +0000 UTC m=+0.105965838 container init 09cc51740922973a562a88ec1d07ccee0f68805063f5f5804d728047ab977d74 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, io.buildah.version=1.41.4, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_init_log, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step3) Nov 26 03:10:49 localhost podman[60790]: 2025-11-26 08:10:49.541951153 +0000 UTC m=+0.110765184 container start 09cc51740922973a562a88ec1d07ccee0f68805063f5f5804d728047ab977d74 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_init_log, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12) Nov 26 03:10:49 localhost podman[60790]: 2025-11-26 08:10:49.465821261 +0000 UTC m=+0.034635292 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 Nov 26 03:10:49 localhost python3[60596]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_init_log --conmon-pidfile /run/ceilometer_init_log.pid --detach=True --label config_id=tripleo_step3 --label container_name=ceilometer_init_log --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_init_log.log --network none --user root --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 /bin/bash -c chown -R ceilometer:ceilometer /var/log/ceilometer Nov 26 03:10:49 localhost systemd[1]: Started libcrun container. Nov 26 03:10:49 localhost systemd[1]: libpod-09cc51740922973a562a88ec1d07ccee0f68805063f5f5804d728047ab977d74.scope: Deactivated successfully. Nov 26 03:10:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a6b696492174e75acfc2b8fd9aa6357f72d30dd36dce3e87a766ba6c92f819d/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a6b696492174e75acfc2b8fd9aa6357f72d30dd36dce3e87a766ba6c92f819d/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a6b696492174e75acfc2b8fd9aa6357f72d30dd36dce3e87a766ba6c92f819d/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a6b696492174e75acfc2b8fd9aa6357f72d30dd36dce3e87a766ba6c92f819d/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a6b696492174e75acfc2b8fd9aa6357f72d30dd36dce3e87a766ba6c92f819d/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a6b696492174e75acfc2b8fd9aa6357f72d30dd36dce3e87a766ba6c92f819d/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a6b696492174e75acfc2b8fd9aa6357f72d30dd36dce3e87a766ba6c92f819d/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:49 localhost podman[60746]: 2025-11-26 08:10:49.461746056 +0000 UTC m=+0.078277859 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 26 03:10:49 localhost podman[60746]: 2025-11-26 08:10:49.568021993 +0000 UTC m=+0.184553786 container init 8745f127beba509bb46acdd315816193362d33cc29035590ca4ada21f6718d93 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, build-date=2025-11-19T00:35:22Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-nova-libvirt, io.buildah.version=1.41.4, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, container_name=nova_virtlogd_wrapper, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.12) Nov 26 03:10:49 localhost podman[60746]: 2025-11-26 08:10:49.573824491 +0000 UTC m=+0.190356324 container start 8745f127beba509bb46acdd315816193362d33cc29035590ca4ada21f6718d93 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-libvirt, url=https://www.redhat.com, tcib_managed=true, build-date=2025-11-19T00:35:22Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, architecture=x86_64, config_id=tripleo_step3, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_virtlogd_wrapper, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 26 03:10:49 localhost python3[60596]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtlogd_wrapper --cgroupns=host --conmon-pidfile /run/nova_virtlogd_wrapper.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=c7803ed1795969cb7cf47e6d4d57c4b9 --label config_id=tripleo_step3 --label container_name=nova_virtlogd_wrapper --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtlogd_wrapper.log --network host --pid host --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 26 03:10:49 localhost systemd[1]: libpod-da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de.scope: Deactivated successfully. Nov 26 03:10:49 localhost systemd-logind[761]: Existing logind session ID 29 used by new audit session, ignoring. Nov 26 03:10:49 localhost podman[60851]: 2025-11-26 08:10:49.599328012 +0000 UTC m=+0.041494193 container died 09cc51740922973a562a88ec1d07ccee0f68805063f5f5804d728047ab977d74 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, config_id=tripleo_step3, container_name=ceilometer_init_log, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, vcs-type=git, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 26 03:10:49 localhost systemd[1]: Created slice User Slice of UID 0. Nov 26 03:10:49 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Nov 26 03:10:49 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Nov 26 03:10:49 localhost systemd[1]: Starting User Manager for UID 0... Nov 26 03:10:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:10:49 localhost podman[60745]: 2025-11-26 08:10:49.641780533 +0000 UTC m=+0.263067092 container init 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., container_name=collectd) Nov 26 03:10:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:10:49 localhost systemd-logind[761]: Existing logind session ID 29 used by new audit session, ignoring. Nov 26 03:10:49 localhost podman[60745]: 2025-11-26 08:10:49.667401788 +0000 UTC m=+0.288688327 container start 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, version=17.1.12, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team) Nov 26 03:10:49 localhost python3[60596]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name collectd --cap-add IPC_LOCK --conmon-pidfile /run/collectd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=4767aaabc3de112d8791c290aa2b669d --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step3 --label container_name=collectd --label managed_by=tripleo_ansible --label config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/collectd.log --memory 512m --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro --volume /var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/collectd:/var/log/collectd:rw,z --volume /var/lib/container-config-scripts:/config-scripts:ro --volume /var/lib/container-user-scripts:/scripts:z --volume /run:/run:rw --volume /sys/fs/cgroup:/sys/fs/cgroup:ro registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Nov 26 03:10:49 localhost podman[60813]: 2025-11-26 08:10:49.612374661 +0000 UTC m=+0.129994914 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Nov 26 03:10:49 localhost podman[60852]: 2025-11-26 08:10:49.730623125 +0000 UTC m=+0.173826647 container cleanup 09cc51740922973a562a88ec1d07ccee0f68805063f5f5804d728047ab977d74 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.buildah.version=1.41.4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_init_log, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044) Nov 26 03:10:49 localhost systemd[1]: libpod-conmon-09cc51740922973a562a88ec1d07ccee0f68805063f5f5804d728047ab977d74.scope: Deactivated successfully. Nov 26 03:10:49 localhost systemd[60913]: Queued start job for default target Main User Target. Nov 26 03:10:49 localhost systemd[60913]: Created slice User Application Slice. Nov 26 03:10:49 localhost systemd[60913]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Nov 26 03:10:49 localhost systemd[60913]: Started Daily Cleanup of User's Temporary Directories. Nov 26 03:10:49 localhost systemd[60913]: Reached target Paths. Nov 26 03:10:49 localhost systemd[60913]: Reached target Timers. Nov 26 03:10:49 localhost podman[60894]: 2025-11-26 08:10:49.747469611 +0000 UTC m=+0.139312490 container died da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, batch=17.1_20251118.1, container_name=rsyslog, architecture=x86_64, release=1761123044, vendor=Red Hat, Inc., name=rhosp17/openstack-rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '89e2bf3e240198013fa934e7fe0b50df'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, build-date=2025-11-18T22:49:49Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-rsyslog-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, distribution-scope=public, io.buildah.version=1.41.4) Nov 26 03:10:49 localhost systemd[60913]: Starting D-Bus User Message Bus Socket... Nov 26 03:10:49 localhost systemd[60913]: Starting Create User's Volatile Files and Directories... Nov 26 03:10:49 localhost systemd[60913]: Finished Create User's Volatile Files and Directories. Nov 26 03:10:49 localhost systemd[60913]: Listening on D-Bus User Message Bus Socket. Nov 26 03:10:49 localhost systemd[60913]: Reached target Sockets. Nov 26 03:10:49 localhost systemd[60913]: Reached target Basic System. Nov 26 03:10:49 localhost systemd[60913]: Reached target Main User Target. Nov 26 03:10:49 localhost systemd[60913]: Startup finished in 115ms. Nov 26 03:10:49 localhost systemd[1]: Started User Manager for UID 0. Nov 26 03:10:49 localhost systemd[1]: Started Session c1 of User root. Nov 26 03:10:49 localhost systemd[1]: Started Session c2 of User root. Nov 26 03:10:49 localhost podman[60931]: 2025-11-26 08:10:49.786017482 +0000 UTC m=+0.115216252 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=starting, batch=17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, config_id=tripleo_step3, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vcs-type=git, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, release=1761123044) Nov 26 03:10:49 localhost podman[60931]: 2025-11-26 08:10:49.790559651 +0000 UTC m=+0.119758431 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044) Nov 26 03:10:49 localhost podman[60931]: unhealthy Nov 26 03:10:49 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:10:49 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Failed with result 'exit-code'. Nov 26 03:10:49 localhost podman[60894]: 2025-11-26 08:10:49.82053891 +0000 UTC m=+0.212381769 container cleanup da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-18T22:49:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=rsyslog, tcib_managed=true, com.redhat.component=openstack-rsyslog-container, summary=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '89e2bf3e240198013fa934e7fe0b50df'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, version=17.1.12, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1) Nov 26 03:10:49 localhost systemd[1]: libpod-conmon-da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de.scope: Deactivated successfully. Nov 26 03:10:49 localhost systemd[1]: session-c2.scope: Deactivated successfully. Nov 26 03:10:49 localhost systemd[1]: session-c1.scope: Deactivated successfully. Nov 26 03:10:49 localhost podman[60813]: 2025-11-26 08:10:49.901026056 +0000 UTC m=+0.418646299 container create 20d5e680e185a856b1f85c19d75531d38793834fbbb109b69e76d105ade86df5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, url=https://www.redhat.com, container_name=nova_statedir_owner, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, tcib_managed=true, config_id=tripleo_step3, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:10:49 localhost systemd[1]: Started libpod-conmon-20d5e680e185a856b1f85c19d75531d38793834fbbb109b69e76d105ade86df5.scope. Nov 26 03:10:49 localhost systemd[1]: Started libcrun container. Nov 26 03:10:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/91ede8908568d13e7e992d722f3ffe3beb21394f78b9650d7ef9bdba7da629a5/merged/container-config-scripts supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/91ede8908568d13e7e992d722f3ffe3beb21394f78b9650d7ef9bdba7da629a5/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/91ede8908568d13e7e992d722f3ffe3beb21394f78b9650d7ef9bdba7da629a5/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:49 localhost podman[60813]: 2025-11-26 08:10:49.968312998 +0000 UTC m=+0.485933251 container init 20d5e680e185a856b1f85c19d75531d38793834fbbb109b69e76d105ade86df5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, io.buildah.version=1.41.4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_statedir_owner, name=rhosp17/openstack-nova-compute, version=17.1.12, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:10:49 localhost podman[60813]: 2025-11-26 08:10:49.977412897 +0000 UTC m=+0.495033170 container start 20d5e680e185a856b1f85c19d75531d38793834fbbb109b69e76d105ade86df5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, architecture=x86_64, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, release=1761123044, container_name=nova_statedir_owner, vcs-type=git, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, version=17.1.12, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Nov 26 03:10:49 localhost podman[60813]: 2025-11-26 08:10:49.977694896 +0000 UTC m=+0.495315139 container attach 20d5e680e185a856b1f85c19d75531d38793834fbbb109b69e76d105ade86df5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step3, vendor=Red Hat, Inc., tcib_managed=true, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_statedir_owner, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z) Nov 26 03:10:50 localhost systemd[1]: libpod-20d5e680e185a856b1f85c19d75531d38793834fbbb109b69e76d105ade86df5.scope: Deactivated successfully. Nov 26 03:10:50 localhost podman[60813]: 2025-11-26 08:10:50.048758793 +0000 UTC m=+0.566379046 container died 20d5e680e185a856b1f85c19d75531d38793834fbbb109b69e76d105ade86df5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_statedir_owner, vcs-type=git, name=rhosp17/openstack-nova-compute, distribution-scope=public, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, config_id=tripleo_step3, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:10:50 localhost podman[61054]: 2025-11-26 08:10:50.108797933 +0000 UTC m=+0.053417157 container cleanup 20d5e680e185a856b1f85c19d75531d38793834fbbb109b69e76d105ade86df5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, container_name=nova_statedir_owner, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_id=tripleo_step3, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Nov 26 03:10:50 localhost systemd[1]: libpod-conmon-20d5e680e185a856b1f85c19d75531d38793834fbbb109b69e76d105ade86df5.scope: Deactivated successfully. Nov 26 03:10:50 localhost python3[60596]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_statedir_owner --conmon-pidfile /run/nova_statedir_owner.pid --detach=False --env NOVA_STATEDIR_OWNERSHIP_SKIP=triliovault-mounts --env TRIPLEO_DEPLOY_IDENTIFIER=1764143208 --env __OS_DEBUG=true --label config_id=tripleo_step3 --label container_name=nova_statedir_owner --label managed_by=tripleo_ansible --label config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_statedir_owner.log --network none --privileged=False --security-opt label=disable --user root --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/container-config-scripts:/container-config-scripts:z registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 /container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py Nov 26 03:10:50 localhost podman[61100]: 2025-11-26 08:10:50.210085327 +0000 UTC m=+0.050576081 container create 9609aaa40dbc5e0c67f8fb745bb157c960dc140d8b4865f0a757076da81f19da (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:35:22Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, name=rhosp17/openstack-nova-libvirt, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 26 03:10:50 localhost systemd[1]: Started libpod-conmon-9609aaa40dbc5e0c67f8fb745bb157c960dc140d8b4865f0a757076da81f19da.scope. Nov 26 03:10:50 localhost systemd[1]: Started libcrun container. Nov 26 03:10:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9cad70600db6f91a3503354117d6c001b29942b13ad48c875233bf63324fad13/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9cad70600db6f91a3503354117d6c001b29942b13ad48c875233bf63324fad13/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9cad70600db6f91a3503354117d6c001b29942b13ad48c875233bf63324fad13/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9cad70600db6f91a3503354117d6c001b29942b13ad48c875233bf63324fad13/merged/var/log/swtpm/libvirt supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:50 localhost podman[61100]: 2025-11-26 08:10:50.265409831 +0000 UTC m=+0.105900535 container init 9609aaa40dbc5e0c67f8fb745bb157c960dc140d8b4865f0a757076da81f19da (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:35:22Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.4) Nov 26 03:10:50 localhost podman[61100]: 2025-11-26 08:10:50.27353703 +0000 UTC m=+0.114027734 container start 9609aaa40dbc5e0c67f8fb745bb157c960dc140d8b4865f0a757076da81f19da (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:35:22Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1) Nov 26 03:10:50 localhost podman[61100]: 2025-11-26 08:10:50.186331628 +0000 UTC m=+0.026822352 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 26 03:10:50 localhost systemd[1]: var-lib-containers-storage-overlay-4bba6b9e4ee096356fde8a3c1b121b7bd17e19e8adc642d7cdd467db161ba283-merged.mount: Deactivated successfully. Nov 26 03:10:50 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de-userdata-shm.mount: Deactivated successfully. Nov 26 03:10:50 localhost podman[61170]: 2025-11-26 08:10:50.538485009 +0000 UTC m=+0.099443678 container create 906839e5d93b9347df842476746b0d3a39742bde0368f5b18aed5994f7acb07b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, vcs-type=git, name=rhosp17/openstack-nova-libvirt, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, managed_by=tripleo_ansible, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, config_id=tripleo_step3, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, container_name=nova_virtsecretd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-libvirt-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:10:50 localhost systemd[1]: Started libpod-conmon-906839e5d93b9347df842476746b0d3a39742bde0368f5b18aed5994f7acb07b.scope. Nov 26 03:10:50 localhost podman[61170]: 2025-11-26 08:10:50.492921664 +0000 UTC m=+0.053880373 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 26 03:10:50 localhost systemd[1]: Started libcrun container. Nov 26 03:10:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/936e3cf49366e6a39a6d9fcfb7eda40c941ef016ddebdad776e7ba69c7632552/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/936e3cf49366e6a39a6d9fcfb7eda40c941ef016ddebdad776e7ba69c7632552/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/936e3cf49366e6a39a6d9fcfb7eda40c941ef016ddebdad776e7ba69c7632552/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/936e3cf49366e6a39a6d9fcfb7eda40c941ef016ddebdad776e7ba69c7632552/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/936e3cf49366e6a39a6d9fcfb7eda40c941ef016ddebdad776e7ba69c7632552/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/936e3cf49366e6a39a6d9fcfb7eda40c941ef016ddebdad776e7ba69c7632552/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/936e3cf49366e6a39a6d9fcfb7eda40c941ef016ddebdad776e7ba69c7632552/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:50 localhost podman[61170]: 2025-11-26 08:10:50.6201047 +0000 UTC m=+0.181063369 container init 906839e5d93b9347df842476746b0d3a39742bde0368f5b18aed5994f7acb07b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step3, name=rhosp17/openstack-nova-libvirt, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_virtsecretd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12) Nov 26 03:10:50 localhost podman[61170]: 2025-11-26 08:10:50.634985896 +0000 UTC m=+0.195944535 container start 906839e5d93b9347df842476746b0d3a39742bde0368f5b18aed5994f7acb07b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:35:22Z, name=rhosp17/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, tcib_managed=true, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtsecretd, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, config_id=tripleo_step3, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt) Nov 26 03:10:50 localhost python3[60596]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtsecretd --cgroupns=host --conmon-pidfile /run/nova_virtsecretd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=c7803ed1795969cb7cf47e6d4d57c4b9 --label config_id=tripleo_step3 --label container_name=nova_virtsecretd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtsecretd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 26 03:10:50 localhost systemd-logind[761]: Existing logind session ID 29 used by new audit session, ignoring. Nov 26 03:10:50 localhost systemd[1]: Started Session c3 of User root. Nov 26 03:10:50 localhost systemd[1]: session-c3.scope: Deactivated successfully. Nov 26 03:10:51 localhost podman[61309]: 2025-11-26 08:10:51.051951053 +0000 UTC m=+0.081893980 container create 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, container_name=iscsid, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team) Nov 26 03:10:51 localhost systemd[1]: Started libpod-conmon-1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.scope. Nov 26 03:10:51 localhost podman[61317]: 2025-11-26 08:10:51.093103214 +0000 UTC m=+0.107187805 container create 5994135cac9d5a3d22a1087204d78a6aa9d790c1779606870a5be92c9633bf8d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, distribution-scope=public, config_id=tripleo_step3, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-libvirt, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, container_name=nova_virtnodedevd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-19T00:35:22Z, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}) Nov 26 03:10:51 localhost systemd[1]: Started libcrun container. Nov 26 03:10:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14988db00eaf3274b740fc90a2db62d16af3a82b44457432a1a6aa29dc90bda4/merged/etc/target supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14988db00eaf3274b740fc90a2db62d16af3a82b44457432a1a6aa29dc90bda4/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:51 localhost podman[61309]: 2025-11-26 08:10:51.004590991 +0000 UTC m=+0.034533988 image pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Nov 26 03:10:51 localhost systemd[1]: Started libpod-conmon-5994135cac9d5a3d22a1087204d78a6aa9d790c1779606870a5be92c9633bf8d.scope. Nov 26 03:10:51 localhost podman[61317]: 2025-11-26 08:10:51.040269235 +0000 UTC m=+0.054353946 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 26 03:10:51 localhost systemd[1]: Started libcrun container. Nov 26 03:10:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0341f1887aae20a301e856089dc461ce52079f292afb39f1be5bab8c0d01f7a2/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0341f1887aae20a301e856089dc461ce52079f292afb39f1be5bab8c0d01f7a2/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0341f1887aae20a301e856089dc461ce52079f292afb39f1be5bab8c0d01f7a2/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0341f1887aae20a301e856089dc461ce52079f292afb39f1be5bab8c0d01f7a2/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0341f1887aae20a301e856089dc461ce52079f292afb39f1be5bab8c0d01f7a2/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0341f1887aae20a301e856089dc461ce52079f292afb39f1be5bab8c0d01f7a2/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0341f1887aae20a301e856089dc461ce52079f292afb39f1be5bab8c0d01f7a2/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:10:51 localhost podman[61309]: 2025-11-26 08:10:51.150092 +0000 UTC m=+0.180035017 container init 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.12, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, vcs-type=git, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_id=tripleo_step3, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4) Nov 26 03:10:51 localhost podman[61317]: 2025-11-26 08:10:51.151484083 +0000 UTC m=+0.165568714 container init 5994135cac9d5a3d22a1087204d78a6aa9d790c1779606870a5be92c9633bf8d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1761123044, config_id=tripleo_step3, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtnodedevd, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:10:51 localhost podman[61317]: 2025-11-26 08:10:51.161545661 +0000 UTC m=+0.175630272 container start 5994135cac9d5a3d22a1087204d78a6aa9d790c1779606870a5be92c9633bf8d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-libvirt, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtnodedevd, build-date=2025-11-19T00:35:22Z, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt) Nov 26 03:10:51 localhost python3[60596]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtnodedevd --cgroupns=host --conmon-pidfile /run/nova_virtnodedevd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=c7803ed1795969cb7cf47e6d4d57c4b9 --label config_id=tripleo_step3 --label container_name=nova_virtnodedevd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtnodedevd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 26 03:10:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:10:51 localhost systemd-logind[761]: Existing logind session ID 29 used by new audit session, ignoring. Nov 26 03:10:51 localhost podman[61309]: 2025-11-26 08:10:51.1853212 +0000 UTC m=+0.215264147 container start 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.12, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, vcs-type=git, batch=17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.expose-services=) Nov 26 03:10:51 localhost systemd[1]: Started Session c4 of User root. Nov 26 03:10:51 localhost python3[60596]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name iscsid --conmon-pidfile /run/iscsid.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=62782eb5f982aaac812488dee300321e --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step3 --label container_name=iscsid --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/iscsid.log --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Nov 26 03:10:51 localhost systemd-logind[761]: Existing logind session ID 29 used by new audit session, ignoring. Nov 26 03:10:51 localhost systemd[1]: Started Session c5 of User root. Nov 26 03:10:51 localhost podman[61355]: 2025-11-26 08:10:51.268480868 +0000 UTC m=+0.077225567 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=starting, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container) Nov 26 03:10:51 localhost systemd[1]: session-c4.scope: Deactivated successfully. Nov 26 03:10:51 localhost systemd[1]: session-c5.scope: Deactivated successfully. Nov 26 03:10:51 localhost kernel: Loading iSCSI transport class v2.0-870. Nov 26 03:10:51 localhost podman[61355]: 2025-11-26 08:10:51.353271586 +0000 UTC m=+0.162016295 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git) Nov 26 03:10:51 localhost podman[61355]: unhealthy Nov 26 03:10:51 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:10:51 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Failed with result 'exit-code'. Nov 26 03:10:51 localhost systemd[1]: tmp-crun.ZkMTdH.mount: Deactivated successfully. Nov 26 03:10:51 localhost podman[61482]: 2025-11-26 08:10:51.740379598 +0000 UTC m=+0.089935047 container create 73f5fd05db839fb6a1d1aa71f796fc97a73af0e0d291430d998c62ae8e85d8cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_virtstoraged, architecture=x86_64, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, build-date=2025-11-19T00:35:22Z) Nov 26 03:10:51 localhost systemd[1]: Started libpod-conmon-73f5fd05db839fb6a1d1aa71f796fc97a73af0e0d291430d998c62ae8e85d8cb.scope. Nov 26 03:10:51 localhost systemd[1]: Started libcrun container. Nov 26 03:10:51 localhost podman[61482]: 2025-11-26 08:10:51.692937714 +0000 UTC m=+0.042493153 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 26 03:10:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eadc15d9188ab59a3183de8359c9702c1c3bf67b60cc946527b932af6f7de9b9/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eadc15d9188ab59a3183de8359c9702c1c3bf67b60cc946527b932af6f7de9b9/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eadc15d9188ab59a3183de8359c9702c1c3bf67b60cc946527b932af6f7de9b9/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eadc15d9188ab59a3183de8359c9702c1c3bf67b60cc946527b932af6f7de9b9/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eadc15d9188ab59a3183de8359c9702c1c3bf67b60cc946527b932af6f7de9b9/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eadc15d9188ab59a3183de8359c9702c1c3bf67b60cc946527b932af6f7de9b9/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eadc15d9188ab59a3183de8359c9702c1c3bf67b60cc946527b932af6f7de9b9/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:51 localhost podman[61482]: 2025-11-26 08:10:51.808111673 +0000 UTC m=+0.157667142 container init 73f5fd05db839fb6a1d1aa71f796fc97a73af0e0d291430d998c62ae8e85d8cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, url=https://www.redhat.com, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, io.openshift.expose-services=, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-11-19T00:35:22Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtstoraged, tcib_managed=true) Nov 26 03:10:51 localhost podman[61482]: 2025-11-26 08:10:51.816739427 +0000 UTC m=+0.166294896 container start 73f5fd05db839fb6a1d1aa71f796fc97a73af0e0d291430d998c62ae8e85d8cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=nova_virtstoraged, version=17.1.12, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container, build-date=2025-11-19T00:35:22Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, config_id=tripleo_step3, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:10:51 localhost python3[60596]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtstoraged --cgroupns=host --conmon-pidfile /run/nova_virtstoraged.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=c7803ed1795969cb7cf47e6d4d57c4b9 --label config_id=tripleo_step3 --label container_name=nova_virtstoraged --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtstoraged.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 26 03:10:51 localhost systemd-logind[761]: Existing logind session ID 29 used by new audit session, ignoring. Nov 26 03:10:51 localhost systemd[1]: Started Session c6 of User root. Nov 26 03:10:51 localhost systemd[1]: session-c6.scope: Deactivated successfully. Nov 26 03:10:52 localhost podman[61585]: 2025-11-26 08:10:52.203331004 +0000 UTC m=+0.082808868 container create b29f4cd20a1c18ffd470f87f5036c652bb1768cdf8614e6a7c6503ca9a73b365 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, vcs-type=git, container_name=nova_virtqemud, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, architecture=x86_64, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, config_id=tripleo_step3, distribution-scope=public, build-date=2025-11-19T00:35:22Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible) Nov 26 03:10:52 localhost systemd[1]: Started libpod-conmon-b29f4cd20a1c18ffd470f87f5036c652bb1768cdf8614e6a7c6503ca9a73b365.scope. Nov 26 03:10:52 localhost systemd[1]: Started libcrun container. Nov 26 03:10:52 localhost podman[61585]: 2025-11-26 08:10:52.154734745 +0000 UTC m=+0.034212649 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 26 03:10:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7fbacd248b5281d15359a0a3185510949d60a9c5c12517cf35c6a3746148bd16/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7fbacd248b5281d15359a0a3185510949d60a9c5c12517cf35c6a3746148bd16/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7fbacd248b5281d15359a0a3185510949d60a9c5c12517cf35c6a3746148bd16/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7fbacd248b5281d15359a0a3185510949d60a9c5c12517cf35c6a3746148bd16/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7fbacd248b5281d15359a0a3185510949d60a9c5c12517cf35c6a3746148bd16/merged/var/log/swtpm supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7fbacd248b5281d15359a0a3185510949d60a9c5c12517cf35c6a3746148bd16/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7fbacd248b5281d15359a0a3185510949d60a9c5c12517cf35c6a3746148bd16/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7fbacd248b5281d15359a0a3185510949d60a9c5c12517cf35c6a3746148bd16/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:52 localhost podman[61585]: 2025-11-26 08:10:52.270564714 +0000 UTC m=+0.150042588 container init b29f4cd20a1c18ffd470f87f5036c652bb1768cdf8614e6a7c6503ca9a73b365 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-libvirt, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, container_name=nova_virtqemud, config_id=tripleo_step3, release=1761123044, com.redhat.component=openstack-nova-libvirt-container, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:10:52 localhost podman[61585]: 2025-11-26 08:10:52.278974051 +0000 UTC m=+0.158451935 container start b29f4cd20a1c18ffd470f87f5036c652bb1768cdf8614e6a7c6503ca9a73b365 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, batch=17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, build-date=2025-11-19T00:35:22Z, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtqemud, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, config_id=tripleo_step3, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp17/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, release=1761123044, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:10:52 localhost python3[60596]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtqemud --cgroupns=host --conmon-pidfile /run/nova_virtqemud.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=c7803ed1795969cb7cf47e6d4d57c4b9 --label config_id=tripleo_step3 --label container_name=nova_virtqemud --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtqemud.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro --volume /var/log/containers/libvirt/swtpm:/var/log/swtpm:z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 26 03:10:52 localhost systemd-logind[761]: Existing logind session ID 29 used by new audit session, ignoring. Nov 26 03:10:52 localhost systemd[1]: Started Session c7 of User root. Nov 26 03:10:52 localhost systemd[1]: session-c7.scope: Deactivated successfully. Nov 26 03:10:52 localhost podman[61690]: 2025-11-26 08:10:52.687916892 +0000 UTC m=+0.072459041 container create f67337eb348d14cc8789e9dcf8617d0dec3d3d925b69fc4ab56922ca0f9658f9 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, release=1761123044, build-date=2025-11-19T00:35:22Z, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, version=17.1.12, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, config_id=tripleo_step3, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtproxyd, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-type=git) Nov 26 03:10:52 localhost systemd[1]: Started libpod-conmon-f67337eb348d14cc8789e9dcf8617d0dec3d3d925b69fc4ab56922ca0f9658f9.scope. Nov 26 03:10:52 localhost podman[61690]: 2025-11-26 08:10:52.64247604 +0000 UTC m=+0.027018199 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 26 03:10:52 localhost systemd[1]: Started libcrun container. Nov 26 03:10:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/562cf4f00dff93969d256b2be1bb2ab69067066ea2da814f14524981979b95c3/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/562cf4f00dff93969d256b2be1bb2ab69067066ea2da814f14524981979b95c3/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/562cf4f00dff93969d256b2be1bb2ab69067066ea2da814f14524981979b95c3/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/562cf4f00dff93969d256b2be1bb2ab69067066ea2da814f14524981979b95c3/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/562cf4f00dff93969d256b2be1bb2ab69067066ea2da814f14524981979b95c3/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/562cf4f00dff93969d256b2be1bb2ab69067066ea2da814f14524981979b95c3/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/562cf4f00dff93969d256b2be1bb2ab69067066ea2da814f14524981979b95c3/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Nov 26 03:10:52 localhost podman[61690]: 2025-11-26 08:10:52.758086232 +0000 UTC m=+0.142628411 container init f67337eb348d14cc8789e9dcf8617d0dec3d3d925b69fc4ab56922ca0f9658f9 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtproxyd, version=17.1.12, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, build-date=2025-11-19T00:35:22Z, com.redhat.component=openstack-nova-libvirt-container) Nov 26 03:10:52 localhost podman[61690]: 2025-11-26 08:10:52.764494379 +0000 UTC m=+0.149036528 container start f67337eb348d14cc8789e9dcf8617d0dec3d3d925b69fc4ab56922ca0f9658f9 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, managed_by=tripleo_ansible, container_name=nova_virtproxyd, description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container) Nov 26 03:10:52 localhost python3[60596]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtproxyd --cgroupns=host --conmon-pidfile /run/nova_virtproxyd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=c7803ed1795969cb7cf47e6d4d57c4b9 --label config_id=tripleo_step3 --label container_name=nova_virtproxyd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtproxyd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 26 03:10:52 localhost systemd-logind[761]: Existing logind session ID 29 used by new audit session, ignoring. Nov 26 03:10:52 localhost systemd[1]: Started Session c8 of User root. Nov 26 03:10:52 localhost systemd[1]: session-c8.scope: Deactivated successfully. Nov 26 03:10:53 localhost python3[61770]: ansible-file Invoked with path=/etc/systemd/system/tripleo_collectd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:10:53 localhost python3[61786]: ansible-file Invoked with path=/etc/systemd/system/tripleo_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:10:54 localhost python3[61802]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:10:54 localhost python3[61818]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:10:54 localhost python3[61834]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:10:55 localhost python3[61850]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:10:55 localhost python3[61866]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:10:55 localhost python3[61882]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:10:55 localhost python3[61899]: ansible-file Invoked with path=/etc/systemd/system/tripleo_rsyslog.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:10:56 localhost python3[61915]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_collectd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 03:10:56 localhost python3[61931]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_iscsid_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 03:10:56 localhost python3[61947]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 03:10:56 localhost python3[61964]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 03:10:57 localhost python3[61980]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 03:10:57 localhost python3[61996]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 03:10:57 localhost python3[62012]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 03:10:57 localhost python3[62028]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 03:10:58 localhost python3[62044]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_rsyslog_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 03:10:58 localhost python3[62105]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764144658.1782036-100561-175540511593785/source dest=/etc/systemd/system/tripleo_collectd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:10:59 localhost python3[62134]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764144658.1782036-100561-175540511593785/source dest=/etc/systemd/system/tripleo_iscsid.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:10:59 localhost python3[62163]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764144658.1782036-100561-175540511593785/source dest=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:11:00 localhost python3[62192]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764144658.1782036-100561-175540511593785/source dest=/etc/systemd/system/tripleo_nova_virtnodedevd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:11:00 localhost python3[62221]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764144658.1782036-100561-175540511593785/source dest=/etc/systemd/system/tripleo_nova_virtproxyd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:11:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:11:01 localhost podman[62251]: 2025-11-26 08:11:01.280152566 +0000 UTC m=+0.090628248 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com) Nov 26 03:11:01 localhost python3[62250]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764144658.1782036-100561-175540511593785/source dest=/etc/systemd/system/tripleo_nova_virtqemud.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:11:01 localhost podman[62251]: 2025-11-26 08:11:01.49567985 +0000 UTC m=+0.306155552 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-type=git, version=17.1.12, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=metrics_qdr) Nov 26 03:11:01 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:11:01 localhost python3[62309]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764144658.1782036-100561-175540511593785/source dest=/etc/systemd/system/tripleo_nova_virtsecretd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:11:02 localhost python3[62338]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764144658.1782036-100561-175540511593785/source dest=/etc/systemd/system/tripleo_nova_virtstoraged.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:11:02 localhost python3[62367]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764144658.1782036-100561-175540511593785/source dest=/etc/systemd/system/tripleo_rsyslog.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:11:02 localhost systemd[1]: Stopping User Manager for UID 0... Nov 26 03:11:02 localhost systemd[60913]: Activating special unit Exit the Session... Nov 26 03:11:02 localhost systemd[60913]: Stopped target Main User Target. Nov 26 03:11:02 localhost systemd[60913]: Stopped target Basic System. Nov 26 03:11:02 localhost systemd[60913]: Stopped target Paths. Nov 26 03:11:02 localhost systemd[60913]: Stopped target Sockets. Nov 26 03:11:02 localhost systemd[60913]: Stopped target Timers. Nov 26 03:11:02 localhost systemd[60913]: Stopped Daily Cleanup of User's Temporary Directories. Nov 26 03:11:02 localhost systemd[60913]: Closed D-Bus User Message Bus Socket. Nov 26 03:11:02 localhost systemd[60913]: Stopped Create User's Volatile Files and Directories. Nov 26 03:11:02 localhost systemd[60913]: Removed slice User Application Slice. Nov 26 03:11:02 localhost systemd[60913]: Reached target Shutdown. Nov 26 03:11:02 localhost systemd[60913]: Finished Exit the Session. Nov 26 03:11:02 localhost systemd[60913]: Reached target Exit the Session. Nov 26 03:11:02 localhost systemd[1]: user@0.service: Deactivated successfully. Nov 26 03:11:02 localhost systemd[1]: Stopped User Manager for UID 0. Nov 26 03:11:02 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Nov 26 03:11:03 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Nov 26 03:11:03 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Nov 26 03:11:03 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Nov 26 03:11:03 localhost systemd[1]: Removed slice User Slice of UID 0. Nov 26 03:11:03 localhost python3[62383]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 26 03:11:03 localhost systemd[1]: Reloading. Nov 26 03:11:03 localhost systemd-rc-local-generator[62407]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:11:03 localhost systemd-sysv-generator[62411]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:11:03 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:11:04 localhost python3[62438]: ansible-systemd Invoked with state=restarted name=tripleo_collectd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 03:11:04 localhost systemd[1]: Reloading. Nov 26 03:11:04 localhost systemd-rc-local-generator[62461]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:11:04 localhost systemd-sysv-generator[62465]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:11:04 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:11:04 localhost systemd[1]: Starting dnf makecache... Nov 26 03:11:04 localhost systemd[1]: Starting collectd container... Nov 26 03:11:04 localhost systemd[1]: Started collectd container. Nov 26 03:11:04 localhost dnf[62478]: Updating Subscription Management repositories. Nov 26 03:11:05 localhost python3[62505]: ansible-systemd Invoked with state=restarted name=tripleo_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 03:11:05 localhost systemd[1]: Reloading. Nov 26 03:11:05 localhost systemd-rc-local-generator[62529]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:11:05 localhost systemd-sysv-generator[62533]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:11:05 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:11:05 localhost systemd[1]: Starting iscsid container... Nov 26 03:11:05 localhost systemd[1]: Started iscsid container. Nov 26 03:11:06 localhost python3[62571]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtlogd_wrapper.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 03:11:06 localhost systemd[1]: Reloading. Nov 26 03:11:06 localhost systemd-rc-local-generator[62594]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:11:06 localhost systemd-sysv-generator[62600]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:11:06 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:11:06 localhost dnf[62478]: Metadata cache refreshed recently. Nov 26 03:11:06 localhost systemd[1]: Starting nova_virtlogd_wrapper container... Nov 26 03:11:06 localhost systemd[1]: dnf-makecache.service: Deactivated successfully. Nov 26 03:11:06 localhost systemd[1]: Finished dnf makecache. Nov 26 03:11:06 localhost systemd[1]: dnf-makecache.service: Consumed 2.093s CPU time. Nov 26 03:11:06 localhost systemd[1]: Started nova_virtlogd_wrapper container. Nov 26 03:11:07 localhost python3[62638]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtnodedevd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 03:11:07 localhost systemd[1]: Reloading. Nov 26 03:11:07 localhost systemd-sysv-generator[62669]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:11:07 localhost systemd-rc-local-generator[62663]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:11:07 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:11:07 localhost systemd[1]: Starting nova_virtnodedevd container... Nov 26 03:11:07 localhost tripleo-start-podman-container[62678]: Creating additional drop-in dependency for "nova_virtnodedevd" (5994135cac9d5a3d22a1087204d78a6aa9d790c1779606870a5be92c9633bf8d) Nov 26 03:11:07 localhost systemd[1]: Reloading. Nov 26 03:11:07 localhost systemd-rc-local-generator[62734]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:11:07 localhost systemd-sysv-generator[62739]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:11:07 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:11:08 localhost systemd[1]: Started nova_virtnodedevd container. Nov 26 03:11:08 localhost python3[62760]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtproxyd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 03:11:08 localhost systemd[1]: Reloading. Nov 26 03:11:08 localhost systemd-rc-local-generator[62785]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:11:08 localhost systemd-sysv-generator[62791]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:11:08 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:11:09 localhost systemd[1]: Starting nova_virtproxyd container... Nov 26 03:11:09 localhost tripleo-start-podman-container[62800]: Creating additional drop-in dependency for "nova_virtproxyd" (f67337eb348d14cc8789e9dcf8617d0dec3d3d925b69fc4ab56922ca0f9658f9) Nov 26 03:11:09 localhost systemd[1]: Reloading. Nov 26 03:11:09 localhost systemd-rc-local-generator[62856]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:11:09 localhost systemd-sysv-generator[62860]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:11:09 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:11:09 localhost systemd[1]: Started nova_virtproxyd container. Nov 26 03:11:10 localhost python3[62883]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtqemud.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 03:11:10 localhost systemd[1]: Reloading. Nov 26 03:11:10 localhost systemd-sysv-generator[62910]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:11:10 localhost systemd-rc-local-generator[62906]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:11:10 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:11:10 localhost systemd[1]: Starting nova_virtqemud container... Nov 26 03:11:10 localhost tripleo-start-podman-container[62924]: Creating additional drop-in dependency for "nova_virtqemud" (b29f4cd20a1c18ffd470f87f5036c652bb1768cdf8614e6a7c6503ca9a73b365) Nov 26 03:11:10 localhost systemd[1]: Reloading. Nov 26 03:11:10 localhost systemd-sysv-generator[62985]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:11:10 localhost systemd-rc-local-generator[62980]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:11:10 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:11:10 localhost systemd[1]: Started nova_virtqemud container. Nov 26 03:11:11 localhost python3[63008]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtsecretd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 03:11:12 localhost systemd[1]: Reloading. Nov 26 03:11:12 localhost systemd-sysv-generator[63041]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:11:12 localhost systemd-rc-local-generator[63035]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:11:12 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:11:12 localhost systemd[1]: Starting nova_virtsecretd container... Nov 26 03:11:12 localhost tripleo-start-podman-container[63048]: Creating additional drop-in dependency for "nova_virtsecretd" (906839e5d93b9347df842476746b0d3a39742bde0368f5b18aed5994f7acb07b) Nov 26 03:11:12 localhost systemd[1]: Reloading. Nov 26 03:11:12 localhost systemd-rc-local-generator[63106]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:11:12 localhost systemd-sysv-generator[63110]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:11:12 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:11:13 localhost systemd[1]: Started nova_virtsecretd container. Nov 26 03:11:13 localhost python3[63131]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtstoraged.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 03:11:13 localhost systemd[1]: Reloading. Nov 26 03:11:13 localhost systemd-rc-local-generator[63154]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:11:13 localhost systemd-sysv-generator[63158]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:11:13 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:11:14 localhost systemd[1]: Starting nova_virtstoraged container... Nov 26 03:11:14 localhost tripleo-start-podman-container[63171]: Creating additional drop-in dependency for "nova_virtstoraged" (73f5fd05db839fb6a1d1aa71f796fc97a73af0e0d291430d998c62ae8e85d8cb) Nov 26 03:11:14 localhost systemd[1]: Reloading. Nov 26 03:11:14 localhost systemd-rc-local-generator[63229]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:11:14 localhost systemd-sysv-generator[63232]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:11:14 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:11:14 localhost systemd[1]: Started nova_virtstoraged container. Nov 26 03:11:15 localhost python3[63256]: ansible-systemd Invoked with state=restarted name=tripleo_rsyslog.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 03:11:15 localhost systemd[1]: Reloading. Nov 26 03:11:15 localhost systemd-rc-local-generator[63280]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:11:15 localhost systemd-sysv-generator[63290]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:11:15 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:11:15 localhost systemd[1]: Starting rsyslog container... Nov 26 03:11:15 localhost sshd[63302]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:11:15 localhost systemd[1]: Started libcrun container. Nov 26 03:11:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4bba6b9e4ee096356fde8a3c1b121b7bd17e19e8adc642d7cdd467db161ba283/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Nov 26 03:11:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4bba6b9e4ee096356fde8a3c1b121b7bd17e19e8adc642d7cdd467db161ba283/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Nov 26 03:11:15 localhost podman[63296]: 2025-11-26 08:11:15.7855024 +0000 UTC m=+0.125962451 container init da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, build-date=2025-11-18T22:49:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, managed_by=tripleo_ansible, name=rhosp17/openstack-rsyslog, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '89e2bf3e240198013fa934e7fe0b50df'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, tcib_managed=true, com.redhat.component=openstack-rsyslog-container, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, config_id=tripleo_step3) Nov 26 03:11:15 localhost podman[63296]: 2025-11-26 08:11:15.800466779 +0000 UTC m=+0.140926800 container start da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, version=17.1.12, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-rsyslog, vcs-type=git, distribution-scope=public, release=1761123044, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.4, io.openshift.expose-services=, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-rsyslog-container, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=rsyslog, build-date=2025-11-18T22:49:49Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '89e2bf3e240198013fa934e7fe0b50df'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:11:15 localhost podman[63296]: rsyslog Nov 26 03:11:15 localhost systemd[1]: Started rsyslog container. Nov 26 03:11:15 localhost systemd[1]: libpod-da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de.scope: Deactivated successfully. Nov 26 03:11:15 localhost podman[63331]: 2025-11-26 08:11:15.960123531 +0000 UTC m=+0.057155203 container died da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-rsyslog-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, version=17.1.12, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:49Z, container_name=rsyslog, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '89e2bf3e240198013fa934e7fe0b50df'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}) Nov 26 03:11:15 localhost podman[63331]: 2025-11-26 08:11:15.984377744 +0000 UTC m=+0.081409376 container cleanup da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, distribution-scope=public, container_name=rsyslog, name=rhosp17/openstack-rsyslog, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '89e2bf3e240198013fa934e7fe0b50df'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:49Z, com.redhat.component=openstack-rsyslog-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, vcs-type=git, architecture=x86_64) Nov 26 03:11:15 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:11:16 localhost podman[63345]: 2025-11-26 08:11:16.064502229 +0000 UTC m=+0.055166871 container cleanup da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, batch=17.1_20251118.1, build-date=2025-11-18T22:49:49Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, tcib_managed=true, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '89e2bf3e240198013fa934e7fe0b50df'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-rsyslog, container_name=rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog) Nov 26 03:11:16 localhost podman[63345]: rsyslog Nov 26 03:11:16 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Nov 26 03:11:16 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 1. Nov 26 03:11:16 localhost systemd[1]: Stopped rsyslog container. Nov 26 03:11:16 localhost systemd[1]: Starting rsyslog container... Nov 26 03:11:16 localhost systemd[1]: Started libcrun container. Nov 26 03:11:16 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4bba6b9e4ee096356fde8a3c1b121b7bd17e19e8adc642d7cdd467db161ba283/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Nov 26 03:11:16 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4bba6b9e4ee096356fde8a3c1b121b7bd17e19e8adc642d7cdd467db161ba283/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Nov 26 03:11:16 localhost podman[63374]: 2025-11-26 08:11:16.302461111 +0000 UTC m=+0.105738191 container init da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, name=rhosp17/openstack-rsyslog, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '89e2bf3e240198013fa934e7fe0b50df'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=rsyslog, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:49Z, description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, com.redhat.component=openstack-rsyslog-container, managed_by=tripleo_ansible) Nov 26 03:11:16 localhost podman[63374]: 2025-11-26 08:11:16.309648161 +0000 UTC m=+0.112925241 container start da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, build-date=2025-11-18T22:49:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-rsyslog-container, summary=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '89e2bf3e240198013fa934e7fe0b50df'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.buildah.version=1.41.4, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, name=rhosp17/openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog) Nov 26 03:11:16 localhost podman[63374]: rsyslog Nov 26 03:11:16 localhost systemd[1]: Started rsyslog container. Nov 26 03:11:16 localhost python3[63373]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks3.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:11:16 localhost systemd[1]: libpod-da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de.scope: Deactivated successfully. Nov 26 03:11:16 localhost podman[63398]: 2025-11-26 08:11:16.427635166 +0000 UTC m=+0.033440855 container died da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, container_name=rsyslog, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, vcs-type=git, version=17.1.12, build-date=2025-11-18T22:49:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, com.redhat.component=openstack-rsyslog-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '89e2bf3e240198013fa934e7fe0b50df'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, summary=Red Hat OpenStack Platform 17.1 rsyslog) Nov 26 03:11:16 localhost podman[63398]: 2025-11-26 08:11:16.447172795 +0000 UTC m=+0.052978414 container cleanup da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '89e2bf3e240198013fa934e7fe0b50df'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, com.redhat.component=openstack-rsyslog-container, distribution-scope=public, name=rhosp17/openstack-rsyslog, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, tcib_managed=true, batch=17.1_20251118.1, release=1761123044, version=17.1.12, container_name=rsyslog, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:11:16 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:11:16 localhost podman[63410]: 2025-11-26 08:11:16.513188128 +0000 UTC m=+0.041714079 container cleanup da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '89e2bf3e240198013fa934e7fe0b50df'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, version=17.1.12, vendor=Red Hat, Inc., container_name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-rsyslog, vcs-type=git, url=https://www.redhat.com, release=1761123044, tcib_managed=true, batch=17.1_20251118.1, com.redhat.component=openstack-rsyslog-container, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, distribution-scope=public) Nov 26 03:11:16 localhost podman[63410]: rsyslog Nov 26 03:11:16 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Nov 26 03:11:16 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 2. Nov 26 03:11:16 localhost systemd[1]: Stopped rsyslog container. Nov 26 03:11:16 localhost systemd[1]: Starting rsyslog container... Nov 26 03:11:16 localhost systemd[1]: tmp-crun.GmnF1Q.mount: Deactivated successfully. Nov 26 03:11:16 localhost systemd[1]: var-lib-containers-storage-overlay-4bba6b9e4ee096356fde8a3c1b121b7bd17e19e8adc642d7cdd467db161ba283-merged.mount: Deactivated successfully. Nov 26 03:11:16 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de-userdata-shm.mount: Deactivated successfully. Nov 26 03:11:16 localhost systemd[1]: Started libcrun container. Nov 26 03:11:16 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4bba6b9e4ee096356fde8a3c1b121b7bd17e19e8adc642d7cdd467db161ba283/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Nov 26 03:11:16 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4bba6b9e4ee096356fde8a3c1b121b7bd17e19e8adc642d7cdd467db161ba283/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Nov 26 03:11:16 localhost podman[63455]: 2025-11-26 08:11:16.851273318 +0000 UTC m=+0.100768679 container init da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, container_name=rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-rsyslog, version=17.1.12, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, tcib_managed=true, release=1761123044, build-date=2025-11-18T22:49:49Z, com.redhat.component=openstack-rsyslog-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '89e2bf3e240198013fa934e7fe0b50df'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64) Nov 26 03:11:16 localhost podman[63455]: 2025-11-26 08:11:16.857604081 +0000 UTC m=+0.107099442 container start da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.component=openstack-rsyslog-container, container_name=rsyslog, io.buildah.version=1.41.4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '89e2bf3e240198013fa934e7fe0b50df'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, release=1761123044, build-date=2025-11-18T22:49:49Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step3, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vcs-type=git) Nov 26 03:11:16 localhost podman[63455]: rsyslog Nov 26 03:11:16 localhost systemd[1]: Started rsyslog container. Nov 26 03:11:16 localhost systemd[1]: libpod-da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de.scope: Deactivated successfully. Nov 26 03:11:16 localhost podman[63491]: 2025-11-26 08:11:16.992958709 +0000 UTC m=+0.051697995 container died da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, container_name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '89e2bf3e240198013fa934e7fe0b50df'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, maintainer=OpenStack TripleO Team, distribution-scope=public, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, name=rhosp17/openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.component=openstack-rsyslog-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, batch=17.1_20251118.1, build-date=2025-11-18T22:49:49Z) Nov 26 03:11:17 localhost podman[63491]: 2025-11-26 08:11:17.019031418 +0000 UTC m=+0.077770704 container cleanup da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '89e2bf3e240198013fa934e7fe0b50df'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, container_name=rsyslog, name=rhosp17/openstack-rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, config_id=tripleo_step3, architecture=x86_64, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-rsyslog-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1761123044, summary=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z) Nov 26 03:11:17 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:11:17 localhost podman[63518]: 2025-11-26 08:11:17.091443086 +0000 UTC m=+0.045521565 container cleanup da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, release=1761123044, summary=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, architecture=x86_64, build-date=2025-11-18T22:49:49Z, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-rsyslog-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '89e2bf3e240198013fa934e7fe0b50df'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, container_name=rsyslog, name=rhosp17/openstack-rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public) Nov 26 03:11:17 localhost podman[63518]: rsyslog Nov 26 03:11:17 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Nov 26 03:11:17 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 3. Nov 26 03:11:17 localhost systemd[1]: Stopped rsyslog container. Nov 26 03:11:17 localhost systemd[1]: Starting rsyslog container... Nov 26 03:11:17 localhost systemd[1]: Started libcrun container. Nov 26 03:11:17 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4bba6b9e4ee096356fde8a3c1b121b7bd17e19e8adc642d7cdd467db161ba283/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Nov 26 03:11:17 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4bba6b9e4ee096356fde8a3c1b121b7bd17e19e8adc642d7cdd467db161ba283/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Nov 26 03:11:17 localhost podman[63560]: 2025-11-26 08:11:17.350103443 +0000 UTC m=+0.116976566 container init da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, build-date=2025-11-18T22:49:49Z, distribution-scope=public, release=1761123044, managed_by=tripleo_ansible, config_id=tripleo_step3, vcs-type=git, description=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, com.redhat.component=openstack-rsyslog-container, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '89e2bf3e240198013fa934e7fe0b50df'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, name=rhosp17/openstack-rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, maintainer=OpenStack TripleO Team) Nov 26 03:11:17 localhost podman[63560]: 2025-11-26 08:11:17.360558373 +0000 UTC m=+0.127431506 container start da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, build-date=2025-11-18T22:49:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.component=openstack-rsyslog-container, version=17.1.12, name=rhosp17/openstack-rsyslog, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, architecture=x86_64, container_name=rsyslog, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '89e2bf3e240198013fa934e7fe0b50df'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:11:17 localhost podman[63560]: rsyslog Nov 26 03:11:17 localhost systemd[1]: Started rsyslog container. Nov 26 03:11:17 localhost systemd[1]: libpod-da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de.scope: Deactivated successfully. Nov 26 03:11:17 localhost podman[63597]: 2025-11-26 08:11:17.481577252 +0000 UTC m=+0.041172393 container died da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, build-date=2025-11-18T22:49:49Z, summary=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, release=1761123044, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '89e2bf3e240198013fa934e7fe0b50df'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-type=git, name=rhosp17/openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, tcib_managed=true, version=17.1.12, distribution-scope=public, architecture=x86_64, container_name=rsyslog) Nov 26 03:11:17 localhost podman[63597]: 2025-11-26 08:11:17.502533723 +0000 UTC m=+0.062128814 container cleanup da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, com.redhat.component=openstack-rsyslog-container, version=17.1.12, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:49Z, release=1761123044, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, distribution-scope=public, container_name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.4, name=rhosp17/openstack-rsyslog, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '89e2bf3e240198013fa934e7fe0b50df'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git) Nov 26 03:11:17 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:11:17 localhost podman[63610]: 2025-11-26 08:11:17.594887623 +0000 UTC m=+0.065502058 container cleanup da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, container_name=rsyslog, com.redhat.component=openstack-rsyslog-container, summary=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-type=git, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.4, name=rhosp17/openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1761123044, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '89e2bf3e240198013fa934e7fe0b50df'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, build-date=2025-11-18T22:49:49Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 rsyslog) Nov 26 03:11:17 localhost podman[63610]: rsyslog Nov 26 03:11:17 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Nov 26 03:11:17 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 4. Nov 26 03:11:17 localhost systemd[1]: Stopped rsyslog container. Nov 26 03:11:17 localhost systemd[1]: Starting rsyslog container... Nov 26 03:11:17 localhost systemd[1]: var-lib-containers-storage-overlay-4bba6b9e4ee096356fde8a3c1b121b7bd17e19e8adc642d7cdd467db161ba283-merged.mount: Deactivated successfully. Nov 26 03:11:17 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de-userdata-shm.mount: Deactivated successfully. Nov 26 03:11:17 localhost python3[63637]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks3.json short_hostname=np0005536118 step=3 update_config_hash_only=False Nov 26 03:11:17 localhost systemd[1]: Started libcrun container. Nov 26 03:11:17 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4bba6b9e4ee096356fde8a3c1b121b7bd17e19e8adc642d7cdd467db161ba283/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Nov 26 03:11:17 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4bba6b9e4ee096356fde8a3c1b121b7bd17e19e8adc642d7cdd467db161ba283/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Nov 26 03:11:17 localhost podman[63638]: 2025-11-26 08:11:17.866173716 +0000 UTC m=+0.124454854 container init da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:49Z, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '89e2bf3e240198013fa934e7fe0b50df'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, version=17.1.12, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, config_id=tripleo_step3, name=rhosp17/openstack-rsyslog, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=rsyslog, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container) Nov 26 03:11:17 localhost podman[63638]: 2025-11-26 08:11:17.874380998 +0000 UTC m=+0.132662136 container start da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-rsyslog, io.openshift.expose-services=, vcs-type=git, release=1761123044, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:49Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, com.redhat.component=openstack-rsyslog-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '89e2bf3e240198013fa934e7fe0b50df'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, managed_by=tripleo_ansible, container_name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., config_id=tripleo_step3) Nov 26 03:11:17 localhost podman[63638]: rsyslog Nov 26 03:11:17 localhost systemd[1]: Started rsyslog container. Nov 26 03:11:17 localhost systemd[1]: libpod-da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de.scope: Deactivated successfully. Nov 26 03:11:18 localhost podman[63660]: 2025-11-26 08:11:18.001871584 +0000 UTC m=+0.039437520 container died da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '89e2bf3e240198013fa934e7fe0b50df'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, batch=17.1_20251118.1, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:49Z, io.buildah.version=1.41.4, release=1761123044, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-rsyslog-container, managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com, container_name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog) Nov 26 03:11:18 localhost podman[63660]: 2025-11-26 08:11:18.026637203 +0000 UTC m=+0.064203079 container cleanup da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, maintainer=OpenStack TripleO Team, release=1761123044, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, io.buildah.version=1.41.4, com.redhat.component=openstack-rsyslog-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, build-date=2025-11-18T22:49:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '89e2bf3e240198013fa934e7fe0b50df'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, container_name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc.) Nov 26 03:11:18 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:11:18 localhost podman[63675]: 2025-11-26 08:11:18.120118248 +0000 UTC m=+0.067671596 container cleanup da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, name=rhosp17/openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, config_id=tripleo_step3, tcib_managed=true, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '89e2bf3e240198013fa934e7fe0b50df'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2025-11-18T22:49:49Z, com.redhat.component=openstack-rsyslog-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, distribution-scope=public, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, container_name=rsyslog, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:11:18 localhost podman[63675]: rsyslog Nov 26 03:11:18 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Nov 26 03:11:18 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 5. Nov 26 03:11:18 localhost systemd[1]: Stopped rsyslog container. Nov 26 03:11:18 localhost systemd[1]: tripleo_rsyslog.service: Start request repeated too quickly. Nov 26 03:11:18 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Nov 26 03:11:18 localhost systemd[1]: Failed to start rsyslog container. Nov 26 03:11:18 localhost python3[63703]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:11:18 localhost systemd[1]: var-lib-containers-storage-overlay-4bba6b9e4ee096356fde8a3c1b121b7bd17e19e8adc642d7cdd467db161ba283-merged.mount: Deactivated successfully. Nov 26 03:11:18 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-da49fb44bf0c68024126d00226883fbfc75e2b01b86661adb0a73c9b26b527de-userdata-shm.mount: Deactivated successfully. Nov 26 03:11:18 localhost python3[63719]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_3 config_pattern=container-puppet-*.json config_overrides={} debug=True Nov 26 03:11:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:11:20 localhost podman[63720]: 2025-11-26 08:11:20.800496639 +0000 UTC m=+0.059844814 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=starting, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, version=17.1.12, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, batch=17.1_20251118.1, container_name=collectd, config_id=tripleo_step3, distribution-scope=public) Nov 26 03:11:20 localhost podman[63720]: 2025-11-26 08:11:20.835275836 +0000 UTC m=+0.094624041 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.openshift.expose-services=, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, name=rhosp17/openstack-collectd, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vendor=Red Hat, Inc.) Nov 26 03:11:20 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:11:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:11:21 localhost systemd[1]: tmp-crun.MqIVcI.mount: Deactivated successfully. Nov 26 03:11:21 localhost podman[63738]: 2025-11-26 08:11:21.801130822 +0000 UTC m=+0.066993614 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=starting, com.redhat.component=openstack-iscsid-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, tcib_managed=true, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Nov 26 03:11:21 localhost podman[63738]: 2025-11-26 08:11:21.812875691 +0000 UTC m=+0.078738543 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Nov 26 03:11:21 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:11:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:11:31 localhost systemd[1]: tmp-crun.rE8HGa.mount: Deactivated successfully. Nov 26 03:11:31 localhost podman[63757]: 2025-11-26 08:11:31.834239066 +0000 UTC m=+0.095557198 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, architecture=x86_64, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:11:32 localhost podman[63757]: 2025-11-26 08:11:32.032082489 +0000 UTC m=+0.293400631 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044) Nov 26 03:11:32 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:11:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:11:52 localhost podman[63865]: 2025-11-26 08:11:52.084573008 +0000 UTC m=+0.346229741 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, architecture=x86_64, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=collectd, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=) Nov 26 03:11:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:11:52 localhost podman[63865]: 2025-11-26 08:11:52.123420608 +0000 UTC m=+0.385077241 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12) Nov 26 03:11:52 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:11:52 localhost podman[63882]: 2025-11-26 08:11:52.195847047 +0000 UTC m=+0.089387039 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_id=tripleo_step3, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, url=https://www.redhat.com, version=17.1.12, vcs-type=git, io.buildah.version=1.41.4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z) Nov 26 03:11:52 localhost podman[63882]: 2025-11-26 08:11:52.236418351 +0000 UTC m=+0.129958353 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, version=17.1.12, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, config_id=tripleo_step3, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 26 03:11:52 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:11:57 localhost sshd[63904]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:12:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:12:02 localhost podman[63906]: 2025-11-26 08:12:02.817750125 +0000 UTC m=+0.079487436 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, batch=17.1_20251118.1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, release=1761123044, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:12:02 localhost podman[63906]: 2025-11-26 08:12:02.993333646 +0000 UTC m=+0.255070977 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, url=https://www.redhat.com, architecture=x86_64, container_name=metrics_qdr, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:12:03 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:12:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:12:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:12:22 localhost podman[63936]: 2025-11-26 08:12:22.81817433 +0000 UTC m=+0.080715793 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, container_name=collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, version=17.1.12, architecture=x86_64, release=1761123044, vcs-type=git, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Nov 26 03:12:22 localhost podman[63936]: 2025-11-26 08:12:22.823994569 +0000 UTC m=+0.086536022 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.12, architecture=x86_64, container_name=collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, config_id=tripleo_step3, release=1761123044, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, maintainer=OpenStack TripleO Team) Nov 26 03:12:22 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:12:22 localhost podman[63935]: 2025-11-26 08:12:22.866688637 +0000 UTC m=+0.129830139 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, url=https://www.redhat.com, managed_by=tripleo_ansible, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid) Nov 26 03:12:22 localhost podman[63935]: 2025-11-26 08:12:22.87722925 +0000 UTC m=+0.140370762 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, batch=17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.buildah.version=1.41.4, tcib_managed=true) Nov 26 03:12:22 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:12:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:12:33 localhost systemd[1]: tmp-crun.hgsVLa.mount: Deactivated successfully. Nov 26 03:12:33 localhost podman[63973]: 2025-11-26 08:12:33.823015392 +0000 UTC m=+0.084545791 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, url=https://www.redhat.com, container_name=metrics_qdr, vendor=Red Hat, Inc., release=1761123044, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_id=tripleo_step1) Nov 26 03:12:34 localhost podman[63973]: 2025-11-26 08:12:34.01321021 +0000 UTC m=+0.274740609 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, version=17.1.12, config_id=tripleo_step1, container_name=metrics_qdr, batch=17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:12:34 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:12:38 localhost sshd[64078]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:12:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:12:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:12:53 localhost systemd[1]: tmp-crun.KAj1kS.mount: Deactivated successfully. Nov 26 03:12:53 localhost systemd[1]: tmp-crun.77gJec.mount: Deactivated successfully. Nov 26 03:12:53 localhost podman[64080]: 2025-11-26 08:12:53.839742986 +0000 UTC m=+0.097894980 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, tcib_managed=true, config_id=tripleo_step3, version=17.1.12, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, managed_by=tripleo_ansible, container_name=iscsid) Nov 26 03:12:53 localhost podman[64081]: 2025-11-26 08:12:53.813039479 +0000 UTC m=+0.073124712 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, container_name=collectd, build-date=2025-11-18T22:51:28Z, vcs-type=git, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container) Nov 26 03:12:53 localhost podman[64081]: 2025-11-26 08:12:53.905308706 +0000 UTC m=+0.165393959 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, distribution-scope=public, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, release=1761123044, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, io.openshift.expose-services=) Nov 26 03:12:53 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:12:53 localhost podman[64080]: 2025-11-26 08:12:53.925108973 +0000 UTC m=+0.183261057 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Nov 26 03:12:53 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:13:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:13:04 localhost podman[64118]: 2025-11-26 08:13:04.822247995 +0000 UTC m=+0.083292293 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, release=1761123044, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Nov 26 03:13:05 localhost podman[64118]: 2025-11-26 08:13:05.039305064 +0000 UTC m=+0.300349362 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, version=17.1.12, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true) Nov 26 03:13:05 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:13:19 localhost sshd[64147]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:13:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:13:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:13:24 localhost systemd[1]: tmp-crun.fAzyqw.mount: Deactivated successfully. Nov 26 03:13:24 localhost podman[64150]: 2025-11-26 08:13:24.809105021 +0000 UTC m=+0.072198284 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, container_name=collectd, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, batch=17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-collectd-container, version=17.1.12, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc.) Nov 26 03:13:24 localhost podman[64149]: 2025-11-26 08:13:24.827765453 +0000 UTC m=+0.090583716 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, version=17.1.12, io.openshift.expose-services=) Nov 26 03:13:24 localhost podman[64149]: 2025-11-26 08:13:24.840482172 +0000 UTC m=+0.103300495 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, architecture=x86_64, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, tcib_managed=true, managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, container_name=iscsid, url=https://www.redhat.com) Nov 26 03:13:24 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:13:24 localhost podman[64150]: 2025-11-26 08:13:24.896054635 +0000 UTC m=+0.159147878 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd) Nov 26 03:13:24 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:13:25 localhost systemd[1]: tmp-crun.Mn3J2H.mount: Deactivated successfully. Nov 26 03:13:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:13:35 localhost podman[64186]: 2025-11-26 08:13:35.79855098 +0000 UTC m=+0.061316780 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-type=git, version=17.1.12, build-date=2025-11-18T22:49:46Z, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, distribution-scope=public, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, url=https://www.redhat.com) Nov 26 03:13:35 localhost podman[64186]: 2025-11-26 08:13:35.973383117 +0000 UTC m=+0.236148957 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, container_name=metrics_qdr, vendor=Red Hat, Inc., tcib_managed=true, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:13:35 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:13:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:13:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:13:55 localhost podman[64291]: 2025-11-26 08:13:55.810171371 +0000 UTC m=+0.073974291 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step3, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:13:55 localhost podman[64291]: 2025-11-26 08:13:55.8209186 +0000 UTC m=+0.084721570 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3) Nov 26 03:13:55 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:13:55 localhost podman[64292]: 2025-11-26 08:13:55.882239894 +0000 UTC m=+0.140033180 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, tcib_managed=true, build-date=2025-11-18T22:51:28Z, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, version=17.1.12, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:13:55 localhost podman[64292]: 2025-11-26 08:13:55.895100087 +0000 UTC m=+0.152893383 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.component=openstack-collectd-container, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, version=17.1.12, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Nov 26 03:13:55 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:13:58 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:61:25:a2:5a:71 MACDST=fa:16:3e:73:ba:36 MACPROTO=0800 SRC=79.124.49.90 DST=38.102.83.176 LEN=40 TOS=0x08 PREC=0x20 TTL=238 ID=51489 PROTO=TCP SPT=43241 DPT=9090 SEQ=3763910931 ACK=0 WINDOW=1024 RES=0x00 SYN URGP=0 Nov 26 03:14:00 localhost sshd[64330]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:14:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:14:06 localhost systemd[1]: tmp-crun.gawsq5.mount: Deactivated successfully. Nov 26 03:14:06 localhost podman[64332]: 2025-11-26 08:14:06.825648917 +0000 UTC m=+0.091859739 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, batch=17.1_20251118.1, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, tcib_managed=true) Nov 26 03:14:07 localhost podman[64332]: 2025-11-26 08:14:07.065344761 +0000 UTC m=+0.331555613 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, release=1761123044, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, architecture=x86_64, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.expose-services=) Nov 26 03:14:07 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:14:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:14:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:14:26 localhost systemd[1]: tmp-crun.K7Qa3S.mount: Deactivated successfully. Nov 26 03:14:26 localhost podman[64362]: 2025-11-26 08:14:26.812735173 +0000 UTC m=+0.073817396 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, name=rhosp17/openstack-iscsid, distribution-scope=public, vcs-type=git, version=17.1.12, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, architecture=x86_64, config_id=tripleo_step3, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Nov 26 03:14:26 localhost podman[64362]: 2025-11-26 08:14:26.824537025 +0000 UTC m=+0.085619248 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, batch=17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_id=tripleo_step3, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team) Nov 26 03:14:26 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:14:26 localhost podman[64363]: 2025-11-26 08:14:26.907607452 +0000 UTC m=+0.165613681 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:14:26 localhost podman[64363]: 2025-11-26 08:14:26.921296631 +0000 UTC m=+0.179302860 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, build-date=2025-11-18T22:51:28Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, version=17.1.12, architecture=x86_64, release=1761123044, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:14:26 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:14:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:14:37 localhost podman[64399]: 2025-11-26 08:14:37.81606423 +0000 UTC m=+0.077529800 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, architecture=x86_64) Nov 26 03:14:38 localhost podman[64399]: 2025-11-26 08:14:38.044688947 +0000 UTC m=+0.306154527 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, vcs-type=git, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12) Nov 26 03:14:38 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:14:40 localhost sshd[64489]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:14:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:14:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:14:57 localhost podman[64508]: 2025-11-26 08:14:57.814760599 +0000 UTC m=+0.075568660 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, release=1761123044, vcs-type=git, com.redhat.component=openstack-collectd-container, container_name=collectd, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, name=rhosp17/openstack-collectd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:14:57 localhost podman[64508]: 2025-11-26 08:14:57.828098307 +0000 UTC m=+0.088906338 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, config_id=tripleo_step3, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public) Nov 26 03:14:57 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:14:57 localhost podman[64507]: 2025-11-26 08:14:57.913068963 +0000 UTC m=+0.176917107 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, distribution-scope=public, io.buildah.version=1.41.4, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, release=1761123044, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container) Nov 26 03:14:57 localhost podman[64507]: 2025-11-26 08:14:57.953291362 +0000 UTC m=+0.217139486 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, tcib_managed=true, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, release=1761123044, container_name=iscsid, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid) Nov 26 03:14:57 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:15:04 localhost python3[64592]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:15:05 localhost python3[64637]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764144904.616347-107485-59049622244/source _original_basename=tmpghxmdeir follow=False checksum=ee48fb03297eb703b1954c8852d0f67fab51dac1 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:15:06 localhost python3[64699]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/recover_tripleo_nova_virtqemud.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:15:07 localhost python3[64742]: ansible-ansible.legacy.copy Invoked with dest=/usr/libexec/recover_tripleo_nova_virtqemud.sh mode=0755 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764144906.297193-107580-55326336990591/source _original_basename=tmpj4gov1nr follow=False checksum=922b8aa8342176110bffc2e39abdccc2b39e53a9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:15:07 localhost python3[64804]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:15:08 localhost python3[64847]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_virtqemud_recover.service mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764144907.2974384-107637-33755673637486/source _original_basename=tmprw66p4sv follow=False checksum=92f73544b703afc85885fa63ab07bdf8f8671554 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:15:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:15:08 localhost podman[64848]: 2025-11-26 08:15:08.155773394 +0000 UTC m=+0.075363664 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-type=git, container_name=metrics_qdr, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, release=1761123044, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=) Nov 26 03:15:08 localhost podman[64848]: 2025-11-26 08:15:08.350438823 +0000 UTC m=+0.270029173 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, url=https://www.redhat.com, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, release=1761123044, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.expose-services=, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:15:08 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:15:08 localhost python3[64938]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.timer follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:15:08 localhost python3[64981]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_virtqemud_recover.timer mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764144908.3141725-107702-277207634748463/source _original_basename=tmpb6e_b7vz follow=False checksum=c6e5f76a53c0d6ccaf46c4b48d813dc2891ad8e9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:15:09 localhost python3[65011]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_virtqemud_recover.service daemon_reexec=False scope=system no_block=False state=None force=None masked=None Nov 26 03:15:09 localhost systemd[1]: Reloading. Nov 26 03:15:09 localhost systemd-rc-local-generator[65038]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:15:09 localhost systemd-sysv-generator[65042]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:15:09 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:15:10 localhost systemd[1]: Reloading. Nov 26 03:15:10 localhost systemd-sysv-generator[65076]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:15:10 localhost systemd-rc-local-generator[65072]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:15:10 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:15:10 localhost python3[65101]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_virtqemud_recover.timer state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 03:15:10 localhost systemd[1]: Reloading. Nov 26 03:15:11 localhost systemd-rc-local-generator[65129]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:15:11 localhost systemd-sysv-generator[65133]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:15:11 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:15:11 localhost systemd[1]: Reloading. Nov 26 03:15:11 localhost systemd-sysv-generator[65167]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:15:11 localhost systemd-rc-local-generator[65163]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:15:11 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:15:11 localhost systemd[1]: Started Check and recover tripleo_nova_virtqemud every 10m. Nov 26 03:15:11 localhost python3[65192]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl enable --now tripleo_nova_virtqemud_recover.timer _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 03:15:11 localhost systemd[1]: Reloading. Nov 26 03:15:12 localhost systemd-sysv-generator[65218]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:15:12 localhost systemd-rc-local-generator[65214]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:15:12 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:15:12 localhost python3[65276]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:15:13 localhost python3[65319]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_libvirt.target group=root mode=0644 owner=root src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764144912.4478896-107981-269775933787427/source _original_basename=tmpxnwv_4vb follow=False checksum=c064b4a8e7d3d1d7c62d1f80a09e350659996afd backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:15:13 localhost python3[65349]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 03:15:13 localhost systemd[1]: Reloading. Nov 26 03:15:13 localhost systemd-sysv-generator[65376]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:15:13 localhost systemd-rc-local-generator[65372]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:15:13 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:15:14 localhost systemd[1]: Reached target tripleo_nova_libvirt.target. Nov 26 03:15:14 localhost python3[65404]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 03:15:16 localhost ansible-async_wrapper.py[65576]: Invoked with 961377220690 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764144915.5450478-108204-259416211268540/AnsiballZ_command.py _ Nov 26 03:15:16 localhost ansible-async_wrapper.py[65579]: Starting module and watcher Nov 26 03:15:16 localhost ansible-async_wrapper.py[65579]: Start watching 65580 (3600) Nov 26 03:15:16 localhost ansible-async_wrapper.py[65580]: Start module (65580) Nov 26 03:15:16 localhost ansible-async_wrapper.py[65576]: Return async_wrapper task started. Nov 26 03:15:16 localhost python3[65600]: ansible-ansible.legacy.async_status Invoked with jid=961377220690.65576 mode=status _async_dir=/tmp/.ansible_async Nov 26 03:15:19 localhost puppet-user[65599]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 26 03:15:19 localhost puppet-user[65599]: (file: /etc/puppet/hiera.yaml) Nov 26 03:15:19 localhost puppet-user[65599]: Warning: Undefined variable '::deploy_config_name'; Nov 26 03:15:19 localhost puppet-user[65599]: (file & line not available) Nov 26 03:15:19 localhost puppet-user[65599]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 26 03:15:19 localhost puppet-user[65599]: (file & line not available) Nov 26 03:15:19 localhost puppet-user[65599]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Nov 26 03:15:19 localhost puppet-user[65599]: Warning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/snmp/manifests/params.pp", 310]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Nov 26 03:15:19 localhost puppet-user[65599]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Nov 26 03:15:19 localhost puppet-user[65599]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Nov 26 03:15:19 localhost puppet-user[65599]: with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 358]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Nov 26 03:15:19 localhost puppet-user[65599]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Nov 26 03:15:19 localhost puppet-user[65599]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Nov 26 03:15:19 localhost puppet-user[65599]: with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 367]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Nov 26 03:15:19 localhost puppet-user[65599]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Nov 26 03:15:19 localhost puppet-user[65599]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Nov 26 03:15:19 localhost puppet-user[65599]: with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 382]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Nov 26 03:15:19 localhost puppet-user[65599]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Nov 26 03:15:19 localhost puppet-user[65599]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Nov 26 03:15:19 localhost puppet-user[65599]: with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 388]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Nov 26 03:15:19 localhost puppet-user[65599]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Nov 26 03:15:19 localhost puppet-user[65599]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Nov 26 03:15:19 localhost puppet-user[65599]: with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 393]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Nov 26 03:15:19 localhost puppet-user[65599]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Nov 26 03:15:20 localhost puppet-user[65599]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Nov 26 03:15:20 localhost puppet-user[65599]: Notice: Compiled catalog for np0005536118.localdomain in environment production in 0.21 seconds Nov 26 03:15:21 localhost ansible-async_wrapper.py[65579]: 65580 still running (3600) Nov 26 03:15:22 localhost sshd[65718]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:15:26 localhost ansible-async_wrapper.py[65579]: 65580 still running (3595) Nov 26 03:15:26 localhost python3[65802]: ansible-ansible.legacy.async_status Invoked with jid=961377220690.65576 mode=status _async_dir=/tmp/.ansible_async Nov 26 03:15:27 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 26 03:15:27 localhost systemd[1]: Starting man-db-cache-update.service... Nov 26 03:15:27 localhost systemd[1]: Reloading. Nov 26 03:15:27 localhost systemd-rc-local-generator[65877]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:15:27 localhost systemd-sysv-generator[65881]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:15:27 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:15:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:15:27 localhost systemd[1]: Queuing reload/restart jobs for marked units… Nov 26 03:15:28 localhost systemd[1]: tmp-crun.CKLZBm.mount: Deactivated successfully. Nov 26 03:15:28 localhost podman[66180]: 2025-11-26 08:15:28.117320959 +0000 UTC m=+0.206268844 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, batch=17.1_20251118.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, architecture=x86_64, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:15:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:15:28 localhost podman[66180]: 2025-11-26 08:15:28.132285976 +0000 UTC m=+0.221233861 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, com.redhat.component=openstack-collectd-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_id=tripleo_step3, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:15:28 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:15:28 localhost systemd[1]: tmp-crun.YgjjdU.mount: Deactivated successfully. Nov 26 03:15:28 localhost podman[66650]: 2025-11-26 08:15:28.230656342 +0000 UTC m=+0.096200310 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, container_name=iscsid, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20251118.1, release=1761123044, version=17.1.12, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid) Nov 26 03:15:28 localhost podman[66650]: 2025-11-26 08:15:28.241272106 +0000 UTC m=+0.106816054 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vendor=Red Hat, Inc., version=17.1.12, url=https://www.redhat.com, batch=17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, config_id=tripleo_step3, release=1761123044, build-date=2025-11-18T23:44:13Z) Nov 26 03:15:28 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:15:28 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Nov 26 03:15:28 localhost systemd[1]: Finished man-db-cache-update.service. Nov 26 03:15:28 localhost systemd[1]: run-rb571dced2c0440bdb17be0790a58d95c.service: Deactivated successfully. Nov 26 03:15:29 localhost puppet-user[65599]: Notice: /Stage[main]/Snmp/Package[snmpd]/ensure: created Nov 26 03:15:29 localhost puppet-user[65599]: Notice: /Stage[main]/Snmp/File[snmpd.conf]/content: content changed '{sha256}2b743f970e80e2150759bfc66f2d8d0fbd8b31624f79e2991248d1a5ac57494e' to '{sha256}8a01598c34d14681e8dd0b2ce6a8ecea047f0f546981b9647518366a141cf331' Nov 26 03:15:29 localhost puppet-user[65599]: Notice: /Stage[main]/Snmp/File[snmpd.sysconfig]/content: content changed '{sha256}b63afb2dee7419b6834471f88581d981c8ae5c8b27b9d329ba67a02f3ddd8221' to '{sha256}3917ee8bbc680ad50d77186ad4a1d2705c2025c32fc32f823abbda7f2328dfbd' Nov 26 03:15:29 localhost puppet-user[65599]: Notice: /Stage[main]/Snmp/File[snmptrapd.conf]/content: content changed '{sha256}2e1ca894d609ef337b6243909bf5623c87fd5df98ecbd00c7d4c12cf12f03c4e' to '{sha256}3ecf18da1ba84ea3932607f2b903ee6a038b6f9ac4e1e371e48f3ef61c5052ea' Nov 26 03:15:29 localhost puppet-user[65599]: Notice: /Stage[main]/Snmp/File[snmptrapd.sysconfig]/content: content changed '{sha256}86ee5797ad10cb1ea0f631e9dfa6ae278ecf4f4d16f4c80f831cdde45601b23c' to '{sha256}2244553364afcca151958f8e2003e4c182f5e2ecfbe55405cec73fd818581e97' Nov 26 03:15:29 localhost puppet-user[65599]: Notice: /Stage[main]/Snmp/Service[snmptrapd]: Triggered 'refresh' from 2 events Nov 26 03:15:31 localhost ansible-async_wrapper.py[65579]: 65580 still running (3590) Nov 26 03:15:34 localhost puppet-user[65599]: Notice: /Stage[main]/Tripleo::Profile::Base::Snmp/Snmp::Snmpv3_user[ro_snmp_user]/Exec[create-snmpv3-user-ro_snmp_user]/returns: executed successfully Nov 26 03:15:34 localhost systemd[1]: Reloading. Nov 26 03:15:34 localhost systemd-sysv-generator[66970]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:15:34 localhost systemd-rc-local-generator[66966]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:15:34 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:15:34 localhost systemd[1]: Starting Simple Network Management Protocol (SNMP) Daemon.... Nov 26 03:15:34 localhost snmpd[66980]: Can't find directory of RPM packages Nov 26 03:15:34 localhost snmpd[66980]: Duplicate IPv4 address detected, some interfaces may not be visible in IP-MIB Nov 26 03:15:34 localhost systemd[1]: Started Simple Network Management Protocol (SNMP) Daemon.. Nov 26 03:15:34 localhost systemd[1]: Reloading. Nov 26 03:15:34 localhost systemd-rc-local-generator[67003]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:15:34 localhost systemd-sysv-generator[67007]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:15:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:15:35 localhost systemd[1]: Reloading. Nov 26 03:15:35 localhost systemd-rc-local-generator[67039]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:15:35 localhost systemd-sysv-generator[67043]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:15:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:15:35 localhost puppet-user[65599]: Notice: /Stage[main]/Snmp/Service[snmpd]/ensure: ensure changed 'stopped' to 'running' Nov 26 03:15:35 localhost puppet-user[65599]: Notice: Applied catalog in 15.46 seconds Nov 26 03:15:35 localhost puppet-user[65599]: Application: Nov 26 03:15:35 localhost puppet-user[65599]: Initial environment: production Nov 26 03:15:35 localhost puppet-user[65599]: Converged environment: production Nov 26 03:15:35 localhost puppet-user[65599]: Run mode: user Nov 26 03:15:35 localhost puppet-user[65599]: Changes: Nov 26 03:15:35 localhost puppet-user[65599]: Total: 8 Nov 26 03:15:35 localhost puppet-user[65599]: Events: Nov 26 03:15:35 localhost puppet-user[65599]: Success: 8 Nov 26 03:15:35 localhost puppet-user[65599]: Total: 8 Nov 26 03:15:35 localhost puppet-user[65599]: Resources: Nov 26 03:15:35 localhost puppet-user[65599]: Restarted: 1 Nov 26 03:15:35 localhost puppet-user[65599]: Changed: 8 Nov 26 03:15:35 localhost puppet-user[65599]: Out of sync: 8 Nov 26 03:15:35 localhost puppet-user[65599]: Total: 19 Nov 26 03:15:35 localhost puppet-user[65599]: Time: Nov 26 03:15:35 localhost puppet-user[65599]: Filebucket: 0.00 Nov 26 03:15:35 localhost puppet-user[65599]: Schedule: 0.00 Nov 26 03:15:35 localhost puppet-user[65599]: Augeas: 0.01 Nov 26 03:15:35 localhost puppet-user[65599]: File: 0.10 Nov 26 03:15:35 localhost puppet-user[65599]: Config retrieval: 0.27 Nov 26 03:15:35 localhost puppet-user[65599]: Service: 1.20 Nov 26 03:15:35 localhost puppet-user[65599]: Transaction evaluation: 15.46 Nov 26 03:15:35 localhost puppet-user[65599]: Catalog application: 15.46 Nov 26 03:15:35 localhost puppet-user[65599]: Last run: 1764144935 Nov 26 03:15:35 localhost puppet-user[65599]: Exec: 5.09 Nov 26 03:15:35 localhost puppet-user[65599]: Package: 8.87 Nov 26 03:15:35 localhost puppet-user[65599]: Total: 15.47 Nov 26 03:15:35 localhost puppet-user[65599]: Version: Nov 26 03:15:35 localhost puppet-user[65599]: Config: 1764144919 Nov 26 03:15:35 localhost puppet-user[65599]: Puppet: 7.10.0 Nov 26 03:15:35 localhost ansible-async_wrapper.py[65580]: Module complete (65580) Nov 26 03:15:36 localhost ansible-async_wrapper.py[65579]: Done in kid B. Nov 26 03:15:37 localhost python3[67068]: ansible-ansible.legacy.async_status Invoked with jid=961377220690.65576 mode=status _async_dir=/tmp/.ansible_async Nov 26 03:15:37 localhost python3[67084]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 26 03:15:38 localhost python3[67100]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 03:15:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:15:38 localhost podman[67150]: 2025-11-26 08:15:38.630428814 +0000 UTC m=+0.080358766 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, batch=17.1_20251118.1, architecture=x86_64) Nov 26 03:15:38 localhost python3[67151]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:15:38 localhost podman[67150]: 2025-11-26 08:15:38.817470369 +0000 UTC m=+0.267400291 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, vendor=Red Hat, Inc., release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.openshift.expose-services=) Nov 26 03:15:38 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:15:38 localhost python3[67195]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmp2vp42dek recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 26 03:15:39 localhost python3[67226]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:15:40 localhost python3[67329]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Nov 26 03:15:41 localhost python3[67348]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:15:42 localhost python3[67380]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 03:15:42 localhost python3[67430]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:15:43 localhost python3[67448]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:15:43 localhost python3[67510]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:15:43 localhost python3[67528]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:15:44 localhost python3[67590]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:15:44 localhost python3[67621]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:15:45 localhost python3[67715]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:15:45 localhost python3[67750]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:15:45 localhost python3[67810]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 03:15:45 localhost systemd[1]: Reloading. Nov 26 03:15:45 localhost systemd-sysv-generator[67857]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:15:45 localhost systemd-rc-local-generator[67852]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:15:46 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:15:46 localhost python3[67915]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:15:46 localhost python3[67933]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:15:47 localhost python3[67995]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:15:47 localhost python3[68013]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:15:48 localhost python3[68043]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 03:15:48 localhost systemd[1]: Reloading. Nov 26 03:15:48 localhost systemd-sysv-generator[68064]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:15:48 localhost systemd-rc-local-generator[68060]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:15:48 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:15:48 localhost systemd[1]: Starting Create netns directory... Nov 26 03:15:48 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Nov 26 03:15:48 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Nov 26 03:15:48 localhost systemd[1]: Finished Create netns directory. Nov 26 03:15:49 localhost python3[68115]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Nov 26 03:15:51 localhost python3[68173]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step4 config_dir=/var/lib/tripleo-config/container-startup-config/step_4 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Nov 26 03:15:51 localhost podman[68314]: 2025-11-26 08:15:51.706421552 +0000 UTC m=+0.078976334 container create 88859464a8dcc625f3a0cd00ac8601826e60978192a1e2d12d34e16762adcf0f (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=configure_cms_options, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, io.buildah.version=1.41.4, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:15:51 localhost systemd[1]: Started libpod-conmon-88859464a8dcc625f3a0cd00ac8601826e60978192a1e2d12d34e16762adcf0f.scope. Nov 26 03:15:51 localhost systemd[1]: Started libcrun container. Nov 26 03:15:51 localhost podman[68344]: 2025-11-26 08:15:51.764345343 +0000 UTC m=+0.106152545 container create 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public) Nov 26 03:15:51 localhost podman[68314]: 2025-11-26 08:15:51.766315903 +0000 UTC m=+0.138870685 container init 88859464a8dcc625f3a0cd00ac8601826e60978192a1e2d12d34e16762adcf0f (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, architecture=x86_64, vcs-type=git, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=configure_cms_options, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:15:51 localhost podman[68314]: 2025-11-26 08:15:51.66967147 +0000 UTC m=+0.042226292 image pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Nov 26 03:15:51 localhost podman[68356]: 2025-11-26 08:15:51.779550907 +0000 UTC m=+0.104886986 container create a6f1ec72c694ee3d21b53bc847b40f0b2933ed88356e3442959458e7c8f9a115 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, tcib_managed=true, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, batch=17.1_20251118.1, release=1761123044, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.buildah.version=1.41.4, container_name=nova_libvirt_init_secret, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, vendor=Red Hat, Inc., config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team) Nov 26 03:15:51 localhost podman[68314]: 2025-11-26 08:15:51.786799739 +0000 UTC m=+0.159354521 container start 88859464a8dcc625f3a0cd00ac8601826e60978192a1e2d12d34e16762adcf0f (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, io.buildah.version=1.41.4, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, release=1761123044, vcs-type=git, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, container_name=configure_cms_options, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., managed_by=tripleo_ansible) Nov 26 03:15:51 localhost podman[68314]: 2025-11-26 08:15:51.786984804 +0000 UTC m=+0.159539586 container attach 88859464a8dcc625f3a0cd00ac8601826e60978192a1e2d12d34e16762adcf0f (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, release=1761123044, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=configure_cms_options, maintainer=OpenStack TripleO Team) Nov 26 03:15:51 localhost podman[68366]: 2025-11-26 08:15:51.802991764 +0000 UTC m=+0.111172649 container create 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.4, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, url=https://www.redhat.com, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Nov 26 03:15:51 localhost podman[68344]: 2025-11-26 08:15:51.716885662 +0000 UTC m=+0.058692864 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 Nov 26 03:15:51 localhost podman[68356]: 2025-11-26 08:15:51.719858403 +0000 UTC m=+0.045194482 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 26 03:15:51 localhost systemd[1]: Started libpod-conmon-a6f1ec72c694ee3d21b53bc847b40f0b2933ed88356e3442959458e7c8f9a115.scope. Nov 26 03:15:51 localhost systemd[1]: Started libcrun container. Nov 26 03:15:51 localhost podman[68366]: 2025-11-26 08:15:51.735957566 +0000 UTC m=+0.044138461 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 Nov 26 03:15:51 localhost systemd[1]: Started libpod-conmon-3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.scope. Nov 26 03:15:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/06407ef92aa62a84cce5a4c105c3570211ecf8dcded9a8a2a2909fc5cebcb893/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Nov 26 03:15:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/06407ef92aa62a84cce5a4c105c3570211ecf8dcded9a8a2a2909fc5cebcb893/merged/etc/nova supports timestamps until 2038 (0x7fffffff) Nov 26 03:15:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/06407ef92aa62a84cce5a4c105c3570211ecf8dcded9a8a2a2909fc5cebcb893/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Nov 26 03:15:51 localhost systemd[1]: Started libpod-conmon-90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.scope. Nov 26 03:15:51 localhost podman[68356]: 2025-11-26 08:15:51.844526353 +0000 UTC m=+0.169862432 container init a6f1ec72c694ee3d21b53bc847b40f0b2933ed88356e3442959458e7c8f9a115 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, managed_by=tripleo_ansible, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, container_name=nova_libvirt_init_secret, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, vendor=Red Hat, Inc., release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, name=rhosp17/openstack-nova-libvirt, config_id=tripleo_step4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=) Nov 26 03:15:51 localhost systemd[1]: Started libcrun container. Nov 26 03:15:51 localhost podman[68372]: 2025-11-26 08:15:51.848334329 +0000 UTC m=+0.144843457 container create 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, release=1761123044, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc.) Nov 26 03:15:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0518d1e1e8bf83217956125556b84af78ba63cfd3e598ff1993dc680b963ed72/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff) Nov 26 03:15:51 localhost systemd[1]: Started libcrun container. Nov 26 03:15:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17c292e04cc2973af4faecaf51a38b12d0c20f47d0b5fc279a11e99087cbc694/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff) Nov 26 03:15:51 localhost systemd[1]: Started libpod-conmon-7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.scope. Nov 26 03:15:51 localhost ovs-vsctl[68434]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . external_ids ovn-cms-options Nov 26 03:15:51 localhost systemd[1]: libpod-88859464a8dcc625f3a0cd00ac8601826e60978192a1e2d12d34e16762adcf0f.scope: Deactivated successfully. Nov 26 03:15:51 localhost podman[68314]: 2025-11-26 08:15:51.893763957 +0000 UTC m=+0.266318759 container died 88859464a8dcc625f3a0cd00ac8601826e60978192a1e2d12d34e16762adcf0f (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=configure_cms_options, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, version=17.1.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible) Nov 26 03:15:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:15:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:15:51 localhost podman[68344]: 2025-11-26 08:15:51.89976226 +0000 UTC m=+0.241569452 container init 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, vcs-type=git, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12) Nov 26 03:15:51 localhost podman[68366]: 2025-11-26 08:15:51.899986257 +0000 UTC m=+0.208167152 container init 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, version=17.1.12, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4) Nov 26 03:15:51 localhost systemd[1]: Started libcrun container. Nov 26 03:15:51 localhost podman[68356]: 2025-11-26 08:15:51.906568969 +0000 UTC m=+0.231905078 container start a6f1ec72c694ee3d21b53bc847b40f0b2933ed88356e3442959458e7c8f9a115 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, container_name=nova_libvirt_init_secret, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, name=rhosp17/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, com.redhat.component=openstack-nova-libvirt-container) Nov 26 03:15:51 localhost podman[68356]: 2025-11-26 08:15:51.906843987 +0000 UTC m=+0.232180086 container attach a6f1ec72c694ee3d21b53bc847b40f0b2933ed88356e3442959458e7c8f9a115 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, vendor=Red Hat, Inc., tcib_managed=true, container_name=nova_libvirt_init_secret, build-date=2025-11-19T00:35:22Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.12, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, maintainer=OpenStack TripleO Team) Nov 26 03:15:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f99cd177b672ff33074ec35abbc6210e048ba1785e645693f779453f3bd61c4d/merged/var/log/containers supports timestamps until 2038 (0x7fffffff) Nov 26 03:15:51 localhost podman[68372]: 2025-11-26 08:15:51.817661492 +0000 UTC m=+0.114170620 image pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Nov 26 03:15:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:15:51 localhost podman[68366]: 2025-11-26 08:15:51.931025787 +0000 UTC m=+0.239206682 container start 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, release=1761123044, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:15:51 localhost python3[68173]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=f94fd18b42545cee37022470afd201a1 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ceilometer_agent_compute --label managed_by=tripleo_ansible --label config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_agent_compute.log --network host --privileged=False --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 Nov 26 03:15:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:15:51 localhost podman[68344]: 2025-11-26 08:15:51.95178033 +0000 UTC m=+0.293587512 container start 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, managed_by=tripleo_ansible, batch=17.1_20251118.1, tcib_managed=true, architecture=x86_64, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, vcs-type=git, distribution-scope=public) Nov 26 03:15:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:15:51 localhost podman[68372]: 2025-11-26 08:15:51.954148333 +0000 UTC m=+0.250657461 container init 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, release=1761123044, io.openshift.expose-services=, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, distribution-scope=public) Nov 26 03:15:51 localhost python3[68173]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_agent_ipmi --conmon-pidfile /run/ceilometer_agent_ipmi.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=f94fd18b42545cee37022470afd201a1 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ceilometer_agent_ipmi --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_agent_ipmi.log --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 Nov 26 03:15:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:15:51 localhost podman[68372]: 2025-11-26 08:15:51.990553965 +0000 UTC m=+0.287063073 container start 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, distribution-scope=public, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.buildah.version=1.41.4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, url=https://www.redhat.com, config_id=tripleo_step4) Nov 26 03:15:51 localhost python3[68173]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name logrotate_crond --conmon-pidfile /run/logrotate_crond.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=53ed83bb0cae779ff95edb2002262c6f --healthcheck-command /usr/share/openstack-tripleo-common/healthcheck/cron --label config_id=tripleo_step4 --label container_name=logrotate_crond --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/logrotate_crond.log --network none --pid host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro --volume /var/log/containers:/var/log/containers:z registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Nov 26 03:15:52 localhost systemd[1]: libpod-a6f1ec72c694ee3d21b53bc847b40f0b2933ed88356e3442959458e7c8f9a115.scope: Deactivated successfully. Nov 26 03:15:52 localhost podman[68469]: 2025-11-26 08:15:52.048691072 +0000 UTC m=+0.090888049 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=) Nov 26 03:15:52 localhost podman[68440]: 2025-11-26 08:15:52.078851333 +0000 UTC m=+0.172426219 container cleanup 88859464a8dcc625f3a0cd00ac8601826e60978192a1e2d12d34e16762adcf0f (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, batch=17.1_20251118.1, architecture=x86_64, vcs-type=git, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, container_name=configure_cms_options, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:15:52 localhost systemd[1]: libpod-conmon-88859464a8dcc625f3a0cd00ac8601826e60978192a1e2d12d34e16762adcf0f.scope: Deactivated successfully. Nov 26 03:15:52 localhost podman[68460]: 2025-11-26 08:15:52.084105004 +0000 UTC m=+0.145439016 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1761123044, container_name=ceilometer_agent_compute, config_id=tripleo_step4, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 26 03:15:52 localhost python3[68173]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name configure_cms_options --conmon-pidfile /run/configure_cms_options.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1764143208 --label config_id=tripleo_step4 --label container_name=configure_cms_options --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/configure_cms_options.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 /bin/bash -c CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi Nov 26 03:15:52 localhost podman[68469]: 2025-11-26 08:15:52.181249372 +0000 UTC m=+0.223446369 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, tcib_managed=true, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 26 03:15:52 localhost podman[68356]: 2025-11-26 08:15:52.2030797 +0000 UTC m=+0.528415799 container died a6f1ec72c694ee3d21b53bc847b40f0b2933ed88356e3442959458e7c8f9a115 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, container_name=nova_libvirt_init_secret, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, name=rhosp17/openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, config_id=tripleo_step4, tcib_managed=true, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z, release=1761123044, vendor=Red Hat, Inc.) Nov 26 03:15:52 localhost podman[68460]: 2025-11-26 08:15:52.221859673 +0000 UTC m=+0.283193735 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, release=1761123044, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 26 03:15:52 localhost podman[68460]: unhealthy Nov 26 03:15:52 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:15:52 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Failed with result 'exit-code'. Nov 26 03:15:52 localhost podman[68469]: unhealthy Nov 26 03:15:52 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:15:52 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Failed with result 'exit-code'. Nov 26 03:15:52 localhost podman[68505]: 2025-11-26 08:15:52.068773916 +0000 UTC m=+0.076242541 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=starting, release=1761123044, url=https://www.redhat.com, batch=17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron) Nov 26 03:15:52 localhost podman[68522]: 2025-11-26 08:15:52.279033091 +0000 UTC m=+0.266875546 container cleanup a6f1ec72c694ee3d21b53bc847b40f0b2933ed88356e3442959458e7c8f9a115 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, managed_by=tripleo_ansible, container_name=nova_libvirt_init_secret, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1761123044, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt, vcs-type=git, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:15:52 localhost systemd[1]: libpod-conmon-a6f1ec72c694ee3d21b53bc847b40f0b2933ed88356e3442959458e7c8f9a115.scope: Deactivated successfully. Nov 26 03:15:52 localhost podman[68505]: 2025-11-26 08:15:52.302225289 +0000 UTC m=+0.309693924 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.expose-services=, batch=17.1_20251118.1, config_id=tripleo_step4, vcs-type=git, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, managed_by=tripleo_ansible, vendor=Red Hat, Inc.) Nov 26 03:15:52 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:15:52 localhost python3[68173]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_libvirt_init_secret --cgroupns=host --conmon-pidfile /run/nova_libvirt_init_secret.pid --detach=False --env LIBVIRT_DEFAULT_URI=qemu:///system --env TRIPLEO_CONFIG_HASH=c7803ed1795969cb7cf47e6d4d57c4b9 --label config_id=tripleo_step4 --label container_name=nova_libvirt_init_secret --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_libvirt_init_secret.log --network host --privileged=False --security-opt label=disable --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova --volume /etc/libvirt:/etc/libvirt --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro --volume /var/lib/tripleo-config/ceph:/etc/ceph:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /nova_libvirt_init_secret.sh ceph:openstack Nov 26 03:15:52 localhost podman[68698]: 2025-11-26 08:15:52.466874441 +0000 UTC m=+0.069458514 container create 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, vcs-type=git, container_name=nova_migration_target, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=) Nov 26 03:15:52 localhost systemd[1]: Started libpod-conmon-9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.scope. Nov 26 03:15:52 localhost systemd[1]: Started libcrun container. Nov 26 03:15:52 localhost podman[68698]: 2025-11-26 08:15:52.427027223 +0000 UTC m=+0.029611336 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Nov 26 03:15:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1ad32a5db29098f5568060ccdb89afe68c9fb2dd318793af5aa95785da54e96e/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 26 03:15:52 localhost podman[68740]: 2025-11-26 08:15:52.555581052 +0000 UTC m=+0.071737944 container create ee90a703d474495d57becf520e5f387ffdd300ed7d3603f704ae4f3d84996836 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, container_name=setup_ovs_manager, managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, tcib_managed=true, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, config_id=tripleo_step4) Nov 26 03:15:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:15:52 localhost podman[68698]: 2025-11-26 08:15:52.572501669 +0000 UTC m=+0.175085732 container init 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, architecture=x86_64, build-date=2025-11-19T00:36:58Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 26 03:15:52 localhost systemd[1]: Started libpod-conmon-ee90a703d474495d57becf520e5f387ffdd300ed7d3603f704ae4f3d84996836.scope. Nov 26 03:15:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:15:52 localhost podman[68698]: 2025-11-26 08:15:52.610362815 +0000 UTC m=+0.212946888 container start 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, distribution-scope=public, release=1761123044) Nov 26 03:15:52 localhost systemd[1]: Started libcrun container. Nov 26 03:15:52 localhost podman[68740]: 2025-11-26 08:15:52.516314221 +0000 UTC m=+0.032471153 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Nov 26 03:15:52 localhost python3[68173]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_migration_target --conmon-pidfile /run/nova_migration_target.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=c7803ed1795969cb7cf47e6d4d57c4b9 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=nova_migration_target --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_migration_target.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /etc/ssh:/host-ssh:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Nov 26 03:15:52 localhost podman[68740]: 2025-11-26 08:15:52.656091382 +0000 UTC m=+0.172248284 container init ee90a703d474495d57becf520e5f387ffdd300ed7d3603f704ae4f3d84996836 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=setup_ovs_manager, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:15:52 localhost podman[68740]: 2025-11-26 08:15:52.680577721 +0000 UTC m=+0.196734613 container start ee90a703d474495d57becf520e5f387ffdd300ed7d3603f704ae4f3d84996836 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20251118.1, container_name=setup_ovs_manager, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 26 03:15:52 localhost podman[68740]: 2025-11-26 08:15:52.680843779 +0000 UTC m=+0.197000711 container attach ee90a703d474495d57becf520e5f387ffdd300ed7d3603f704ae4f3d84996836 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=setup_ovs_manager, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, batch=17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:15:52 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a6f1ec72c694ee3d21b53bc847b40f0b2933ed88356e3442959458e7c8f9a115-userdata-shm.mount: Deactivated successfully. Nov 26 03:15:52 localhost systemd[1]: var-lib-containers-storage-overlay-64dc032106e0155914798cde9f31ffb5c79bf4498cfa055c6994544bc631b545-merged.mount: Deactivated successfully. Nov 26 03:15:52 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-88859464a8dcc625f3a0cd00ac8601826e60978192a1e2d12d34e16762adcf0f-userdata-shm.mount: Deactivated successfully. Nov 26 03:15:52 localhost podman[68764]: 2025-11-26 08:15:52.74209532 +0000 UTC m=+0.124082952 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=starting, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, tcib_managed=true, release=1761123044, version=17.1.12, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, batch=17.1_20251118.1, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, vcs-type=git) Nov 26 03:15:53 localhost podman[68764]: 2025-11-26 08:15:53.081482521 +0000 UTC m=+0.463470193 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.buildah.version=1.41.4, distribution-scope=public, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1761123044, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, tcib_managed=true) Nov 26 03:15:53 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:15:53 localhost kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure Nov 26 03:15:55 localhost ovs-vsctl[68942]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager Nov 26 03:15:55 localhost systemd[1]: libpod-ee90a703d474495d57becf520e5f387ffdd300ed7d3603f704ae4f3d84996836.scope: Deactivated successfully. Nov 26 03:15:55 localhost systemd[1]: libpod-ee90a703d474495d57becf520e5f387ffdd300ed7d3603f704ae4f3d84996836.scope: Consumed 2.975s CPU time. Nov 26 03:15:55 localhost podman[68943]: 2025-11-26 08:15:55.75345879 +0000 UTC m=+0.059481318 container died ee90a703d474495d57becf520e5f387ffdd300ed7d3603f704ae4f3d84996836 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, batch=17.1_20251118.1, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, container_name=setup_ovs_manager, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible) Nov 26 03:15:55 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ee90a703d474495d57becf520e5f387ffdd300ed7d3603f704ae4f3d84996836-userdata-shm.mount: Deactivated successfully. Nov 26 03:15:55 localhost systemd[1]: var-lib-containers-storage-overlay-23a3228a9cce0b0415c49ae8c807f0239a9343b6fe6519fde67c779fd8bde488-merged.mount: Deactivated successfully. Nov 26 03:15:55 localhost podman[68943]: 2025-11-26 08:15:55.798412254 +0000 UTC m=+0.104434762 container cleanup ee90a703d474495d57becf520e5f387ffdd300ed7d3603f704ae4f3d84996836 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, distribution-scope=public, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=setup_ovs_manager, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c) Nov 26 03:15:55 localhost systemd[1]: libpod-conmon-ee90a703d474495d57becf520e5f387ffdd300ed7d3603f704ae4f3d84996836.scope: Deactivated successfully. Nov 26 03:15:55 localhost python3[68173]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name setup_ovs_manager --conmon-pidfile /run/setup_ovs_manager.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1764143208 --label config_id=tripleo_step4 --label container_name=setup_ovs_manager --label managed_by=tripleo_ansible --label config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764143208'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/setup_ovs_manager.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 /container_puppet_apply.sh 4 exec include tripleo::profile::base::neutron::ovn_metadata Nov 26 03:15:56 localhost podman[69053]: 2025-11-26 08:15:56.152599437 +0000 UTC m=+0.054129325 container create 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vendor=Red Hat, Inc., batch=17.1_20251118.1, architecture=x86_64) Nov 26 03:15:56 localhost systemd[1]: Started libpod-conmon-670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.scope. Nov 26 03:15:56 localhost systemd[1]: Started libcrun container. Nov 26 03:15:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d5b4c6945eb62c6a5f0770a2d5f587e2cc4a0e0e03face3bef429dfc2877809/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 03:15:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d5b4c6945eb62c6a5f0770a2d5f587e2cc4a0e0e03face3bef429dfc2877809/merged/var/log/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 03:15:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d5b4c6945eb62c6a5f0770a2d5f587e2cc4a0e0e03face3bef429dfc2877809/merged/etc/neutron/kill_scripts supports timestamps until 2038 (0x7fffffff) Nov 26 03:15:56 localhost podman[69069]: 2025-11-26 08:15:56.225737822 +0000 UTC m=+0.092739685 container create 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, managed_by=tripleo_ansible, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, tcib_managed=true, distribution-scope=public) Nov 26 03:15:56 localhost podman[69053]: 2025-11-26 08:15:56.127609194 +0000 UTC m=+0.029139092 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Nov 26 03:15:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:15:56 localhost podman[69053]: 2025-11-26 08:15:56.236684527 +0000 UTC m=+0.138214415 container init 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.buildah.version=1.41.4, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 26 03:15:56 localhost systemd[1]: Started libpod-conmon-4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.scope. Nov 26 03:15:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:15:56 localhost systemd[1]: Started libcrun container. Nov 26 03:15:56 localhost podman[69053]: 2025-11-26 08:15:56.278418981 +0000 UTC m=+0.179948889 container start 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, url=https://www.redhat.com, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c) Nov 26 03:15:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4bfd9627f34b55559ce4f57af46bbdd5ae4252c5961ae723ed839f5c0358e239/merged/run/ovn supports timestamps until 2038 (0x7fffffff) Nov 26 03:15:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4bfd9627f34b55559ce4f57af46bbdd5ae4252c5961ae723ed839f5c0358e239/merged/var/log/ovn supports timestamps until 2038 (0x7fffffff) Nov 26 03:15:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4bfd9627f34b55559ce4f57af46bbdd5ae4252c5961ae723ed839f5c0358e239/merged/var/log/openvswitch supports timestamps until 2038 (0x7fffffff) Nov 26 03:15:56 localhost podman[69069]: 2025-11-26 08:15:56.185863003 +0000 UTC m=+0.052864916 image pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Nov 26 03:15:56 localhost python3[68173]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=8346a4a86ac2c2b1d52b2e36f598d419 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ovn_metadata_agent --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ovn_metadata_agent.log --network host --pid host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/neutron:/var/log/neutron:z --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /run/netns:/run/netns:shared --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Nov 26 03:15:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:15:56 localhost podman[69069]: 2025-11-26 08:15:56.307909793 +0000 UTC m=+0.174911676 container init 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, version=17.1.12, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller) Nov 26 03:15:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:15:56 localhost podman[69069]: 2025-11-26 08:15:56.333242327 +0000 UTC m=+0.200244170 container start 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.12, distribution-scope=public, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller) Nov 26 03:15:56 localhost systemd-logind[761]: Existing logind session ID 29 used by new audit session, ignoring. Nov 26 03:15:56 localhost python3[68173]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck 6642 --label config_id=tripleo_step4 --label container_name=ovn_controller --label managed_by=tripleo_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ovn_controller.log --network host --privileged=True --user root --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/log/containers/openvswitch:/var/log/openvswitch:z --volume /var/log/containers/openvswitch:/var/log/ovn:z registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Nov 26 03:15:56 localhost systemd[1]: Created slice User Slice of UID 0. Nov 26 03:15:56 localhost podman[69099]: 2025-11-26 08:15:56.344834601 +0000 UTC m=+0.067431201 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=starting, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c) Nov 26 03:15:56 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Nov 26 03:15:56 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Nov 26 03:15:56 localhost systemd[1]: Starting User Manager for UID 0... Nov 26 03:15:56 localhost podman[69099]: 2025-11-26 08:15:56.370177845 +0000 UTC m=+0.092774465 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 26 03:15:56 localhost podman[69099]: unhealthy Nov 26 03:15:56 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:15:56 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 03:15:56 localhost podman[69123]: 2025-11-26 08:15:56.457115772 +0000 UTC m=+0.115880742 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=starting, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, name=rhosp17/openstack-ovn-controller, version=17.1.12, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc.) Nov 26 03:15:56 localhost podman[69123]: 2025-11-26 08:15:56.471167501 +0000 UTC m=+0.129932471 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com) Nov 26 03:15:56 localhost podman[69123]: unhealthy Nov 26 03:15:56 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:15:56 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 03:15:56 localhost systemd[69149]: Queued start job for default target Main User Target. Nov 26 03:15:56 localhost systemd[69149]: Created slice User Application Slice. Nov 26 03:15:56 localhost systemd[69149]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Nov 26 03:15:56 localhost systemd[69149]: Started Daily Cleanup of User's Temporary Directories. Nov 26 03:15:56 localhost systemd[69149]: Reached target Paths. Nov 26 03:15:56 localhost systemd[69149]: Reached target Timers. Nov 26 03:15:56 localhost systemd[69149]: Starting D-Bus User Message Bus Socket... Nov 26 03:15:56 localhost systemd[69149]: Starting Create User's Volatile Files and Directories... Nov 26 03:15:56 localhost systemd[69149]: Listening on D-Bus User Message Bus Socket. Nov 26 03:15:56 localhost systemd[69149]: Finished Create User's Volatile Files and Directories. Nov 26 03:15:56 localhost systemd[69149]: Reached target Sockets. Nov 26 03:15:56 localhost systemd[69149]: Reached target Basic System. Nov 26 03:15:56 localhost systemd[69149]: Reached target Main User Target. Nov 26 03:15:56 localhost systemd[69149]: Startup finished in 150ms. Nov 26 03:15:56 localhost systemd[1]: Started User Manager for UID 0. Nov 26 03:15:56 localhost systemd[1]: Started Session c9 of User root. Nov 26 03:15:56 localhost systemd[1]: session-c9.scope: Deactivated successfully. Nov 26 03:15:56 localhost kernel: device br-int entered promiscuous mode Nov 26 03:15:56 localhost NetworkManager[5970]: [1764144956.6512] manager: (br-int): new Generic device (/org/freedesktop/NetworkManager/Devices/11) Nov 26 03:15:56 localhost systemd-udevd[69216]: Network interface NamePolicy= disabled on kernel command line. Nov 26 03:15:56 localhost NetworkManager[5970]: [1764144956.6967] device (genev_sys_6081): carrier: link connected Nov 26 03:15:56 localhost systemd-udevd[69219]: Network interface NamePolicy= disabled on kernel command line. Nov 26 03:15:56 localhost kernel: device genev_sys_6081 entered promiscuous mode Nov 26 03:15:56 localhost NetworkManager[5970]: [1764144956.6970] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/12) Nov 26 03:15:56 localhost python3[69236]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:15:57 localhost python3[69254]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:15:57 localhost python3[69270]: ansible-file Invoked with path=/etc/systemd/system/tripleo_logrotate_crond.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:15:57 localhost python3[69286]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:15:57 localhost python3[69302]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:15:58 localhost python3[69321]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:15:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:15:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:15:58 localhost podman[69340]: 2025-11-26 08:15:58.474841998 +0000 UTC m=+0.097810359 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, version=17.1.12, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team) Nov 26 03:15:58 localhost systemd[1]: tmp-crun.IP5WBh.mount: Deactivated successfully. Nov 26 03:15:58 localhost podman[69338]: 2025-11-26 08:15:58.518118161 +0000 UTC m=+0.141207125 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible) Nov 26 03:15:58 localhost python3[69339]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 03:15:58 localhost podman[69340]: 2025-11-26 08:15:58.560500226 +0000 UTC m=+0.183468607 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, architecture=x86_64, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12) Nov 26 03:15:58 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:15:58 localhost podman[69338]: 2025-11-26 08:15:58.580798186 +0000 UTC m=+0.203887130 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Nov 26 03:15:58 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:15:58 localhost python3[69393]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 03:15:59 localhost python3[69410]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_logrotate_crond_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 03:15:59 localhost python3[69428]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_migration_target_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 03:15:59 localhost python3[69444]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ovn_controller_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 03:15:59 localhost python3[69460]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 03:16:00 localhost python3[69521]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764144959.904925-109479-95254703275528/source dest=/etc/systemd/system/tripleo_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:16:00 localhost python3[69550]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764144959.904925-109479-95254703275528/source dest=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:16:01 localhost python3[69579]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764144959.904925-109479-95254703275528/source dest=/etc/systemd/system/tripleo_logrotate_crond.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:16:01 localhost python3[69608]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764144959.904925-109479-95254703275528/source dest=/etc/systemd/system/tripleo_nova_migration_target.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:16:02 localhost python3[69637]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764144959.904925-109479-95254703275528/source dest=/etc/systemd/system/tripleo_ovn_controller.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:16:02 localhost python3[69666]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764144959.904925-109479-95254703275528/source dest=/etc/systemd/system/tripleo_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:16:03 localhost python3[69682]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 26 03:16:03 localhost systemd[1]: Reloading. Nov 26 03:16:03 localhost systemd-rc-local-generator[69703]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:16:03 localhost systemd-sysv-generator[69708]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:16:03 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:16:04 localhost python3[69734]: ansible-systemd Invoked with state=restarted name=tripleo_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 03:16:04 localhost systemd[1]: Reloading. Nov 26 03:16:04 localhost systemd-rc-local-generator[69763]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:16:04 localhost systemd-sysv-generator[69767]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:16:04 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:16:04 localhost systemd[1]: Starting ceilometer_agent_compute container... Nov 26 03:16:04 localhost tripleo-start-podman-container[69775]: Creating additional drop-in dependency for "ceilometer_agent_compute" (3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe) Nov 26 03:16:04 localhost systemd[1]: Reloading. Nov 26 03:16:04 localhost systemd-rc-local-generator[69831]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:16:04 localhost systemd-sysv-generator[69838]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:16:04 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:16:05 localhost systemd[1]: Started ceilometer_agent_compute container. Nov 26 03:16:05 localhost sshd[69845]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:16:05 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 26 03:16:05 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 5231 writes, 23K keys, 5231 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 5231 writes, 596 syncs, 8.78 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 64 writes, 108 keys, 64 commit groups, 1.0 writes per commit group, ingest: 0.03 MB, 0.00 MB/s#012Interval WAL: 64 writes, 32 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 26 03:16:05 localhost python3[69862]: ansible-systemd Invoked with state=restarted name=tripleo_ceilometer_agent_ipmi.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 03:16:05 localhost systemd[1]: Reloading. Nov 26 03:16:06 localhost systemd-rc-local-generator[69887]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:16:06 localhost systemd-sysv-generator[69893]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:16:06 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:16:06 localhost systemd[1]: Starting ceilometer_agent_ipmi container... Nov 26 03:16:06 localhost systemd[1]: Started ceilometer_agent_ipmi container. Nov 26 03:16:06 localhost systemd[1]: Stopping User Manager for UID 0... Nov 26 03:16:06 localhost systemd[69149]: Activating special unit Exit the Session... Nov 26 03:16:06 localhost systemd[69149]: Stopped target Main User Target. Nov 26 03:16:06 localhost systemd[69149]: Stopped target Basic System. Nov 26 03:16:06 localhost systemd[69149]: Stopped target Paths. Nov 26 03:16:06 localhost systemd[69149]: Stopped target Sockets. Nov 26 03:16:06 localhost systemd[69149]: Stopped target Timers. Nov 26 03:16:06 localhost systemd[69149]: Stopped Daily Cleanup of User's Temporary Directories. Nov 26 03:16:06 localhost systemd[69149]: Closed D-Bus User Message Bus Socket. Nov 26 03:16:06 localhost systemd[69149]: Stopped Create User's Volatile Files and Directories. Nov 26 03:16:06 localhost systemd[69149]: Removed slice User Application Slice. Nov 26 03:16:06 localhost systemd[69149]: Reached target Shutdown. Nov 26 03:16:06 localhost systemd[69149]: Finished Exit the Session. Nov 26 03:16:06 localhost systemd[69149]: Reached target Exit the Session. Nov 26 03:16:06 localhost systemd[1]: user@0.service: Deactivated successfully. Nov 26 03:16:06 localhost systemd[1]: Stopped User Manager for UID 0. Nov 26 03:16:06 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Nov 26 03:16:06 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Nov 26 03:16:06 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Nov 26 03:16:06 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Nov 26 03:16:06 localhost systemd[1]: Removed slice User Slice of UID 0. Nov 26 03:16:07 localhost python3[69929]: ansible-systemd Invoked with state=restarted name=tripleo_logrotate_crond.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 03:16:07 localhost systemd[1]: Reloading. Nov 26 03:16:07 localhost systemd-rc-local-generator[69955]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:16:07 localhost systemd-sysv-generator[69960]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:16:07 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:16:07 localhost systemd[1]: Starting logrotate_crond container... Nov 26 03:16:07 localhost systemd[1]: Started logrotate_crond container. Nov 26 03:16:08 localhost python3[69994]: ansible-systemd Invoked with state=restarted name=tripleo_nova_migration_target.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 03:16:08 localhost systemd[1]: Reloading. Nov 26 03:16:08 localhost systemd-rc-local-generator[70019]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:16:08 localhost systemd-sysv-generator[70024]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:16:08 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:16:08 localhost systemd[1]: Starting nova_migration_target container... Nov 26 03:16:08 localhost systemd[1]: Started nova_migration_target container. Nov 26 03:16:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:16:09 localhost systemd[1]: tmp-crun.zeVwiN.mount: Deactivated successfully. Nov 26 03:16:09 localhost podman[70061]: 2025-11-26 08:16:09.24902954 +0000 UTC m=+0.091414154 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.expose-services=, container_name=metrics_qdr, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, tcib_managed=true, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 03:16:09 localhost python3[70062]: ansible-systemd Invoked with state=restarted name=tripleo_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 03:16:09 localhost podman[70061]: 2025-11-26 08:16:09.460562985 +0000 UTC m=+0.302947619 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, tcib_managed=true, managed_by=tripleo_ansible, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, release=1761123044, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 03:16:09 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:16:09 localhost systemd[1]: Reloading. Nov 26 03:16:09 localhost systemd-sysv-generator[70119]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:16:09 localhost systemd-rc-local-generator[70115]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:16:09 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:16:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 26 03:16:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 4356 writes, 19K keys, 4356 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4356 writes, 436 syncs, 9.99 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 46 writes, 88 keys, 46 commit groups, 1.0 writes per commit group, ingest: 0.02 MB, 0.00 MB/s#012Interval WAL: 46 writes, 23 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 26 03:16:09 localhost systemd[1]: Starting ovn_controller container... Nov 26 03:16:10 localhost tripleo-start-podman-container[70131]: Creating additional drop-in dependency for "ovn_controller" (4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5) Nov 26 03:16:10 localhost systemd[1]: Reloading. Nov 26 03:16:10 localhost systemd-sysv-generator[70188]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:16:10 localhost systemd-rc-local-generator[70185]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:16:10 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:16:10 localhost systemd[1]: Started ovn_controller container. Nov 26 03:16:11 localhost python3[70214]: ansible-systemd Invoked with state=restarted name=tripleo_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 03:16:11 localhost systemd[1]: Reloading. Nov 26 03:16:11 localhost systemd-sysv-generator[70244]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:16:11 localhost systemd-rc-local-generator[70239]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:16:11 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:16:11 localhost systemd[1]: Starting ovn_metadata_agent container... Nov 26 03:16:11 localhost systemd[1]: Started ovn_metadata_agent container. Nov 26 03:16:11 localhost python3[70295]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks4.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:16:13 localhost python3[70416]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks4.json short_hostname=np0005536118 step=4 update_config_hash_only=False Nov 26 03:16:13 localhost python3[70432]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:16:14 localhost python3[70448]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_4 config_pattern=container-puppet-*.json config_overrides={} debug=True Nov 26 03:16:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:16:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:16:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:16:22 localhost podman[70452]: 2025-11-26 08:16:22.823705659 +0000 UTC m=+0.085524664 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, distribution-scope=public, config_id=tripleo_step4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, vcs-type=git, release=1761123044, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc.) Nov 26 03:16:22 localhost podman[70452]: 2025-11-26 08:16:22.838337875 +0000 UTC m=+0.100156970 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond) Nov 26 03:16:22 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:16:22 localhost podman[70451]: 2025-11-26 08:16:22.923437977 +0000 UTC m=+0.185278823 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, version=17.1.12, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute) Nov 26 03:16:22 localhost podman[70451]: 2025-11-26 08:16:22.983146671 +0000 UTC m=+0.244987457 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.12) Nov 26 03:16:22 localhost systemd[1]: tmp-crun.4LrBvw.mount: Deactivated successfully. Nov 26 03:16:22 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:16:23 localhost podman[70453]: 2025-11-26 08:16:22.988184845 +0000 UTC m=+0.247103882 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.expose-services=, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, config_id=tripleo_step4, distribution-scope=public, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 26 03:16:23 localhost podman[70453]: 2025-11-26 08:16:23.068775018 +0000 UTC m=+0.327694035 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, vcs-type=git, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, architecture=x86_64, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, release=1761123044, container_name=ceilometer_agent_ipmi) Nov 26 03:16:23 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:16:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:16:23 localhost podman[70520]: 2025-11-26 08:16:23.82020663 +0000 UTC m=+0.082980928 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, vcs-type=git, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible) Nov 26 03:16:24 localhost podman[70520]: 2025-11-26 08:16:24.157802385 +0000 UTC m=+0.420576683 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true) Nov 26 03:16:24 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:16:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:16:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:16:26 localhost podman[70543]: 2025-11-26 08:16:26.818482939 +0000 UTC m=+0.080374208 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=starting, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vendor=Red Hat, Inc., batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public) Nov 26 03:16:26 localhost podman[70544]: 2025-11-26 08:16:26.872383145 +0000 UTC m=+0.132096057 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=starting, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vendor=Red Hat, Inc., release=1761123044, vcs-type=git, batch=17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, architecture=x86_64, container_name=ovn_metadata_agent, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible) Nov 26 03:16:26 localhost podman[70543]: 2025-11-26 08:16:26.89607681 +0000 UTC m=+0.157968329 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, container_name=ovn_controller, version=17.1.12, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp17/openstack-ovn-controller) Nov 26 03:16:26 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:16:26 localhost podman[70544]: 2025-11-26 08:16:26.925374005 +0000 UTC m=+0.185086927 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, container_name=ovn_metadata_agent, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, release=1761123044, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, tcib_managed=true, vendor=Red Hat, Inc.) Nov 26 03:16:26 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:16:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:16:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:16:28 localhost systemd[1]: tmp-crun.UkATWr.mount: Deactivated successfully. Nov 26 03:16:28 localhost podman[70590]: 2025-11-26 08:16:28.820762763 +0000 UTC m=+0.084194714 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, batch=17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Nov 26 03:16:28 localhost podman[70590]: 2025-11-26 08:16:28.831242064 +0000 UTC m=+0.094674005 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, config_id=tripleo_step3, distribution-scope=public) Nov 26 03:16:28 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:16:28 localhost podman[70589]: 2025-11-26 08:16:28.917008424 +0000 UTC m=+0.182813747 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, container_name=iscsid, vcs-type=git, release=1761123044, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1) Nov 26 03:16:28 localhost podman[70589]: 2025-11-26 08:16:28.924301447 +0000 UTC m=+0.190106780 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, io.buildah.version=1.41.4, version=17.1.12) Nov 26 03:16:28 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:16:34 localhost snmpd[66980]: empty variable list in _query Nov 26 03:16:34 localhost snmpd[66980]: empty variable list in _query Nov 26 03:16:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:16:39 localhost systemd[1]: tmp-crun.1k22al.mount: Deactivated successfully. Nov 26 03:16:39 localhost podman[70629]: 2025-11-26 08:16:39.840492297 +0000 UTC m=+0.092210989 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step1, io.buildah.version=1.41.4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, release=1761123044, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Nov 26 03:16:40 localhost podman[70629]: 2025-11-26 08:16:40.031441502 +0000 UTC m=+0.283160134 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, vcs-type=git, config_id=tripleo_step1, architecture=x86_64, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., version=17.1.12, release=1761123044) Nov 26 03:16:40 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:16:47 localhost sshd[70659]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:16:51 localhost podman[70843]: Nov 26 03:16:51 localhost podman[70843]: 2025-11-26 08:16:51.446002153 +0000 UTC m=+0.076288132 container create d83d9ddf0c44121b099b4bebef0abc974d6e8940a4d7ec1da430dc7f3d6918ed (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_hellman, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, name=rhceph, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, RELEASE=main, vcs-type=git, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 26 03:16:51 localhost podman[70843]: 2025-11-26 08:16:51.408320762 +0000 UTC m=+0.038606761 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 03:16:51 localhost systemd[1]: Started libpod-conmon-d83d9ddf0c44121b099b4bebef0abc974d6e8940a4d7ec1da430dc7f3d6918ed.scope. Nov 26 03:16:51 localhost systemd[1]: Started libcrun container. Nov 26 03:16:51 localhost podman[70843]: 2025-11-26 08:16:51.78460937 +0000 UTC m=+0.414895349 container init d83d9ddf0c44121b099b4bebef0abc974d6e8940a4d7ec1da430dc7f3d6918ed (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_hellman, release=553, io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, com.redhat.component=rhceph-container, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, name=rhceph, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 26 03:16:51 localhost podman[70843]: 2025-11-26 08:16:51.796429091 +0000 UTC m=+0.426715060 container start d83d9ddf0c44121b099b4bebef0abc974d6e8940a4d7ec1da430dc7f3d6918ed (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_hellman, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, name=rhceph, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, CEPH_POINT_RELEASE=, architecture=x86_64, GIT_CLEAN=True, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, version=7, distribution-scope=public, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git) Nov 26 03:16:51 localhost youthful_hellman[70856]: 167 167 Nov 26 03:16:51 localhost systemd[1]: libpod-d83d9ddf0c44121b099b4bebef0abc974d6e8940a4d7ec1da430dc7f3d6918ed.scope: Deactivated successfully. Nov 26 03:16:51 localhost podman[70843]: 2025-11-26 08:16:51.797184515 +0000 UTC m=+0.427470484 container attach d83d9ddf0c44121b099b4bebef0abc974d6e8940a4d7ec1da430dc7f3d6918ed (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_hellman, CEPH_POINT_RELEASE=, io.openshift.expose-services=, distribution-scope=public, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, name=rhceph, RELEASE=main, release=553, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux ) Nov 26 03:16:51 localhost podman[70843]: 2025-11-26 08:16:51.805454198 +0000 UTC m=+0.435740197 container died d83d9ddf0c44121b099b4bebef0abc974d6e8940a4d7ec1da430dc7f3d6918ed (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_hellman, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, release=553, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, ceph=True, RELEASE=main, vcs-type=git, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7) Nov 26 03:16:51 localhost podman[70863]: 2025-11-26 08:16:51.902718359 +0000 UTC m=+0.090381102 container remove d83d9ddf0c44121b099b4bebef0abc974d6e8940a4d7ec1da430dc7f3d6918ed (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_hellman, ceph=True, name=rhceph, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, GIT_BRANCH=main, maintainer=Guillaume Abrioux , vcs-type=git, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, distribution-scope=public) Nov 26 03:16:51 localhost systemd[1]: libpod-conmon-d83d9ddf0c44121b099b4bebef0abc974d6e8940a4d7ec1da430dc7f3d6918ed.scope: Deactivated successfully. Nov 26 03:16:52 localhost podman[70885]: Nov 26 03:16:52 localhost podman[70885]: 2025-11-26 08:16:52.145422876 +0000 UTC m=+0.086008360 container create 1c421ec9de9364c5cfa63ddd322a3c15a480e0f10463d715fa6ff0a58c9a356f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_swartz, version=7, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, RELEASE=main, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vcs-type=git, io.openshift.tags=rhceph ceph, architecture=x86_64, name=rhceph, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, com.redhat.component=rhceph-container) Nov 26 03:16:52 localhost systemd[1]: Started libpod-conmon-1c421ec9de9364c5cfa63ddd322a3c15a480e0f10463d715fa6ff0a58c9a356f.scope. Nov 26 03:16:52 localhost systemd[1]: Started libcrun container. Nov 26 03:16:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9451d616b1c935968e098829c92c6358f0115b385c8f315c8f9c3670087242c/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 26 03:16:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9451d616b1c935968e098829c92c6358f0115b385c8f315c8f9c3670087242c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 26 03:16:52 localhost podman[70885]: 2025-11-26 08:16:52.107559029 +0000 UTC m=+0.048144483 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 03:16:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9451d616b1c935968e098829c92c6358f0115b385c8f315c8f9c3670087242c/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 26 03:16:52 localhost podman[70885]: 2025-11-26 08:16:52.211212546 +0000 UTC m=+0.151797990 container init 1c421ec9de9364c5cfa63ddd322a3c15a480e0f10463d715fa6ff0a58c9a356f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_swartz, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, distribution-scope=public, GIT_BRANCH=main, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, version=7, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, description=Red Hat Ceph Storage 7, architecture=x86_64, ceph=True, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7) Nov 26 03:16:52 localhost podman[70885]: 2025-11-26 08:16:52.22114578 +0000 UTC m=+0.161731184 container start 1c421ec9de9364c5cfa63ddd322a3c15a480e0f10463d715fa6ff0a58c9a356f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_swartz, vcs-type=git, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, name=rhceph, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, version=7, GIT_BRANCH=main, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, io.openshift.expose-services=, architecture=x86_64, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, release=553) Nov 26 03:16:52 localhost podman[70885]: 2025-11-26 08:16:52.221433959 +0000 UTC m=+0.162019393 container attach 1c421ec9de9364c5cfa63ddd322a3c15a480e0f10463d715fa6ff0a58c9a356f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_swartz, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, vcs-type=git, distribution-scope=public, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=rhceph-container, GIT_BRANCH=main, vendor=Red Hat, Inc.) Nov 26 03:16:52 localhost systemd[1]: var-lib-containers-storage-overlay-ba455e0db28e22f57bbd0c23fef70903afdf07c37995a9a90e249d287ac49f74-merged.mount: Deactivated successfully. Nov 26 03:16:53 localhost serene_swartz[70900]: [ Nov 26 03:16:53 localhost serene_swartz[70900]: { Nov 26 03:16:53 localhost serene_swartz[70900]: "available": false, Nov 26 03:16:53 localhost serene_swartz[70900]: "ceph_device": false, Nov 26 03:16:53 localhost serene_swartz[70900]: "device_id": "QEMU_DVD-ROM_QM00001", Nov 26 03:16:53 localhost serene_swartz[70900]: "lsm_data": {}, Nov 26 03:16:53 localhost serene_swartz[70900]: "lvs": [], Nov 26 03:16:53 localhost serene_swartz[70900]: "path": "/dev/sr0", Nov 26 03:16:53 localhost serene_swartz[70900]: "rejected_reasons": [ Nov 26 03:16:53 localhost serene_swartz[70900]: "Has a FileSystem", Nov 26 03:16:53 localhost serene_swartz[70900]: "Insufficient space (<5GB)" Nov 26 03:16:53 localhost serene_swartz[70900]: ], Nov 26 03:16:53 localhost serene_swartz[70900]: "sys_api": { Nov 26 03:16:53 localhost serene_swartz[70900]: "actuators": null, Nov 26 03:16:53 localhost serene_swartz[70900]: "device_nodes": "sr0", Nov 26 03:16:53 localhost serene_swartz[70900]: "human_readable_size": "482.00 KB", Nov 26 03:16:53 localhost serene_swartz[70900]: "id_bus": "ata", Nov 26 03:16:53 localhost serene_swartz[70900]: "model": "QEMU DVD-ROM", Nov 26 03:16:53 localhost serene_swartz[70900]: "nr_requests": "2", Nov 26 03:16:53 localhost serene_swartz[70900]: "partitions": {}, Nov 26 03:16:53 localhost serene_swartz[70900]: "path": "/dev/sr0", Nov 26 03:16:53 localhost serene_swartz[70900]: "removable": "1", Nov 26 03:16:53 localhost serene_swartz[70900]: "rev": "2.5+", Nov 26 03:16:53 localhost serene_swartz[70900]: "ro": "0", Nov 26 03:16:53 localhost serene_swartz[70900]: "rotational": "1", Nov 26 03:16:53 localhost serene_swartz[70900]: "sas_address": "", Nov 26 03:16:53 localhost serene_swartz[70900]: "sas_device_handle": "", Nov 26 03:16:53 localhost serene_swartz[70900]: "scheduler_mode": "mq-deadline", Nov 26 03:16:53 localhost serene_swartz[70900]: "sectors": 0, Nov 26 03:16:53 localhost serene_swartz[70900]: "sectorsize": "2048", Nov 26 03:16:53 localhost serene_swartz[70900]: "size": 493568.0, Nov 26 03:16:53 localhost serene_swartz[70900]: "support_discard": "0", Nov 26 03:16:53 localhost serene_swartz[70900]: "type": "disk", Nov 26 03:16:53 localhost serene_swartz[70900]: "vendor": "QEMU" Nov 26 03:16:53 localhost serene_swartz[70900]: } Nov 26 03:16:53 localhost serene_swartz[70900]: } Nov 26 03:16:53 localhost serene_swartz[70900]: ] Nov 26 03:16:53 localhost systemd[1]: libpod-1c421ec9de9364c5cfa63ddd322a3c15a480e0f10463d715fa6ff0a58c9a356f.scope: Deactivated successfully. Nov 26 03:16:53 localhost systemd[1]: libpod-1c421ec9de9364c5cfa63ddd322a3c15a480e0f10463d715fa6ff0a58c9a356f.scope: Consumed 1.093s CPU time. Nov 26 03:16:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:16:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:16:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:16:53 localhost podman[72917]: 2025-11-26 08:16:53.370036397 +0000 UTC m=+0.049661568 container died 1c421ec9de9364c5cfa63ddd322a3c15a480e0f10463d715fa6ff0a58c9a356f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_swartz, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, architecture=x86_64, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_CLEAN=True, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 26 03:16:53 localhost systemd[1]: tmp-crun.KtRZyo.mount: Deactivated successfully. Nov 26 03:16:53 localhost systemd[1]: var-lib-containers-storage-overlay-f9451d616b1c935968e098829c92c6358f0115b385c8f315c8f9c3670087242c-merged.mount: Deactivated successfully. Nov 26 03:16:53 localhost podman[72917]: 2025-11-26 08:16:53.400799907 +0000 UTC m=+0.080425058 container remove 1c421ec9de9364c5cfa63ddd322a3c15a480e0f10463d715fa6ff0a58c9a356f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_swartz, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.expose-services=, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, vcs-type=git, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, distribution-scope=public, release=553, ceph=True, name=rhceph, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 26 03:16:53 localhost systemd[1]: libpod-conmon-1c421ec9de9364c5cfa63ddd322a3c15a480e0f10463d715fa6ff0a58c9a356f.scope: Deactivated successfully. Nov 26 03:16:53 localhost podman[72924]: 2025-11-26 08:16:53.464868965 +0000 UTC m=+0.130601462 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, release=1761123044, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, batch=17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, url=https://www.redhat.com, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron) Nov 26 03:16:53 localhost podman[72924]: 2025-11-26 08:16:53.477456659 +0000 UTC m=+0.143188977 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, config_id=tripleo_step4, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container) Nov 26 03:16:53 localhost podman[72923]: 2025-11-26 08:16:53.437367185 +0000 UTC m=+0.107497856 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 26 03:16:53 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:16:53 localhost podman[72925]: 2025-11-26 08:16:53.538039441 +0000 UTC m=+0.203780868 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.openshift.expose-services=) Nov 26 03:16:53 localhost podman[72923]: 2025-11-26 08:16:53.570567455 +0000 UTC m=+0.240698166 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, release=1761123044, container_name=ceilometer_agent_compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 26 03:16:53 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:16:53 localhost podman[72925]: 2025-11-26 08:16:53.595211069 +0000 UTC m=+0.260952516 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, version=17.1.12, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container) Nov 26 03:16:53 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:16:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:16:54 localhost podman[73015]: 2025-11-26 08:16:54.834514648 +0000 UTC m=+0.093822238 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com) Nov 26 03:16:55 localhost podman[73015]: 2025-11-26 08:16:55.200349697 +0000 UTC m=+0.459657277 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, release=1761123044, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:16:55 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:16:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:16:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:16:57 localhost podman[73038]: 2025-11-26 08:16:57.829315372 +0000 UTC m=+0.084876774 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, release=1761123044, config_id=tripleo_step4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, container_name=ovn_controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 26 03:16:57 localhost podman[73038]: 2025-11-26 08:16:57.882015843 +0000 UTC m=+0.137577195 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 26 03:16:57 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:16:57 localhost podman[73039]: 2025-11-26 08:16:57.883330082 +0000 UTC m=+0.136537783 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.12, io.buildah.version=1.41.4, release=1761123044, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Nov 26 03:16:57 localhost podman[73039]: 2025-11-26 08:16:57.966297418 +0000 UTC m=+0.219505149 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, tcib_managed=true, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64) Nov 26 03:16:57 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:16:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:16:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:16:59 localhost podman[73087]: 2025-11-26 08:16:59.825082747 +0000 UTC m=+0.085254605 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, managed_by=tripleo_ansible, container_name=collectd, version=17.1.12, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, release=1761123044, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=) Nov 26 03:16:59 localhost podman[73086]: 2025-11-26 08:16:59.868163154 +0000 UTC m=+0.131887491 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, tcib_managed=true, batch=17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid) Nov 26 03:16:59 localhost podman[73087]: 2025-11-26 08:16:59.892705545 +0000 UTC m=+0.152877363 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, vcs-type=git, io.buildah.version=1.41.4, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., release=1761123044, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3) Nov 26 03:16:59 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:16:59 localhost podman[73086]: 2025-11-26 08:16:59.908435805 +0000 UTC m=+0.172160152 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-iscsid, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, container_name=iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid) Nov 26 03:16:59 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:17:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:17:10 localhost podman[73124]: 2025-11-26 08:17:10.812099273 +0000 UTC m=+0.075938221 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, version=17.1.12, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container) Nov 26 03:17:11 localhost podman[73124]: 2025-11-26 08:17:11.007319629 +0000 UTC m=+0.271158607 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, release=1761123044, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container) Nov 26 03:17:11 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:17:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:17:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:17:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:17:23 localhost podman[73156]: 2025-11-26 08:17:23.839010195 +0000 UTC m=+0.092246809 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Nov 26 03:17:23 localhost systemd[1]: tmp-crun.fiyFbC.mount: Deactivated successfully. Nov 26 03:17:23 localhost podman[73156]: 2025-11-26 08:17:23.897480772 +0000 UTC m=+0.150717416 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 26 03:17:23 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:17:23 localhost podman[73158]: 2025-11-26 08:17:23.899017459 +0000 UTC m=+0.148620162 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi) Nov 26 03:17:23 localhost podman[73158]: 2025-11-26 08:17:23.978849628 +0000 UTC m=+0.228452321 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, url=https://www.redhat.com, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team) Nov 26 03:17:23 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:17:24 localhost podman[73157]: 2025-11-26 08:17:24.043514594 +0000 UTC m=+0.294097937 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, url=https://www.redhat.com, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=logrotate_crond, vendor=Red Hat, Inc., config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true) Nov 26 03:17:24 localhost podman[73157]: 2025-11-26 08:17:24.078888206 +0000 UTC m=+0.329471489 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Nov 26 03:17:24 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:17:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:17:25 localhost systemd[1]: tmp-crun.4YJXRj.mount: Deactivated successfully. Nov 26 03:17:25 localhost podman[73231]: 2025-11-26 08:17:25.830102068 +0000 UTC m=+0.090736113 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64) Nov 26 03:17:26 localhost podman[73231]: 2025-11-26 08:17:26.21925929 +0000 UTC m=+0.479893265 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, release=1761123044, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:17:26 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:17:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:17:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:17:28 localhost systemd[1]: tmp-crun.BGeBkU.mount: Deactivated successfully. Nov 26 03:17:28 localhost podman[73254]: 2025-11-26 08:17:28.825003465 +0000 UTC m=+0.089392893 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, distribution-scope=public, io.buildah.version=1.41.4, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, version=17.1.12, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:17:28 localhost podman[73255]: 2025-11-26 08:17:28.861524781 +0000 UTC m=+0.123251857 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.buildah.version=1.41.4, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, config_id=tripleo_step4) Nov 26 03:17:28 localhost podman[73254]: 2025-11-26 08:17:28.89287378 +0000 UTC m=+0.157263198 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, architecture=x86_64, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, version=17.1.12, distribution-scope=public, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, container_name=ovn_controller, io.openshift.expose-services=) Nov 26 03:17:28 localhost podman[73255]: 2025-11-26 08:17:28.903266767 +0000 UTC m=+0.164993873 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20251118.1, release=1761123044, config_id=tripleo_step4, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true) Nov 26 03:17:28 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:17:28 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:17:29 localhost sshd[73303]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:17:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:17:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:17:30 localhost systemd[1]: tmp-crun.8b8DLS.mount: Deactivated successfully. Nov 26 03:17:30 localhost podman[73305]: 2025-11-26 08:17:30.238583671 +0000 UTC m=+0.104926718 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., tcib_managed=true, container_name=iscsid, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 26 03:17:30 localhost podman[73305]: 2025-11-26 08:17:30.276573811 +0000 UTC m=+0.142916868 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.12, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.buildah.version=1.41.4, distribution-scope=public, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 26 03:17:30 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:17:30 localhost podman[73306]: 2025-11-26 08:17:30.328724605 +0000 UTC m=+0.191831772 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, managed_by=tripleo_ansible, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, distribution-scope=public, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Nov 26 03:17:30 localhost podman[73306]: 2025-11-26 08:17:30.363343863 +0000 UTC m=+0.226451000 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, url=https://www.redhat.com, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-collectd-container, container_name=collectd, version=17.1.12, io.openshift.expose-services=, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:17:30 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:17:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:17:41 localhost podman[73344]: 2025-11-26 08:17:41.825534621 +0000 UTC m=+0.083877904 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, distribution-scope=public, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, config_id=tripleo_step1) Nov 26 03:17:42 localhost podman[73344]: 2025-11-26 08:17:42.051547247 +0000 UTC m=+0.309890480 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, release=1761123044, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:17:42 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:17:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:17:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:17:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:17:54 localhost podman[73388]: 2025-11-26 08:17:54.199213279 +0000 UTC m=+0.095603312 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 26 03:17:54 localhost podman[73389]: 2025-11-26 08:17:54.257360177 +0000 UTC m=+0.149280044 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step4, url=https://www.redhat.com, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, release=1761123044, vendor=Red Hat, Inc., vcs-type=git, version=17.1.12, tcib_managed=true, maintainer=OpenStack TripleO Team) Nov 26 03:17:54 localhost podman[73388]: 2025-11-26 08:17:54.268469606 +0000 UTC m=+0.164859589 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, release=1761123044, tcib_managed=true) Nov 26 03:17:54 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:17:54 localhost systemd[1]: tmp-crun.qwGVwd.mount: Deactivated successfully. Nov 26 03:17:54 localhost podman[73429]: 2025-11-26 08:17:54.368312596 +0000 UTC m=+0.164905939 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, vcs-type=git, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, version=17.1.12, managed_by=tripleo_ansible, container_name=logrotate_crond, architecture=x86_64) Nov 26 03:17:54 localhost podman[73429]: 2025-11-26 08:17:54.377342153 +0000 UTC m=+0.173935516 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, version=17.1.12, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true) Nov 26 03:17:54 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:17:54 localhost podman[73389]: 2025-11-26 08:17:54.419513132 +0000 UTC m=+0.311432949 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, distribution-scope=public, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vcs-type=git, container_name=ceilometer_agent_ipmi, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:17:54 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:17:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:17:56 localhost podman[73522]: 2025-11-26 08:17:56.37285488 +0000 UTC m=+0.088170966 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-type=git, batch=17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Nov 26 03:17:56 localhost podman[73522]: 2025-11-26 08:17:56.747390235 +0000 UTC m=+0.462706271 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, container_name=nova_migration_target, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z) Nov 26 03:17:56 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:17:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:17:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:17:59 localhost systemd[1]: tmp-crun.hsBJRY.mount: Deactivated successfully. Nov 26 03:17:59 localhost podman[73546]: 2025-11-26 08:17:59.827360201 +0000 UTC m=+0.083513913 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, release=1761123044, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, vcs-type=git, version=17.1.12, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step4) Nov 26 03:17:59 localhost systemd[1]: tmp-crun.B9SXa4.mount: Deactivated successfully. Nov 26 03:17:59 localhost podman[73545]: 2025-11-26 08:17:59.87611975 +0000 UTC m=+0.137010536 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, url=https://www.redhat.com, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, tcib_managed=true, container_name=ovn_controller, architecture=x86_64, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Nov 26 03:17:59 localhost podman[73546]: 2025-11-26 08:17:59.882361542 +0000 UTC m=+0.138515264 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Nov 26 03:17:59 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:17:59 localhost podman[73545]: 2025-11-26 08:17:59.922468997 +0000 UTC m=+0.183359783 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z) Nov 26 03:17:59 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:18:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:18:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:18:00 localhost podman[73590]: 2025-11-26 08:18:00.822630843 +0000 UTC m=+0.088246477 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, version=17.1.12, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3) Nov 26 03:18:00 localhost podman[73590]: 2025-11-26 08:18:00.857253022 +0000 UTC m=+0.122868646 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, name=rhosp17/openstack-iscsid, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Nov 26 03:18:00 localhost systemd[1]: tmp-crun.QxUy6G.mount: Deactivated successfully. Nov 26 03:18:00 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:18:00 localhost podman[73591]: 2025-11-26 08:18:00.878445839 +0000 UTC m=+0.140655799 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, container_name=collectd, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, vcs-type=git, release=1761123044, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, distribution-scope=public) Nov 26 03:18:00 localhost podman[73591]: 2025-11-26 08:18:00.912628983 +0000 UTC m=+0.174838893 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, vendor=Red Hat, Inc.) Nov 26 03:18:00 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:18:10 localhost sshd[73627]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:18:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:18:12 localhost podman[73629]: 2025-11-26 08:18:12.806325693 +0000 UTC m=+0.069272325 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, architecture=x86_64, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, container_name=metrics_qdr, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z) Nov 26 03:18:13 localhost podman[73629]: 2025-11-26 08:18:13.002485074 +0000 UTC m=+0.265431716 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., release=1761123044, distribution-scope=public, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, io.buildah.version=1.41.4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:18:13 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:18:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:18:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:18:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:18:24 localhost systemd[1]: tmp-crun.afTbjF.mount: Deactivated successfully. Nov 26 03:18:24 localhost podman[73658]: 2025-11-26 08:18:24.835649881 +0000 UTC m=+0.094361620 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com) Nov 26 03:18:24 localhost podman[73660]: 2025-11-26 08:18:24.885420328 +0000 UTC m=+0.137742555 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, vcs-type=git, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, release=1761123044, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., url=https://www.redhat.com, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64) Nov 26 03:18:24 localhost podman[73658]: 2025-11-26 08:18:24.89080554 +0000 UTC m=+0.149517299 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, url=https://www.redhat.com, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, release=1761123044, config_id=tripleo_step4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4) Nov 26 03:18:24 localhost podman[73659]: 2025-11-26 08:18:24.928428782 +0000 UTC m=+0.183025857 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-cron, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, container_name=logrotate_crond, vendor=Red Hat, Inc.) Nov 26 03:18:24 localhost podman[73659]: 2025-11-26 08:18:24.937607789 +0000 UTC m=+0.192204864 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, release=1761123044, batch=17.1_20251118.1) Nov 26 03:18:24 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:18:24 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:18:24 localhost podman[73660]: 2025-11-26 08:18:24.993462029 +0000 UTC m=+0.245784416 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:18:25 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:18:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:18:27 localhost podman[73729]: 2025-11-26 08:18:27.813924945 +0000 UTC m=+0.076578244 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Nov 26 03:18:28 localhost podman[73729]: 2025-11-26 08:18:28.185342349 +0000 UTC m=+0.447995628 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, release=1761123044, vcs-type=git, io.openshift.expose-services=, version=17.1.12, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, batch=17.1_20251118.1) Nov 26 03:18:28 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:18:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:18:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:18:30 localhost podman[73752]: 2025-11-26 08:18:30.81975622 +0000 UTC m=+0.082066299 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, architecture=x86_64, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, config_id=tripleo_step4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4) Nov 26 03:18:30 localhost systemd[1]: tmp-crun.rHifWI.mount: Deactivated successfully. Nov 26 03:18:30 localhost podman[73752]: 2025-11-26 08:18:30.883415976 +0000 UTC m=+0.145725995 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, build-date=2025-11-18T23:34:05Z, version=17.1.12, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, batch=17.1_20251118.1) Nov 26 03:18:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:18:30 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:18:30 localhost podman[73753]: 2025-11-26 08:18:30.884091376 +0000 UTC m=+0.142532609 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-type=git, version=17.1.12, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, tcib_managed=true, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent) Nov 26 03:18:30 localhost podman[73753]: 2025-11-26 08:18:30.968367901 +0000 UTC m=+0.226809124 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step4, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, version=17.1.12, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Nov 26 03:18:30 localhost podman[73800]: 2025-11-26 08:18:30.977374533 +0000 UTC m=+0.059432300 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, version=17.1.12, config_id=tripleo_step3, vcs-type=git, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.buildah.version=1.41.4, url=https://www.redhat.com, container_name=iscsid, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, release=1761123044, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, architecture=x86_64) Nov 26 03:18:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:18:30 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:18:31 localhost podman[73800]: 2025-11-26 08:18:31.013154498 +0000 UTC m=+0.095212225 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step3) Nov 26 03:18:31 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:18:31 localhost podman[73819]: 2025-11-26 08:18:31.06140047 +0000 UTC m=+0.066344247 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, batch=17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc.) Nov 26 03:18:31 localhost podman[73819]: 2025-11-26 08:18:31.07669844 +0000 UTC m=+0.081642247 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, batch=17.1_20251118.1, config_id=tripleo_step3, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd) Nov 26 03:18:31 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:18:31 localhost systemd[1]: tmp-crun.4ZlZPM.mount: Deactivated successfully. Nov 26 03:18:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:18:43 localhost podman[73840]: 2025-11-26 08:18:43.818981755 +0000 UTC m=+0.077289646 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1761123044, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 03:18:44 localhost podman[73840]: 2025-11-26 08:18:44.028431505 +0000 UTC m=+0.286739346 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, tcib_managed=true, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z) Nov 26 03:18:44 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:18:50 localhost python3[73916]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:18:50 localhost python3[73961]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764145130.1825619-113627-256468374700125/source _original_basename=tmpjx3ylgae follow=False checksum=039e0b234f00fbd1242930f0d5dc67e8b4c067fe backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:18:51 localhost python3[73991]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_5 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 03:18:51 localhost sshd[73994]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:18:52 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 26 03:18:52 localhost recover_tripleo_nova_virtqemud[74045]: 61604 Nov 26 03:18:52 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 26 03:18:52 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 26 03:18:54 localhost ansible-async_wrapper.py[74167]: Invoked with 344358551092 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764145133.1391976-113789-42755083635277/AnsiballZ_command.py _ Nov 26 03:18:54 localhost ansible-async_wrapper.py[74170]: Starting module and watcher Nov 26 03:18:54 localhost ansible-async_wrapper.py[74170]: Start watching 74171 (3600) Nov 26 03:18:54 localhost ansible-async_wrapper.py[74171]: Start module (74171) Nov 26 03:18:54 localhost ansible-async_wrapper.py[74167]: Return async_wrapper task started. Nov 26 03:18:54 localhost python3[74191]: ansible-ansible.legacy.async_status Invoked with jid=344358551092.74167 mode=status _async_dir=/tmp/.ansible_async Nov 26 03:18:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:18:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:18:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:18:55 localhost podman[74212]: 2025-11-26 08:18:55.824548399 +0000 UTC m=+0.080223945 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, version=17.1.12, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=logrotate_crond, url=https://www.redhat.com) Nov 26 03:18:55 localhost podman[74212]: 2025-11-26 08:18:55.837173719 +0000 UTC m=+0.092849315 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_id=tripleo_step4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Nov 26 03:18:55 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:18:55 localhost systemd[1]: tmp-crun.F5Ytnc.mount: Deactivated successfully. Nov 26 03:18:55 localhost podman[74211]: 2025-11-26 08:18:55.936684062 +0000 UTC m=+0.193785350 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, architecture=x86_64, batch=17.1_20251118.1) Nov 26 03:18:55 localhost podman[74213]: 2025-11-26 08:18:55.902303758 +0000 UTC m=+0.152638863 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible) Nov 26 03:18:55 localhost podman[74213]: 2025-11-26 08:18:55.985391808 +0000 UTC m=+0.235726903 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, vcs-type=git, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 26 03:18:55 localhost podman[74211]: 2025-11-26 08:18:55.996562424 +0000 UTC m=+0.253663712 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, release=1761123044, version=17.1.12, architecture=x86_64, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 26 03:18:55 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:18:56 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:18:57 localhost podman[74480]: 2025-11-26 08:18:57.30811212 +0000 UTC m=+0.068761591 container exec a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vcs-type=git, GIT_CLEAN=True, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, name=rhceph, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, release=553, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=) Nov 26 03:18:57 localhost podman[74480]: 2025-11-26 08:18:57.429854442 +0000 UTC m=+0.190503913 container exec_died a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, CEPH_POINT_RELEASE=, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, ceph=True, RELEASE=main, vendor=Red Hat, Inc., io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, GIT_CLEAN=True) Nov 26 03:18:58 localhost puppet-user[74190]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 26 03:18:58 localhost puppet-user[74190]: (file: /etc/puppet/hiera.yaml) Nov 26 03:18:58 localhost puppet-user[74190]: Warning: Undefined variable '::deploy_config_name'; Nov 26 03:18:58 localhost puppet-user[74190]: (file & line not available) Nov 26 03:18:58 localhost puppet-user[74190]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 26 03:18:58 localhost puppet-user[74190]: (file & line not available) Nov 26 03:18:58 localhost puppet-user[74190]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Nov 26 03:18:58 localhost puppet-user[74190]: Warning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/snmp/manifests/params.pp", 310]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Nov 26 03:18:58 localhost puppet-user[74190]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Nov 26 03:18:58 localhost puppet-user[74190]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Nov 26 03:18:58 localhost puppet-user[74190]: with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 358]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Nov 26 03:18:58 localhost puppet-user[74190]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Nov 26 03:18:58 localhost puppet-user[74190]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Nov 26 03:18:58 localhost puppet-user[74190]: with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 367]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Nov 26 03:18:58 localhost puppet-user[74190]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Nov 26 03:18:58 localhost puppet-user[74190]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Nov 26 03:18:58 localhost puppet-user[74190]: with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 382]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Nov 26 03:18:58 localhost puppet-user[74190]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Nov 26 03:18:58 localhost puppet-user[74190]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Nov 26 03:18:58 localhost puppet-user[74190]: with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 388]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Nov 26 03:18:58 localhost puppet-user[74190]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Nov 26 03:18:58 localhost puppet-user[74190]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Nov 26 03:18:58 localhost puppet-user[74190]: with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 393]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Nov 26 03:18:58 localhost puppet-user[74190]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Nov 26 03:18:58 localhost puppet-user[74190]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Nov 26 03:18:58 localhost puppet-user[74190]: Notice: Compiled catalog for np0005536118.localdomain in environment production in 0.22 seconds Nov 26 03:18:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:18:58 localhost puppet-user[74190]: Notice: Applied catalog in 0.32 seconds Nov 26 03:18:58 localhost puppet-user[74190]: Application: Nov 26 03:18:58 localhost puppet-user[74190]: Initial environment: production Nov 26 03:18:58 localhost puppet-user[74190]: Converged environment: production Nov 26 03:18:58 localhost puppet-user[74190]: Run mode: user Nov 26 03:18:58 localhost puppet-user[74190]: Changes: Nov 26 03:18:58 localhost puppet-user[74190]: Events: Nov 26 03:18:58 localhost puppet-user[74190]: Resources: Nov 26 03:18:58 localhost puppet-user[74190]: Total: 19 Nov 26 03:18:58 localhost puppet-user[74190]: Time: Nov 26 03:18:58 localhost puppet-user[74190]: Filebucket: 0.00 Nov 26 03:18:58 localhost puppet-user[74190]: Package: 0.00 Nov 26 03:18:58 localhost puppet-user[74190]: Schedule: 0.00 Nov 26 03:18:58 localhost puppet-user[74190]: Augeas: 0.01 Nov 26 03:18:58 localhost puppet-user[74190]: Exec: 0.01 Nov 26 03:18:58 localhost puppet-user[74190]: File: 0.02 Nov 26 03:18:58 localhost puppet-user[74190]: Service: 0.08 Nov 26 03:18:58 localhost puppet-user[74190]: Config retrieval: 0.29 Nov 26 03:18:58 localhost puppet-user[74190]: Transaction evaluation: 0.31 Nov 26 03:18:58 localhost puppet-user[74190]: Catalog application: 0.32 Nov 26 03:18:58 localhost puppet-user[74190]: Last run: 1764145138 Nov 26 03:18:58 localhost puppet-user[74190]: Total: 0.33 Nov 26 03:18:58 localhost puppet-user[74190]: Version: Nov 26 03:18:58 localhost puppet-user[74190]: Config: 1764145137 Nov 26 03:18:58 localhost puppet-user[74190]: Puppet: 7.10.0 Nov 26 03:18:58 localhost podman[74612]: 2025-11-26 08:18:58.618026425 +0000 UTC m=+0.096319228 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044) Nov 26 03:18:58 localhost ansible-async_wrapper.py[74171]: Module complete (74171) Nov 26 03:18:58 localhost podman[74612]: 2025-11-26 08:18:58.975388056 +0000 UTC m=+0.453680879 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_id=tripleo_step4, vendor=Red Hat, Inc., distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 26 03:18:58 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:18:59 localhost ansible-async_wrapper.py[74170]: Done in kid B. Nov 26 03:19:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:19:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:19:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:19:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:19:01 localhost systemd[1]: tmp-crun.exSS6V.mount: Deactivated successfully. Nov 26 03:19:01 localhost podman[74654]: 2025-11-26 08:19:01.875312444 +0000 UTC m=+0.130501077 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, container_name=iscsid, version=17.1.12, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, distribution-scope=public) Nov 26 03:19:01 localhost podman[74655]: 2025-11-26 08:19:01.832446345 +0000 UTC m=+0.088364790 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public) Nov 26 03:19:01 localhost podman[74657]: 2025-11-26 08:19:01.889781079 +0000 UTC m=+0.134962471 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-type=git, tcib_managed=true, distribution-scope=public, config_id=tripleo_step4, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com) Nov 26 03:19:01 localhost podman[74656]: 2025-11-26 08:19:01.846175218 +0000 UTC m=+0.095921668 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, url=https://www.redhat.com, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4) Nov 26 03:19:01 localhost podman[74654]: 2025-11-26 08:19:01.908346988 +0000 UTC m=+0.163535651 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, vendor=Red Hat, Inc., container_name=iscsid, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid) Nov 26 03:19:01 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:19:01 localhost podman[74656]: 2025-11-26 08:19:01.929578336 +0000 UTC m=+0.179324826 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, io.openshift.expose-services=, version=17.1.12, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., container_name=ovn_controller, io.buildah.version=1.41.4, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container) Nov 26 03:19:01 localhost podman[74657]: 2025-11-26 08:19:01.952306701 +0000 UTC m=+0.197488133 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c) Nov 26 03:19:01 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:19:01 localhost podman[74655]: 2025-11-26 08:19:01.962315901 +0000 UTC m=+0.218234346 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, architecture=x86_64, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, io.openshift.expose-services=, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd) Nov 26 03:19:01 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:19:01 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:19:04 localhost python3[74751]: ansible-ansible.legacy.async_status Invoked with jid=344358551092.74167 mode=status _async_dir=/tmp/.ansible_async Nov 26 03:19:05 localhost python3[74767]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 26 03:19:05 localhost python3[74783]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 03:19:06 localhost python3[74833]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:19:06 localhost python3[74851]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpfkcfs9cu recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 26 03:19:07 localhost python3[74881]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:19:08 localhost python3[74986]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Nov 26 03:19:09 localhost python3[75005]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:19:10 localhost python3[75037]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 03:19:10 localhost python3[75087]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:19:10 localhost python3[75105]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:19:11 localhost python3[75167]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:19:11 localhost python3[75185]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:19:12 localhost python3[75247]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:19:12 localhost python3[75265]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:19:13 localhost python3[75327]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:19:13 localhost python3[75345]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:19:14 localhost python3[75375]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 03:19:14 localhost systemd[1]: Reloading. Nov 26 03:19:14 localhost systemd-rc-local-generator[75393]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:19:14 localhost systemd-sysv-generator[75399]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:19:14 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:19:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:19:14 localhost podman[75413]: 2025-11-26 08:19:14.495256731 +0000 UTC m=+0.097882486 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step1, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true) Nov 26 03:19:14 localhost podman[75413]: 2025-11-26 08:19:14.71327677 +0000 UTC m=+0.315902485 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 03:19:14 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:19:14 localhost python3[75492]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:19:15 localhost python3[75510]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:19:15 localhost python3[75572]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 03:19:16 localhost python3[75590]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:19:16 localhost python3[75620]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 03:19:16 localhost systemd[1]: Reloading. Nov 26 03:19:16 localhost systemd-rc-local-generator[75641]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:19:16 localhost systemd-sysv-generator[75646]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:19:16 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:19:16 localhost systemd[1]: Starting Create netns directory... Nov 26 03:19:16 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Nov 26 03:19:16 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Nov 26 03:19:16 localhost systemd[1]: Finished Create netns directory. Nov 26 03:19:17 localhost python3[75677]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Nov 26 03:19:19 localhost python3[75736]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step5 config_dir=/var/lib/tripleo-config/container-startup-config/step_5 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Nov 26 03:19:19 localhost podman[75774]: 2025-11-26 08:19:19.752847085 +0000 UTC m=+0.099190305 container create f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, version=17.1.12, batch=17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=) Nov 26 03:19:19 localhost podman[75774]: 2025-11-26 08:19:19.702402268 +0000 UTC m=+0.048745498 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Nov 26 03:19:19 localhost systemd[1]: Started libpod-conmon-f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.scope. Nov 26 03:19:19 localhost systemd[1]: Started libcrun container. Nov 26 03:19:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c309ab81ad8c7882d0bc2a3cffa363ac8b346f70f1c23bf5a7e70394ef52b071/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Nov 26 03:19:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c309ab81ad8c7882d0bc2a3cffa363ac8b346f70f1c23bf5a7e70394ef52b071/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 26 03:19:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c309ab81ad8c7882d0bc2a3cffa363ac8b346f70f1c23bf5a7e70394ef52b071/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Nov 26 03:19:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c309ab81ad8c7882d0bc2a3cffa363ac8b346f70f1c23bf5a7e70394ef52b071/merged/var/log/nova supports timestamps until 2038 (0x7fffffff) Nov 26 03:19:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c309ab81ad8c7882d0bc2a3cffa363ac8b346f70f1c23bf5a7e70394ef52b071/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Nov 26 03:19:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:19:19 localhost podman[75774]: 2025-11-26 08:19:19.864107891 +0000 UTC m=+0.210451161 container init f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container) Nov 26 03:19:19 localhost systemd[1]: tmp-crun.pBuVpt.mount: Deactivated successfully. Nov 26 03:19:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:19:19 localhost podman[75774]: 2025-11-26 08:19:19.911055964 +0000 UTC m=+0.257399194 container start f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vcs-type=git, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, version=17.1.12, architecture=x86_64, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Nov 26 03:19:19 localhost systemd-logind[761]: Existing logind session ID 29 used by new audit session, ignoring. Nov 26 03:19:19 localhost python3[75736]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_compute --conmon-pidfile /run/nova_compute.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env LIBGUESTFS_BACKEND=direct --env TRIPLEO_CONFIG_HASH=62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9 --healthcheck-command /openstack/healthcheck 5672 --ipc host --label config_id=tripleo_step5 --label container_name=nova_compute --label managed_by=tripleo_ansible --label config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_compute.log --network host --privileged=True --ulimit nofile=131072 --ulimit memlock=67108864 --user nova --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/nova:/var/log/nova --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /dev:/dev --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /run/nova:/run/nova:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /sys/class/net:/sys/class/net --volume /sys/bus/pci:/sys/bus/pci --volume /boot:/boot:ro --volume /var/lib/nova:/var/lib/nova:shared registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Nov 26 03:19:19 localhost systemd[1]: Created slice User Slice of UID 0. Nov 26 03:19:19 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Nov 26 03:19:19 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Nov 26 03:19:19 localhost systemd[1]: Starting User Manager for UID 0... Nov 26 03:19:20 localhost podman[75795]: 2025-11-26 08:19:20.024447726 +0000 UTC m=+0.103480124 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, version=17.1.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:19:20 localhost podman[75795]: 2025-11-26 08:19:20.089513833 +0000 UTC m=+0.168546241 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5) Nov 26 03:19:20 localhost podman[75795]: unhealthy Nov 26 03:19:20 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:19:20 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Failed with result 'exit-code'. Nov 26 03:19:20 localhost systemd[75809]: Queued start job for default target Main User Target. Nov 26 03:19:20 localhost systemd[75809]: Created slice User Application Slice. Nov 26 03:19:20 localhost systemd[75809]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Nov 26 03:19:20 localhost systemd[75809]: Started Daily Cleanup of User's Temporary Directories. Nov 26 03:19:20 localhost systemd[75809]: Reached target Paths. Nov 26 03:19:20 localhost systemd[75809]: Reached target Timers. Nov 26 03:19:20 localhost systemd[75809]: Starting D-Bus User Message Bus Socket... Nov 26 03:19:20 localhost systemd[75809]: Starting Create User's Volatile Files and Directories... Nov 26 03:19:20 localhost systemd[75809]: Finished Create User's Volatile Files and Directories. Nov 26 03:19:20 localhost systemd[75809]: Listening on D-Bus User Message Bus Socket. Nov 26 03:19:20 localhost systemd[75809]: Reached target Sockets. Nov 26 03:19:20 localhost systemd[75809]: Reached target Basic System. Nov 26 03:19:20 localhost systemd[75809]: Reached target Main User Target. Nov 26 03:19:20 localhost systemd[75809]: Startup finished in 157ms. Nov 26 03:19:20 localhost systemd[1]: Started User Manager for UID 0. Nov 26 03:19:20 localhost systemd[1]: Started Session c10 of User root. Nov 26 03:19:20 localhost systemd[1]: session-c10.scope: Deactivated successfully. Nov 26 03:19:20 localhost podman[75897]: 2025-11-26 08:19:20.517019743 +0000 UTC m=+0.098879955 container create 5b3681e874015ba7279a1323490b43d4c2358ed6d099870ec7a1493286a1e6a3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, url=https://www.redhat.com, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=nova_wait_for_compute_service, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 26 03:19:20 localhost podman[75897]: 2025-11-26 08:19:20.46770199 +0000 UTC m=+0.049562202 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Nov 26 03:19:20 localhost systemd[1]: Started libpod-conmon-5b3681e874015ba7279a1323490b43d4c2358ed6d099870ec7a1493286a1e6a3.scope. Nov 26 03:19:20 localhost systemd[1]: Started libcrun container. Nov 26 03:19:20 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b40f2e3ce469630e9b0ac8513bc38db78e513919de8034db1894708fd48cdba/merged/container-config-scripts supports timestamps until 2038 (0x7fffffff) Nov 26 03:19:20 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b40f2e3ce469630e9b0ac8513bc38db78e513919de8034db1894708fd48cdba/merged/var/log/nova supports timestamps until 2038 (0x7fffffff) Nov 26 03:19:20 localhost podman[75897]: 2025-11-26 08:19:20.607407023 +0000 UTC m=+0.189267225 container init 5b3681e874015ba7279a1323490b43d4c2358ed6d099870ec7a1493286a1e6a3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, tcib_managed=true, distribution-scope=public, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, container_name=nova_wait_for_compute_service, com.redhat.component=openstack-nova-compute-container, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:19:20 localhost podman[75897]: 2025-11-26 08:19:20.618785035 +0000 UTC m=+0.200645267 container start 5b3681e874015ba7279a1323490b43d4c2358ed6d099870ec7a1493286a1e6a3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, release=1761123044, io.buildah.version=1.41.4, container_name=nova_wait_for_compute_service, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, url=https://www.redhat.com, config_id=tripleo_step5, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, vcs-type=git, name=rhosp17/openstack-nova-compute) Nov 26 03:19:20 localhost podman[75897]: 2025-11-26 08:19:20.619190917 +0000 UTC m=+0.201051179 container attach 5b3681e874015ba7279a1323490b43d4c2358ed6d099870ec7a1493286a1e6a3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, version=17.1.12, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, container_name=nova_wait_for_compute_service, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:19:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:19:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:19:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:19:26 localhost podman[75923]: 2025-11-26 08:19:26.810584043 +0000 UTC m=+0.070144792 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step4, tcib_managed=true, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:19:26 localhost podman[75921]: 2025-11-26 08:19:26.85936744 +0000 UTC m=+0.118645550 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team) Nov 26 03:19:26 localhost podman[75923]: 2025-11-26 08:19:26.871459114 +0000 UTC m=+0.131019893 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.12, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_step4) Nov 26 03:19:26 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:19:26 localhost podman[75922]: 2025-11-26 08:19:26.916758527 +0000 UTC m=+0.176019077 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, name=rhosp17/openstack-cron, tcib_managed=true, com.redhat.component=openstack-cron-container) Nov 26 03:19:26 localhost podman[75922]: 2025-11-26 08:19:26.925292733 +0000 UTC m=+0.184553243 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, config_id=tripleo_step4, name=rhosp17/openstack-cron, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, managed_by=tripleo_ansible) Nov 26 03:19:26 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:19:26 localhost podman[75921]: 2025-11-26 08:19:26.940447279 +0000 UTC m=+0.199725409 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, version=17.1.12, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, release=1761123044, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4) Nov 26 03:19:26 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:19:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:19:29 localhost podman[75994]: 2025-11-26 08:19:29.827534171 +0000 UTC m=+0.092896686 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.buildah.version=1.41.4, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, release=1761123044, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, architecture=x86_64) Nov 26 03:19:30 localhost podman[75994]: 2025-11-26 08:19:30.195349285 +0000 UTC m=+0.460711811 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:19:30 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:19:30 localhost systemd[1]: Stopping User Manager for UID 0... Nov 26 03:19:30 localhost systemd[75809]: Activating special unit Exit the Session... Nov 26 03:19:30 localhost systemd[75809]: Stopped target Main User Target. Nov 26 03:19:30 localhost systemd[75809]: Stopped target Basic System. Nov 26 03:19:30 localhost systemd[75809]: Stopped target Paths. Nov 26 03:19:30 localhost systemd[75809]: Stopped target Sockets. Nov 26 03:19:30 localhost systemd[75809]: Stopped target Timers. Nov 26 03:19:30 localhost systemd[75809]: Stopped Daily Cleanup of User's Temporary Directories. Nov 26 03:19:30 localhost systemd[75809]: Closed D-Bus User Message Bus Socket. Nov 26 03:19:30 localhost systemd[75809]: Stopped Create User's Volatile Files and Directories. Nov 26 03:19:30 localhost systemd[75809]: Removed slice User Application Slice. Nov 26 03:19:30 localhost systemd[75809]: Reached target Shutdown. Nov 26 03:19:30 localhost systemd[75809]: Finished Exit the Session. Nov 26 03:19:30 localhost systemd[75809]: Reached target Exit the Session. Nov 26 03:19:30 localhost systemd[1]: user@0.service: Deactivated successfully. Nov 26 03:19:30 localhost systemd[1]: Stopped User Manager for UID 0. Nov 26 03:19:30 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Nov 26 03:19:30 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Nov 26 03:19:30 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Nov 26 03:19:30 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Nov 26 03:19:30 localhost systemd[1]: Removed slice User Slice of UID 0. Nov 26 03:19:32 localhost sshd[76018]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:19:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:19:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:19:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:19:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:19:32 localhost systemd[1]: tmp-crun.tXEGoC.mount: Deactivated successfully. Nov 26 03:19:32 localhost podman[76020]: 2025-11-26 08:19:32.841268804 +0000 UTC m=+0.103071512 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true) Nov 26 03:19:32 localhost podman[76022]: 2025-11-26 08:19:32.88172141 +0000 UTC m=+0.139555239 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044) Nov 26 03:19:32 localhost podman[76020]: 2025-11-26 08:19:32.927374054 +0000 UTC m=+0.189176752 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, release=1761123044, version=17.1.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, config_id=tripleo_step3) Nov 26 03:19:32 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:19:32 localhost podman[76021]: 2025-11-26 08:19:32.93090471 +0000 UTC m=+0.189249364 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, release=1761123044, container_name=collectd, managed_by=tripleo_ansible, version=17.1.12, url=https://www.redhat.com, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container) Nov 26 03:19:33 localhost podman[76021]: 2025-11-26 08:19:33.016370801 +0000 UTC m=+0.274715435 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, container_name=collectd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1761123044, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:19:33 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:19:33 localhost podman[76023]: 2025-11-26 08:19:32.981530433 +0000 UTC m=+0.235339361 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Nov 26 03:19:33 localhost podman[76023]: 2025-11-26 08:19:33.067335595 +0000 UTC m=+0.321144453 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, release=1761123044, tcib_managed=true, version=17.1.12, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-type=git) Nov 26 03:19:33 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:19:33 localhost podman[76022]: 2025-11-26 08:19:33.085978995 +0000 UTC m=+0.343812824 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, release=1761123044, com.redhat.component=openstack-ovn-controller-container) Nov 26 03:19:33 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:19:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:19:45 localhost podman[76106]: 2025-11-26 08:19:45.823258403 +0000 UTC m=+0.085864564 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, vcs-type=git, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vendor=Red Hat, Inc.) Nov 26 03:19:46 localhost podman[76106]: 2025-11-26 08:19:46.013998861 +0000 UTC m=+0.276604942 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.4, release=1761123044, com.redhat.component=openstack-qdrouterd-container) Nov 26 03:19:46 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:19:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:19:50 localhost podman[76135]: 2025-11-26 08:19:50.823613459 +0000 UTC m=+0.077453672 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, batch=17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, tcib_managed=true, version=17.1.12, build-date=2025-11-19T00:36:58Z, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 26 03:19:50 localhost podman[76135]: 2025-11-26 08:19:50.911464111 +0000 UTC m=+0.165304284 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, release=1761123044, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, build-date=2025-11-19T00:36:58Z, architecture=x86_64, container_name=nova_compute, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container) Nov 26 03:19:50 localhost podman[76135]: unhealthy Nov 26 03:19:50 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:19:50 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Failed with result 'exit-code'. Nov 26 03:19:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:19:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:19:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:19:57 localhost podman[76159]: 2025-11-26 08:19:57.832645149 +0000 UTC m=+0.084804352 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1761123044, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public) Nov 26 03:19:57 localhost podman[76159]: 2025-11-26 08:19:57.865534709 +0000 UTC m=+0.117693862 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1761123044, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 26 03:19:57 localhost systemd[1]: tmp-crun.f9OFKa.mount: Deactivated successfully. Nov 26 03:19:57 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:19:57 localhost podman[76157]: 2025-11-26 08:19:57.883844439 +0000 UTC m=+0.143387184 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., tcib_managed=true) Nov 26 03:19:57 localhost systemd[1]: tmp-crun.cWB7Sx.mount: Deactivated successfully. Nov 26 03:19:57 localhost podman[76158]: 2025-11-26 08:19:57.930248336 +0000 UTC m=+0.185771941 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, name=rhosp17/openstack-cron, distribution-scope=public, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:19:57 localhost podman[76158]: 2025-11-26 08:19:57.941654429 +0000 UTC m=+0.197178084 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, architecture=x86_64, container_name=logrotate_crond, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, url=https://www.redhat.com, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1761123044, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z) Nov 26 03:19:57 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:19:57 localhost podman[76157]: 2025-11-26 08:19:57.99390263 +0000 UTC m=+0.253445385 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:19:58 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:20:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:20:00 localhost podman[76299]: 2025-11-26 08:20:00.8261121 +0000 UTC m=+0.082674957 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, version=17.1.12, batch=17.1_20251118.1, tcib_managed=true, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc.) Nov 26 03:20:01 localhost podman[76299]: 2025-11-26 08:20:01.224871507 +0000 UTC m=+0.481434364 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, version=17.1.12, io.buildah.version=1.41.4, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:20:01 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:20:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:20:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:20:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:20:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:20:03 localhost podman[76325]: 2025-11-26 08:20:03.832258694 +0000 UTC m=+0.092294627 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, build-date=2025-11-18T23:44:13Z, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team) Nov 26 03:20:03 localhost systemd[1]: tmp-crun.fD6rbj.mount: Deactivated successfully. Nov 26 03:20:03 localhost podman[76326]: 2025-11-26 08:20:03.893126894 +0000 UTC m=+0.146969801 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, architecture=x86_64, version=17.1.12, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, container_name=collectd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, release=1761123044, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:20:03 localhost podman[76326]: 2025-11-26 08:20:03.902874308 +0000 UTC m=+0.156717235 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:20:03 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:20:03 localhost podman[76325]: 2025-11-26 08:20:03.917323793 +0000 UTC m=+0.177359756 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, container_name=iscsid, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid) Nov 26 03:20:03 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:20:03 localhost podman[76333]: 2025-11-26 08:20:03.996288108 +0000 UTC m=+0.240679221 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, tcib_managed=true) Nov 26 03:20:04 localhost podman[76333]: 2025-11-26 08:20:04.045315493 +0000 UTC m=+0.289706556 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true) Nov 26 03:20:04 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:20:04 localhost podman[76327]: 2025-11-26 08:20:04.04620754 +0000 UTC m=+0.293356036 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, container_name=ovn_controller, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-type=git) Nov 26 03:20:04 localhost podman[76327]: 2025-11-26 08:20:04.131294249 +0000 UTC m=+0.378442715 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, release=1761123044, distribution-scope=public, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, io.buildah.version=1.41.4) Nov 26 03:20:04 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:20:12 localhost sshd[76409]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:20:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:20:16 localhost systemd[1]: tmp-crun.r6FJHY.mount: Deactivated successfully. Nov 26 03:20:16 localhost podman[76411]: 2025-11-26 08:20:16.824268743 +0000 UTC m=+0.085709059 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 03:20:17 localhost podman[76411]: 2025-11-26 08:20:17.076647565 +0000 UTC m=+0.338087841 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, io.openshift.expose-services=, batch=17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, build-date=2025-11-18T22:49:46Z, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true) Nov 26 03:20:17 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:20:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:20:21 localhost systemd[1]: tmp-crun.sV6KqO.mount: Deactivated successfully. Nov 26 03:20:21 localhost podman[76441]: 2025-11-26 08:20:21.816987888 +0000 UTC m=+0.082586965 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.41.4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 26 03:20:21 localhost podman[76441]: 2025-11-26 08:20:21.900105709 +0000 UTC m=+0.165704786 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, vcs-type=git, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com) Nov 26 03:20:21 localhost podman[76441]: unhealthy Nov 26 03:20:21 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:20:21 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Failed with result 'exit-code'. Nov 26 03:20:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:20:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:20:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:20:28 localhost podman[76464]: 2025-11-26 08:20:28.842419025 +0000 UTC m=+0.094533444 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:20:28 localhost systemd[1]: tmp-crun.fII1rI.mount: Deactivated successfully. Nov 26 03:20:28 localhost podman[76463]: 2025-11-26 08:20:28.893495342 +0000 UTC m=+0.147612052 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, vcs-type=git, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, release=1761123044, maintainer=OpenStack TripleO Team) Nov 26 03:20:28 localhost podman[76464]: 2025-11-26 08:20:28.904482922 +0000 UTC m=+0.156597371 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container) Nov 26 03:20:28 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:20:28 localhost podman[76463]: 2025-11-26 08:20:28.925621898 +0000 UTC m=+0.179738638 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., architecture=x86_64) Nov 26 03:20:28 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:20:29 localhost podman[76462]: 2025-11-26 08:20:28.999049367 +0000 UTC m=+0.256611220 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, release=1761123044, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, build-date=2025-11-19T00:11:48Z, version=17.1.12) Nov 26 03:20:29 localhost podman[76462]: 2025-11-26 08:20:29.057461074 +0000 UTC m=+0.315022887 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Nov 26 03:20:29 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:20:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:20:31 localhost podman[76535]: 2025-11-26 08:20:31.820126603 +0000 UTC m=+0.084718250 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_id=tripleo_step4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 26 03:20:32 localhost podman[76535]: 2025-11-26 08:20:32.229352493 +0000 UTC m=+0.493944130 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, batch=17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true) Nov 26 03:20:32 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:20:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:20:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:20:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:20:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:20:34 localhost systemd[1]: tmp-crun.39Nx9U.mount: Deactivated successfully. Nov 26 03:20:34 localhost systemd[1]: tmp-crun.df3XmC.mount: Deactivated successfully. Nov 26 03:20:34 localhost podman[76558]: 2025-11-26 08:20:34.828448922 +0000 UTC m=+0.091391441 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1761123044, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_id=tripleo_step3, vendor=Red Hat, Inc., container_name=iscsid, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team) Nov 26 03:20:34 localhost podman[76561]: 2025-11-26 08:20:34.894107218 +0000 UTC m=+0.147070606 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, release=1761123044, io.buildah.version=1.41.4) Nov 26 03:20:34 localhost podman[76560]: 2025-11-26 08:20:34.853723783 +0000 UTC m=+0.108878297 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, architecture=x86_64, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, version=17.1.12, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.buildah.version=1.41.4, container_name=ovn_controller, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, config_id=tripleo_step4, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 26 03:20:34 localhost podman[76558]: 2025-11-26 08:20:34.913306235 +0000 UTC m=+0.176248743 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, tcib_managed=true, container_name=iscsid, release=1761123044, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_id=tripleo_step3, io.openshift.expose-services=, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 26 03:20:34 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:20:34 localhost podman[76560]: 2025-11-26 08:20:34.93738091 +0000 UTC m=+0.192535454 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, release=1761123044, container_name=ovn_controller, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, tcib_managed=true) Nov 26 03:20:34 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:20:34 localhost podman[76561]: 2025-11-26 08:20:34.965483505 +0000 UTC m=+0.218446883 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step4) Nov 26 03:20:34 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:20:35 localhost podman[76559]: 2025-11-26 08:20:34.999280992 +0000 UTC m=+0.259742415 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, container_name=collectd, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, url=https://www.redhat.com, tcib_managed=true, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:20:35 localhost podman[76559]: 2025-11-26 08:20:35.034410289 +0000 UTC m=+0.294871692 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-collectd, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., release=1761123044, version=17.1.12, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1) Nov 26 03:20:35 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:20:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:20:47 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 26 03:20:47 localhost recover_tripleo_nova_virtqemud[76649]: 61604 Nov 26 03:20:47 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 26 03:20:47 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 26 03:20:47 localhost systemd[1]: tmp-crun.ahffSZ.mount: Deactivated successfully. Nov 26 03:20:47 localhost podman[76642]: 2025-11-26 08:20:47.827806813 +0000 UTC m=+0.090951236 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, version=17.1.12, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z) Nov 26 03:20:48 localhost podman[76642]: 2025-11-26 08:20:48.018318005 +0000 UTC m=+0.281462478 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, release=1761123044, version=17.1.12, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., architecture=x86_64) Nov 26 03:20:48 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:20:52 localhost sshd[76672]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:20:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:20:52 localhost systemd[1]: tmp-crun.pCpw7R.mount: Deactivated successfully. Nov 26 03:20:52 localhost podman[76674]: 2025-11-26 08:20:52.832102199 +0000 UTC m=+0.098383241 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, architecture=x86_64, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:20:52 localhost podman[76674]: 2025-11-26 08:20:52.886385721 +0000 UTC m=+0.152666763 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_compute, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, distribution-scope=public) Nov 26 03:20:52 localhost podman[76674]: unhealthy Nov 26 03:20:52 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:20:52 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Failed with result 'exit-code'. Nov 26 03:20:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:20:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:20:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:20:59 localhost systemd[1]: tmp-crun.WUFft3.mount: Deactivated successfully. Nov 26 03:20:59 localhost podman[76698]: 2025-11-26 08:20:59.88203905 +0000 UTC m=+0.140320872 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z) Nov 26 03:20:59 localhost podman[76698]: 2025-11-26 08:20:59.89431269 +0000 UTC m=+0.152594472 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, tcib_managed=true, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Nov 26 03:20:59 localhost podman[76697]: 2025-11-26 08:20:59.84780906 +0000 UTC m=+0.109014790 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:20:59 localhost podman[76699]: 2025-11-26 08:20:59.932771426 +0000 UTC m=+0.189247414 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, batch=17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.buildah.version=1.41.4, managed_by=tripleo_ansible, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 26 03:20:59 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:20:59 localhost podman[76697]: 2025-11-26 08:20:59.984348518 +0000 UTC m=+0.245554238 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, architecture=x86_64, version=17.1.12, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container) Nov 26 03:20:59 localhost podman[76699]: 2025-11-26 08:20:59.991343448 +0000 UTC m=+0.247819396 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.buildah.version=1.41.4, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, release=1761123044, vcs-type=git, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 26 03:20:59 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:21:00 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:21:00 localhost systemd[1]: tmp-crun.3F5Erb.mount: Deactivated successfully. Nov 26 03:21:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:21:02 localhost systemd[1]: session-28.scope: Deactivated successfully. Nov 26 03:21:02 localhost systemd[1]: session-28.scope: Consumed 3.011s CPU time. Nov 26 03:21:02 localhost systemd-logind[761]: Session 28 logged out. Waiting for processes to exit. Nov 26 03:21:02 localhost systemd-logind[761]: Removed session 28. Nov 26 03:21:02 localhost podman[76844]: 2025-11-26 08:21:02.577101985 +0000 UTC m=+0.085330587 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, distribution-scope=public, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., container_name=nova_migration_target, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:21:02 localhost podman[76844]: 2025-11-26 08:21:02.954268982 +0000 UTC m=+0.462497554 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, release=1761123044, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true) Nov 26 03:21:02 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:21:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:21:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:21:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:21:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:21:05 localhost podman[76870]: 2025-11-26 08:21:05.832309072 +0000 UTC m=+0.084690089 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_id=tripleo_step4, tcib_managed=true, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, url=https://www.redhat.com, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Nov 26 03:21:05 localhost podman[76868]: 2025-11-26 08:21:05.862635504 +0000 UTC m=+0.121533106 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step3, architecture=x86_64, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp17/openstack-collectd, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public) Nov 26 03:21:05 localhost podman[76868]: 2025-11-26 08:21:05.874200522 +0000 UTC m=+0.133098144 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, distribution-scope=public, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.buildah.version=1.41.4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com) Nov 26 03:21:05 localhost podman[76870]: 2025-11-26 08:21:05.882257315 +0000 UTC m=+0.134638282 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, architecture=x86_64, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Nov 26 03:21:05 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:21:05 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:21:05 localhost podman[76869]: 2025-11-26 08:21:05.919386291 +0000 UTC m=+0.173025435 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, container_name=ovn_controller, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, version=17.1.12, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=) Nov 26 03:21:05 localhost podman[76869]: 2025-11-26 08:21:05.966453898 +0000 UTC m=+0.220093022 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, release=1761123044, version=17.1.12, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=) Nov 26 03:21:05 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:21:05 localhost podman[76867]: 2025-11-26 08:21:05.980521931 +0000 UTC m=+0.240349282 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, tcib_managed=true, vcs-type=git, release=1761123044, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4) Nov 26 03:21:05 localhost podman[76867]: 2025-11-26 08:21:05.992344817 +0000 UTC m=+0.252172148 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, container_name=iscsid, architecture=x86_64, release=1761123044, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com) Nov 26 03:21:06 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:21:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:21:18 localhost systemd[1]: tmp-crun.EmRK9M.mount: Deactivated successfully. Nov 26 03:21:18 localhost podman[76949]: 2025-11-26 08:21:18.821251728 +0000 UTC m=+0.080000698 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, version=17.1.12, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, managed_by=tripleo_ansible, container_name=metrics_qdr, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=) Nov 26 03:21:19 localhost podman[76949]: 2025-11-26 08:21:19.043492124 +0000 UTC m=+0.302241154 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, tcib_managed=true, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, architecture=x86_64, version=17.1.12, release=1761123044, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container) Nov 26 03:21:19 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:21:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:21:23 localhost podman[76978]: 2025-11-26 08:21:23.821251664 +0000 UTC m=+0.085175194 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, io.openshift.expose-services=, tcib_managed=true, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1761123044, config_id=tripleo_step5, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 26 03:21:23 localhost podman[76978]: 2025-11-26 08:21:23.876371741 +0000 UTC m=+0.140295241 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, container_name=nova_compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:21:23 localhost podman[76978]: unhealthy Nov 26 03:21:23 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:21:23 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Failed with result 'exit-code'. Nov 26 03:21:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:21:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:21:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:21:30 localhost podman[76999]: 2025-11-26 08:21:30.830118891 +0000 UTC m=+0.088533974 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, config_id=tripleo_step4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Nov 26 03:21:30 localhost systemd[1]: tmp-crun.fj46M8.mount: Deactivated successfully. Nov 26 03:21:30 localhost podman[77000]: 2025-11-26 08:21:30.849440813 +0000 UTC m=+0.103164435 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, version=17.1.12, io.buildah.version=1.41.4, batch=17.1_20251118.1, container_name=logrotate_crond, managed_by=tripleo_ansible, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container) Nov 26 03:21:30 localhost podman[77000]: 2025-11-26 08:21:30.882203758 +0000 UTC m=+0.135927370 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, container_name=logrotate_crond, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Nov 26 03:21:30 localhost podman[76999]: 2025-11-26 08:21:30.890221779 +0000 UTC m=+0.148636832 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 26 03:21:30 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:21:30 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:21:30 localhost podman[77001]: 2025-11-26 08:21:30.931171911 +0000 UTC m=+0.180801220 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, release=1761123044, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, io.buildah.version=1.41.4) Nov 26 03:21:30 localhost podman[77001]: 2025-11-26 08:21:30.985465434 +0000 UTC m=+0.235094713 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 26 03:21:30 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:21:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:21:33 localhost podman[77068]: 2025-11-26 08:21:33.814893632 +0000 UTC m=+0.077661718 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Nov 26 03:21:34 localhost sshd[77091]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:21:34 localhost podman[77068]: 2025-11-26 08:21:34.173356966 +0000 UTC m=+0.436125072 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, config_id=tripleo_step4, io.buildah.version=1.41.4, batch=17.1_20251118.1, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, managed_by=tripleo_ansible, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044) Nov 26 03:21:34 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:21:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:21:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:21:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:21:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:21:36 localhost systemd[1]: tmp-crun.if0LvR.mount: Deactivated successfully. Nov 26 03:21:36 localhost podman[77096]: 2025-11-26 08:21:36.891867508 +0000 UTC m=+0.141406756 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, vcs-type=git, distribution-scope=public, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.openshift.expose-services=) Nov 26 03:21:36 localhost podman[77093]: 2025-11-26 08:21:36.956264625 +0000 UTC m=+0.210772052 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git) Nov 26 03:21:36 localhost podman[77096]: 2025-11-26 08:21:36.982313088 +0000 UTC m=+0.231852336 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=) Nov 26 03:21:36 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:21:37 localhost podman[77094]: 2025-11-26 08:21:36.99999195 +0000 UTC m=+0.250988661 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, build-date=2025-11-18T22:51:28Z, tcib_managed=true, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, release=1761123044, maintainer=OpenStack TripleO Team) Nov 26 03:21:37 localhost podman[77095]: 2025-11-26 08:21:36.860301477 +0000 UTC m=+0.110947778 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, batch=17.1_20251118.1, url=https://www.redhat.com, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, io.openshift.expose-services=, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc.) Nov 26 03:21:37 localhost podman[77094]: 2025-11-26 08:21:37.01560227 +0000 UTC m=+0.266598961 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, batch=17.1_20251118.1, version=17.1.12, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, distribution-scope=public, name=rhosp17/openstack-collectd, url=https://www.redhat.com, release=1761123044, vcs-type=git, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:21:37 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:21:37 localhost podman[77095]: 2025-11-26 08:21:37.044398466 +0000 UTC m=+0.295044767 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, container_name=ovn_controller, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team) Nov 26 03:21:37 localhost podman[77093]: 2025-11-26 08:21:37.053305954 +0000 UTC m=+0.307813371 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, release=1761123044, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, container_name=iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, io.openshift.expose-services=) Nov 26 03:21:37 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:21:37 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:21:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:21:49 localhost podman[77180]: 2025-11-26 08:21:49.825681416 +0000 UTC m=+0.086387349 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, io.buildah.version=1.41.4, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Nov 26 03:21:49 localhost podman[77180]: 2025-11-26 08:21:49.98736364 +0000 UTC m=+0.248069513 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, release=1761123044, io.buildah.version=1.41.4, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, tcib_managed=true, vendor=Red Hat, Inc.) Nov 26 03:21:49 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:21:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:21:54 localhost podman[77209]: 2025-11-26 08:21:54.812132334 +0000 UTC m=+0.076045108 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, build-date=2025-11-19T00:36:58Z, architecture=x86_64, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, version=17.1.12) Nov 26 03:21:54 localhost podman[77209]: 2025-11-26 08:21:54.86979383 +0000 UTC m=+0.133706644 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, container_name=nova_compute, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.expose-services=) Nov 26 03:21:54 localhost podman[77209]: unhealthy Nov 26 03:21:54 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:21:54 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Failed with result 'exit-code'. Nov 26 03:22:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:22:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:22:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:22:01 localhost systemd[1]: tmp-crun.kSBN3p.mount: Deactivated successfully. Nov 26 03:22:01 localhost podman[77230]: 2025-11-26 08:22:01.825064904 +0000 UTC m=+0.083150212 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, url=https://www.redhat.com, config_id=tripleo_step4, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z) Nov 26 03:22:01 localhost podman[77230]: 2025-11-26 08:22:01.854973654 +0000 UTC m=+0.113058952 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, config_id=tripleo_step4, container_name=logrotate_crond, vcs-type=git, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron) Nov 26 03:22:01 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:22:01 localhost systemd[1]: tmp-crun.mPKinU.mount: Deactivated successfully. Nov 26 03:22:01 localhost podman[77229]: 2025-11-26 08:22:01.895020719 +0000 UTC m=+0.155347355 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, config_id=tripleo_step4, url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, architecture=x86_64, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=) Nov 26 03:22:01 localhost podman[77231]: 2025-11-26 08:22:01.934548988 +0000 UTC m=+0.185701207 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4) Nov 26 03:22:01 localhost podman[77231]: 2025-11-26 08:22:01.962269752 +0000 UTC m=+0.213422001 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, config_id=tripleo_step4, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:22:01 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:22:02 localhost podman[77229]: 2025-11-26 08:22:02.019034389 +0000 UTC m=+0.279361055 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.openshift.expose-services=, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:22:02 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:22:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:22:04 localhost systemd[1]: tmp-crun.qQo2jb.mount: Deactivated successfully. Nov 26 03:22:04 localhost podman[77380]: 2025-11-26 08:22:04.834092965 +0000 UTC m=+0.096955968 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., version=17.1.12, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:22:05 localhost podman[77380]: 2025-11-26 08:22:05.206252371 +0000 UTC m=+0.469115304 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., batch=17.1_20251118.1, version=17.1.12, config_id=tripleo_step4, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, distribution-scope=public, release=1761123044, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:22:05 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:22:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:22:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:22:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:22:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:22:07 localhost systemd[1]: tmp-crun.kJmJqb.mount: Deactivated successfully. Nov 26 03:22:07 localhost podman[77406]: 2025-11-26 08:22:07.838718124 +0000 UTC m=+0.093178545 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, managed_by=tripleo_ansible) Nov 26 03:22:07 localhost podman[77403]: 2025-11-26 08:22:07.868661584 +0000 UTC m=+0.130251289 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, version=17.1.12, name=rhosp17/openstack-iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Nov 26 03:22:07 localhost podman[77405]: 2025-11-26 08:22:07.92068656 +0000 UTC m=+0.176979245 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_id=tripleo_step4, vendor=Red Hat, Inc., container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, release=1761123044) Nov 26 03:22:07 localhost podman[77406]: 2025-11-26 08:22:07.924206105 +0000 UTC m=+0.178666546 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, batch=17.1_20251118.1, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com) Nov 26 03:22:07 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:22:07 localhost podman[77405]: 2025-11-26 08:22:07.942437824 +0000 UTC m=+0.198730479 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_id=tripleo_step4, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:22:07 localhost podman[77403]: 2025-11-26 08:22:07.952000561 +0000 UTC m=+0.213590276 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container) Nov 26 03:22:07 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:22:07 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:22:07 localhost podman[77404]: 2025-11-26 08:22:07.82165016 +0000 UTC m=+0.082785821 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, version=17.1.12, architecture=x86_64, tcib_managed=true, container_name=collectd, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible) Nov 26 03:22:08 localhost podman[77404]: 2025-11-26 08:22:08.005265224 +0000 UTC m=+0.266400905 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, batch=17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team) Nov 26 03:22:08 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:22:16 localhost sshd[77490]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:22:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:22:20 localhost podman[77492]: 2025-11-26 08:22:20.818009941 +0000 UTC m=+0.080580736 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, container_name=metrics_qdr, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 03:22:21 localhost podman[77492]: 2025-11-26 08:22:21.007358346 +0000 UTC m=+0.269929111 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp17/openstack-qdrouterd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, architecture=x86_64, build-date=2025-11-18T22:49:46Z, release=1761123044, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr) Nov 26 03:22:21 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:22:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:22:25 localhost podman[77611]: 2025-11-26 08:22:25.814912433 +0000 UTC m=+0.075443057 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, release=1761123044, architecture=x86_64, version=17.1.12, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, url=https://www.redhat.com, config_id=tripleo_step5, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:22:25 localhost podman[77611]: 2025-11-26 08:22:25.850369591 +0000 UTC m=+0.110900225 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, release=1761123044, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=nova_compute) Nov 26 03:22:25 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:22:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:22:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:22:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:22:32 localhost podman[77638]: 2025-11-26 08:22:32.833117278 +0000 UTC m=+0.093148441 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, config_id=tripleo_step4, version=17.1.12, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, release=1761123044) Nov 26 03:22:32 localhost podman[77638]: 2025-11-26 08:22:32.863128968 +0000 UTC m=+0.123160111 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 26 03:22:32 localhost systemd[1]: tmp-crun.PtIP8n.mount: Deactivated successfully. Nov 26 03:22:32 localhost podman[77639]: 2025-11-26 08:22:32.886553488 +0000 UTC m=+0.146221889 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, version=17.1.12, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:22:32 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:22:32 localhost podman[77640]: 2025-11-26 08:22:32.936720568 +0000 UTC m=+0.194071849 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, vcs-type=git, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:22:32 localhost podman[77639]: 2025-11-26 08:22:32.952290845 +0000 UTC m=+0.211959216 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, release=1761123044, config_id=tripleo_step4, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, architecture=x86_64, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team) Nov 26 03:22:32 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:22:32 localhost podman[77640]: 2025-11-26 08:22:32.968239995 +0000 UTC m=+0.225591276 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:22:32 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:22:33 localhost systemd[1]: libpod-5b3681e874015ba7279a1323490b43d4c2358ed6d099870ec7a1493286a1e6a3.scope: Deactivated successfully. Nov 26 03:22:33 localhost podman[77709]: 2025-11-26 08:22:33.308277002 +0000 UTC m=+0.057824595 container died 5b3681e874015ba7279a1323490b43d4c2358ed6d099870ec7a1493286a1e6a3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_id=tripleo_step5, vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_wait_for_compute_service, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., release=1761123044) Nov 26 03:22:33 localhost podman[77709]: 2025-11-26 08:22:33.3411327 +0000 UTC m=+0.090680233 container cleanup 5b3681e874015ba7279a1323490b43d4c2358ed6d099870ec7a1493286a1e6a3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, io.openshift.expose-services=, container_name=nova_wait_for_compute_service, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, tcib_managed=true, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, version=17.1.12, distribution-scope=public, batch=17.1_20251118.1) Nov 26 03:22:33 localhost systemd[1]: libpod-conmon-5b3681e874015ba7279a1323490b43d4c2358ed6d099870ec7a1493286a1e6a3.scope: Deactivated successfully. Nov 26 03:22:33 localhost python3[75736]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_wait_for_compute_service --conmon-pidfile /run/nova_wait_for_compute_service.pid --detach=False --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env __OS_DEBUG=true --env TRIPLEO_CONFIG_HASH=c7803ed1795969cb7cf47e6d4d57c4b9 --label config_id=tripleo_step5 --label container_name=nova_wait_for_compute_service --label managed_by=tripleo_ansible --label config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_wait_for_compute_service.log --network host --user nova --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/nova:/var/log/nova --volume /var/lib/container-config-scripts:/container-config-scripts registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Nov 26 03:22:33 localhost systemd[1]: var-lib-containers-storage-overlay-6b40f2e3ce469630e9b0ac8513bc38db78e513919de8034db1894708fd48cdba-merged.mount: Deactivated successfully. Nov 26 03:22:33 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5b3681e874015ba7279a1323490b43d4c2358ed6d099870ec7a1493286a1e6a3-userdata-shm.mount: Deactivated successfully. Nov 26 03:22:33 localhost python3[77762]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:22:34 localhost python3[77778]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_compute_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 03:22:34 localhost python3[77839]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764145354.285221-118442-224093371601776/source dest=/etc/systemd/system/tripleo_nova_compute.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:22:35 localhost python3[77855]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 26 03:22:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:22:35 localhost systemd[1]: Reloading. Nov 26 03:22:35 localhost systemd-sysv-generator[77895]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:22:35 localhost systemd-rc-local-generator[77891]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:22:35 localhost podman[77857]: 2025-11-26 08:22:35.363188295 +0000 UTC m=+0.098935538 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, config_id=tripleo_step4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:22:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:22:35 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 26 03:22:35 localhost recover_tripleo_nova_virtqemud[77914]: 61604 Nov 26 03:22:35 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 26 03:22:35 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 26 03:22:35 localhost podman[77857]: 2025-11-26 08:22:35.810433162 +0000 UTC m=+0.546180435 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1761123044, config_id=tripleo_step4, io.buildah.version=1.41.4, tcib_managed=true, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, batch=17.1_20251118.1) Nov 26 03:22:35 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:22:36 localhost python3[77933]: ansible-systemd Invoked with state=restarted name=tripleo_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 03:22:36 localhost systemd[1]: Reloading. Nov 26 03:22:36 localhost systemd-sysv-generator[77961]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:22:36 localhost systemd-rc-local-generator[77958]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:22:36 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:22:36 localhost systemd[1]: Starting nova_compute container... Nov 26 03:22:36 localhost tripleo-start-podman-container[77972]: Creating additional drop-in dependency for "nova_compute" (f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d) Nov 26 03:22:36 localhost systemd[1]: Reloading. Nov 26 03:22:36 localhost systemd-rc-local-generator[78031]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 03:22:36 localhost systemd-sysv-generator[78034]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 03:22:37 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 03:22:37 localhost systemd[1]: Started nova_compute container. Nov 26 03:22:37 localhost python3[78072]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks5.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:22:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:22:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:22:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:22:38 localhost systemd[1]: tmp-crun.pqvPeJ.mount: Deactivated successfully. Nov 26 03:22:38 localhost podman[78073]: 2025-11-26 08:22:38.123355545 +0000 UTC m=+0.129320070 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 26 03:22:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:22:38 localhost podman[78074]: 2025-11-26 08:22:38.135721044 +0000 UTC m=+0.136601523 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.expose-services=, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com) Nov 26 03:22:38 localhost podman[78075]: 2025-11-26 08:22:38.190129295 +0000 UTC m=+0.187623600 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1761123044, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.12, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64) Nov 26 03:22:38 localhost podman[78073]: 2025-11-26 08:22:38.194400815 +0000 UTC m=+0.200365330 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, release=1761123044, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team) Nov 26 03:22:38 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:22:38 localhost podman[78074]: 2025-11-26 08:22:38.218283109 +0000 UTC m=+0.219163628 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, release=1761123044, com.redhat.component=openstack-iscsid-container, version=17.1.12, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid) Nov 26 03:22:38 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:22:38 localhost podman[78075]: 2025-11-26 08:22:38.241441339 +0000 UTC m=+0.238935674 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, release=1761123044, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, vcs-type=git) Nov 26 03:22:38 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:22:38 localhost podman[78131]: 2025-11-26 08:22:38.338371395 +0000 UTC m=+0.192811830 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, distribution-scope=public, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:22:38 localhost podman[78131]: 2025-11-26 08:22:38.392338321 +0000 UTC m=+0.246778766 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true) Nov 26 03:22:38 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:22:39 localhost python3[78277]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks5.json short_hostname=np0005536118 step=5 update_config_hash_only=False Nov 26 03:22:39 localhost python3[78293]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 03:22:40 localhost python3[78309]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_5 config_pattern=container-puppet-*.json config_overrides={} debug=True Nov 26 03:22:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:22:51 localhost podman[78310]: 2025-11-26 08:22:51.8281865 +0000 UTC m=+0.090510769 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, architecture=x86_64, version=17.1.12, release=1761123044, config_id=tripleo_step1, io.openshift.expose-services=, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:22:52 localhost podman[78310]: 2025-11-26 08:22:52.064495544 +0000 UTC m=+0.326819823 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, config_id=tripleo_step1, version=17.1.12, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:22:52 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:22:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:22:56 localhost podman[78339]: 2025-11-26 08:22:56.822829124 +0000 UTC m=+0.081766631 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, version=17.1.12, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.component=openstack-nova-compute-container) Nov 26 03:22:56 localhost podman[78339]: 2025-11-26 08:22:56.876079928 +0000 UTC m=+0.135017445 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, container_name=nova_compute, release=1761123044, name=rhosp17/openstack-nova-compute) Nov 26 03:22:56 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:22:57 localhost sshd[78368]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:23:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:23:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:23:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:23:03 localhost podman[78371]: 2025-11-26 08:23:03.817866607 +0000 UTC m=+0.079647865 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, batch=17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, version=17.1.12, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., config_id=tripleo_step4, tcib_managed=true, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:23:03 localhost podman[78371]: 2025-11-26 08:23:03.854597145 +0000 UTC m=+0.116378393 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., tcib_managed=true, name=rhosp17/openstack-cron, version=17.1.12, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, config_id=tripleo_step4, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z) Nov 26 03:23:03 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:23:03 localhost podman[78370]: 2025-11-26 08:23:03.872728592 +0000 UTC m=+0.137426879 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, release=1761123044, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64) Nov 26 03:23:03 localhost podman[78370]: 2025-11-26 08:23:03.904323611 +0000 UTC m=+0.169021948 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, build-date=2025-11-19T00:11:48Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, vcs-type=git) Nov 26 03:23:03 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:23:03 localhost podman[78372]: 2025-11-26 08:23:03.922761708 +0000 UTC m=+0.179224612 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, architecture=x86_64, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1) Nov 26 03:23:03 localhost podman[78372]: 2025-11-26 08:23:03.974276168 +0000 UTC m=+0.230739082 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 26 03:23:03 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:23:04 localhost sshd[78488]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:23:04 localhost systemd-logind[761]: New session 34 of user zuul. Nov 26 03:23:04 localhost systemd[1]: Started Session 34 of User zuul. Nov 26 03:23:05 localhost python3[78614]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 03:23:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:23:06 localhost podman[78706]: 2025-11-26 08:23:06.861639763 +0000 UTC m=+0.122318076 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible) Nov 26 03:23:07 localhost podman[78706]: 2025-11-26 08:23:07.240425749 +0000 UTC m=+0.501104062 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., container_name=nova_migration_target, vcs-type=git, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team) Nov 26 03:23:07 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:23:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:23:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:23:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:23:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:23:08 localhost systemd[1]: tmp-crun.LYqndB.mount: Deactivated successfully. Nov 26 03:23:08 localhost podman[78839]: 2025-11-26 08:23:08.849174838 +0000 UTC m=+0.109768070 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, version=17.1.12, config_id=tripleo_step3, release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 26 03:23:08 localhost podman[78839]: 2025-11-26 08:23:08.864433786 +0000 UTC m=+0.125026998 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, distribution-scope=public, release=1761123044, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64) Nov 26 03:23:08 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:23:08 localhost podman[78842]: 2025-11-26 08:23:08.944906016 +0000 UTC m=+0.197233275 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z) Nov 26 03:23:08 localhost podman[78842]: 2025-11-26 08:23:08.996302064 +0000 UTC m=+0.248629323 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., config_id=tripleo_step4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, release=1761123044, container_name=ovn_metadata_agent) Nov 26 03:23:09 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:23:09 localhost podman[78841]: 2025-11-26 08:23:09.045230085 +0000 UTC m=+0.301310958 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, version=17.1.12, batch=17.1_20251118.1, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, tcib_managed=true, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z) Nov 26 03:23:09 localhost podman[78840]: 2025-11-26 08:23:08.997011536 +0000 UTC m=+0.256311549 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, container_name=collectd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, vcs-type=git, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:23:09 localhost podman[78840]: 2025-11-26 08:23:09.129164152 +0000 UTC m=+0.388464185 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_id=tripleo_step3, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044) Nov 26 03:23:09 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:23:09 localhost podman[78841]: 2025-11-26 08:23:09.150135946 +0000 UTC m=+0.406216829 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, release=1761123044, version=17.1.12) Nov 26 03:23:09 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:23:13 localhost python3[79003]: ansible-ansible.legacy.dnf Invoked with name=['iptables'] allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None state=None Nov 26 03:23:20 localhost python3[79096]: ansible-ansible.builtin.iptables Invoked with action=insert chain=INPUT comment=allow ssh access for zuul executor in_interface=eth0 jump=ACCEPT protocol=tcp source=38.102.83.114 table=filter state=present ip_version=ipv4 match=[] destination_ports=[] ctstate=[] syn=ignore flush=False chain_management=False numeric=False rule_num=None wait=None to_source=None destination=None to_destination=None tcp_flags=None gateway=None log_prefix=None log_level=None goto=None out_interface=None fragment=None set_counters=None source_port=None destination_port=None to_ports=None set_dscp_mark=None set_dscp_mark_class=None src_range=None dst_range=None match_set=None match_set_flags=None limit=None limit_burst=None uid_owner=None gid_owner=None reject_with=None icmp_type=None policy=None Nov 26 03:23:20 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled Nov 26 03:23:20 localhost systemd-journald[47778]: Field hash table of /run/log/journal/ea6370aa35b896eb1e7cdbd81aa316d7/system.journal has a fill level at 81.1 (270 of 333 items), suggesting rotation. Nov 26 03:23:20 localhost systemd-journald[47778]: /run/log/journal/ea6370aa35b896eb1e7cdbd81aa316d7/system.journal: Journal header limits reached or header out-of-date, rotating. Nov 26 03:23:20 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 26 03:23:20 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 26 03:23:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:23:22 localhost systemd[1]: tmp-crun.B44UGZ.mount: Deactivated successfully. Nov 26 03:23:22 localhost podman[79139]: 2025-11-26 08:23:22.8325427 +0000 UTC m=+0.094426360 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, release=1761123044, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:23:23 localhost podman[79139]: 2025-11-26 08:23:23.032181397 +0000 UTC m=+0.294065007 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vcs-type=git, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:23:23 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:23:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:23:27 localhost systemd[1]: tmp-crun.uNIhQI.mount: Deactivated successfully. Nov 26 03:23:27 localhost podman[79195]: 2025-11-26 08:23:27.823780208 +0000 UTC m=+0.083854394 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, version=17.1.12, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:23:27 localhost podman[79195]: 2025-11-26 08:23:27.857522134 +0000 UTC m=+0.117596330 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:23:27 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:23:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:23:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:23:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:23:34 localhost systemd[1]: tmp-crun.bCv7TU.mount: Deactivated successfully. Nov 26 03:23:34 localhost podman[79222]: 2025-11-26 08:23:34.870033535 +0000 UTC m=+0.131408704 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, version=17.1.12, config_id=tripleo_step4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z) Nov 26 03:23:34 localhost podman[79221]: 2025-11-26 08:23:34.838734064 +0000 UTC m=+0.099415712 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:23:34 localhost podman[79223]: 2025-11-26 08:23:34.938103214 +0000 UTC m=+0.197317767 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.expose-services=, distribution-scope=public, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, release=1761123044, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:23:34 localhost podman[79222]: 2025-11-26 08:23:34.960677667 +0000 UTC m=+0.222052866 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step4, tcib_managed=true, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Nov 26 03:23:34 localhost podman[79221]: 2025-11-26 08:23:34.970103506 +0000 UTC m=+0.230785094 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, version=17.1.12, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, container_name=ceilometer_agent_compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 26 03:23:34 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:23:34 localhost podman[79223]: 2025-11-26 08:23:34.991144333 +0000 UTC m=+0.250358866 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, version=17.1.12, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git) Nov 26 03:23:35 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:23:35 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:23:35 localhost systemd[1]: tmp-crun.TpUO4f.mount: Deactivated successfully. Nov 26 03:23:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:23:37 localhost podman[79297]: 2025-11-26 08:23:37.832561435 +0000 UTC m=+0.095440531 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible) Nov 26 03:23:38 localhost podman[79297]: 2025-11-26 08:23:38.222335918 +0000 UTC m=+0.485215014 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, config_id=tripleo_step4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, version=17.1.12) Nov 26 03:23:38 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:23:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:23:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:23:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:23:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:23:39 localhost podman[79321]: 2025-11-26 08:23:39.825108873 +0000 UTC m=+0.085417232 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, architecture=x86_64, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, container_name=iscsid, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, managed_by=tripleo_ansible, release=1761123044, vcs-type=git, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, version=17.1.12) Nov 26 03:23:39 localhost systemd[1]: tmp-crun.ti8EOm.mount: Deactivated successfully. Nov 26 03:23:39 localhost podman[79324]: 2025-11-26 08:23:39.887132778 +0000 UTC m=+0.138674148 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, release=1761123044, vcs-type=git, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c) Nov 26 03:23:39 localhost podman[79321]: 2025-11-26 08:23:39.9106781 +0000 UTC m=+0.170986479 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, tcib_managed=true, release=1761123044, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, vcs-type=git, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:23:39 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:23:39 localhost sshd[79384]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:23:39 localhost podman[79323]: 2025-11-26 08:23:39.991886953 +0000 UTC m=+0.247446536 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-type=git, version=17.1.12, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public) Nov 26 03:23:40 localhost podman[79324]: 2025-11-26 08:23:40.007355797 +0000 UTC m=+0.258897177 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, container_name=ovn_metadata_agent) Nov 26 03:23:40 localhost podman[79323]: 2025-11-26 08:23:40.016299382 +0000 UTC m=+0.271858935 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.12, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, architecture=x86_64, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, io.openshift.expose-services=) Nov 26 03:23:40 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:23:40 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:23:40 localhost podman[79322]: 2025-11-26 08:23:40.087300191 +0000 UTC m=+0.343287328 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, release=1761123044, io.buildah.version=1.41.4, url=https://www.redhat.com, version=17.1.12, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:23:40 localhost podman[79322]: 2025-11-26 08:23:40.100304381 +0000 UTC m=+0.356291518 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, io.openshift.expose-services=, version=17.1.12, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:23:40 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:23:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:23:53 localhost podman[79409]: 2025-11-26 08:23:53.822918569 +0000 UTC m=+0.085625929 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step1, release=1761123044, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, version=17.1.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container) Nov 26 03:23:54 localhost podman[79409]: 2025-11-26 08:23:54.036721821 +0000 UTC m=+0.299429131 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, container_name=metrics_qdr, io.buildah.version=1.41.4, version=17.1.12, tcib_managed=true, vendor=Red Hat, Inc., release=1761123044, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 03:23:54 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:23:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:23:58 localhost podman[79438]: 2025-11-26 08:23:58.812764587 +0000 UTC m=+0.078469920 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, url=https://www.redhat.com, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, architecture=x86_64) Nov 26 03:23:58 localhost podman[79438]: 2025-11-26 08:23:58.837902268 +0000 UTC m=+0.103607641 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, vendor=Red Hat, Inc., container_name=nova_compute, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4) Nov 26 03:23:58 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:24:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:24:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:24:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:24:05 localhost podman[79480]: 2025-11-26 08:24:05.841631117 +0000 UTC m=+0.098847954 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, config_id=tripleo_step4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:24:05 localhost podman[79480]: 2025-11-26 08:24:05.875320851 +0000 UTC m=+0.132537708 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-cron, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, distribution-scope=public) Nov 26 03:24:05 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:24:05 localhost podman[79479]: 2025-11-26 08:24:05.892948613 +0000 UTC m=+0.151766119 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, managed_by=tripleo_ansible, distribution-scope=public, container_name=ceilometer_agent_compute, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step4, version=17.1.12, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:24:05 localhost podman[79481]: 2025-11-26 08:24:05.975492956 +0000 UTC m=+0.230074362 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step4) Nov 26 03:24:05 localhost podman[79479]: 2025-11-26 08:24:05.998296816 +0000 UTC m=+0.257114362 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, release=1761123044, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, version=17.1.12, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 26 03:24:06 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:24:06 localhost podman[79481]: 2025-11-26 08:24:06.054889114 +0000 UTC m=+0.309470550 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, config_id=tripleo_step4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi) Nov 26 03:24:06 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:24:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:24:08 localhost systemd[1]: tmp-crun.ZuBjuB.mount: Deactivated successfully. Nov 26 03:24:08 localhost podman[79614]: 2025-11-26 08:24:08.827784804 +0000 UTC m=+0.083562475 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, config_id=tripleo_step4, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team) Nov 26 03:24:09 localhost podman[79614]: 2025-11-26 08:24:09.211384438 +0000 UTC m=+0.467162099 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vendor=Red Hat, Inc., container_name=nova_migration_target, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, config_id=tripleo_step4) Nov 26 03:24:09 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:24:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:24:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:24:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:24:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:24:10 localhost podman[79636]: 2025-11-26 08:24:10.830226177 +0000 UTC m=+0.094140540 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:24:10 localhost podman[79636]: 2025-11-26 08:24:10.86484341 +0000 UTC m=+0.128757813 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.4, config_id=tripleo_step3, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Nov 26 03:24:10 localhost podman[79640]: 2025-11-26 08:24:10.890308921 +0000 UTC m=+0.145626020 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.expose-services=, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, managed_by=tripleo_ansible) Nov 26 03:24:10 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:24:10 localhost podman[79637]: 2025-11-26 08:24:10.942534894 +0000 UTC m=+0.200885026 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, architecture=x86_64, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, container_name=collectd, batch=17.1_20251118.1) Nov 26 03:24:10 localhost podman[79640]: 2025-11-26 08:24:10.944551876 +0000 UTC m=+0.199869005 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team) Nov 26 03:24:10 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:24:10 localhost podman[79638]: 2025-11-26 08:24:10.992060734 +0000 UTC m=+0.248689994 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, tcib_managed=true, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.openshift.expose-services=, vcs-type=git) Nov 26 03:24:11 localhost podman[79638]: 2025-11-26 08:24:11.038590323 +0000 UTC m=+0.295219583 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Nov 26 03:24:11 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:24:11 localhost podman[79637]: 2025-11-26 08:24:11.127337866 +0000 UTC m=+0.385687948 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, release=1761123044, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true) Nov 26 03:24:11 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:24:11 localhost systemd[1]: tmp-crun.EjSIDJ.mount: Deactivated successfully. Nov 26 03:24:19 localhost sshd[79723]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:24:20 localhost systemd[1]: session-34.scope: Deactivated successfully. Nov 26 03:24:20 localhost systemd[1]: session-34.scope: Consumed 5.729s CPU time. Nov 26 03:24:20 localhost systemd-logind[761]: Session 34 logged out. Waiting for processes to exit. Nov 26 03:24:20 localhost systemd-logind[761]: Removed session 34. Nov 26 03:24:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:24:24 localhost systemd[1]: tmp-crun.pqnK7R.mount: Deactivated successfully. Nov 26 03:24:24 localhost podman[79745]: 2025-11-26 08:24:24.82113983 +0000 UTC m=+0.084066751 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, architecture=x86_64, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1) Nov 26 03:24:25 localhost podman[79745]: 2025-11-26 08:24:25.075468937 +0000 UTC m=+0.338395818 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, distribution-scope=public, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, build-date=2025-11-18T22:49:46Z, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:24:25 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:24:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:24:29 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 26 03:24:29 localhost recover_tripleo_nova_virtqemud[79800]: 61604 Nov 26 03:24:29 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 26 03:24:29 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 26 03:24:29 localhost podman[79798]: 2025-11-26 08:24:29.823579105 +0000 UTC m=+0.086688072 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, container_name=nova_compute, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, config_id=tripleo_step5, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:24:29 localhost podman[79798]: 2025-11-26 08:24:29.860350814 +0000 UTC m=+0.123459791 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_id=tripleo_step5, container_name=nova_compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true) Nov 26 03:24:29 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:24:31 localhost sshd[79826]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:24:31 localhost systemd-logind[761]: New session 35 of user zuul. Nov 26 03:24:31 localhost systemd[1]: Started Session 35 of User zuul. Nov 26 03:24:31 localhost python3[79845]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 26 03:24:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:24:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:24:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:24:36 localhost podman[79849]: 2025-11-26 08:24:36.836424885 +0000 UTC m=+0.090892061 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, release=1761123044, io.buildah.version=1.41.4, vcs-type=git, distribution-scope=public, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4) Nov 26 03:24:36 localhost systemd[1]: tmp-crun.Qbncm5.mount: Deactivated successfully. Nov 26 03:24:36 localhost podman[79849]: 2025-11-26 08:24:36.894473817 +0000 UTC m=+0.148940923 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi) Nov 26 03:24:36 localhost podman[79847]: 2025-11-26 08:24:36.893438475 +0000 UTC m=+0.150698846 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, release=1761123044, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, version=17.1.12, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute) Nov 26 03:24:36 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:24:36 localhost podman[79847]: 2025-11-26 08:24:36.980416705 +0000 UTC m=+0.237677036 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, tcib_managed=true, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 26 03:24:36 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:24:37 localhost podman[79848]: 2025-11-26 08:24:37.039851959 +0000 UTC m=+0.298566515 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, tcib_managed=true, io.openshift.expose-services=) Nov 26 03:24:37 localhost podman[79848]: 2025-11-26 08:24:37.078403213 +0000 UTC m=+0.337117729 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, release=1761123044, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git) Nov 26 03:24:37 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:24:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:24:39 localhost podman[79920]: 2025-11-26 08:24:39.823411526 +0000 UTC m=+0.077648925 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step4, url=https://www.redhat.com, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible) Nov 26 03:24:40 localhost podman[79920]: 2025-11-26 08:24:40.21646736 +0000 UTC m=+0.470704789 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.4, release=1761123044, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:24:40 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:24:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:24:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:24:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:24:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:24:41 localhost podman[79945]: 2025-11-26 08:24:41.824152076 +0000 UTC m=+0.081327157 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-ovn-controller, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4) Nov 26 03:24:41 localhost systemd[1]: tmp-crun.qA59cx.mount: Deactivated successfully. Nov 26 03:24:41 localhost podman[79943]: 2025-11-26 08:24:41.87639151 +0000 UTC m=+0.138180473 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, tcib_managed=true) Nov 26 03:24:41 localhost podman[79943]: 2025-11-26 08:24:41.89238872 +0000 UTC m=+0.154177703 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:24:41 localhost podman[79945]: 2025-11-26 08:24:41.901662036 +0000 UTC m=+0.158837157 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, config_id=tripleo_step4, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=ovn_controller, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, vcs-type=git, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Nov 26 03:24:41 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:24:41 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:24:41 localhost podman[79951]: 2025-11-26 08:24:41.893173895 +0000 UTC m=+0.139184263 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, io.openshift.expose-services=, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, url=https://www.redhat.com) Nov 26 03:24:41 localhost podman[79951]: 2025-11-26 08:24:41.974233942 +0000 UTC m=+0.220244280 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc.) Nov 26 03:24:41 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:24:42 localhost podman[79944]: 2025-11-26 08:24:42.037712221 +0000 UTC m=+0.293262363 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.expose-services=, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, vcs-type=git, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_id=tripleo_step3, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, release=1761123044, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:24:42 localhost podman[79944]: 2025-11-26 08:24:42.047281635 +0000 UTC m=+0.302831737 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, batch=17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vcs-type=git, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd) Nov 26 03:24:42 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:24:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:24:55 localhost podman[80032]: 2025-11-26 08:24:55.816707871 +0000 UTC m=+0.079579704 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, version=17.1.12, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd) Nov 26 03:24:56 localhost podman[80032]: 2025-11-26 08:24:56.00631119 +0000 UTC m=+0.269182983 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., distribution-scope=public, release=1761123044, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, container_name=metrics_qdr) Nov 26 03:24:56 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:24:59 localhost python3[80076]: ansible-ansible.legacy.dnf Invoked with name=['sos'] state=latest allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 26 03:24:59 localhost sshd[80078]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:25:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:25:00 localhost systemd[1]: tmp-crun.fQay9S.mount: Deactivated successfully. Nov 26 03:25:00 localhost podman[80080]: 2025-11-26 08:25:00.674123442 +0000 UTC m=+0.094540123 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, batch=17.1_20251118.1, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-type=git, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., distribution-scope=public) Nov 26 03:25:00 localhost podman[80080]: 2025-11-26 08:25:00.731663348 +0000 UTC m=+0.152080009 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, distribution-scope=public, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, url=https://www.redhat.com) Nov 26 03:25:00 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:25:03 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 26 03:25:03 localhost systemd[1]: Starting man-db-cache-update.service... Nov 26 03:25:03 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 26 03:25:03 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Nov 26 03:25:03 localhost systemd[1]: Finished man-db-cache-update.service. Nov 26 03:25:03 localhost systemd[1]: run-r66f09cdcd73c46e98b2a6678c7492a0c.service: Deactivated successfully. Nov 26 03:25:03 localhost systemd[1]: run-r9ceed046f0ec43bebaa721fde898399a.service: Deactivated successfully. Nov 26 03:25:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:25:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:25:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:25:07 localhost podman[80268]: 2025-11-26 08:25:07.647070928 +0000 UTC m=+0.105433057 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute) Nov 26 03:25:07 localhost podman[80271]: 2025-11-26 08:25:07.691178072 +0000 UTC m=+0.149915553 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:25:07 localhost podman[80268]: 2025-11-26 08:25:07.697367222 +0000 UTC m=+0.155729271 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044, version=17.1.12, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:25:07 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:25:07 localhost podman[80271]: 2025-11-26 08:25:07.726587838 +0000 UTC m=+0.185325319 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi) Nov 26 03:25:07 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:25:07 localhost podman[80270]: 2025-11-26 08:25:07.801430146 +0000 UTC m=+0.259844317 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.openshift.expose-services=, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, batch=17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com) Nov 26 03:25:07 localhost podman[80270]: 2025-11-26 08:25:07.837254356 +0000 UTC m=+0.295668537 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, version=17.1.12, architecture=x86_64, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com) Nov 26 03:25:07 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:25:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:25:10 localhost systemd[1]: tmp-crun.VuD9f0.mount: Deactivated successfully. Nov 26 03:25:10 localhost podman[80401]: 2025-11-26 08:25:10.668157767 +0000 UTC m=+0.100031751 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, release=1761123044, version=17.1.12) Nov 26 03:25:11 localhost podman[80401]: 2025-11-26 08:25:11.047457979 +0000 UTC m=+0.479331933 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1761123044, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container) Nov 26 03:25:11 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:25:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:25:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:25:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:25:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:25:12 localhost podman[80426]: 2025-11-26 08:25:12.821416758 +0000 UTC m=+0.082692009 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.expose-services=, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, release=1761123044, batch=17.1_20251118.1, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Nov 26 03:25:12 localhost podman[80426]: 2025-11-26 08:25:12.835531152 +0000 UTC m=+0.096806373 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:25:12 localhost podman[80428]: 2025-11-26 08:25:12.843251199 +0000 UTC m=+0.096682529 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, batch=17.1_20251118.1, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Nov 26 03:25:12 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:25:12 localhost podman[80425]: 2025-11-26 08:25:12.926204555 +0000 UTC m=+0.188841767 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, version=17.1.12, config_id=tripleo_step3, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid) Nov 26 03:25:12 localhost podman[80425]: 2025-11-26 08:25:12.933471668 +0000 UTC m=+0.196108890 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, release=1761123044, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, version=17.1.12, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, architecture=x86_64, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Nov 26 03:25:12 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:25:12 localhost podman[80427]: 2025-11-26 08:25:12.978469439 +0000 UTC m=+0.232934020 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, release=1761123044, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_id=tripleo_step4, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vendor=Red Hat, Inc., batch=17.1_20251118.1) Nov 26 03:25:13 localhost podman[80428]: 2025-11-26 08:25:13.005980154 +0000 UTC m=+0.259411474 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 26 03:25:13 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:25:13 localhost podman[80427]: 2025-11-26 08:25:13.028156904 +0000 UTC m=+0.282621465 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.4, architecture=x86_64, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:25:13 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:25:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:25:26 localhost systemd[1]: tmp-crun.zHrkGC.mount: Deactivated successfully. Nov 26 03:25:26 localhost podman[80556]: 2025-11-26 08:25:26.827404115 +0000 UTC m=+0.089220130 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4) Nov 26 03:25:27 localhost podman[80556]: 2025-11-26 08:25:27.007620756 +0000 UTC m=+0.269436711 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, architecture=x86_64, release=1761123044, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, container_name=metrics_qdr, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git) Nov 26 03:25:27 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:25:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:25:31 localhost systemd[1]: tmp-crun.aD5BDg.mount: Deactivated successfully. Nov 26 03:25:31 localhost podman[80585]: 2025-11-26 08:25:31.825114283 +0000 UTC m=+0.087477706 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, tcib_managed=true) Nov 26 03:25:31 localhost podman[80585]: 2025-11-26 08:25:31.877223482 +0000 UTC m=+0.139586795 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, version=17.1.12, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 26 03:25:31 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:25:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:25:37 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 26 03:25:37 localhost recover_tripleo_nova_virtqemud[80614]: 61604 Nov 26 03:25:37 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 26 03:25:37 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 26 03:25:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:25:37 localhost podman[80611]: 2025-11-26 08:25:37.820736032 +0000 UTC m=+0.083443953 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc.) Nov 26 03:25:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:25:37 localhost podman[80626]: 2025-11-26 08:25:37.889572474 +0000 UTC m=+0.095742850 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:25:37 localhost podman[80611]: 2025-11-26 08:25:37.909689011 +0000 UTC m=+0.172396912 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, batch=17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com) Nov 26 03:25:37 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:25:37 localhost podman[80626]: 2025-11-26 08:25:37.929321504 +0000 UTC m=+0.135491900 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, vendor=Red Hat, Inc.) Nov 26 03:25:37 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:25:37 localhost podman[80654]: 2025-11-26 08:25:37.979230476 +0000 UTC m=+0.089204389 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., config_id=tripleo_step4, distribution-scope=public, release=1761123044, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git) Nov 26 03:25:37 localhost podman[80654]: 2025-11-26 08:25:37.987323614 +0000 UTC m=+0.097297577 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, release=1761123044, config_id=tripleo_step4) Nov 26 03:25:37 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:25:38 localhost systemd[1]: tmp-crun.gEa6tb.mount: Deactivated successfully. Nov 26 03:25:40 localhost sshd[80687]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:25:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:25:41 localhost podman[80689]: 2025-11-26 08:25:41.737606705 +0000 UTC m=+0.074546979 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, vcs-type=git, io.openshift.expose-services=, version=17.1.12, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:25:42 localhost podman[80689]: 2025-11-26 08:25:42.103193706 +0000 UTC m=+0.440133940 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20251118.1, version=17.1.12, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Nov 26 03:25:42 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:25:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:25:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:25:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:25:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:25:43 localhost podman[80713]: 2025-11-26 08:25:43.826091168 +0000 UTC m=+0.086175075 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.12, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, io.openshift.expose-services=) Nov 26 03:25:43 localhost podman[80713]: 2025-11-26 08:25:43.863418145 +0000 UTC m=+0.123502002 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.12, architecture=x86_64, release=1761123044, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git) Nov 26 03:25:43 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:25:43 localhost podman[80714]: 2025-11-26 08:25:43.882258492 +0000 UTC m=+0.139959466 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_id=tripleo_step3, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, tcib_managed=true) Nov 26 03:25:43 localhost podman[80714]: 2025-11-26 08:25:43.917403272 +0000 UTC m=+0.175104246 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd) Nov 26 03:25:43 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:25:43 localhost podman[80715]: 2025-11-26 08:25:43.941472971 +0000 UTC m=+0.196342218 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, distribution-scope=public, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., vcs-type=git, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Nov 26 03:25:43 localhost podman[80715]: 2025-11-26 08:25:43.967274452 +0000 UTC m=+0.222143689 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Nov 26 03:25:43 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:25:43 localhost podman[80716]: 2025-11-26 08:25:43.992422044 +0000 UTC m=+0.242158784 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., url=https://www.redhat.com, version=17.1.12, distribution-scope=public, architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Nov 26 03:25:44 localhost podman[80716]: 2025-11-26 08:25:44.070523032 +0000 UTC m=+0.320259732 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, batch=17.1_20251118.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, version=17.1.12) Nov 26 03:25:44 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:25:48 localhost python3[80818]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager repos --disable rhel-9-for-x86_64-baseos-eus-rpms --disable rhel-9-for-x86_64-appstream-eus-rpms --disable rhel-9-for-x86_64-highavailability-eus-rpms --disable openstack-17.1-for-rhel-9-x86_64-rpms --disable fast-datapath-for-rhel-9-x86_64-rpms _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 03:25:51 localhost rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 26 03:25:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:25:57 localhost podman[81007]: 2025-11-26 08:25:57.833979294 +0000 UTC m=+0.094897164 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, distribution-scope=public) Nov 26 03:25:58 localhost podman[81007]: 2025-11-26 08:25:58.02737811 +0000 UTC m=+0.288295990 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_step1, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, url=https://www.redhat.com, version=17.1.12) Nov 26 03:25:58 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:26:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:26:02 localhost systemd[1]: tmp-crun.nsBOfR.mount: Deactivated successfully. Nov 26 03:26:02 localhost podman[81037]: 2025-11-26 08:26:02.817573458 +0000 UTC m=+0.081993458 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, release=1761123044, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1) Nov 26 03:26:02 localhost podman[81037]: 2025-11-26 08:26:02.870039418 +0000 UTC m=+0.134459468 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, version=17.1.12, distribution-scope=public, release=1761123044, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container) Nov 26 03:26:02 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:26:05 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 26 03:26:05 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.1 total, 600.0 interval#012Cumulative writes: 5231 writes, 23K keys, 5231 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 5231 writes, 596 syncs, 8.78 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 26 03:26:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:26:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:26:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:26:08 localhost systemd[1]: tmp-crun.A6iXNN.mount: Deactivated successfully. Nov 26 03:26:08 localhost podman[81066]: 2025-11-26 08:26:08.830536809 +0000 UTC m=+0.080212263 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, config_id=tripleo_step4, version=17.1.12, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:26:08 localhost podman[81064]: 2025-11-26 08:26:08.883321219 +0000 UTC m=+0.137604884 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, vcs-type=git) Nov 26 03:26:08 localhost podman[81065]: 2025-11-26 08:26:08.930800356 +0000 UTC m=+0.183040208 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, release=1761123044, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Nov 26 03:26:08 localhost podman[81064]: 2025-11-26 08:26:08.960507259 +0000 UTC m=+0.214790974 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, version=17.1.12, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public) Nov 26 03:26:08 localhost podman[81066]: 2025-11-26 08:26:08.961396756 +0000 UTC m=+0.211072200 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.12, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, distribution-scope=public, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044) Nov 26 03:26:08 localhost podman[81065]: 2025-11-26 08:26:08.968431251 +0000 UTC m=+0.220671093 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:26:08 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:26:08 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:26:09 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:26:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 26 03:26:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.1 total, 600.0 interval#012Cumulative writes: 4356 writes, 19K keys, 4356 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4356 writes, 436 syncs, 9.99 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 26 03:26:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:26:12 localhost podman[81197]: 2025-11-26 08:26:12.827107149 +0000 UTC m=+0.084591537 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Nov 26 03:26:13 localhost podman[81197]: 2025-11-26 08:26:13.191424191 +0000 UTC m=+0.448908609 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, io.openshift.expose-services=, io.buildah.version=1.41.4, tcib_managed=true, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Nov 26 03:26:13 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:26:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:26:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:26:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:26:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:26:14 localhost podman[81233]: 2025-11-26 08:26:14.828014993 +0000 UTC m=+0.087201257 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, com.redhat.component=openstack-iscsid-container, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 26 03:26:14 localhost podman[81234]: 2025-11-26 08:26:14.890989917 +0000 UTC m=+0.149877532 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:26:14 localhost podman[81233]: 2025-11-26 08:26:14.916593983 +0000 UTC m=+0.175780327 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, distribution-scope=public, io.buildah.version=1.41.4, tcib_managed=true, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, batch=17.1_20251118.1) Nov 26 03:26:14 localhost podman[81236]: 2025-11-26 08:26:14.932781179 +0000 UTC m=+0.182985258 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, architecture=x86_64, version=17.1.12, container_name=ovn_metadata_agent, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, managed_by=tripleo_ansible, url=https://www.redhat.com, distribution-scope=public, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 26 03:26:14 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:26:14 localhost podman[81234]: 2025-11-26 08:26:14.954418513 +0000 UTC m=+0.213306118 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., config_id=tripleo_step3, distribution-scope=public, url=https://www.redhat.com, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vcs-type=git, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:26:14 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:26:14 localhost podman[81236]: 2025-11-26 08:26:14.978366219 +0000 UTC m=+0.228570328 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 26 03:26:14 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:26:15 localhost podman[81235]: 2025-11-26 08:26:15.039527126 +0000 UTC m=+0.293273263 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.openshift.expose-services=, release=1761123044, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z) Nov 26 03:26:15 localhost podman[81235]: 2025-11-26 08:26:15.067481224 +0000 UTC m=+0.321227441 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_id=tripleo_step4, container_name=ovn_controller, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Nov 26 03:26:15 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:26:15 localhost systemd[1]: tmp-crun.sJ5h27.mount: Deactivated successfully. Nov 26 03:26:22 localhost sshd[81318]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:26:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:26:28 localhost systemd[1]: tmp-crun.0YGBEZ.mount: Deactivated successfully. Nov 26 03:26:28 localhost podman[81365]: 2025-11-26 08:26:28.845074391 +0000 UTC m=+0.102323642 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:26:29 localhost podman[81365]: 2025-11-26 08:26:29.111000564 +0000 UTC m=+0.368249815 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, tcib_managed=true, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64) Nov 26 03:26:29 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:26:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:26:33 localhost podman[81395]: 2025-11-26 08:26:33.82388951 +0000 UTC m=+0.083094572 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-type=git, version=17.1.12, config_id=tripleo_step5, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1761123044, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z) Nov 26 03:26:33 localhost podman[81395]: 2025-11-26 08:26:33.861498374 +0000 UTC m=+0.120703416 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, config_id=tripleo_step5, version=17.1.12, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-type=git, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 26 03:26:33 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:26:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:26:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:26:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:26:39 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 26 03:26:39 localhost recover_tripleo_nova_virtqemud[81439]: 61604 Nov 26 03:26:39 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 26 03:26:39 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 26 03:26:39 localhost podman[81422]: 2025-11-26 08:26:39.835126748 +0000 UTC m=+0.094252894 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container) Nov 26 03:26:39 localhost podman[81423]: 2025-11-26 08:26:39.889583069 +0000 UTC m=+0.146407194 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1761123044, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, version=17.1.12, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Nov 26 03:26:39 localhost podman[81422]: 2025-11-26 08:26:39.897707829 +0000 UTC m=+0.156834025 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044) Nov 26 03:26:39 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:26:39 localhost podman[81423]: 2025-11-26 08:26:39.924853072 +0000 UTC m=+0.181677197 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., release=1761123044, architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible) Nov 26 03:26:39 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:26:39 localhost podman[81424]: 2025-11-26 08:26:39.994181928 +0000 UTC m=+0.243467122 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z) Nov 26 03:26:40 localhost podman[81424]: 2025-11-26 08:26:40.023365864 +0000 UTC m=+0.272651058 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, config_id=tripleo_step4, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 26 03:26:40 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:26:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:26:43 localhost podman[81498]: 2025-11-26 08:26:43.818227619 +0000 UTC m=+0.079814838 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, url=https://www.redhat.com, version=17.1.12) Nov 26 03:26:44 localhost podman[81498]: 2025-11-26 08:26:44.192319389 +0000 UTC m=+0.453906588 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, version=17.1.12, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, container_name=nova_migration_target) Nov 26 03:26:44 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:26:44 localhost python3[81537]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager repos --disable rhceph-7-tools-for-rhel-9-x86_64-rpms _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 03:26:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:26:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:26:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:26:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:26:45 localhost systemd[1]: tmp-crun.0FAblI.mount: Deactivated successfully. Nov 26 03:26:45 localhost podman[81541]: 2025-11-26 08:26:45.838151153 +0000 UTC m=+0.097789769 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, tcib_managed=true, build-date=2025-11-18T22:51:28Z, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.openshift.expose-services=) Nov 26 03:26:45 localhost podman[81541]: 2025-11-26 08:26:45.847579022 +0000 UTC m=+0.107217638 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, version=17.1.12, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:26:45 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:26:45 localhost podman[81543]: 2025-11-26 08:26:45.936018204 +0000 UTC m=+0.190634287 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, release=1761123044, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, version=17.1.12, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c) Nov 26 03:26:45 localhost podman[81543]: 2025-11-26 08:26:45.984760638 +0000 UTC m=+0.239376751 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:26:45 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:26:46 localhost podman[81542]: 2025-11-26 08:26:46.032675087 +0000 UTC m=+0.288588850 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, distribution-scope=public, name=rhosp17/openstack-ovn-controller, vcs-type=git, url=https://www.redhat.com, config_id=tripleo_step4, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 26 03:26:46 localhost podman[81540]: 2025-11-26 08:26:45.99103904 +0000 UTC m=+0.252671168 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., tcib_managed=true, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:26:46 localhost podman[81542]: 2025-11-26 08:26:46.059382076 +0000 UTC m=+0.315295809 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, release=1761123044, vcs-type=git, version=17.1.12, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 26 03:26:46 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:26:46 localhost podman[81540]: 2025-11-26 08:26:46.077759109 +0000 UTC m=+0.339391187 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, build-date=2025-11-18T23:44:13Z, tcib_managed=true, release=1761123044, config_id=tripleo_step3, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Nov 26 03:26:46 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:26:48 localhost rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 26 03:26:48 localhost rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 26 03:26:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:26:59 localhost podman[81812]: 2025-11-26 08:26:59.829136166 +0000 UTC m=+0.090494775 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container) Nov 26 03:27:00 localhost podman[81812]: 2025-11-26 08:27:00.063415159 +0000 UTC m=+0.324773738 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, version=17.1.12, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, url=https://www.redhat.com, architecture=x86_64, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible) Nov 26 03:27:00 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:27:00 localhost sshd[81841]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:27:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:27:04 localhost systemd[1]: tmp-crun.GxDlte.mount: Deactivated successfully. Nov 26 03:27:04 localhost podman[81843]: 2025-11-26 08:27:04.84597515 +0000 UTC m=+0.107793116 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, tcib_managed=true, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, managed_by=tripleo_ansible, container_name=nova_compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git) Nov 26 03:27:04 localhost podman[81843]: 2025-11-26 08:27:04.901846083 +0000 UTC m=+0.163664019 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=nova_compute, distribution-scope=public, build-date=2025-11-19T00:36:58Z, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:27:04 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:27:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:27:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:27:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:27:10 localhost podman[81869]: 2025-11-26 08:27:10.828268366 +0000 UTC m=+0.092021113 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 26 03:27:10 localhost systemd[1]: tmp-crun.M5RAlA.mount: Deactivated successfully. Nov 26 03:27:10 localhost podman[81870]: 2025-11-26 08:27:10.886832301 +0000 UTC m=+0.148810893 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond) Nov 26 03:27:10 localhost podman[81869]: 2025-11-26 08:27:10.939849647 +0000 UTC m=+0.203602364 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, version=17.1.12, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 26 03:27:10 localhost podman[81870]: 2025-11-26 08:27:10.947893334 +0000 UTC m=+0.209871896 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, architecture=x86_64, name=rhosp17/openstack-cron) Nov 26 03:27:10 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:27:10 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:27:10 localhost podman[81871]: 2025-11-26 08:27:10.942682234 +0000 UTC m=+0.201419147 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1761123044) Nov 26 03:27:11 localhost podman[81871]: 2025-11-26 08:27:11.026296768 +0000 UTC m=+0.285033621 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:27:11 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:27:11 localhost systemd[1]: tmp-crun.6jVxLG.mount: Deactivated successfully. Nov 26 03:27:13 localhost python3[81983]: ansible-ansible.builtin.slurp Invoked with path=/home/zuul/ansible_hostname src=/home/zuul/ansible_hostname Nov 26 03:27:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:27:14 localhost podman[82069]: 2025-11-26 08:27:14.831144141 +0000 UTC m=+0.090701262 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, tcib_managed=true, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Nov 26 03:27:15 localhost podman[82069]: 2025-11-26 08:27:15.231124805 +0000 UTC m=+0.490681946 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20251118.1, version=17.1.12, container_name=nova_migration_target, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4) Nov 26 03:27:15 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:27:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:27:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:27:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:27:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:27:16 localhost podman[82109]: 2025-11-26 08:27:16.84016318 +0000 UTC m=+0.095755127 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, vendor=Red Hat, Inc., container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container) Nov 26 03:27:16 localhost podman[82109]: 2025-11-26 08:27:16.864900618 +0000 UTC m=+0.120492565 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, tcib_managed=true) Nov 26 03:27:16 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:27:16 localhost podman[82107]: 2025-11-26 08:27:16.933795471 +0000 UTC m=+0.192491903 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, url=https://www.redhat.com, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, release=1761123044, architecture=x86_64, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid) Nov 26 03:27:16 localhost podman[82110]: 2025-11-26 08:27:16.937271368 +0000 UTC m=+0.187083958 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:27:16 localhost podman[82110]: 2025-11-26 08:27:16.983806884 +0000 UTC m=+0.233619464 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, architecture=x86_64, vcs-type=git, build-date=2025-11-19T00:14:25Z, tcib_managed=true, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, config_id=tripleo_step4, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=) Nov 26 03:27:16 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:27:16 localhost podman[82108]: 2025-11-26 08:27:16.998403142 +0000 UTC m=+0.255416802 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, version=17.1.12, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, tcib_managed=true, io.openshift.expose-services=) Nov 26 03:27:17 localhost podman[82108]: 2025-11-26 08:27:17.012216466 +0000 UTC m=+0.269230086 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, container_name=collectd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team) Nov 26 03:27:17 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:27:17 localhost podman[82107]: 2025-11-26 08:27:17.064988393 +0000 UTC m=+0.323684845 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, container_name=iscsid, distribution-scope=public, version=17.1.12, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, release=1761123044) Nov 26 03:27:17 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:27:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:27:30 localhost systemd[1]: tmp-crun.ljh2ui.mount: Deactivated successfully. Nov 26 03:27:30 localhost podman[82239]: 2025-11-26 08:27:30.831619548 +0000 UTC m=+0.089004369 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, version=17.1.12, distribution-scope=public, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044) Nov 26 03:27:31 localhost podman[82239]: 2025-11-26 08:27:31.043513985 +0000 UTC m=+0.300898766 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, url=https://www.redhat.com, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.openshift.expose-services=, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team) Nov 26 03:27:31 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:27:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:27:35 localhost podman[82267]: 2025-11-26 08:27:35.80602323 +0000 UTC m=+0.072373239 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, tcib_managed=true, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, release=1761123044, maintainer=OpenStack TripleO Team, config_id=tripleo_step5) Nov 26 03:27:35 localhost podman[82267]: 2025-11-26 08:27:35.833475722 +0000 UTC m=+0.099825681 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, batch=17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5) Nov 26 03:27:35 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:27:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:27:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:27:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:27:41 localhost systemd[1]: tmp-crun.UDCoih.mount: Deactivated successfully. Nov 26 03:27:41 localhost podman[82295]: 2025-11-26 08:27:41.816340826 +0000 UTC m=+0.078056275 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, release=1761123044, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z) Nov 26 03:27:41 localhost podman[82295]: 2025-11-26 08:27:41.827349203 +0000 UTC m=+0.089064652 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, url=https://www.redhat.com, version=17.1.12, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true) Nov 26 03:27:41 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:27:41 localhost systemd[1]: tmp-crun.AphB87.mount: Deactivated successfully. Nov 26 03:27:41 localhost podman[82294]: 2025-11-26 08:27:41.881032069 +0000 UTC m=+0.140361364 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:27:41 localhost podman[82296]: 2025-11-26 08:27:41.920751267 +0000 UTC m=+0.173284434 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, tcib_managed=true, release=1761123044, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=) Nov 26 03:27:41 localhost podman[82296]: 2025-11-26 08:27:41.94725441 +0000 UTC m=+0.199787607 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, version=17.1.12, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Nov 26 03:27:41 localhost podman[82294]: 2025-11-26 08:27:41.95377218 +0000 UTC m=+0.213101455 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, distribution-scope=public, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible) Nov 26 03:27:41 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:27:41 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:27:42 localhost sshd[82364]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:27:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:27:45 localhost systemd[1]: tmp-crun.nF9uXZ.mount: Deactivated successfully. Nov 26 03:27:45 localhost podman[82366]: 2025-11-26 08:27:45.843971649 +0000 UTC m=+0.094422336 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, distribution-scope=public, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, container_name=nova_migration_target, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true) Nov 26 03:27:46 localhost podman[82366]: 2025-11-26 08:27:46.280694029 +0000 UTC m=+0.531144716 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, release=1761123044, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.openshift.expose-services=, architecture=x86_64) Nov 26 03:27:46 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:27:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:27:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:27:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:27:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:27:47 localhost podman[82390]: 2025-11-26 08:27:47.823335408 +0000 UTC m=+0.083288145 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64) Nov 26 03:27:47 localhost podman[82393]: 2025-11-26 08:27:47.886515765 +0000 UTC m=+0.137209298 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.12, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4) Nov 26 03:27:47 localhost podman[82391]: 2025-11-26 08:27:47.935786805 +0000 UTC m=+0.192996758 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, vcs-type=git, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public) Nov 26 03:27:47 localhost podman[82391]: 2025-11-26 08:27:47.944612576 +0000 UTC m=+0.201822559 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, maintainer=OpenStack TripleO Team, release=1761123044, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, architecture=x86_64, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z) Nov 26 03:27:47 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:27:47 localhost podman[82392]: 2025-11-26 08:27:47.981251699 +0000 UTC m=+0.234837300 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 26 03:27:48 localhost podman[82393]: 2025-11-26 08:27:48.007404062 +0000 UTC m=+0.258097555 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, distribution-scope=public, release=1761123044, vendor=Red Hat, Inc., version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64) Nov 26 03:27:48 localhost podman[82390]: 2025-11-26 08:27:48.011316841 +0000 UTC m=+0.271269568 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, distribution-scope=public, build-date=2025-11-18T23:44:13Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.openshift.expose-services=, batch=17.1_20251118.1) Nov 26 03:27:48 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:27:48 localhost podman[82392]: 2025-11-26 08:27:48.030241262 +0000 UTC m=+0.283826793 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, url=https://www.redhat.com, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 26 03:27:48 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:27:48 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:28:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:28:01 localhost podman[82475]: 2025-11-26 08:28:01.820561973 +0000 UTC m=+0.089273358 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, container_name=metrics_qdr, vendor=Red Hat, Inc., release=1761123044, batch=17.1_20251118.1, config_id=tripleo_step1, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, vcs-type=git) Nov 26 03:28:02 localhost podman[82475]: 2025-11-26 08:28:02.042424286 +0000 UTC m=+0.311135671 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, architecture=x86_64, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, release=1761123044) Nov 26 03:28:02 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:28:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:28:06 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 26 03:28:06 localhost recover_tripleo_nova_virtqemud[82507]: 61604 Nov 26 03:28:06 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 26 03:28:06 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 26 03:28:06 localhost podman[82505]: 2025-11-26 08:28:06.812167843 +0000 UTC m=+0.077165696 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., architecture=x86_64, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_id=tripleo_step5, vcs-type=git) Nov 26 03:28:06 localhost podman[82505]: 2025-11-26 08:28:06.867145839 +0000 UTC m=+0.132143662 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., batch=17.1_20251118.1, tcib_managed=true, architecture=x86_64, distribution-scope=public, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-type=git, com.redhat.component=openstack-nova-compute-container) Nov 26 03:28:06 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:28:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:28:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:28:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:28:12 localhost podman[82534]: 2025-11-26 08:28:12.805913011 +0000 UTC m=+0.070441271 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-cron, distribution-scope=public, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, release=1761123044) Nov 26 03:28:12 localhost podman[82534]: 2025-11-26 08:28:12.82252962 +0000 UTC m=+0.087057960 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.buildah.version=1.41.4, release=1761123044, managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, name=rhosp17/openstack-cron, vcs-type=git, batch=17.1_20251118.1) Nov 26 03:28:12 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:28:12 localhost podman[82535]: 2025-11-26 08:28:12.911725685 +0000 UTC m=+0.176313297 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, distribution-scope=public, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-19T00:12:45Z, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:28:12 localhost podman[82533]: 2025-11-26 08:28:12.894211699 +0000 UTC m=+0.159719489 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., version=17.1.12, tcib_managed=true, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 26 03:28:12 localhost podman[82535]: 2025-11-26 08:28:12.962352568 +0000 UTC m=+0.226940180 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, distribution-scope=public, config_id=tripleo_step4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 26 03:28:12 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:28:12 localhost podman[82533]: 2025-11-26 08:28:12.980402521 +0000 UTC m=+0.245910331 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, config_id=tripleo_step4, release=1761123044, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute) Nov 26 03:28:12 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:28:13 localhost systemd[1]: session-35.scope: Deactivated successfully. Nov 26 03:28:13 localhost systemd[1]: session-35.scope: Consumed 20.114s CPU time. Nov 26 03:28:13 localhost systemd-logind[761]: Session 35 logged out. Waiting for processes to exit. Nov 26 03:28:13 localhost systemd-logind[761]: Removed session 35. Nov 26 03:28:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:28:16 localhost podman[82671]: 2025-11-26 08:28:16.823731253 +0000 UTC m=+0.087068050 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step4, release=1761123044, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12) Nov 26 03:28:17 localhost podman[82671]: 2025-11-26 08:28:17.190380606 +0000 UTC m=+0.453717363 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, release=1761123044, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.12, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, managed_by=tripleo_ansible, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, distribution-scope=public, architecture=x86_64) Nov 26 03:28:17 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:28:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:28:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:28:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:28:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:28:18 localhost podman[82710]: 2025-11-26 08:28:18.827697038 +0000 UTC m=+0.089618557 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., distribution-scope=public, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-iscsid) Nov 26 03:28:18 localhost podman[82710]: 2025-11-26 08:28:18.837412536 +0000 UTC m=+0.099334085 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64) Nov 26 03:28:18 localhost systemd[1]: tmp-crun.mv4EeL.mount: Deactivated successfully. Nov 26 03:28:18 localhost podman[82712]: 2025-11-26 08:28:18.870528881 +0000 UTC m=+0.127618243 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, architecture=x86_64, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, managed_by=tripleo_ansible, url=https://www.redhat.com, release=1761123044, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 26 03:28:18 localhost systemd[1]: tmp-crun.1NntiO.mount: Deactivated successfully. Nov 26 03:28:18 localhost podman[82711]: 2025-11-26 08:28:18.923068003 +0000 UTC m=+0.182628461 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, config_id=tripleo_step3, distribution-scope=public, version=17.1.12, architecture=x86_64, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Nov 26 03:28:18 localhost podman[82711]: 2025-11-26 08:28:18.934289907 +0000 UTC m=+0.193850415 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, io.buildah.version=1.41.4, container_name=collectd, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1761123044, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, io.openshift.expose-services=, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Nov 26 03:28:18 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:28:18 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:28:18 localhost podman[82712]: 2025-11-26 08:28:18.975825859 +0000 UTC m=+0.232915271 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, architecture=x86_64, batch=17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step4, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, vcs-type=git, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc.) Nov 26 03:28:18 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:28:19 localhost podman[82713]: 2025-11-26 08:28:19.030228278 +0000 UTC m=+0.283943556 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, architecture=x86_64, vcs-type=git, url=https://www.redhat.com, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, managed_by=tripleo_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 26 03:28:19 localhost podman[82713]: 2025-11-26 08:28:19.104383231 +0000 UTC m=+0.358098479 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn) Nov 26 03:28:19 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:28:24 localhost sshd[82794]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:28:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:28:32 localhost podman[82841]: 2025-11-26 08:28:32.826479401 +0000 UTC m=+0.087854975 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, architecture=x86_64, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, release=1761123044, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git) Nov 26 03:28:33 localhost podman[82841]: 2025-11-26 08:28:33.079577161 +0000 UTC m=+0.340952735 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, release=1761123044, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, batch=17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, tcib_managed=true, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:28:33 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:28:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:28:37 localhost podman[82871]: 2025-11-26 08:28:37.7873928 +0000 UTC m=+0.056810583 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, release=1761123044) Nov 26 03:28:37 localhost podman[82871]: 2025-11-26 08:28:37.837378122 +0000 UTC m=+0.106795935 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, release=1761123044, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:28:37 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:28:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:28:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:28:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:28:43 localhost systemd[1]: tmp-crun.HfCAO2.mount: Deactivated successfully. Nov 26 03:28:43 localhost podman[82898]: 2025-11-26 08:28:43.830092028 +0000 UTC m=+0.095263652 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:28:43 localhost podman[82899]: 2025-11-26 08:28:43.844064416 +0000 UTC m=+0.101894936 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, distribution-scope=public, batch=17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step4, release=1761123044, version=17.1.12) Nov 26 03:28:43 localhost podman[82899]: 2025-11-26 08:28:43.851612747 +0000 UTC m=+0.109443267 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, container_name=logrotate_crond, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64) Nov 26 03:28:43 localhost podman[82898]: 2025-11-26 08:28:43.89085797 +0000 UTC m=+0.156029594 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.12, url=https://www.redhat.com, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public) Nov 26 03:28:43 localhost podman[82904]: 2025-11-26 08:28:43.89799858 +0000 UTC m=+0.148698921 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, url=https://www.redhat.com, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, release=1761123044, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 26 03:28:43 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:28:43 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:28:43 localhost podman[82904]: 2025-11-26 08:28:43.957155643 +0000 UTC m=+0.207855894 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, io.buildah.version=1.41.4, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:28:43 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:28:44 localhost systemd[1]: tmp-crun.ONKF9L.mount: Deactivated successfully. Nov 26 03:28:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:28:47 localhost podman[82971]: 2025-11-26 08:28:47.809190672 +0000 UTC m=+0.074444904 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, architecture=x86_64, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_id=tripleo_step4, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, com.redhat.component=openstack-nova-compute-container, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:28:48 localhost podman[82971]: 2025-11-26 08:28:48.180138276 +0000 UTC m=+0.445392528 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, distribution-scope=public, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com) Nov 26 03:28:48 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:28:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:28:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:28:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:28:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:28:49 localhost systemd[1]: tmp-crun.ADRpnG.mount: Deactivated successfully. Nov 26 03:28:49 localhost systemd[1]: tmp-crun.E3F4Em.mount: Deactivated successfully. Nov 26 03:28:49 localhost podman[82996]: 2025-11-26 08:28:49.833022686 +0000 UTC m=+0.081605923 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, architecture=x86_64, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, version=17.1.12, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1761123044, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 26 03:28:49 localhost podman[82994]: 2025-11-26 08:28:49.889464927 +0000 UTC m=+0.144835682 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, com.redhat.component=openstack-iscsid-container, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, architecture=x86_64, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, vcs-type=git) Nov 26 03:28:49 localhost podman[82994]: 2025-11-26 08:28:49.903469067 +0000 UTC m=+0.158839832 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, version=17.1.12, distribution-scope=public, architecture=x86_64, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Nov 26 03:28:49 localhost podman[83002]: 2025-11-26 08:28:49.865141461 +0000 UTC m=+0.107974952 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vcs-type=git, architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, distribution-scope=public, batch=17.1_20251118.1) Nov 26 03:28:49 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:28:49 localhost podman[83002]: 2025-11-26 08:28:49.949530449 +0000 UTC m=+0.192363940 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, release=1761123044, architecture=x86_64, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1) Nov 26 03:28:49 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:28:49 localhost podman[82996]: 2025-11-26 08:28:49.968470209 +0000 UTC m=+0.217053466 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, distribution-scope=public, url=https://www.redhat.com, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-type=git, build-date=2025-11-18T23:34:05Z) Nov 26 03:28:49 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:28:50 localhost podman[82995]: 2025-11-26 08:28:50.034989108 +0000 UTC m=+0.283956017 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, batch=17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.expose-services=, config_id=tripleo_step3, release=1761123044, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:28:50 localhost podman[82995]: 2025-11-26 08:28:50.043157459 +0000 UTC m=+0.292124428 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, vendor=Red Hat, Inc., config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Nov 26 03:28:50 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:28:50 localhost systemd[1]: tmp-crun.mh0Jw0.mount: Deactivated successfully. Nov 26 03:29:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:29:03 localhost podman[83077]: 2025-11-26 08:29:03.827235911 +0000 UTC m=+0.094150359 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, release=1761123044, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc.) Nov 26 03:29:04 localhost podman[83077]: 2025-11-26 08:29:04.029566494 +0000 UTC m=+0.296480942 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 03:29:04 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:29:06 localhost sshd[83106]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:29:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:29:08 localhost podman[83108]: 2025-11-26 08:29:08.806305586 +0000 UTC m=+0.071850394 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, container_name=nova_compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, version=17.1.12) Nov 26 03:29:08 localhost podman[83108]: 2025-11-26 08:29:08.864366856 +0000 UTC m=+0.129911664 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, architecture=x86_64, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, version=17.1.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, io.buildah.version=1.41.4) Nov 26 03:29:08 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:29:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:29:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:29:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:29:14 localhost podman[83260]: 2025-11-26 08:29:14.823361339 +0000 UTC m=+0.082478160 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible) Nov 26 03:29:14 localhost podman[83260]: 2025-11-26 08:29:14.871480874 +0000 UTC m=+0.130597615 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.12, build-date=2025-11-19T00:11:48Z, release=1761123044, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step4) Nov 26 03:29:14 localhost systemd[1]: tmp-crun.dumrwv.mount: Deactivated successfully. Nov 26 03:29:14 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:29:14 localhost podman[83262]: 2025-11-26 08:29:14.930323578 +0000 UTC m=+0.183079674 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12) Nov 26 03:29:14 localhost podman[83261]: 2025-11-26 08:29:14.881848532 +0000 UTC m=+0.138836558 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, version=17.1.12, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step4, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron) Nov 26 03:29:14 localhost podman[83262]: 2025-11-26 08:29:14.952291632 +0000 UTC m=+0.205047698 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:29:14 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:29:14 localhost podman[83261]: 2025-11-26 08:29:14.968175399 +0000 UTC m=+0.225163345 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, tcib_managed=true, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., config_id=tripleo_step4, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4) Nov 26 03:29:14 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:29:18 localhost systemd[1]: tmp-crun.L8X8SB.mount: Deactivated successfully. Nov 26 03:29:18 localhost podman[83683]: 2025-11-26 08:29:18.058476192 +0000 UTC m=+0.101152162 container exec a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, GIT_BRANCH=main, ceph=True, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., name=rhceph, version=7, RELEASE=main, release=553, architecture=x86_64) Nov 26 03:29:18 localhost systemd-logind[761]: Existing logind session ID 29 used by new audit session, ignoring. Nov 26 03:29:18 localhost systemd[1]: Created slice User Slice of UID 0. Nov 26 03:29:18 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Nov 26 03:29:18 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Nov 26 03:29:18 localhost systemd[1]: Starting User Manager for UID 0... Nov 26 03:29:18 localhost podman[83683]: 2025-11-26 08:29:18.159528471 +0000 UTC m=+0.202204421 container exec_died a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, description=Red Hat Ceph Storage 7, RELEASE=main, vcs-type=git, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, ceph=True, com.redhat.component=rhceph-container, name=rhceph, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, architecture=x86_64, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 26 03:29:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:29:18 localhost systemd[83704]: Queued start job for default target Main User Target. Nov 26 03:29:18 localhost systemd[83704]: Created slice User Application Slice. Nov 26 03:29:18 localhost systemd[83704]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Nov 26 03:29:18 localhost systemd[83704]: Started Daily Cleanup of User's Temporary Directories. Nov 26 03:29:18 localhost systemd[83704]: Reached target Paths. Nov 26 03:29:18 localhost systemd[83704]: Reached target Timers. Nov 26 03:29:18 localhost systemd[83704]: Starting D-Bus User Message Bus Socket... Nov 26 03:29:18 localhost systemd[83704]: Starting Create User's Volatile Files and Directories... Nov 26 03:29:18 localhost systemd[83704]: Finished Create User's Volatile Files and Directories. Nov 26 03:29:18 localhost systemd[83704]: Listening on D-Bus User Message Bus Socket. Nov 26 03:29:18 localhost systemd[83704]: Reached target Sockets. Nov 26 03:29:18 localhost systemd[83704]: Reached target Basic System. Nov 26 03:29:18 localhost systemd[83704]: Reached target Main User Target. Nov 26 03:29:18 localhost systemd[83704]: Startup finished in 157ms. Nov 26 03:29:18 localhost systemd[1]: Started User Manager for UID 0. Nov 26 03:29:18 localhost systemd[1]: Started Session c11 of User root. Nov 26 03:29:18 localhost podman[83732]: 2025-11-26 08:29:18.396094254 +0000 UTC m=+0.157180101 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true) Nov 26 03:29:18 localhost podman[83732]: 2025-11-26 08:29:18.716317403 +0000 UTC m=+0.477403230 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1761123044, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Nov 26 03:29:18 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:29:19 localhost systemd[1]: tmp-crun.Bh17Jg.mount: Deactivated successfully. Nov 26 03:29:19 localhost kernel: tun: Universal TUN/TAP device driver, 1.6 Nov 26 03:29:19 localhost kernel: device tap5afdc9d0-95 entered promiscuous mode Nov 26 03:29:19 localhost NetworkManager[5970]: [1764145759.5014] manager: (tap5afdc9d0-95): new Tun device (/org/freedesktop/NetworkManager/Devices/13) Nov 26 03:29:19 localhost systemd-udevd[83865]: Network interface NamePolicy= disabled on kernel command line. Nov 26 03:29:19 localhost NetworkManager[5970]: [1764145759.5220] device (tap5afdc9d0-95): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Nov 26 03:29:19 localhost NetworkManager[5970]: [1764145759.5224] device (tap5afdc9d0-95): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external') Nov 26 03:29:19 localhost systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Nov 26 03:29:19 localhost systemd[1]: Starting Virtual Machine and Container Registration Service... Nov 26 03:29:19 localhost systemd[1]: Started Virtual Machine and Container Registration Service. Nov 26 03:29:19 localhost systemd-machined[83873]: New machine qemu-1-instance-00000002. Nov 26 03:29:19 localhost systemd[1]: Started Virtual Machine qemu-1-instance-00000002. Nov 26 03:29:19 localhost NetworkManager[5970]: [1764145759.7729] manager: (tap3633976c-30): new Veth device (/org/freedesktop/NetworkManager/Devices/14) Nov 26 03:29:19 localhost systemd-udevd[83863]: Network interface NamePolicy= disabled on kernel command line. Nov 26 03:29:19 localhost NetworkManager[5970]: [1764145759.8127] device (tap3633976c-30): carrier: link connected Nov 26 03:29:19 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap3633976c-31: link becomes ready Nov 26 03:29:19 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap3633976c-30: link becomes ready Nov 26 03:29:19 localhost kernel: device tap3633976c-30 entered promiscuous mode Nov 26 03:29:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:29:20 localhost podman[83969]: 2025-11-26 08:29:20.021521193 +0000 UTC m=+0.067150890 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, url=https://www.redhat.com, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., release=1761123044, batch=17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible) Nov 26 03:29:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:29:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:29:20 localhost podman[83969]: 2025-11-26 08:29:20.041745403 +0000 UTC m=+0.087375130 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-iscsid-container, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, release=1761123044, version=17.1.12, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, container_name=iscsid, io.buildah.version=1.41.4, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid) Nov 26 03:29:20 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:29:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:29:20 localhost systemd[1]: tmp-crun.FLFCC1.mount: Deactivated successfully. Nov 26 03:29:20 localhost systemd[1]: tmp-crun.hkU9vq.mount: Deactivated successfully. Nov 26 03:29:20 localhost podman[83990]: 2025-11-26 08:29:20.104974231 +0000 UTC m=+0.065547961 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-18T23:34:05Z, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=ovn_controller, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:29:20 localhost podman[84012]: 2025-11-26 08:29:20.165584609 +0000 UTC m=+0.080391716 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd) Nov 26 03:29:20 localhost podman[83990]: 2025-11-26 08:29:20.188217263 +0000 UTC m=+0.148790973 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 26 03:29:20 localhost podman[84012]: 2025-11-26 08:29:20.197960322 +0000 UTC m=+0.112767409 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-collectd-container, architecture=x86_64, container_name=collectd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, config_id=tripleo_step3) Nov 26 03:29:20 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:29:20 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:29:20 localhost podman[83991]: 2025-11-26 08:29:20.137748996 +0000 UTC m=+0.086393140 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20251118.1, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64) Nov 26 03:29:20 localhost podman[83991]: 2025-11-26 08:29:20.271243459 +0000 UTC m=+0.219887613 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, architecture=x86_64, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:29:20 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:29:21 localhost systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs... Nov 26 03:29:21 localhost systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs. Nov 26 03:29:21 localhost systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged. Nov 26 03:29:21 localhost systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service. Nov 26 03:29:22 localhost podman[84110]: 2025-11-26 08:29:22.132785667 +0000 UTC m=+0.093963492 container create 80fe362c998c364f9ce1ffb3e71d38513195bcaa2e7c8e4ba20a3e7439113a4e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-3633976c-3aa0-4c4a-aa49-e8224cd25e39, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2025-11-19T00:14:25Z, release=1761123044, url=https://www.redhat.com, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, io.buildah.version=1.41.4, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 26 03:29:22 localhost podman[84110]: 2025-11-26 08:29:22.08589391 +0000 UTC m=+0.047071785 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Nov 26 03:29:22 localhost systemd[1]: Started libpod-conmon-80fe362c998c364f9ce1ffb3e71d38513195bcaa2e7c8e4ba20a3e7439113a4e.scope. Nov 26 03:29:22 localhost systemd[1]: tmp-crun.8rgxCl.mount: Deactivated successfully. Nov 26 03:29:22 localhost systemd[1]: Started libcrun container. Nov 26 03:29:22 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f7559b400252157619e8013c4f621a43e0a48c792e3e4ada3b21ecf95d0bea65/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 03:29:22 localhost podman[84110]: 2025-11-26 08:29:22.260437671 +0000 UTC m=+0.221615506 container init 80fe362c998c364f9ce1ffb3e71d38513195bcaa2e7c8e4ba20a3e7439113a4e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-3633976c-3aa0-4c4a-aa49-e8224cd25e39, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.buildah.version=1.41.4, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, io.openshift.expose-services=) Nov 26 03:29:22 localhost podman[84110]: 2025-11-26 08:29:22.269441627 +0000 UTC m=+0.230619452 container start 80fe362c998c364f9ce1ffb3e71d38513195bcaa2e7c8e4ba20a3e7439113a4e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-3633976c-3aa0-4c4a-aa49-e8224cd25e39, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc.) Nov 26 03:29:22 localhost setroubleshoot[84073]: SELinux is preventing /usr/libexec/qemu-kvm from read access on the file max_map_count. For complete SELinux messages run: sealert -l 41e13fe7-4246-4c3e-9d2a-3104b27ca041 Nov 26 03:29:22 localhost setroubleshoot[84073]: SELinux is preventing /usr/libexec/qemu-kvm from read access on the file max_map_count.#012#012***** Plugin qemu_file_image (98.8 confidence) suggests *******************#012#012If max_map_count is a virtualization target#012Then you need to change the label on max_map_count'#012Do#012# semanage fcontext -a -t virt_image_t 'max_map_count'#012# restorecon -v 'max_map_count'#012#012***** Plugin catchall (2.13 confidence) suggests **************************#012#012If you believe that qemu-kvm should be allowed read access on the max_map_count file by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'qemu-kvm' --raw | audit2allow -M my-qemukvm#012# semodule -X 300 -i my-qemukvm.pp#012 Nov 26 03:29:32 localhost systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully. Nov 26 03:29:32 localhost systemd[1]: setroubleshootd.service: Deactivated successfully. Nov 26 03:29:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:29:34 localhost snmpd[66980]: empty variable list in _query Nov 26 03:29:34 localhost snmpd[66980]: empty variable list in _query Nov 26 03:29:34 localhost podman[84185]: 2025-11-26 08:29:34.8337759 +0000 UTC m=+0.092703123 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, release=1761123044, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step1, vcs-type=git, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:29:35 localhost podman[84185]: 2025-11-26 08:29:35.079400592 +0000 UTC m=+0.338327775 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, architecture=x86_64, tcib_managed=true, vcs-type=git, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc.) Nov 26 03:29:35 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:29:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:29:39 localhost podman[84215]: 2025-11-26 08:29:39.819572001 +0000 UTC m=+0.081015695 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, version=17.1.12, io.openshift.expose-services=, vcs-type=git, batch=17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:29:39 localhost podman[84215]: 2025-11-26 08:29:39.85150627 +0000 UTC m=+0.112949944 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, architecture=x86_64, container_name=nova_compute, version=17.1.12, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step5, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Nov 26 03:29:39 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:29:40 localhost haproxy-metadata-proxy-3633976c-3aa0-4c4a-aa49-e8224cd25e39[84134]: 192.168.0.160:44944 [26/Nov/2025:08:29:39.122] listener listener/metadata 0/0/0/1416/1416 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1" Nov 26 03:29:40 localhost haproxy-metadata-proxy-3633976c-3aa0-4c4a-aa49-e8224cd25e39[84134]: 192.168.0.160:44950 [26/Nov/2025:08:29:40.643] listener listener/metadata 0/0/0/13/13 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1" Nov 26 03:29:40 localhost haproxy-metadata-proxy-3633976c-3aa0-4c4a-aa49-e8224cd25e39[84134]: 192.168.0.160:44952 [26/Nov/2025:08:29:40.734] listener listener/metadata 0/0/0/14/14 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1" Nov 26 03:29:40 localhost haproxy-metadata-proxy-3633976c-3aa0-4c4a-aa49-e8224cd25e39[84134]: 192.168.0.160:44962 [26/Nov/2025:08:29:40.812] listener listener/metadata 0/0/0/13/13 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" Nov 26 03:29:40 localhost haproxy-metadata-proxy-3633976c-3aa0-4c4a-aa49-e8224cd25e39[84134]: 192.168.0.160:44964 [26/Nov/2025:08:29:40.875] listener listener/metadata 0/0/0/14/14 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1" Nov 26 03:29:40 localhost haproxy-metadata-proxy-3633976c-3aa0-4c4a-aa49-e8224cd25e39[84134]: 192.168.0.160:44976 [26/Nov/2025:08:29:40.936] listener listener/metadata 0/0/0/16/16 200 133 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" Nov 26 03:29:41 localhost haproxy-metadata-proxy-3633976c-3aa0-4c4a-aa49-e8224cd25e39[84134]: 192.168.0.160:44988 [26/Nov/2025:08:29:40.997] listener listener/metadata 0/0/0/16/16 200 134 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" Nov 26 03:29:41 localhost haproxy-metadata-proxy-3633976c-3aa0-4c4a-aa49-e8224cd25e39[84134]: 192.168.0.160:44994 [26/Nov/2025:08:29:41.105] listener listener/metadata 0/0/0/14/14 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1" Nov 26 03:29:41 localhost haproxy-metadata-proxy-3633976c-3aa0-4c4a-aa49-e8224cd25e39[84134]: 192.168.0.160:44996 [26/Nov/2025:08:29:41.173] listener listener/metadata 0/0/0/14/14 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" Nov 26 03:29:41 localhost haproxy-metadata-proxy-3633976c-3aa0-4c4a-aa49-e8224cd25e39[84134]: 192.168.0.160:45008 [26/Nov/2025:08:29:41.227] listener listener/metadata 0/0/0/10/10 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1" Nov 26 03:29:41 localhost haproxy-metadata-proxy-3633976c-3aa0-4c4a-aa49-e8224cd25e39[84134]: 192.168.0.160:45022 [26/Nov/2025:08:29:41.274] listener listener/metadata 0/0/0/15/15 200 139 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" Nov 26 03:29:41 localhost haproxy-metadata-proxy-3633976c-3aa0-4c4a-aa49-e8224cd25e39[84134]: 192.168.0.160:45034 [26/Nov/2025:08:29:41.320] listener listener/metadata 0/0/0/11/11 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" Nov 26 03:29:41 localhost haproxy-metadata-proxy-3633976c-3aa0-4c4a-aa49-e8224cd25e39[84134]: 192.168.0.160:45040 [26/Nov/2025:08:29:41.359] listener listener/metadata 0/0/0/12/12 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1" Nov 26 03:29:41 localhost haproxy-metadata-proxy-3633976c-3aa0-4c4a-aa49-e8224cd25e39[84134]: 192.168.0.160:45044 [26/Nov/2025:08:29:41.399] listener listener/metadata 0/0/0/12/12 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" Nov 26 03:29:41 localhost haproxy-metadata-proxy-3633976c-3aa0-4c4a-aa49-e8224cd25e39[84134]: 192.168.0.160:45050 [26/Nov/2025:08:29:41.450] listener listener/metadata 0/0/0/10/10 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" Nov 26 03:29:41 localhost haproxy-metadata-proxy-3633976c-3aa0-4c4a-aa49-e8224cd25e39[84134]: 192.168.0.160:45064 [26/Nov/2025:08:29:41.498] listener listener/metadata 0/0/0/11/11 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" Nov 26 03:29:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:29:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:29:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:29:45 localhost podman[84242]: 2025-11-26 08:29:45.823967545 +0000 UTC m=+0.087720200 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.buildah.version=1.41.4, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Nov 26 03:29:45 localhost podman[84242]: 2025-11-26 08:29:45.832253029 +0000 UTC m=+0.096005674 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, name=rhosp17/openstack-cron, vcs-type=git, container_name=logrotate_crond, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, batch=17.1_20251118.1, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Nov 26 03:29:45 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:29:45 localhost systemd[1]: tmp-crun.oErErr.mount: Deactivated successfully. Nov 26 03:29:45 localhost podman[84243]: 2025-11-26 08:29:45.91936618 +0000 UTC m=+0.181249348 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.12, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044) Nov 26 03:29:45 localhost podman[84241]: 2025-11-26 08:29:45.893420755 +0000 UTC m=+0.158730528 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Nov 26 03:29:45 localhost podman[84243]: 2025-11-26 08:29:45.968366933 +0000 UTC m=+0.230250121 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team) Nov 26 03:29:45 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:29:46 localhost podman[84241]: 2025-11-26 08:29:46.02240953 +0000 UTC m=+0.287719293 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, vcs-type=git, architecture=x86_64, release=1761123044, container_name=ceilometer_agent_compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Nov 26 03:29:46 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:29:46 localhost sshd[84310]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:29:47 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0. Nov 26 03:29:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:29:49 localhost podman[84312]: 2025-11-26 08:29:49.830532932 +0000 UTC m=+0.084948216 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, distribution-scope=public, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, io.buildah.version=1.41.4, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container) Nov 26 03:29:50 localhost podman[84312]: 2025-11-26 08:29:50.222013975 +0000 UTC m=+0.476429199 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, version=17.1.12, managed_by=tripleo_ansible, config_id=tripleo_step4, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 26 03:29:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:29:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:29:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:29:50 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:29:50 localhost podman[84337]: 2025-11-26 08:29:50.336150215 +0000 UTC m=+0.072602627 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:29:50 localhost podman[84337]: 2025-11-26 08:29:50.352474476 +0000 UTC m=+0.088926838 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, tcib_managed=true, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:29:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:29:50 localhost podman[84336]: 2025-11-26 08:29:50.374670066 +0000 UTC m=+0.120351581 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, container_name=iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, vendor=Red Hat, Inc.) Nov 26 03:29:50 localhost podman[84375]: 2025-11-26 08:29:50.436334407 +0000 UTC m=+0.073320850 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.expose-services=, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team) Nov 26 03:29:50 localhost podman[84375]: 2025-11-26 08:29:50.480574283 +0000 UTC m=+0.117560786 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, release=1761123044, vcs-type=git, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, tcib_managed=true, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 26 03:29:50 localhost podman[84338]: 2025-11-26 08:29:50.491336183 +0000 UTC m=+0.229031343 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, vendor=Red Hat, Inc., tcib_managed=true, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vcs-type=git, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, distribution-scope=public, build-date=2025-11-18T23:34:05Z) Nov 26 03:29:50 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:29:50 localhost podman[84336]: 2025-11-26 08:29:50.513665148 +0000 UTC m=+0.259346723 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, version=17.1.12, architecture=x86_64, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044) Nov 26 03:29:50 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:29:50 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:29:50 localhost podman[84338]: 2025-11-26 08:29:50.547310639 +0000 UTC m=+0.285005789 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Nov 26 03:29:50 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:29:50 localhost systemd[1]: tmp-crun.P1L5fa.mount: Deactivated successfully. Nov 26 03:29:57 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 26 03:29:57 localhost recover_tripleo_nova_virtqemud[84426]: 61604 Nov 26 03:29:57 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 26 03:29:57 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 26 03:30:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:30:05 localhost systemd[1]: tmp-crun.AJkfJJ.mount: Deactivated successfully. Nov 26 03:30:05 localhost podman[84427]: 2025-11-26 08:30:05.834250669 +0000 UTC m=+0.101365519 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, container_name=metrics_qdr, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, distribution-scope=public, name=rhosp17/openstack-qdrouterd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git) Nov 26 03:30:06 localhost podman[84427]: 2025-11-26 08:30:06.060388143 +0000 UTC m=+0.327502943 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, container_name=metrics_qdr, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, config_id=tripleo_step1) Nov 26 03:30:06 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:30:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:30:10 localhost podman[84457]: 2025-11-26 08:30:10.819103661 +0000 UTC m=+0.081528741 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, tcib_managed=true, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git) Nov 26 03:30:10 localhost podman[84457]: 2025-11-26 08:30:10.849553084 +0000 UTC m=+0.111978154 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Nov 26 03:30:10 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:30:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:30:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:30:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:30:16 localhost podman[84483]: 2025-11-26 08:30:16.819514883 +0000 UTC m=+0.086724660 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, batch=17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, release=1761123044, managed_by=tripleo_ansible, config_id=tripleo_step4) Nov 26 03:30:16 localhost systemd[1]: tmp-crun.VREyae.mount: Deactivated successfully. Nov 26 03:30:16 localhost podman[84484]: 2025-11-26 08:30:16.850340668 +0000 UTC m=+0.111706826 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, release=1761123044, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, container_name=logrotate_crond, version=17.1.12) Nov 26 03:30:16 localhost podman[84484]: 2025-11-26 08:30:16.890422437 +0000 UTC m=+0.151788605 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, distribution-scope=public, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.buildah.version=1.41.4, container_name=logrotate_crond, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Nov 26 03:30:16 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:30:16 localhost podman[84483]: 2025-11-26 08:30:16.917707914 +0000 UTC m=+0.184917711 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, vcs-type=git, build-date=2025-11-19T00:11:48Z, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, managed_by=tripleo_ansible, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1) Nov 26 03:30:16 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:30:16 localhost podman[84485]: 2025-11-26 08:30:16.893831601 +0000 UTC m=+0.150964530 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc.) Nov 26 03:30:16 localhost podman[84485]: 2025-11-26 08:30:16.974310199 +0000 UTC m=+0.231443078 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, architecture=x86_64, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044) Nov 26 03:30:16 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:30:17 localhost systemd[1]: tmp-crun.8NI9fK.mount: Deactivated successfully. Nov 26 03:30:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:30:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:30:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:30:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:30:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:30:20 localhost systemd[1]: tmp-crun.CFq7cM.mount: Deactivated successfully. Nov 26 03:30:20 localhost podman[84607]: 2025-11-26 08:30:20.836345255 +0000 UTC m=+0.084670287 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, container_name=ovn_controller, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, architecture=x86_64, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Nov 26 03:30:20 localhost podman[84607]: 2025-11-26 08:30:20.859222548 +0000 UTC m=+0.107547560 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, batch=17.1_20251118.1, container_name=ovn_controller, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, url=https://www.redhat.com, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 26 03:30:20 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:30:20 localhost systemd[1]: tmp-crun.x5EGQb.mount: Deactivated successfully. Nov 26 03:30:20 localhost podman[84608]: 2025-11-26 08:30:20.901068191 +0000 UTC m=+0.141645275 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, version=17.1.12, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Nov 26 03:30:20 localhost podman[84604]: 2025-11-26 08:30:20.91052917 +0000 UTC m=+0.164227887 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, container_name=iscsid, batch=17.1_20251118.1, release=1761123044, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:30:20 localhost podman[84608]: 2025-11-26 08:30:20.936221418 +0000 UTC m=+0.176798452 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, io.openshift.expose-services=, io.buildah.version=1.41.4, tcib_managed=true, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team) Nov 26 03:30:20 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:30:20 localhost podman[84604]: 2025-11-26 08:30:20.950413833 +0000 UTC m=+0.204112520 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.openshift.expose-services=, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, config_id=tripleo_step3, container_name=iscsid, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, architecture=x86_64) Nov 26 03:30:20 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:30:20 localhost podman[84619]: 2025-11-26 08:30:20.940953933 +0000 UTC m=+0.182348262 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, version=17.1.12, config_id=tripleo_step4, container_name=nova_migration_target, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64) Nov 26 03:30:20 localhost podman[84606]: 2025-11-26 08:30:20.994306759 +0000 UTC m=+0.246433417 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:30:21 localhost podman[84606]: 2025-11-26 08:30:21.004257414 +0000 UTC m=+0.256384082 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, distribution-scope=public, vcs-type=git, release=1761123044, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd) Nov 26 03:30:21 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:30:21 localhost podman[84619]: 2025-11-26 08:30:21.328245108 +0000 UTC m=+0.569639367 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step4, vcs-type=git, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, release=1761123044) Nov 26 03:30:21 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:30:28 localhost sshd[84741]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:30:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:30:36 localhost podman[84789]: 2025-11-26 08:30:36.838816823 +0000 UTC m=+0.087213895 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, version=17.1.12, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, tcib_managed=true, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, config_id=tripleo_step1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible) Nov 26 03:30:37 localhost podman[84789]: 2025-11-26 08:30:37.064591985 +0000 UTC m=+0.312989097 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, container_name=metrics_qdr, distribution-scope=public, io.openshift.expose-services=, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Nov 26 03:30:37 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:30:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:30:41 localhost podman[84819]: 2025-11-26 08:30:41.828518245 +0000 UTC m=+0.087180773 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, batch=17.1_20251118.1, container_name=nova_compute, io.buildah.version=1.41.4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, release=1761123044, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:30:41 localhost podman[84819]: 2025-11-26 08:30:41.886849674 +0000 UTC m=+0.145512172 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, batch=17.1_20251118.1, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12) Nov 26 03:30:41 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:30:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:30:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:30:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:30:47 localhost systemd[1]: tmp-crun.ekoN84.mount: Deactivated successfully. Nov 26 03:30:47 localhost podman[84845]: 2025-11-26 08:30:47.825297197 +0000 UTC m=+0.088337660 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, config_id=tripleo_step4, release=1761123044, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, build-date=2025-11-19T00:11:48Z, vcs-type=git, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc.) Nov 26 03:30:47 localhost podman[84845]: 2025-11-26 08:30:47.88544869 +0000 UTC m=+0.148489103 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, batch=17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, release=1761123044) Nov 26 03:30:47 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:30:47 localhost podman[84847]: 2025-11-26 08:30:47.935815055 +0000 UTC m=+0.191080970 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true) Nov 26 03:30:47 localhost podman[84846]: 2025-11-26 08:30:47.887084751 +0000 UTC m=+0.145655107 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vcs-type=git, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron) Nov 26 03:30:48 localhost podman[84846]: 2025-11-26 08:30:48.022532114 +0000 UTC m=+0.281102370 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12) Nov 26 03:30:48 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:30:48 localhost podman[84847]: 2025-11-26 08:30:48.042527777 +0000 UTC m=+0.297793622 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible) Nov 26 03:30:48 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:30:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:30:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:30:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:30:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:30:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:30:51 localhost podman[84921]: 2025-11-26 08:30:51.83957047 +0000 UTC m=+0.083569043 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Nov 26 03:30:51 localhost podman[84920]: 2025-11-26 08:30:51.893263837 +0000 UTC m=+0.139069405 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, batch=17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public) Nov 26 03:30:51 localhost podman[84920]: 2025-11-26 08:30:51.940297318 +0000 UTC m=+0.186102846 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, maintainer=OpenStack TripleO Team) Nov 26 03:30:51 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:30:51 localhost podman[84919]: 2025-11-26 08:30:51.94359261 +0000 UTC m=+0.197340052 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 26 03:30:51 localhost podman[84917]: 2025-11-26 08:30:51.99351308 +0000 UTC m=+0.247240561 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, vcs-type=git, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, tcib_managed=true, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:30:52 localhost podman[84918]: 2025-11-26 08:30:52.050795947 +0000 UTC m=+0.304636282 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, release=1761123044, build-date=2025-11-18T22:51:28Z, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team) Nov 26 03:30:52 localhost podman[84918]: 2025-11-26 08:30:52.064653652 +0000 UTC m=+0.318493987 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, io.openshift.expose-services=, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:30:52 localhost podman[84917]: 2025-11-26 08:30:52.074414941 +0000 UTC m=+0.328142422 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.12, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, release=1761123044, build-date=2025-11-18T23:44:13Z, container_name=iscsid, vcs-type=git, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public) Nov 26 03:30:52 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:30:52 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:30:52 localhost podman[84919]: 2025-11-26 08:30:52.128999295 +0000 UTC m=+0.382746747 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, batch=17.1_20251118.1, config_id=tripleo_step4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 26 03:30:52 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:30:52 localhost podman[84921]: 2025-11-26 08:30:52.230071014 +0000 UTC m=+0.474069547 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:30:52 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:31:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:31:07 localhost podman[85028]: 2025-11-26 08:31:07.793429202 +0000 UTC m=+0.059386242 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, tcib_managed=true, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, release=1761123044, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 03:31:07 localhost podman[85028]: 2025-11-26 08:31:07.95935335 +0000 UTC m=+0.225310420 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Nov 26 03:31:07 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:31:09 localhost sshd[85057]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:31:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:31:12 localhost podman[85059]: 2025-11-26 08:31:12.818638893 +0000 UTC m=+0.078978892 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, config_id=tripleo_step5, vcs-type=git, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, distribution-scope=public) Nov 26 03:31:12 localhost podman[85059]: 2025-11-26 08:31:12.876361734 +0000 UTC m=+0.136701713 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, version=17.1.12, release=1761123044, container_name=nova_compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.expose-services=) Nov 26 03:31:12 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:31:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:31:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:31:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:31:18 localhost systemd[1]: tmp-crun.P9HXSf.mount: Deactivated successfully. Nov 26 03:31:18 localhost podman[85085]: 2025-11-26 08:31:18.841917767 +0000 UTC m=+0.104284179 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, release=1761123044, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 26 03:31:18 localhost podman[85085]: 2025-11-26 08:31:18.881682427 +0000 UTC m=+0.144048799 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute) Nov 26 03:31:18 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:31:18 localhost podman[85086]: 2025-11-26 08:31:18.928682327 +0000 UTC m=+0.185904981 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, url=https://www.redhat.com, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:31:18 localhost podman[85087]: 2025-11-26 08:31:18.895458268 +0000 UTC m=+0.146667698 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team) Nov 26 03:31:18 localhost podman[85086]: 2025-11-26 08:31:18.968354523 +0000 UTC m=+0.225577197 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, architecture=x86_64, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com) Nov 26 03:31:18 localhost podman[85087]: 2025-11-26 08:31:18.975970757 +0000 UTC m=+0.227180177 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-19T00:12:45Z, vcs-type=git) Nov 26 03:31:18 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:31:18 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:31:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:31:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:31:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:31:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:31:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:31:22 localhost podman[85220]: 2025-11-26 08:31:22.822441226 +0000 UTC m=+0.081475529 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, release=1761123044, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1) Nov 26 03:31:22 localhost podman[85223]: 2025-11-26 08:31:22.854910772 +0000 UTC m=+0.105392373 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, vcs-type=git, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, batch=17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team) Nov 26 03:31:22 localhost podman[85219]: 2025-11-26 08:31:22.871060707 +0000 UTC m=+0.129392819 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, container_name=iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, version=17.1.12, architecture=x86_64, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid) Nov 26 03:31:22 localhost podman[85219]: 2025-11-26 08:31:22.904196683 +0000 UTC m=+0.162528765 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible) Nov 26 03:31:22 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:31:22 localhost podman[85221]: 2025-11-26 08:31:22.877044651 +0000 UTC m=+0.131919866 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, config_id=tripleo_step4) Nov 26 03:31:22 localhost podman[85220]: 2025-11-26 08:31:22.959572841 +0000 UTC m=+0.218607154 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, distribution-scope=public, io.buildah.version=1.41.4, url=https://www.redhat.com, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step3, vendor=Red Hat, Inc., tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd) Nov 26 03:31:22 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:31:22 localhost podman[85222]: 2025-11-26 08:31:22.99801494 +0000 UTC m=+0.251780531 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., config_id=tripleo_step4, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true) Nov 26 03:31:23 localhost podman[85221]: 2025-11-26 08:31:23.010633376 +0000 UTC m=+0.265508681 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, distribution-scope=public, release=1761123044, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:31:23 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:31:23 localhost podman[85222]: 2025-11-26 08:31:23.063015663 +0000 UTC m=+0.316781294 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent) Nov 26 03:31:23 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:31:23 localhost podman[85223]: 2025-11-26 08:31:23.215357674 +0000 UTC m=+0.465839295 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public) Nov 26 03:31:23 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:31:23 localhost systemd[1]: tmp-crun.PlRkVZ.mount: Deactivated successfully. Nov 26 03:31:27 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 26 03:31:27 localhost recover_tripleo_nova_virtqemud[85339]: 61604 Nov 26 03:31:27 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 26 03:31:27 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 26 03:31:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:31:38 localhost podman[85386]: 2025-11-26 08:31:38.850978376 +0000 UTC m=+0.109592731 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_id=tripleo_step1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.buildah.version=1.41.4, architecture=x86_64, batch=17.1_20251118.1, release=1761123044, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Nov 26 03:31:39 localhost podman[85386]: 2025-11-26 08:31:39.06178485 +0000 UTC m=+0.320399185 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, version=17.1.12, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible) Nov 26 03:31:39 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:31:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:31:43 localhost podman[85414]: 2025-11-26 08:31:43.78945639 +0000 UTC m=+0.055747411 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team) Nov 26 03:31:43 localhost podman[85414]: 2025-11-26 08:31:43.839515415 +0000 UTC m=+0.105806506 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-nova-compute, tcib_managed=true, managed_by=tripleo_ansible, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vendor=Red Hat, Inc., url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z) Nov 26 03:31:43 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:31:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:31:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:31:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:31:49 localhost podman[85442]: 2025-11-26 08:31:49.779520533 +0000 UTC m=+0.048147187 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:31:49 localhost podman[85442]: 2025-11-26 08:31:49.784257089 +0000 UTC m=+0.052883733 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team) Nov 26 03:31:49 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:31:49 localhost podman[85441]: 2025-11-26 08:31:49.843503455 +0000 UTC m=+0.112274583 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.buildah.version=1.41.4, url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible) Nov 26 03:31:49 localhost podman[85441]: 2025-11-26 08:31:49.864140978 +0000 UTC m=+0.132912096 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, build-date=2025-11-19T00:11:48Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, version=17.1.12, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute) Nov 26 03:31:49 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:31:49 localhost podman[85443]: 2025-11-26 08:31:49.985997664 +0000 UTC m=+0.251221524 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, tcib_managed=true, config_id=tripleo_step4, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, batch=17.1_20251118.1, vcs-type=git, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi) Nov 26 03:31:50 localhost podman[85443]: 2025-11-26 08:31:50.013176017 +0000 UTC m=+0.278399857 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 26 03:31:50 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:31:50 localhost sshd[85511]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:31:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:31:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:31:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:31:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:31:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:31:53 localhost systemd[1]: tmp-crun.eMrnZZ.mount: Deactivated successfully. Nov 26 03:31:53 localhost podman[85515]: 2025-11-26 08:31:53.817795254 +0000 UTC m=+0.078674164 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, distribution-scope=public, container_name=ovn_controller, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, batch=17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 26 03:31:53 localhost systemd[1]: tmp-crun.1Z1RmS.mount: Deactivated successfully. Nov 26 03:31:53 localhost podman[85516]: 2025-11-26 08:31:53.856544301 +0000 UTC m=+0.114342476 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, version=17.1.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:31:53 localhost podman[85515]: 2025-11-26 08:31:53.887253313 +0000 UTC m=+0.148132213 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc.) Nov 26 03:31:53 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:31:53 localhost podman[85516]: 2025-11-26 08:31:53.915457817 +0000 UTC m=+0.173256002 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 26 03:31:53 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:31:53 localhost podman[85517]: 2025-11-26 08:31:53.988182057 +0000 UTC m=+0.242321360 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:31:54 localhost podman[85513]: 2025-11-26 08:31:54.026178993 +0000 UTC m=+0.289123766 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, version=17.1.12, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1761123044, vendor=Red Hat, Inc.) Nov 26 03:31:54 localhost podman[85514]: 2025-11-26 08:31:54.056521773 +0000 UTC m=+0.318667322 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:31:54 localhost podman[85513]: 2025-11-26 08:31:54.061703942 +0000 UTC m=+0.324648745 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, config_id=tripleo_step3, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, release=1761123044, build-date=2025-11-18T23:44:13Z, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container) Nov 26 03:31:54 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:31:54 localhost podman[85514]: 2025-11-26 08:31:54.117005897 +0000 UTC m=+0.379151466 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, config_id=tripleo_step3, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.12) Nov 26 03:31:54 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:31:54 localhost podman[85517]: 2025-11-26 08:31:54.321395594 +0000 UTC m=+0.575534827 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, release=1761123044, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true) Nov 26 03:31:54 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:32:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:32:09 localhost systemd[1]: tmp-crun.c7Wy0z.mount: Deactivated successfully. Nov 26 03:32:09 localhost podman[85618]: 2025-11-26 08:32:09.841338047 +0000 UTC m=+0.101508313 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.openshift.expose-services=, container_name=metrics_qdr, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Nov 26 03:32:10 localhost podman[85618]: 2025-11-26 08:32:10.063209061 +0000 UTC m=+0.323379267 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, vcs-type=git, config_id=tripleo_step1) Nov 26 03:32:10 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:32:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:32:14 localhost systemd[1]: tmp-crun.u02aO3.mount: Deactivated successfully. Nov 26 03:32:14 localhost podman[85647]: 2025-11-26 08:32:14.829117772 +0000 UTC m=+0.093636393 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, version=17.1.12, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, architecture=x86_64, tcib_managed=true) Nov 26 03:32:14 localhost podman[85647]: 2025-11-26 08:32:14.884418128 +0000 UTC m=+0.148936739 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044) Nov 26 03:32:14 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:32:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:32:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:32:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:32:20 localhost systemd[1]: tmp-crun.J4ef6J.mount: Deactivated successfully. Nov 26 03:32:20 localhost podman[85673]: 2025-11-26 08:32:20.828204071 +0000 UTC m=+0.093370793 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4) Nov 26 03:32:20 localhost podman[85673]: 2025-11-26 08:32:20.852319841 +0000 UTC m=+0.117486623 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, tcib_managed=true, release=1761123044, name=rhosp17/openstack-ceilometer-compute) Nov 26 03:32:20 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:32:20 localhost podman[85674]: 2025-11-26 08:32:20.86924562 +0000 UTC m=+0.129426459 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, release=1761123044, version=17.1.12, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, name=rhosp17/openstack-cron, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:32:20 localhost podman[85674]: 2025-11-26 08:32:20.881138945 +0000 UTC m=+0.141319784 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=logrotate_crond, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container) Nov 26 03:32:20 localhost podman[85675]: 2025-11-26 08:32:20.789409102 +0000 UTC m=+0.053081549 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, version=17.1.12, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, distribution-scope=public, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 26 03:32:20 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:32:20 localhost podman[85675]: 2025-11-26 08:32:20.920424809 +0000 UTC m=+0.184097306 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, release=1761123044, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 26 03:32:20 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:32:21 localhost systemd[1]: tmp-crun.HDmbyK.mount: Deactivated successfully. Nov 26 03:32:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:32:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:32:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:32:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:32:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:32:24 localhost systemd[1]: tmp-crun.JwOdlS.mount: Deactivated successfully. Nov 26 03:32:24 localhost podman[85817]: 2025-11-26 08:32:24.85792197 +0000 UTC m=+0.105136825 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step3, batch=17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:32:24 localhost systemd[1]: tmp-crun.wHjRcC.mount: Deactivated successfully. Nov 26 03:32:24 localhost podman[85817]: 2025-11-26 08:32:24.89739589 +0000 UTC m=+0.144610775 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=collectd, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044, config_id=tripleo_step3) Nov 26 03:32:24 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:32:24 localhost podman[85816]: 2025-11-26 08:32:24.951771317 +0000 UTC m=+0.198623361 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid) Nov 26 03:32:24 localhost podman[85818]: 2025-11-26 08:32:24.904198848 +0000 UTC m=+0.151406983 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-ovn-controller, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, vcs-type=git) Nov 26 03:32:24 localhost podman[85816]: 2025-11-26 08:32:24.958727871 +0000 UTC m=+0.205579885 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, version=17.1.12, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid) Nov 26 03:32:24 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:32:25 localhost podman[85819]: 2025-11-26 08:32:25.00305462 +0000 UTC m=+0.243232019 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1) Nov 26 03:32:25 localhost podman[85820]: 2025-11-26 08:32:25.05391963 +0000 UTC m=+0.294349386 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 26 03:32:25 localhost podman[85819]: 2025-11-26 08:32:25.060904733 +0000 UTC m=+0.301082102 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, version=17.1.12, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, vcs-type=git, io.openshift.expose-services=) Nov 26 03:32:25 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:32:25 localhost podman[85818]: 2025-11-26 08:32:25.083223808 +0000 UTC m=+0.330431993 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step4, tcib_managed=true, build-date=2025-11-18T23:34:05Z, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, batch=17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Nov 26 03:32:25 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:32:25 localhost podman[85820]: 2025-11-26 08:32:25.491310741 +0000 UTC m=+0.731740507 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=) Nov 26 03:32:25 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:32:32 localhost sshd[85972]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:32:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:32:40 localhost podman[85974]: 2025-11-26 08:32:40.826242562 +0000 UTC m=+0.087605997 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Nov 26 03:32:41 localhost podman[85974]: 2025-11-26 08:32:41.011267335 +0000 UTC m=+0.272630700 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1761123044, batch=17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, io.buildah.version=1.41.4, managed_by=tripleo_ansible) Nov 26 03:32:41 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:32:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:32:45 localhost podman[86003]: 2025-11-26 08:32:45.828247822 +0000 UTC m=+0.090981341 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, tcib_managed=true, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, container_name=nova_compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, architecture=x86_64) Nov 26 03:32:45 localhost podman[86003]: 2025-11-26 08:32:45.859413438 +0000 UTC m=+0.122146967 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, distribution-scope=public) Nov 26 03:32:45 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:32:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:32:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:32:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:32:51 localhost podman[86029]: 2025-11-26 08:32:51.817091089 +0000 UTC m=+0.083642095 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 26 03:32:51 localhost podman[86029]: 2025-11-26 08:32:51.871985823 +0000 UTC m=+0.138536819 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 26 03:32:51 localhost podman[86030]: 2025-11-26 08:32:51.879209954 +0000 UTC m=+0.141926982 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Nov 26 03:32:51 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:32:51 localhost podman[86030]: 2025-11-26 08:32:51.909184673 +0000 UTC m=+0.171901671 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.openshift.expose-services=, release=1761123044, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.4, batch=17.1_20251118.1, tcib_managed=true, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Nov 26 03:32:51 localhost systemd[1]: tmp-crun.cDIA7p.mount: Deactivated successfully. Nov 26 03:32:51 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:32:51 localhost podman[86031]: 2025-11-26 08:32:51.932006743 +0000 UTC m=+0.190964316 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, vcs-type=git, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, distribution-scope=public, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 26 03:32:51 localhost podman[86031]: 2025-11-26 08:32:51.95735773 +0000 UTC m=+0.216315303 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, io.buildah.version=1.41.4, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 26 03:32:51 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:32:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:32:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:32:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:32:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:32:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:32:55 localhost systemd[1]: tmp-crun.WUoCAW.mount: Deactivated successfully. Nov 26 03:32:55 localhost podman[86106]: 2025-11-26 08:32:55.826724131 +0000 UTC m=+0.080411687 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, tcib_managed=true, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.buildah.version=1.41.4) Nov 26 03:32:55 localhost podman[86105]: 2025-11-26 08:32:55.885164722 +0000 UTC m=+0.143305754 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, name=rhosp17/openstack-collectd, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044) Nov 26 03:32:55 localhost podman[86105]: 2025-11-26 08:32:55.897388458 +0000 UTC m=+0.155529530 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, container_name=collectd, distribution-scope=public, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=) Nov 26 03:32:55 localhost podman[86106]: 2025-11-26 08:32:55.906156757 +0000 UTC m=+0.159844363 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc.) Nov 26 03:32:55 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:32:55 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:32:55 localhost podman[86107]: 2025-11-26 08:32:55.984808628 +0000 UTC m=+0.235990247 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, batch=17.1_20251118.1, version=17.1.12, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:32:56 localhost podman[86104]: 2025-11-26 08:32:56.042164056 +0000 UTC m=+0.301470064 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, release=1761123044, version=17.1.12, vcs-type=git, managed_by=tripleo_ansible, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3) Nov 26 03:32:56 localhost podman[86107]: 2025-11-26 08:32:56.060427617 +0000 UTC m=+0.311609206 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.12, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:32:56 localhost podman[86104]: 2025-11-26 08:32:56.075834838 +0000 UTC m=+0.335140796 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, release=1761123044, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, config_id=tripleo_step3, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team) Nov 26 03:32:56 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:32:56 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:32:56 localhost podman[86118]: 2025-11-26 08:32:55.860660402 +0000 UTC m=+0.107373204 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target) Nov 26 03:32:56 localhost podman[86118]: 2025-11-26 08:32:56.245768099 +0000 UTC m=+0.492480911 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_id=tripleo_step4, version=17.1.12, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true) Nov 26 03:32:56 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:33:07 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 26 03:33:07 localhost recover_tripleo_nova_virtqemud[86213]: 61604 Nov 26 03:33:07 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 26 03:33:07 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 26 03:33:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:33:11 localhost systemd[1]: tmp-crun.y7Uixr.mount: Deactivated successfully. Nov 26 03:33:11 localhost podman[86214]: 2025-11-26 08:33:11.827872663 +0000 UTC m=+0.086892335 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, batch=17.1_20251118.1, config_id=tripleo_step1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, architecture=x86_64, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044) Nov 26 03:33:11 localhost podman[86214]: 2025-11-26 08:33:11.998517216 +0000 UTC m=+0.257536928 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, version=17.1.12) Nov 26 03:33:12 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:33:13 localhost sshd[86243]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:33:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:33:16 localhost podman[86245]: 2025-11-26 08:33:16.823059532 +0000 UTC m=+0.084207633 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.4, config_id=tripleo_step5, release=1761123044, build-date=2025-11-19T00:36:58Z, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12) Nov 26 03:33:16 localhost podman[86245]: 2025-11-26 08:33:16.87744491 +0000 UTC m=+0.138593011 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, io.buildah.version=1.41.4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, version=17.1.12, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Nov 26 03:33:16 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:33:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:33:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:33:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:33:22 localhost podman[86272]: 2025-11-26 08:33:22.832200704 +0000 UTC m=+0.092190018 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, release=1761123044) Nov 26 03:33:22 localhost podman[86272]: 2025-11-26 08:33:22.838696083 +0000 UTC m=+0.098685407 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, container_name=logrotate_crond, io.buildah.version=1.41.4, architecture=x86_64) Nov 26 03:33:22 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:33:22 localhost systemd[1]: tmp-crun.aMbNBl.mount: Deactivated successfully. Nov 26 03:33:22 localhost podman[86273]: 2025-11-26 08:33:22.948692565 +0000 UTC m=+0.202165879 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git) Nov 26 03:33:22 localhost podman[86271]: 2025-11-26 08:33:22.911548727 +0000 UTC m=+0.173554383 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:33:22 localhost podman[86271]: 2025-11-26 08:33:22.993092417 +0000 UTC m=+0.255097993 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.12, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:33:23 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:33:23 localhost podman[86273]: 2025-11-26 08:33:23.043466352 +0000 UTC m=+0.296939676 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, batch=17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, release=1761123044, container_name=ceilometer_agent_ipmi, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 26 03:33:23 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:33:23 localhost systemd[1]: tmp-crun.qzKecK.mount: Deactivated successfully. Nov 26 03:33:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:33:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:33:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:33:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:33:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:33:26 localhost systemd[1]: tmp-crun.wf9EPh.mount: Deactivated successfully. Nov 26 03:33:26 localhost podman[86422]: 2025-11-26 08:33:26.521118581 +0000 UTC m=+0.106823976 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, name=rhosp17/openstack-nova-compute, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, io.buildah.version=1.41.4, batch=17.1_20251118.1, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container) Nov 26 03:33:26 localhost podman[86418]: 2025-11-26 08:33:26.536211005 +0000 UTC m=+0.131697639 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, release=1761123044, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 26 03:33:26 localhost podman[86419]: 2025-11-26 08:33:26.490963958 +0000 UTC m=+0.086822353 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, config_id=tripleo_step3, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Nov 26 03:33:26 localhost podman[86421]: 2025-11-26 08:33:26.597463143 +0000 UTC m=+0.187753458 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, distribution-scope=public, build-date=2025-11-19T00:14:25Z, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, tcib_managed=true) Nov 26 03:33:26 localhost podman[86420]: 2025-11-26 08:33:26.643571296 +0000 UTC m=+0.235323226 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, tcib_managed=true, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, release=1761123044, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=) Nov 26 03:33:26 localhost podman[86421]: 2025-11-26 08:33:26.655364628 +0000 UTC m=+0.245654893 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com) Nov 26 03:33:26 localhost podman[86420]: 2025-11-26 08:33:26.662225169 +0000 UTC m=+0.253977069 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, version=17.1.12, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z) Nov 26 03:33:26 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:33:26 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:33:26 localhost podman[86419]: 2025-11-26 08:33:26.674781153 +0000 UTC m=+0.270639538 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, version=17.1.12, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, container_name=collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Nov 26 03:33:26 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:33:26 localhost podman[86418]: 2025-11-26 08:33:26.726356665 +0000 UTC m=+0.321843279 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, release=1761123044, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.12, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, managed_by=tripleo_ansible, container_name=iscsid) Nov 26 03:33:26 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:33:26 localhost podman[86422]: 2025-11-26 08:33:26.921321512 +0000 UTC m=+0.507026937 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, version=17.1.12, io.openshift.expose-services=, release=1761123044, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:33:26 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:33:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:33:42 localhost podman[86569]: 2025-11-26 08:33:42.82706024 +0000 UTC m=+0.089885137 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, architecture=x86_64, name=rhosp17/openstack-qdrouterd, tcib_managed=true, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=metrics_qdr, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1) Nov 26 03:33:43 localhost podman[86569]: 2025-11-26 08:33:43.0442567 +0000 UTC m=+0.307081527 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, version=17.1.12, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1761123044, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:33:43 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:33:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:33:47 localhost podman[86598]: 2025-11-26 08:33:47.821289781 +0000 UTC m=+0.087954528 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, batch=17.1_20251118.1, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, version=17.1.12, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, container_name=nova_compute, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5) Nov 26 03:33:47 localhost podman[86598]: 2025-11-26 08:33:47.87313432 +0000 UTC m=+0.139799087 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, batch=17.1_20251118.1, container_name=nova_compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git) Nov 26 03:33:47 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:33:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:33:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:33:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:33:53 localhost podman[86625]: 2025-11-26 08:33:53.882400534 +0000 UTC m=+0.138374874 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, release=1761123044, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, config_id=tripleo_step4, container_name=logrotate_crond, url=https://www.redhat.com, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64) Nov 26 03:33:53 localhost podman[86625]: 2025-11-26 08:33:53.889394459 +0000 UTC m=+0.145368769 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, release=1761123044, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team) Nov 26 03:33:53 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:33:53 localhost podman[86624]: 2025-11-26 08:33:53.856672825 +0000 UTC m=+0.116005137 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, container_name=ceilometer_agent_compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044) Nov 26 03:33:53 localhost podman[86624]: 2025-11-26 08:33:53.962363516 +0000 UTC m=+0.221695868 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, version=17.1.12, container_name=ceilometer_agent_compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:33:53 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:33:53 localhost podman[86626]: 2025-11-26 08:33:53.98826763 +0000 UTC m=+0.241896107 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., tcib_managed=true, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 26 03:33:54 localhost podman[86626]: 2025-11-26 08:33:54.018443036 +0000 UTC m=+0.272071553 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20251118.1, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, config_id=tripleo_step4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 26 03:33:54 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:33:54 localhost systemd[1]: tmp-crun.Zft2nJ.mount: Deactivated successfully. Nov 26 03:33:55 localhost sshd[86697]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:33:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:33:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:33:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:33:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:33:56 localhost podman[86700]: 2025-11-26 08:33:56.83900976 +0000 UTC m=+0.089406963 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, container_name=ovn_controller, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, vcs-type=git) Nov 26 03:33:56 localhost systemd[1]: tmp-crun.N8cXd8.mount: Deactivated successfully. Nov 26 03:33:56 localhost podman[86700]: 2025-11-26 08:33:56.897493223 +0000 UTC m=+0.147890426 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, release=1761123044, container_name=ovn_controller, config_id=tripleo_step4, vendor=Red Hat, Inc.) Nov 26 03:33:56 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:33:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:33:56 localhost podman[86701]: 2025-11-26 08:33:56.946611598 +0000 UTC m=+0.192876865 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, config_id=tripleo_step4, url=https://www.redhat.com, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent) Nov 26 03:33:56 localhost podman[86699]: 2025-11-26 08:33:56.904014423 +0000 UTC m=+0.154382045 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vcs-type=git, name=rhosp17/openstack-collectd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, release=1761123044, distribution-scope=public, maintainer=OpenStack TripleO Team) Nov 26 03:33:56 localhost podman[86701]: 2025-11-26 08:33:56.999401967 +0000 UTC m=+0.245667224 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Nov 26 03:33:57 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:33:57 localhost podman[86702]: 2025-11-26 08:33:57.011071785 +0000 UTC m=+0.256388042 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, distribution-scope=public, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.buildah.version=1.41.4, version=17.1.12, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, maintainer=OpenStack TripleO Team) Nov 26 03:33:57 localhost podman[86702]: 2025-11-26 08:33:57.020910167 +0000 UTC m=+0.266226444 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, container_name=iscsid, io.buildah.version=1.41.4, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, release=1761123044, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:33:57 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:33:57 localhost podman[86699]: 2025-11-26 08:33:57.0396311 +0000 UTC m=+0.289998762 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, version=17.1.12, config_id=tripleo_step3, io.buildah.version=1.41.4, tcib_managed=true, managed_by=tripleo_ansible, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z) Nov 26 03:33:57 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:33:57 localhost podman[86767]: 2025-11-26 08:33:57.094750881 +0000 UTC m=+0.140198890 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, vcs-type=git, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4) Nov 26 03:33:57 localhost podman[86767]: 2025-11-26 08:33:57.489335759 +0000 UTC m=+0.534783738 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, container_name=nova_migration_target, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Nov 26 03:33:57 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:34:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:34:13 localhost podman[86806]: 2025-11-26 08:34:13.816881599 +0000 UTC m=+0.080576781 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, io.buildah.version=1.41.4, architecture=x86_64, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, version=17.1.12, url=https://www.redhat.com) Nov 26 03:34:13 localhost podman[86806]: 2025-11-26 08:34:13.972315406 +0000 UTC m=+0.236010518 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:34:13 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:34:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:34:18 localhost systemd[83704]: Created slice User Background Tasks Slice. Nov 26 03:34:18 localhost podman[86833]: 2025-11-26 08:34:18.82117494 +0000 UTC m=+0.086119822 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:34:18 localhost systemd[83704]: Starting Cleanup of User's Temporary Files and Directories... Nov 26 03:34:18 localhost systemd[83704]: Finished Cleanup of User's Temporary Files and Directories. Nov 26 03:34:18 localhost podman[86833]: 2025-11-26 08:34:18.881648043 +0000 UTC m=+0.146592875 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:34:18 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:34:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:34:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:34:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:34:24 localhost podman[86860]: 2025-11-26 08:34:24.823408538 +0000 UTC m=+0.080299543 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 26 03:34:24 localhost podman[86861]: 2025-11-26 08:34:24.871067659 +0000 UTC m=+0.123537629 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.12, url=https://www.redhat.com, vcs-type=git, build-date=2025-11-18T22:49:32Z, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=) Nov 26 03:34:24 localhost systemd[1]: tmp-crun.2mmTkR.mount: Deactivated successfully. Nov 26 03:34:24 localhost podman[86860]: 2025-11-26 08:34:24.874757733 +0000 UTC m=+0.131648718 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, version=17.1.12, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute) Nov 26 03:34:24 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:34:24 localhost podman[86862]: 2025-11-26 08:34:24.942608313 +0000 UTC m=+0.190460431 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., config_id=tripleo_step4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12) Nov 26 03:34:24 localhost podman[86861]: 2025-11-26 08:34:24.955610081 +0000 UTC m=+0.208080121 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, version=17.1.12, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Nov 26 03:34:24 localhost podman[86862]: 2025-11-26 08:34:24.964217466 +0000 UTC m=+0.212069604 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:34:24 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:34:24 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:34:25 localhost systemd[1]: tmp-crun.aQpU4z.mount: Deactivated successfully. Nov 26 03:34:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:34:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:34:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:34:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:34:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:34:27 localhost systemd[1]: tmp-crun.wEbZDv.mount: Deactivated successfully. Nov 26 03:34:27 localhost podman[87008]: 2025-11-26 08:34:27.841049935 +0000 UTC m=+0.075342331 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, release=1761123044, io.buildah.version=1.41.4, managed_by=tripleo_ansible, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute) Nov 26 03:34:27 localhost systemd[1]: tmp-crun.z8mBk7.mount: Deactivated successfully. Nov 26 03:34:27 localhost podman[86996]: 2025-11-26 08:34:27.883134045 +0000 UTC m=+0.132080291 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, version=17.1.12, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd) Nov 26 03:34:27 localhost podman[86997]: 2025-11-26 08:34:27.900810348 +0000 UTC m=+0.145661188 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, io.buildah.version=1.41.4, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Nov 26 03:34:27 localhost podman[86996]: 2025-11-26 08:34:27.913816806 +0000 UTC m=+0.162763022 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.12, io.buildah.version=1.41.4, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, url=https://www.redhat.com, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:34:27 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:34:27 localhost podman[86997]: 2025-11-26 08:34:27.945334003 +0000 UTC m=+0.190184803 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:34:27 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:34:28 localhost podman[86995]: 2025-11-26 08:34:27.99904571 +0000 UTC m=+0.247125899 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.buildah.version=1.41.4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_id=tripleo_step3, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:34:28 localhost podman[86998]: 2025-11-26 08:34:27.918275823 +0000 UTC m=+0.156513350 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn) Nov 26 03:34:28 localhost podman[86995]: 2025-11-26 08:34:28.037319392 +0000 UTC m=+0.285399571 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, distribution-scope=public, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, version=17.1.12, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64) Nov 26 03:34:28 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:34:28 localhost podman[86998]: 2025-11-26 08:34:28.055341466 +0000 UTC m=+0.293578973 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com) Nov 26 03:34:28 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:34:28 localhost podman[87008]: 2025-11-26 08:34:28.179247674 +0000 UTC m=+0.413540070 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, vendor=Red Hat, Inc., batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, config_id=tripleo_step4, distribution-scope=public, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12) Nov 26 03:34:28 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:34:37 localhost sshd[87161]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:34:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:34:44 localhost podman[87163]: 2025-11-26 08:34:44.81227904 +0000 UTC m=+0.073128504 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible) Nov 26 03:34:45 localhost podman[87163]: 2025-11-26 08:34:45.004459152 +0000 UTC m=+0.265308656 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044) Nov 26 03:34:45 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:34:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:34:49 localhost podman[87193]: 2025-11-26 08:34:49.825126292 +0000 UTC m=+0.082844082 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, release=1761123044, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true) Nov 26 03:34:49 localhost podman[87193]: 2025-11-26 08:34:49.862323732 +0000 UTC m=+0.120041462 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, vcs-type=git, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, maintainer=OpenStack TripleO Team, architecture=x86_64) Nov 26 03:34:49 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:34:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:34:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:34:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:34:55 localhost systemd[1]: tmp-crun.gYmnMj.mount: Deactivated successfully. Nov 26 03:34:55 localhost podman[87221]: 2025-11-26 08:34:55.848181199 +0000 UTC m=+0.104128234 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.buildah.version=1.41.4, release=1761123044, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64) Nov 26 03:34:55 localhost podman[87220]: 2025-11-26 08:34:55.820343385 +0000 UTC m=+0.086012368 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, container_name=logrotate_crond, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, url=https://www.redhat.com, vcs-type=git, tcib_managed=true, build-date=2025-11-18T22:49:32Z, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:34:55 localhost podman[87219]: 2025-11-26 08:34:55.875729974 +0000 UTC m=+0.140451897 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, url=https://www.redhat.com, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 26 03:34:55 localhost podman[87220]: 2025-11-26 08:34:55.905391703 +0000 UTC m=+0.171060726 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, distribution-scope=public, io.buildah.version=1.41.4) Nov 26 03:34:55 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:34:55 localhost podman[87219]: 2025-11-26 08:34:55.927262373 +0000 UTC m=+0.191984306 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.12, managed_by=tripleo_ansible, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, io.buildah.version=1.41.4, tcib_managed=true, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64) Nov 26 03:34:55 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:34:55 localhost podman[87221]: 2025-11-26 08:34:55.980746013 +0000 UTC m=+0.236693078 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Nov 26 03:34:55 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:34:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:34:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:34:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:34:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:34:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:34:58 localhost systemd[1]: tmp-crun.SN3QpD.mount: Deactivated successfully. Nov 26 03:34:58 localhost podman[87291]: 2025-11-26 08:34:58.828294215 +0000 UTC m=+0.083134620 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, maintainer=OpenStack TripleO Team, container_name=ovn_controller, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:34:58 localhost podman[87291]: 2025-11-26 08:34:58.874282565 +0000 UTC m=+0.129122950 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, distribution-scope=public, config_id=tripleo_step4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, tcib_managed=true, version=17.1.12, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:34:58 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:34:58 localhost podman[87289]: 2025-11-26 08:34:58.858982776 +0000 UTC m=+0.117858625 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, vendor=Red Hat, Inc.) Nov 26 03:34:58 localhost podman[87302]: 2025-11-26 08:34:58.942685043 +0000 UTC m=+0.186374506 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, io.openshift.expose-services=, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4) Nov 26 03:34:58 localhost podman[87289]: 2025-11-26 08:34:58.943332092 +0000 UTC m=+0.202207971 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, release=1761123044, tcib_managed=true, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=iscsid, config_id=tripleo_step3, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12) Nov 26 03:34:58 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:34:59 localhost podman[87292]: 2025-11-26 08:34:59.004222259 +0000 UTC m=+0.252226774 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, version=17.1.12, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c) Nov 26 03:34:59 localhost podman[87290]: 2025-11-26 08:34:58.875022637 +0000 UTC m=+0.132234305 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc.) Nov 26 03:34:59 localhost podman[87292]: 2025-11-26 08:34:59.04632807 +0000 UTC m=+0.294332615 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, vcs-type=git, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 26 03:34:59 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:34:59 localhost podman[87290]: 2025-11-26 08:34:59.05840556 +0000 UTC m=+0.315617278 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true) Nov 26 03:34:59 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:34:59 localhost podman[87302]: 2025-11-26 08:34:59.301315989 +0000 UTC m=+0.545005472 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1761123044, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z) Nov 26 03:34:59 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:34:59 localhost systemd[1]: tmp-crun.r5MySg.mount: Deactivated successfully. Nov 26 03:35:07 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 26 03:35:07 localhost recover_tripleo_nova_virtqemud[87398]: 61604 Nov 26 03:35:07 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 26 03:35:07 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 26 03:35:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:35:15 localhost podman[87399]: 2025-11-26 08:35:15.860293394 +0000 UTC m=+0.088767153 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:35:16 localhost podman[87399]: 2025-11-26 08:35:16.076332938 +0000 UTC m=+0.304806697 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, release=1761123044, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.expose-services=, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, tcib_managed=true) Nov 26 03:35:16 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:35:19 localhost sshd[87428]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:35:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:35:20 localhost systemd[1]: tmp-crun.YWj3Oc.mount: Deactivated successfully. Nov 26 03:35:20 localhost podman[87430]: 2025-11-26 08:35:20.879712236 +0000 UTC m=+0.137385663 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, distribution-scope=public, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com) Nov 26 03:35:20 localhost podman[87430]: 2025-11-26 08:35:20.934658461 +0000 UTC m=+0.192331858 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, version=17.1.12, config_id=tripleo_step5, vendor=Red Hat, Inc., architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com) Nov 26 03:35:20 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:35:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:35:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:35:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:35:26 localhost systemd[1]: tmp-crun.BbEagm.mount: Deactivated successfully. Nov 26 03:35:26 localhost podman[87457]: 2025-11-26 08:35:26.882872356 +0000 UTC m=+0.141663922 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, tcib_managed=true, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, distribution-scope=public, build-date=2025-11-18T22:49:32Z) Nov 26 03:35:26 localhost podman[87456]: 2025-11-26 08:35:26.849890049 +0000 UTC m=+0.112287826 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, managed_by=tripleo_ansible) Nov 26 03:35:26 localhost podman[87456]: 2025-11-26 08:35:26.937641821 +0000 UTC m=+0.200039658 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, vcs-type=git, batch=17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 26 03:35:26 localhost podman[87458]: 2025-11-26 08:35:26.952477585 +0000 UTC m=+0.204982469 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, vcs-type=git) Nov 26 03:35:26 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:35:26 localhost podman[87457]: 2025-11-26 08:35:26.972535418 +0000 UTC m=+0.231327014 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., release=1761123044, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, name=rhosp17/openstack-cron, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, version=17.1.12, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, io.buildah.version=1.41.4, distribution-scope=public, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Nov 26 03:35:26 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:35:26 localhost podman[87458]: 2025-11-26 08:35:26.986749592 +0000 UTC m=+0.239254446 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, release=1761123044, vcs-type=git, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 26 03:35:26 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:35:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:35:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:35:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:35:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:35:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:35:29 localhost podman[87592]: 2025-11-26 08:35:29.843675409 +0000 UTC m=+0.100956308 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, release=1761123044, config_id=tripleo_step3, io.openshift.expose-services=, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, container_name=iscsid, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Nov 26 03:35:29 localhost podman[87592]: 2025-11-26 08:35:29.852576171 +0000 UTC m=+0.109857040 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, batch=17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, distribution-scope=public, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z) Nov 26 03:35:29 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:35:29 localhost systemd[1]: tmp-crun.f6n6c3.mount: Deactivated successfully. Nov 26 03:35:29 localhost podman[87606]: 2025-11-26 08:35:29.905788648 +0000 UTC m=+0.147058377 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.buildah.version=1.41.4, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4) Nov 26 03:35:29 localhost podman[87593]: 2025-11-26 08:35:29.946600707 +0000 UTC m=+0.204233086 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, release=1761123044, com.redhat.component=openstack-collectd-container, container_name=collectd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, tcib_managed=true, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible) Nov 26 03:35:30 localhost podman[87594]: 2025-11-26 08:35:30.061909132 +0000 UTC m=+0.314325422 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, container_name=ovn_controller, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, vcs-type=git, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1) Nov 26 03:35:30 localhost podman[87595]: 2025-11-26 08:35:30.149685686 +0000 UTC m=+0.399910388 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., tcib_managed=true, release=1761123044, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible) Nov 26 03:35:30 localhost podman[87594]: 2025-11-26 08:35:30.175499336 +0000 UTC m=+0.427915616 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_id=tripleo_step4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:35:30 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:35:30 localhost podman[87595]: 2025-11-26 08:35:30.190551026 +0000 UTC m=+0.440775718 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, container_name=ovn_metadata_agent, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, release=1761123044, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:35:30 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:35:30 localhost podman[87593]: 2025-11-26 08:35:30.221217284 +0000 UTC m=+0.478849613 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, url=https://www.redhat.com, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true) Nov 26 03:35:30 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:35:30 localhost podman[87606]: 2025-11-26 08:35:30.298361833 +0000 UTC m=+0.539631582 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, release=1761123044, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Nov 26 03:35:30 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:35:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:35:46 localhost podman[87757]: 2025-11-26 08:35:46.827833765 +0000 UTC m=+0.089446926 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, io.openshift.expose-services=, vcs-type=git) Nov 26 03:35:47 localhost podman[87757]: 2025-11-26 08:35:47.006330183 +0000 UTC m=+0.267943324 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, distribution-scope=public, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 03:35:47 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:35:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:35:51 localhost podman[87786]: 2025-11-26 08:35:51.795123513 +0000 UTC m=+0.061515822 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, version=17.1.12, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20251118.1, architecture=x86_64, tcib_managed=true) Nov 26 03:35:51 localhost podman[87786]: 2025-11-26 08:35:51.847271618 +0000 UTC m=+0.113663927 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, architecture=x86_64, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, version=17.1.12) Nov 26 03:35:51 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:35:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:35:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:35:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:35:57 localhost systemd[1]: tmp-crun.6dz9b9.mount: Deactivated successfully. Nov 26 03:35:57 localhost podman[87813]: 2025-11-26 08:35:57.84222632 +0000 UTC m=+0.099730351 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Nov 26 03:35:57 localhost podman[87814]: 2025-11-26 08:35:57.817256566 +0000 UTC m=+0.074287102 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, distribution-scope=public, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi) Nov 26 03:35:57 localhost podman[87812]: 2025-11-26 08:35:57.876800967 +0000 UTC m=+0.136129724 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-type=git, version=17.1.12, vendor=Red Hat, Inc., batch=17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute) Nov 26 03:35:57 localhost podman[87813]: 2025-11-26 08:35:57.880261993 +0000 UTC m=+0.137765994 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vcs-type=git, io.buildah.version=1.41.4, tcib_managed=true, release=1761123044, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, distribution-scope=public) Nov 26 03:35:57 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:35:57 localhost podman[87814]: 2025-11-26 08:35:57.901619136 +0000 UTC m=+0.158649642 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, version=17.1.12, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com) Nov 26 03:35:57 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:35:57 localhost podman[87812]: 2025-11-26 08:35:57.930753726 +0000 UTC m=+0.190082523 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_id=tripleo_step4) Nov 26 03:35:57 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:36:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:36:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:36:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:36:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:36:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:36:00 localhost systemd[1]: tmp-crun.Uh6lR7.mount: Deactivated successfully. Nov 26 03:36:00 localhost podman[87884]: 2025-11-26 08:36:00.825083146 +0000 UTC m=+0.087599869 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, batch=17.1_20251118.1, container_name=collectd, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_id=tripleo_step3, release=1761123044, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.4) Nov 26 03:36:00 localhost podman[87884]: 2025-11-26 08:36:00.861252772 +0000 UTC m=+0.123769485 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, container_name=collectd, release=1761123044, tcib_managed=true, version=17.1.12, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-collectd-container) Nov 26 03:36:00 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:36:00 localhost podman[87883]: 2025-11-26 08:36:00.918994208 +0000 UTC m=+0.183750280 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, config_id=tripleo_step3, container_name=iscsid, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64) Nov 26 03:36:00 localhost podman[87883]: 2025-11-26 08:36:00.928247041 +0000 UTC m=+0.193003153 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, version=17.1.12, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, vcs-type=git, maintainer=OpenStack TripleO Team) Nov 26 03:36:00 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:36:00 localhost podman[87885]: 2025-11-26 08:36:00.894055375 +0000 UTC m=+0.155847446 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., release=1761123044, architecture=x86_64, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, version=17.1.12, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Nov 26 03:36:00 localhost podman[87886]: 2025-11-26 08:36:00.800539806 +0000 UTC m=+0.062507893 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., config_id=tripleo_step4, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team) Nov 26 03:36:00 localhost podman[87885]: 2025-11-26 08:36:00.973785074 +0000 UTC m=+0.235577125 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, release=1761123044, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 26 03:36:00 localhost podman[87886]: 2025-11-26 08:36:00.983348636 +0000 UTC m=+0.245316753 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, version=17.1.12, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 26 03:36:00 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:36:00 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:36:01 localhost podman[87891]: 2025-11-26 08:36:00.98872403 +0000 UTC m=+0.240482704 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, tcib_managed=true, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20251118.1) Nov 26 03:36:01 localhost sshd[87986]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:36:01 localhost podman[87891]: 2025-11-26 08:36:01.321165196 +0000 UTC m=+0.572923830 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, architecture=x86_64, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step4, batch=17.1_20251118.1, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:36:01 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:36:05 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 26 03:36:05 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3000.1 total, 600.0 interval#012Cumulative writes: 5692 writes, 25K keys, 5692 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 5692 writes, 763 syncs, 7.46 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 461 writes, 1911 keys, 461 commit groups, 1.0 writes per commit group, ingest: 2.36 MB, 0.00 MB/s#012Interval WAL: 461 writes, 167 syncs, 2.76 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 26 03:36:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 26 03:36:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3000.1 total, 600.0 interval#012Cumulative writes: 4860 writes, 21K keys, 4860 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4860 writes, 621 syncs, 7.83 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 504 writes, 1886 keys, 504 commit groups, 1.0 writes per commit group, ingest: 2.29 MB, 0.00 MB/s#012Interval WAL: 504 writes, 185 syncs, 2.72 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 26 03:36:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:36:17 localhost podman[87988]: 2025-11-26 08:36:17.823245222 +0000 UTC m=+0.081174053 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, tcib_managed=true, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Nov 26 03:36:18 localhost podman[87988]: 2025-11-26 08:36:18.045288441 +0000 UTC m=+0.303217232 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 03:36:18 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:36:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:36:22 localhost podman[88016]: 2025-11-26 08:36:22.822387184 +0000 UTC m=+0.082315329 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vcs-type=git, version=17.1.12, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:36:22 localhost podman[88016]: 2025-11-26 08:36:22.850852234 +0000 UTC m=+0.110780439 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=17.1.12, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, architecture=x86_64, name=rhosp17/openstack-nova-compute, container_name=nova_compute) Nov 26 03:36:22 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:36:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:36:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:36:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:36:28 localhost systemd[1]: tmp-crun.RF4Ode.mount: Deactivated successfully. Nov 26 03:36:28 localhost systemd[1]: tmp-crun.MyrToF.mount: Deactivated successfully. Nov 26 03:36:28 localhost podman[88044]: 2025-11-26 08:36:28.917554199 +0000 UTC m=+0.179783839 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, version=17.1.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:36:28 localhost podman[88043]: 2025-11-26 08:36:28.885198439 +0000 UTC m=+0.150078750 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-19T00:11:48Z, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, tcib_managed=true, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 26 03:36:28 localhost podman[88045]: 2025-11-26 08:36:28.898583869 +0000 UTC m=+0.158869099 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Nov 26 03:36:28 localhost podman[88043]: 2025-11-26 08:36:28.965474294 +0000 UTC m=+0.230354615 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container) Nov 26 03:36:28 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:36:28 localhost podman[88045]: 2025-11-26 08:36:28.982285368 +0000 UTC m=+0.242570598 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_id=tripleo_step4, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:36:28 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:36:29 localhost podman[88044]: 2025-11-26 08:36:29.002381883 +0000 UTC m=+0.264611513 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, release=1761123044, com.redhat.component=openstack-cron-container, version=17.1.12, architecture=x86_64, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible) Nov 26 03:36:29 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:36:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:36:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:36:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:36:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:36:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:36:31 localhost systemd[1]: tmp-crun.yqZoZv.mount: Deactivated successfully. Nov 26 03:36:31 localhost systemd[1]: tmp-crun.B6bsT4.mount: Deactivated successfully. Nov 26 03:36:31 localhost podman[88127]: 2025-11-26 08:36:31.462800547 +0000 UTC m=+0.168585596 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, vcs-type=git, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, release=1761123044, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, container_name=iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, url=https://www.redhat.com) Nov 26 03:36:31 localhost podman[88129]: 2025-11-26 08:36:31.467706577 +0000 UTC m=+0.169605838 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, release=1761123044, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.4, config_id=tripleo_step4, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible) Nov 26 03:36:31 localhost podman[88129]: 2025-11-26 08:36:31.523479043 +0000 UTC m=+0.225378294 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, config_id=tripleo_step4, batch=17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, release=1761123044, vcs-type=git, build-date=2025-11-18T23:34:05Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 26 03:36:31 localhost podman[88188]: 2025-11-26 08:36:31.551044616 +0000 UTC m=+0.125342234 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:36:31 localhost podman[88127]: 2025-11-26 08:36:31.560958508 +0000 UTC m=+0.266743507 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, io.openshift.expose-services=, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:36:31 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:36:31 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:36:31 localhost podman[88128]: 2025-11-26 08:36:31.51262352 +0000 UTC m=+0.220697219 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, version=17.1.12, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:36:31 localhost podman[88130]: 2025-11-26 08:36:31.56886828 +0000 UTC m=+0.263783276 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, distribution-scope=public, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64) Nov 26 03:36:31 localhost podman[88128]: 2025-11-26 08:36:31.644286316 +0000 UTC m=+0.352359985 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, release=1761123044, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4) Nov 26 03:36:31 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:36:31 localhost podman[88130]: 2025-11-26 08:36:31.700548297 +0000 UTC m=+0.395463233 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20251118.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z) Nov 26 03:36:31 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:36:31 localhost podman[88188]: 2025-11-26 08:36:31.92534602 +0000 UTC m=+0.499643638 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, container_name=nova_migration_target) Nov 26 03:36:31 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:36:43 localhost sshd[88341]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:36:44 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 26 03:36:44 localhost recover_tripleo_nova_virtqemud[88344]: 61604 Nov 26 03:36:44 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 26 03:36:44 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 26 03:36:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:36:48 localhost podman[88345]: 2025-11-26 08:36:48.820245175 +0000 UTC m=+0.084217777 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, config_id=tripleo_step1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12) Nov 26 03:36:49 localhost podman[88345]: 2025-11-26 08:36:49.04607528 +0000 UTC m=+0.310047892 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., vcs-type=git, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, architecture=x86_64, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:36:49 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:36:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:36:54 localhost podman[88375]: 2025-11-26 08:36:54.093163266 +0000 UTC m=+0.089756366 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, distribution-scope=public, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, version=17.1.12, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1) Nov 26 03:36:54 localhost podman[88375]: 2025-11-26 08:36:54.128292601 +0000 UTC m=+0.124885761 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, vendor=Red Hat, Inc.) Nov 26 03:36:54 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:36:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:36:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:36:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:36:59 localhost podman[88401]: 2025-11-26 08:36:59.822304929 +0000 UTC m=+0.087019172 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, version=17.1.12, release=1761123044) Nov 26 03:36:59 localhost podman[88401]: 2025-11-26 08:36:59.863331283 +0000 UTC m=+0.128045496 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.buildah.version=1.41.4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:36:59 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:36:59 localhost podman[88402]: 2025-11-26 08:36:59.8782825 +0000 UTC m=+0.140216908 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.4, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, version=17.1.12, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, build-date=2025-11-18T22:49:32Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, release=1761123044, tcib_managed=true, name=rhosp17/openstack-cron) Nov 26 03:36:59 localhost podman[88402]: 2025-11-26 08:36:59.886236384 +0000 UTC m=+0.148170792 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1) Nov 26 03:36:59 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:36:59 localhost podman[88403]: 2025-11-26 08:36:59.928717272 +0000 UTC m=+0.188246407 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, version=17.1.12, batch=17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com) Nov 26 03:36:59 localhost podman[88403]: 2025-11-26 08:36:59.985326554 +0000 UTC m=+0.244855699 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, distribution-scope=public) Nov 26 03:37:00 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:37:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:37:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:37:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:37:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:37:01 localhost podman[88475]: 2025-11-26 08:37:01.805357895 +0000 UTC m=+0.069289689 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, config_id=tripleo_step4, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, vcs-type=git, io.buildah.version=1.41.4, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=) Nov 26 03:37:01 localhost podman[88474]: 2025-11-26 08:37:01.816082473 +0000 UTC m=+0.077328645 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, io.buildah.version=1.41.4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, container_name=collectd, version=17.1.12) Nov 26 03:37:01 localhost podman[88475]: 2025-11-26 08:37:01.855359614 +0000 UTC m=+0.119291408 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.buildah.version=1.41.4, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:37:01 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:37:01 localhost podman[88476]: 2025-11-26 08:37:01.858924703 +0000 UTC m=+0.118144364 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4) Nov 26 03:37:01 localhost podman[88473]: 2025-11-26 08:37:01.914942096 +0000 UTC m=+0.178554480 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., tcib_managed=true, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, version=17.1.12, architecture=x86_64, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, config_id=tripleo_step3, url=https://www.redhat.com, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 26 03:37:01 localhost podman[88474]: 2025-11-26 08:37:01.936526006 +0000 UTC m=+0.197772188 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, container_name=collectd, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:37:01 localhost podman[88476]: 2025-11-26 08:37:01.943242142 +0000 UTC m=+0.202461813 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, config_id=tripleo_step4, vcs-type=git, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, version=17.1.12) Nov 26 03:37:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:37:01 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:37:01 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:37:02 localhost podman[88473]: 2025-11-26 08:37:02.000166532 +0000 UTC m=+0.263778906 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, version=17.1.12, vcs-type=git, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.expose-services=, managed_by=tripleo_ansible) Nov 26 03:37:02 localhost podman[88559]: 2025-11-26 08:37:02.040921648 +0000 UTC m=+0.064896955 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_id=tripleo_step4, vendor=Red Hat, Inc., release=1761123044, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target) Nov 26 03:37:02 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:37:02 localhost podman[88559]: 2025-11-26 08:37:02.43961335 +0000 UTC m=+0.463588707 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, release=1761123044, architecture=x86_64, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Nov 26 03:37:02 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:37:02 localhost systemd[1]: tmp-crun.kWS8Aj.mount: Deactivated successfully. Nov 26 03:37:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:37:19 localhost podman[88585]: 2025-11-26 08:37:19.828356462 +0000 UTC m=+0.085618350 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=metrics_qdr, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, tcib_managed=true, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 03:37:20 localhost podman[88585]: 2025-11-26 08:37:20.025459788 +0000 UTC m=+0.282721676 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, io.openshift.expose-services=, tcib_managed=true, build-date=2025-11-18T22:49:46Z, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:37:20 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:37:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:37:24 localhost podman[88613]: 2025-11-26 08:37:24.824974677 +0000 UTC m=+0.087398474 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, managed_by=tripleo_ansible, tcib_managed=true, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, vcs-type=git) Nov 26 03:37:24 localhost podman[88613]: 2025-11-26 08:37:24.848576798 +0000 UTC m=+0.111000645 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, tcib_managed=true, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container) Nov 26 03:37:24 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:37:24 localhost sshd[88640]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:37:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:37:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:37:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:37:30 localhost podman[88643]: 2025-11-26 08:37:30.812326515 +0000 UTC m=+0.073489138 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, version=17.1.12, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, tcib_managed=true, release=1761123044, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, container_name=logrotate_crond) Nov 26 03:37:30 localhost podman[88643]: 2025-11-26 08:37:30.825354773 +0000 UTC m=+0.086517466 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.buildah.version=1.41.4, batch=17.1_20251118.1, version=17.1.12, release=1761123044, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, vcs-type=git, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:37:30 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:37:30 localhost podman[88644]: 2025-11-26 08:37:30.793107557 +0000 UTC m=+0.056872039 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, version=17.1.12, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public) Nov 26 03:37:30 localhost podman[88644]: 2025-11-26 08:37:30.875821996 +0000 UTC m=+0.139586478 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 26 03:37:30 localhost podman[88642]: 2025-11-26 08:37:30.884997517 +0000 UTC m=+0.147500621 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:37:30 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:37:30 localhost podman[88642]: 2025-11-26 08:37:30.913090076 +0000 UTC m=+0.175593240 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, distribution-scope=public, vcs-type=git, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible) Nov 26 03:37:30 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:37:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:37:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:37:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:37:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:37:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:37:32 localhost podman[88716]: 2025-11-26 08:37:32.829707402 +0000 UTC m=+0.079317516 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, io.openshift.expose-services=, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:37:32 localhost systemd[1]: tmp-crun.eiBno7.mount: Deactivated successfully. Nov 26 03:37:32 localhost podman[88714]: 2025-11-26 08:37:32.870997205 +0000 UTC m=+0.130697778 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step3, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, url=https://www.redhat.com, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:37:32 localhost podman[88716]: 2025-11-26 08:37:32.877295597 +0000 UTC m=+0.126905671 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vendor=Red Hat, Inc., container_name=ovn_controller, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, version=17.1.12) Nov 26 03:37:32 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:37:32 localhost podman[88714]: 2025-11-26 08:37:32.906476529 +0000 UTC m=+0.166177142 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, architecture=x86_64, release=1761123044, io.buildah.version=1.41.4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Nov 26 03:37:32 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:37:32 localhost podman[88715]: 2025-11-26 08:37:32.918479577 +0000 UTC m=+0.175224829 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_id=tripleo_step3, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, name=rhosp17/openstack-collectd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd) Nov 26 03:37:32 localhost podman[88715]: 2025-11-26 08:37:32.930176774 +0000 UTC m=+0.186922006 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, config_id=tripleo_step3, distribution-scope=public, architecture=x86_64, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git) Nov 26 03:37:32 localhost podman[88723]: 2025-11-26 08:37:32.943856353 +0000 UTC m=+0.191219229 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.) Nov 26 03:37:32 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:37:32 localhost podman[88728]: 2025-11-26 08:37:32.98039169 +0000 UTC m=+0.222722492 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., release=1761123044, version=17.1.12, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:37:32 localhost podman[88723]: 2025-11-26 08:37:32.986422134 +0000 UTC m=+0.233785010 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, config_id=tripleo_step4, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, batch=17.1_20251118.1, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:37:32 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:37:33 localhost podman[88728]: 2025-11-26 08:37:33.346513824 +0000 UTC m=+0.588844716 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.12, vcs-type=git, distribution-scope=public, tcib_managed=true, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, managed_by=tripleo_ansible) Nov 26 03:37:33 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:37:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:37:50 localhost systemd[1]: tmp-crun.5TGhnZ.mount: Deactivated successfully. Nov 26 03:37:50 localhost podman[88974]: 2025-11-26 08:37:50.847897449 +0000 UTC m=+0.104367462 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, release=1761123044, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, config_id=tripleo_step1) Nov 26 03:37:51 localhost podman[88974]: 2025-11-26 08:37:51.040678364 +0000 UTC m=+0.297148307 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, com.redhat.component=openstack-qdrouterd-container) Nov 26 03:37:51 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:37:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:37:55 localhost podman[89004]: 2025-11-26 08:37:55.823651374 +0000 UTC m=+0.083232726 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-11-19T00:36:58Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, batch=17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, tcib_managed=true, version=17.1.12, config_id=tripleo_step5, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team) Nov 26 03:37:55 localhost podman[89004]: 2025-11-26 08:37:55.859375417 +0000 UTC m=+0.118956829 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:37:55 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:38:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:38:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:38:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:38:01 localhost systemd[1]: tmp-crun.VhJdeU.mount: Deactivated successfully. Nov 26 03:38:01 localhost podman[89033]: 2025-11-26 08:38:01.839537095 +0000 UTC m=+0.096351716 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4) Nov 26 03:38:01 localhost podman[89033]: 2025-11-26 08:38:01.887269015 +0000 UTC m=+0.144083636 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.buildah.version=1.41.4, version=17.1.12, config_id=tripleo_step4, release=1761123044) Nov 26 03:38:01 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:38:01 localhost podman[89032]: 2025-11-26 08:38:01.89100736 +0000 UTC m=+0.147957785 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, distribution-scope=public, config_id=tripleo_step4) Nov 26 03:38:01 localhost podman[89031]: 2025-11-26 08:38:01.946012841 +0000 UTC m=+0.206260517 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container) Nov 26 03:38:01 localhost podman[89031]: 2025-11-26 08:38:01.974366899 +0000 UTC m=+0.234614735 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com) Nov 26 03:38:01 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:38:02 localhost podman[89032]: 2025-11-26 08:38:02.024572313 +0000 UTC m=+0.281522738 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, release=1761123044, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, vendor=Red Hat, Inc.) Nov 26 03:38:02 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:38:02 localhost systemd[1]: tmp-crun.vL4K7d.mount: Deactivated successfully. Nov 26 03:38:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:38:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:38:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:38:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:38:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:38:03 localhost podman[89104]: 2025-11-26 08:38:03.821134069 +0000 UTC m=+0.071545149 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, release=1761123044) Nov 26 03:38:03 localhost systemd[1]: tmp-crun.hGzBLy.mount: Deactivated successfully. Nov 26 03:38:03 localhost podman[89101]: 2025-11-26 08:38:03.884017351 +0000 UTC m=+0.140262870 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., url=https://www.redhat.com, version=17.1.12, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64) Nov 26 03:38:03 localhost podman[89104]: 2025-11-26 08:38:03.894347947 +0000 UTC m=+0.144759037 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, tcib_managed=true, architecture=x86_64, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, io.buildah.version=1.41.4) Nov 26 03:38:03 localhost podman[89101]: 2025-11-26 08:38:03.9013136 +0000 UTC m=+0.157559109 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.expose-services=, batch=17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid) Nov 26 03:38:03 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:38:03 localhost podman[89111]: 2025-11-26 08:38:03.935377612 +0000 UTC m=+0.180292264 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, release=1761123044, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:38:03 localhost podman[89103]: 2025-11-26 08:38:03.897869005 +0000 UTC m=+0.151983318 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, batch=17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, container_name=ovn_controller, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Nov 26 03:38:03 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:38:03 localhost podman[89103]: 2025-11-26 08:38:03.979321455 +0000 UTC m=+0.233435738 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, config_id=tripleo_step4, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, container_name=ovn_controller) Nov 26 03:38:03 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:38:04 localhost podman[89102]: 2025-11-26 08:38:04.049011536 +0000 UTC m=+0.305726399 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, release=1761123044, distribution-scope=public, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, tcib_managed=true, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:38:04 localhost podman[89102]: 2025-11-26 08:38:04.064305154 +0000 UTC m=+0.321020017 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, build-date=2025-11-18T22:51:28Z, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1761123044, version=17.1.12, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, io.openshift.expose-services=) Nov 26 03:38:04 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:38:04 localhost podman[89111]: 2025-11-26 08:38:04.308092449 +0000 UTC m=+0.553007151 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vendor=Red Hat, Inc., release=1761123044, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, architecture=x86_64) Nov 26 03:38:04 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:38:04 localhost sshd[89209]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:38:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:38:21 localhost podman[89211]: 2025-11-26 08:38:21.82655957 +0000 UTC m=+0.083359610 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, version=17.1.12, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container) Nov 26 03:38:22 localhost podman[89211]: 2025-11-26 08:38:22.021388358 +0000 UTC m=+0.278188358 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, release=1761123044, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, managed_by=tripleo_ansible) Nov 26 03:38:22 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:38:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:38:26 localhost podman[89241]: 2025-11-26 08:38:26.824111085 +0000 UTC m=+0.087325151 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, release=1761123044, batch=17.1_20251118.1, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, version=17.1.12, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git) Nov 26 03:38:26 localhost podman[89241]: 2025-11-26 08:38:26.873390662 +0000 UTC m=+0.136604698 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_id=tripleo_step5, tcib_managed=true, batch=17.1_20251118.1, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 26 03:38:26 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:38:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:38:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:38:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:38:32 localhost podman[89267]: 2025-11-26 08:38:32.831144135 +0000 UTC m=+0.095572223 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.buildah.version=1.41.4, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 26 03:38:32 localhost podman[89267]: 2025-11-26 08:38:32.856065387 +0000 UTC m=+0.120493465 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, release=1761123044, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4) Nov 26 03:38:32 localhost systemd[1]: tmp-crun.A5qbyF.mount: Deactivated successfully. Nov 26 03:38:32 localhost podman[89269]: 2025-11-26 08:38:32.869287381 +0000 UTC m=+0.128095799 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 26 03:38:32 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:38:32 localhost podman[89269]: 2025-11-26 08:38:32.897263607 +0000 UTC m=+0.156071974 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, version=17.1.12, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64) Nov 26 03:38:32 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:38:32 localhost podman[89268]: 2025-11-26 08:38:32.983306238 +0000 UTC m=+0.244865479 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, architecture=x86_64, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_id=tripleo_step4, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, release=1761123044) Nov 26 03:38:32 localhost podman[89268]: 2025-11-26 08:38:32.994257973 +0000 UTC m=+0.255817254 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, url=https://www.redhat.com, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:38:33 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:38:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:38:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:38:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:38:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:38:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:38:34 localhost systemd[1]: tmp-crun.UzUdb4.mount: Deactivated successfully. Nov 26 03:38:34 localhost podman[89362]: 2025-11-26 08:38:34.903347558 +0000 UTC m=+0.158991943 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, batch=17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc.) Nov 26 03:38:34 localhost podman[89363]: 2025-11-26 08:38:34.86155989 +0000 UTC m=+0.116641878 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., version=17.1.12, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com) Nov 26 03:38:34 localhost podman[89362]: 2025-11-26 08:38:34.937327097 +0000 UTC m=+0.192971502 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, batch=17.1_20251118.1, release=1761123044, distribution-scope=public, container_name=collectd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:38:34 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:38:34 localhost podman[89374]: 2025-11-26 08:38:34.872196336 +0000 UTC m=+0.116342719 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, distribution-scope=public, batch=17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, release=1761123044, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:38:34 localhost podman[89361]: 2025-11-26 08:38:34.985796419 +0000 UTC m=+0.244127476 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, distribution-scope=public, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 26 03:38:34 localhost podman[89363]: 2025-11-26 08:38:34.993806234 +0000 UTC m=+0.248888192 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.12, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, release=1761123044, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vendor=Red Hat, Inc.) Nov 26 03:38:35 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:38:35 localhost podman[89365]: 2025-11-26 08:38:34.943101734 +0000 UTC m=+0.188733952 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 26 03:38:35 localhost podman[89365]: 2025-11-26 08:38:35.077235265 +0000 UTC m=+0.322867483 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, release=1761123044) Nov 26 03:38:35 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:38:35 localhost podman[89361]: 2025-11-26 08:38:35.096080642 +0000 UTC m=+0.354411679 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, release=1761123044, version=17.1.12, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, tcib_managed=true) Nov 26 03:38:35 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:38:35 localhost podman[89374]: 2025-11-26 08:38:35.192301914 +0000 UTC m=+0.436448317 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, container_name=nova_migration_target, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.openshift.expose-services=, release=1761123044, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-nova-compute-container) Nov 26 03:38:35 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:38:43 localhost sshd[89547]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:38:47 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 26 03:38:47 localhost recover_tripleo_nova_virtqemud[89551]: 61604 Nov 26 03:38:47 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 26 03:38:47 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 26 03:38:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:38:52 localhost podman[89552]: 2025-11-26 08:38:52.797086415 +0000 UTC m=+0.064818533 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, version=17.1.12, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, name=rhosp17/openstack-qdrouterd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vendor=Red Hat, Inc.) Nov 26 03:38:52 localhost podman[89552]: 2025-11-26 08:38:52.964227475 +0000 UTC m=+0.231959593 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, release=1761123044, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Nov 26 03:38:52 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:38:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:38:57 localhost podman[89581]: 2025-11-26 08:38:57.817390056 +0000 UTC m=+0.078862393 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, release=1761123044, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible) Nov 26 03:38:57 localhost podman[89581]: 2025-11-26 08:38:57.857559484 +0000 UTC m=+0.119031821 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z) Nov 26 03:38:57 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:39:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:39:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:39:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:39:03 localhost systemd[1]: tmp-crun.gFcDCs.mount: Deactivated successfully. Nov 26 03:39:03 localhost podman[89609]: 2025-11-26 08:39:03.821152627 +0000 UTC m=+0.079803802 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, release=1761123044, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:39:03 localhost podman[89609]: 2025-11-26 08:39:03.881294635 +0000 UTC m=+0.139945880 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, release=1761123044, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, version=17.1.12, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true) Nov 26 03:39:03 localhost podman[89607]: 2025-11-26 08:39:03.8840711 +0000 UTC m=+0.146529371 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, architecture=x86_64, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute) Nov 26 03:39:03 localhost podman[89608]: 2025-11-26 08:39:03.841150538 +0000 UTC m=+0.099721411 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, distribution-scope=public, vcs-type=git, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.buildah.version=1.41.4, release=1761123044, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Nov 26 03:39:03 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:39:03 localhost podman[89608]: 2025-11-26 08:39:03.919666209 +0000 UTC m=+0.178237072 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., io.buildah.version=1.41.4, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.expose-services=) Nov 26 03:39:03 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:39:03 localhost podman[89607]: 2025-11-26 08:39:03.968243224 +0000 UTC m=+0.230701515 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Nov 26 03:39:03 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:39:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:39:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:39:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:39:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:39:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:39:05 localhost podman[89681]: 2025-11-26 08:39:05.837139869 +0000 UTC m=+0.093644524 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.buildah.version=1.41.4, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, vcs-type=git) Nov 26 03:39:05 localhost systemd[1]: tmp-crun.MUGm96.mount: Deactivated successfully. Nov 26 03:39:05 localhost podman[89682]: 2025-11-26 08:39:05.90026355 +0000 UTC m=+0.150981248 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., container_name=ovn_controller, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, release=1761123044, architecture=x86_64, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-ovn-controller) Nov 26 03:39:05 localhost podman[89680]: 2025-11-26 08:39:05.939092787 +0000 UTC m=+0.195369125 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, distribution-scope=public, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, release=1761123044, io.buildah.version=1.41.4, container_name=iscsid, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true) Nov 26 03:39:05 localhost podman[89680]: 2025-11-26 08:39:05.950392323 +0000 UTC m=+0.206668641 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, version=17.1.12, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com) Nov 26 03:39:05 localhost podman[89681]: 2025-11-26 08:39:05.951104584 +0000 UTC m=+0.207609269 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, container_name=collectd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team) Nov 26 03:39:05 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:39:06 localhost podman[89683]: 2025-11-26 08:39:06.042016064 +0000 UTC m=+0.291530765 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:39:06 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:39:06 localhost podman[89693]: 2025-11-26 08:39:06.098834242 +0000 UTC m=+0.344170845 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, container_name=nova_migration_target, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, version=17.1.12, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true) Nov 26 03:39:06 localhost podman[89683]: 2025-11-26 08:39:06.115380008 +0000 UTC m=+0.364894729 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, distribution-scope=public, vcs-type=git, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:39:06 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:39:06 localhost podman[89682]: 2025-11-26 08:39:06.138472724 +0000 UTC m=+0.389190452 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=) Nov 26 03:39:06 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:39:06 localhost podman[89693]: 2025-11-26 08:39:06.437932291 +0000 UTC m=+0.683268904 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, tcib_managed=true) Nov 26 03:39:06 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:39:23 localhost sshd[89789]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:39:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:39:23 localhost podman[89790]: 2025-11-26 08:39:23.280029582 +0000 UTC m=+0.086467485 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, vcs-type=git, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044) Nov 26 03:39:23 localhost podman[89790]: 2025-11-26 08:39:23.469273638 +0000 UTC m=+0.275711481 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.expose-services=, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, version=17.1.12) Nov 26 03:39:23 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:39:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:39:28 localhost podman[89820]: 2025-11-26 08:39:28.814866072 +0000 UTC m=+0.074210400 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:39:28 localhost podman[89820]: 2025-11-26 08:39:28.872409751 +0000 UTC m=+0.131754109 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:39:28 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:39:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:39:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:39:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:39:34 localhost systemd[1]: tmp-crun.Ram4zh.mount: Deactivated successfully. Nov 26 03:39:34 localhost podman[89847]: 2025-11-26 08:39:34.814262999 +0000 UTC m=+0.081055399 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, build-date=2025-11-19T00:11:48Z, release=1761123044) Nov 26 03:39:34 localhost podman[89848]: 2025-11-26 08:39:34.866012592 +0000 UTC m=+0.128572032 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, release=1761123044, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, batch=17.1_20251118.1) Nov 26 03:39:34 localhost podman[89848]: 2025-11-26 08:39:34.876230844 +0000 UTC m=+0.138790284 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1761123044, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_id=tripleo_step4, vcs-type=git, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.expose-services=) Nov 26 03:39:34 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:39:34 localhost podman[89847]: 2025-11-26 08:39:34.889690066 +0000 UTC m=+0.156482456 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, config_id=tripleo_step4, batch=17.1_20251118.1, release=1761123044, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 26 03:39:34 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:39:34 localhost podman[89849]: 2025-11-26 08:39:34.974507759 +0000 UTC m=+0.236355888 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, vendor=Red Hat, Inc., url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 26 03:39:35 localhost podman[89849]: 2025-11-26 08:39:35.002300219 +0000 UTC m=+0.264148308 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, release=1761123044, config_id=tripleo_step4, version=17.1.12, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com) Nov 26 03:39:35 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:39:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:39:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:39:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:39:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:39:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:39:36 localhost podman[89938]: 2025-11-26 08:39:36.798226414 +0000 UTC m=+0.061591265 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, release=1761123044, config_id=tripleo_step3, batch=17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, container_name=collectd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team) Nov 26 03:39:36 localhost podman[89941]: 2025-11-26 08:39:36.857913399 +0000 UTC m=+0.115670078 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, batch=17.1_20251118.1, release=1761123044, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:39:36 localhost podman[89938]: 2025-11-26 08:39:36.880755478 +0000 UTC m=+0.144120329 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp17/openstack-collectd, version=17.1.12, batch=17.1_20251118.1, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, vcs-type=git) Nov 26 03:39:36 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:39:36 localhost podman[89940]: 2025-11-26 08:39:36.93512705 +0000 UTC m=+0.192289670 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, release=1761123044, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1) Nov 26 03:39:36 localhost podman[89939]: 2025-11-26 08:39:36.9822346 +0000 UTC m=+0.241423773 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, release=1761123044, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:39:37 localhost podman[89939]: 2025-11-26 08:39:37.001218641 +0000 UTC m=+0.260407854 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, name=rhosp17/openstack-ovn-controller, vcs-type=git) Nov 26 03:39:37 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:39:37 localhost podman[89940]: 2025-11-26 08:39:37.015555409 +0000 UTC m=+0.272718009 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, distribution-scope=public) Nov 26 03:39:37 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:39:37 localhost podman[89937]: 2025-11-26 08:39:36.835111542 +0000 UTC m=+0.097064229 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, tcib_managed=true, container_name=iscsid, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3) Nov 26 03:39:37 localhost podman[89937]: 2025-11-26 08:39:37.065757244 +0000 UTC m=+0.327709931 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, distribution-scope=public, io.openshift.expose-services=) Nov 26 03:39:37 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:39:37 localhost podman[89941]: 2025-11-26 08:39:37.240370453 +0000 UTC m=+0.498127132 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:39:37 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:39:37 localhost podman[90148]: 2025-11-26 08:39:37.80668146 +0000 UTC m=+0.088005222 container exec a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, maintainer=Guillaume Abrioux , vcs-type=git, RELEASE=main, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, GIT_CLEAN=True, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, version=7, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container) Nov 26 03:39:37 localhost systemd[1]: tmp-crun.xvMzi4.mount: Deactivated successfully. Nov 26 03:39:37 localhost podman[90148]: 2025-11-26 08:39:37.936962964 +0000 UTC m=+0.218286656 container exec_died a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_CLEAN=True, architecture=x86_64, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, version=7, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, release=553, ceph=True) Nov 26 03:39:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:39:53 localhost podman[90290]: 2025-11-26 08:39:53.819788639 +0000 UTC m=+0.082440851 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, container_name=metrics_qdr, vendor=Red Hat, Inc., version=17.1.12, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, vcs-type=git, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step1) Nov 26 03:39:54 localhost podman[90290]: 2025-11-26 08:39:54.050394032 +0000 UTC m=+0.313046234 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team) Nov 26 03:39:54 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:39:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:39:59 localhost podman[90319]: 2025-11-26 08:39:59.80788878 +0000 UTC m=+0.074167138 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, distribution-scope=public, version=17.1.12, io.buildah.version=1.41.4, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:39:59 localhost podman[90319]: 2025-11-26 08:39:59.864456509 +0000 UTC m=+0.130734777 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, config_id=tripleo_step5, container_name=nova_compute, name=rhosp17/openstack-nova-compute) Nov 26 03:39:59 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:40:02 localhost sshd[90345]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:40:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:40:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:40:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:40:05 localhost systemd[1]: tmp-crun.Mqr8MR.mount: Deactivated successfully. Nov 26 03:40:05 localhost podman[90347]: 2025-11-26 08:40:05.870292494 +0000 UTC m=+0.120732863 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, version=17.1.12, batch=17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_step4) Nov 26 03:40:05 localhost podman[90349]: 2025-11-26 08:40:05.831884839 +0000 UTC m=+0.081614517 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc.) Nov 26 03:40:05 localhost podman[90349]: 2025-11-26 08:40:05.917189398 +0000 UTC m=+0.166919066 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-type=git, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z) Nov 26 03:40:05 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:40:05 localhost podman[90348]: 2025-11-26 08:40:05.877783613 +0000 UTC m=+0.128218852 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron) Nov 26 03:40:05 localhost podman[90348]: 2025-11-26 08:40:05.960295996 +0000 UTC m=+0.210731265 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044, version=17.1.12, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Nov 26 03:40:05 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:40:06 localhost podman[90347]: 2025-11-26 08:40:06.01833032 +0000 UTC m=+0.268770649 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git) Nov 26 03:40:06 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:40:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:40:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:40:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:40:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:40:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:40:07 localhost systemd[1]: tmp-crun.NuWSIr.mount: Deactivated successfully. Nov 26 03:40:07 localhost podman[90419]: 2025-11-26 08:40:07.844180631 +0000 UTC m=+0.099286418 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:40:07 localhost podman[90421]: 2025-11-26 08:40:07.916690508 +0000 UTC m=+0.163008406 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044) Nov 26 03:40:07 localhost podman[90419]: 2025-11-26 08:40:07.927418375 +0000 UTC m=+0.182524172 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, managed_by=tripleo_ansible, version=17.1.12, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3) Nov 26 03:40:07 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:40:07 localhost podman[90431]: 2025-11-26 08:40:07.880127279 +0000 UTC m=+0.115077780 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_id=tripleo_step4, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, vendor=Red Hat, Inc., container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vcs-type=git, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, architecture=x86_64) Nov 26 03:40:08 localhost podman[90418]: 2025-11-26 08:40:08.004022738 +0000 UTC m=+0.260705863 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, release=1761123044, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, container_name=iscsid) Nov 26 03:40:08 localhost podman[90421]: 2025-11-26 08:40:08.016357415 +0000 UTC m=+0.262675313 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, config_id=tripleo_step4, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, release=1761123044, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible) Nov 26 03:40:08 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:40:08 localhost podman[90418]: 2025-11-26 08:40:08.037628516 +0000 UTC m=+0.294311641 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z) Nov 26 03:40:08 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:40:08 localhost podman[90420]: 2025-11-26 08:40:08.098783496 +0000 UTC m=+0.347259470 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, maintainer=OpenStack TripleO Team) Nov 26 03:40:08 localhost podman[90420]: 2025-11-26 08:40:08.180807934 +0000 UTC m=+0.429283858 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Nov 26 03:40:08 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:40:08 localhost podman[90431]: 2025-11-26 08:40:08.284505414 +0000 UTC m=+0.519455905 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-nova-compute, release=1761123044, com.redhat.component=openstack-nova-compute-container) Nov 26 03:40:08 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:40:17 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 26 03:40:17 localhost recover_tripleo_nova_virtqemud[90527]: 61604 Nov 26 03:40:17 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 26 03:40:17 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 26 03:40:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:40:24 localhost podman[90528]: 2025-11-26 08:40:24.826763407 +0000 UTC m=+0.089395634 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, config_id=tripleo_step1) Nov 26 03:40:25 localhost podman[90528]: 2025-11-26 08:40:25.026126493 +0000 UTC m=+0.288758710 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, vcs-type=git, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, release=1761123044, config_id=tripleo_step1, url=https://www.redhat.com, tcib_managed=true, container_name=metrics_qdr, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Nov 26 03:40:25 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:40:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:40:30 localhost systemd[1]: tmp-crun.9ZBYEi.mount: Deactivated successfully. Nov 26 03:40:30 localhost podman[90557]: 2025-11-26 08:40:30.818795877 +0000 UTC m=+0.082219015 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, vcs-type=git, url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team) Nov 26 03:40:30 localhost podman[90557]: 2025-11-26 08:40:30.875370017 +0000 UTC m=+0.138793145 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, release=1761123044, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, distribution-scope=public, url=https://www.redhat.com, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team) Nov 26 03:40:30 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:40:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:40:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:40:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:40:36 localhost systemd[1]: tmp-crun.o3iDYx.mount: Deactivated successfully. Nov 26 03:40:36 localhost podman[90602]: 2025-11-26 08:40:36.83724117 +0000 UTC m=+0.094918644 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, container_name=logrotate_crond, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, name=rhosp17/openstack-cron, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Nov 26 03:40:36 localhost podman[90602]: 2025-11-26 08:40:36.853578759 +0000 UTC m=+0.111256323 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, batch=17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, tcib_managed=true) Nov 26 03:40:36 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:40:36 localhost podman[90601]: 2025-11-26 08:40:36.92224797 +0000 UTC m=+0.179926334 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, architecture=x86_64) Nov 26 03:40:36 localhost podman[90601]: 2025-11-26 08:40:36.945216082 +0000 UTC m=+0.202894446 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-type=git, version=17.1.12) Nov 26 03:40:36 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:40:36 localhost podman[90603]: 2025-11-26 08:40:36.97983221 +0000 UTC m=+0.236541664 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, config_id=tripleo_step4, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:40:37 localhost podman[90603]: 2025-11-26 08:40:37.001171203 +0000 UTC m=+0.257880687 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, release=1761123044, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, config_id=tripleo_step4, url=https://www.redhat.com, version=17.1.12, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=) Nov 26 03:40:37 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:40:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:40:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:40:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:40:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:40:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:40:38 localhost systemd[1]: tmp-crun.GRuqzL.mount: Deactivated successfully. Nov 26 03:40:38 localhost podman[90677]: 2025-11-26 08:40:38.843587439 +0000 UTC m=+0.107460977 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, version=17.1.12, container_name=iscsid, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:40:38 localhost podman[90686]: 2025-11-26 08:40:38.892397362 +0000 UTC m=+0.142091566 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.buildah.version=1.41.4, tcib_managed=true, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, batch=17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 26 03:40:38 localhost podman[90677]: 2025-11-26 08:40:38.908725641 +0000 UTC m=+0.172599209 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, release=1761123044, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, vendor=Red Hat, Inc., config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid) Nov 26 03:40:38 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:40:38 localhost podman[90679]: 2025-11-26 08:40:38.990155961 +0000 UTC m=+0.248910102 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, batch=17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, version=17.1.12) Nov 26 03:40:39 localhost podman[90678]: 2025-11-26 08:40:39.041587643 +0000 UTC m=+0.302181351 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, version=17.1.12, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, container_name=collectd, batch=17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, name=rhosp17/openstack-collectd) Nov 26 03:40:39 localhost podman[90678]: 2025-11-26 08:40:39.054662514 +0000 UTC m=+0.315256222 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, release=1761123044, maintainer=OpenStack TripleO Team, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, version=17.1.12, architecture=x86_64) Nov 26 03:40:39 localhost podman[90680]: 2025-11-26 08:40:39.091255852 +0000 UTC m=+0.345981530 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 26 03:40:39 localhost podman[90679]: 2025-11-26 08:40:39.105282671 +0000 UTC m=+0.364036832 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, batch=17.1_20251118.1, io.buildah.version=1.41.4) Nov 26 03:40:39 localhost podman[90680]: 2025-11-26 08:40:39.134897576 +0000 UTC m=+0.389623254 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Nov 26 03:40:39 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:40:39 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:40:39 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:40:39 localhost podman[90686]: 2025-11-26 08:40:39.289812804 +0000 UTC m=+0.539506998 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.expose-services=, version=17.1.12, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vendor=Red Hat, Inc., tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:40:39 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:40:41 localhost sshd[90856]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:40:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:40:55 localhost podman[90858]: 2025-11-26 08:40:55.817585965 +0000 UTC m=+0.080511722 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, tcib_managed=true, config_id=tripleo_step1, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 03:40:56 localhost podman[90858]: 2025-11-26 08:40:56.010200775 +0000 UTC m=+0.273126472 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Nov 26 03:40:56 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:41:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:41:01 localhost podman[90887]: 2025-11-26 08:41:01.816566161 +0000 UTC m=+0.079669168 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, container_name=nova_compute, release=1761123044, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:41:01 localhost podman[90887]: 2025-11-26 08:41:01.846221728 +0000 UTC m=+0.109324795 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, vcs-type=git, managed_by=tripleo_ansible, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:41:01 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:41:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:41:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:41:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:41:07 localhost systemd[1]: tmp-crun.D4t4Rb.mount: Deactivated successfully. Nov 26 03:41:07 localhost podman[90912]: 2025-11-26 08:41:07.847379159 +0000 UTC m=+0.109739507 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, container_name=ceilometer_agent_compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=17.1.12) Nov 26 03:41:07 localhost podman[90912]: 2025-11-26 08:41:07.871283599 +0000 UTC m=+0.133643937 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vcs-type=git, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true) Nov 26 03:41:07 localhost podman[90914]: 2025-11-26 08:41:07.884886425 +0000 UTC m=+0.139177667 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 26 03:41:07 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:41:07 localhost podman[90913]: 2025-11-26 08:41:07.948226762 +0000 UTC m=+0.204889176 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, url=https://www.redhat.com, container_name=logrotate_crond) Nov 26 03:41:07 localhost podman[90914]: 2025-11-26 08:41:07.960136996 +0000 UTC m=+0.214428248 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, url=https://www.redhat.com, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 26 03:41:07 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:41:07 localhost podman[90913]: 2025-11-26 08:41:07.985327186 +0000 UTC m=+0.241989550 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, config_id=tripleo_step4, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:41:08 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:41:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:41:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:41:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:41:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:41:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:41:09 localhost systemd[1]: tmp-crun.5HqK4Q.mount: Deactivated successfully. Nov 26 03:41:09 localhost podman[90984]: 2025-11-26 08:41:09.840613178 +0000 UTC m=+0.096739199 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step3, container_name=collectd, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, batch=17.1_20251118.1, version=17.1.12, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=) Nov 26 03:41:09 localhost podman[90984]: 2025-11-26 08:41:09.87533457 +0000 UTC m=+0.131460561 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_id=tripleo_step3, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, url=https://www.redhat.com, version=17.1.12, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Nov 26 03:41:09 localhost systemd[1]: tmp-crun.9hKHsf.mount: Deactivated successfully. Nov 26 03:41:09 localhost podman[90983]: 2025-11-26 08:41:09.889775071 +0000 UTC m=+0.147573393 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.4, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, architecture=x86_64, release=1761123044, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, batch=17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Nov 26 03:41:09 localhost podman[90983]: 2025-11-26 08:41:09.900178839 +0000 UTC m=+0.157977171 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc.) Nov 26 03:41:09 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:41:09 localhost podman[90985]: 2025-11-26 08:41:09.940006797 +0000 UTC m=+0.189773514 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_id=tripleo_step4, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, container_name=ovn_controller, version=17.1.12, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, architecture=x86_64) Nov 26 03:41:09 localhost podman[90985]: 2025-11-26 08:41:09.962366171 +0000 UTC m=+0.212132958 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, vcs-type=git, version=17.1.12, name=rhosp17/openstack-ovn-controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.expose-services=, container_name=ovn_controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc.) Nov 26 03:41:09 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:41:10 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:41:10 localhost podman[90997]: 2025-11-26 08:41:10.057089877 +0000 UTC m=+0.297900760 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20251118.1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, container_name=nova_migration_target, io.buildah.version=1.41.4, version=17.1.12, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step4) Nov 26 03:41:10 localhost podman[90989]: 2025-11-26 08:41:10.102200427 +0000 UTC m=+0.348636342 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, version=17.1.12, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Nov 26 03:41:10 localhost podman[90989]: 2025-11-26 08:41:10.144257003 +0000 UTC m=+0.390692958 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_id=tripleo_step4, distribution-scope=public, io.buildah.version=1.41.4, tcib_managed=true, vcs-type=git, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 26 03:41:10 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:41:10 localhost podman[90997]: 2025-11-26 08:41:10.428198245 +0000 UTC m=+0.669009088 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, version=17.1.12, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container) Nov 26 03:41:10 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:41:21 localhost sshd[91093]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:41:21 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 26 03:41:21 localhost recover_tripleo_nova_virtqemud[91096]: 61604 Nov 26 03:41:21 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 26 03:41:21 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 26 03:41:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:41:26 localhost systemd[1]: tmp-crun.AZIC8d.mount: Deactivated successfully. Nov 26 03:41:26 localhost podman[91097]: 2025-11-26 08:41:26.833031024 +0000 UTC m=+0.090856890 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, container_name=metrics_qdr, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:41:27 localhost podman[91097]: 2025-11-26 08:41:27.041420336 +0000 UTC m=+0.299246212 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, distribution-scope=public, io.buildah.version=1.41.4, release=1761123044, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:41:27 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:41:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:41:32 localhost systemd[1]: tmp-crun.L62zef.mount: Deactivated successfully. Nov 26 03:41:32 localhost podman[91124]: 2025-11-26 08:41:32.834112993 +0000 UTC m=+0.092317883 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com) Nov 26 03:41:32 localhost podman[91124]: 2025-11-26 08:41:32.873413145 +0000 UTC m=+0.131618045 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:41:32 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:41:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:41:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:41:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:41:38 localhost systemd[1]: tmp-crun.2RhCiO.mount: Deactivated successfully. Nov 26 03:41:38 localhost systemd[1]: tmp-crun.Xny3fL.mount: Deactivated successfully. Nov 26 03:41:38 localhost podman[91150]: 2025-11-26 08:41:38.88225126 +0000 UTC m=+0.142388445 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, distribution-scope=public, vcs-type=git, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, batch=17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 26 03:41:38 localhost podman[91152]: 2025-11-26 08:41:38.945192455 +0000 UTC m=+0.196159439 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:41:38 localhost podman[91151]: 2025-11-26 08:41:38.909500564 +0000 UTC m=+0.163348757 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Nov 26 03:41:38 localhost podman[91150]: 2025-11-26 08:41:38.964636829 +0000 UTC m=+0.224774014 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, architecture=x86_64, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 26 03:41:38 localhost podman[91152]: 2025-11-26 08:41:38.973289094 +0000 UTC m=+0.224256078 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:41:38 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:41:38 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:41:38 localhost podman[91151]: 2025-11-26 08:41:38.987989314 +0000 UTC m=+0.241837547 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, architecture=x86_64) Nov 26 03:41:39 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:41:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:41:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:41:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:41:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:41:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:41:40 localhost podman[91223]: 2025-11-26 08:41:40.823142137 +0000 UTC m=+0.079309705 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, version=17.1.12, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Nov 26 03:41:40 localhost systemd[1]: tmp-crun.6GD4PR.mount: Deactivated successfully. Nov 26 03:41:40 localhost podman[91223]: 2025-11-26 08:41:40.862767029 +0000 UTC m=+0.118934617 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, release=1761123044, url=https://www.redhat.com, name=rhosp17/openstack-collectd, tcib_managed=true, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, version=17.1.12, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1) Nov 26 03:41:40 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:41:40 localhost podman[91222]: 2025-11-26 08:41:40.936104331 +0000 UTC m=+0.192457585 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, url=https://www.redhat.com, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid) Nov 26 03:41:40 localhost podman[91225]: 2025-11-26 08:41:40.851107872 +0000 UTC m=+0.098713569 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, container_name=ovn_metadata_agent) Nov 26 03:41:40 localhost podman[91222]: 2025-11-26 08:41:40.971259837 +0000 UTC m=+0.227613051 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, version=17.1.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z) Nov 26 03:41:40 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:41:40 localhost podman[91237]: 2025-11-26 08:41:40.984645306 +0000 UTC m=+0.228760016 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, distribution-scope=public, config_id=tripleo_step4, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, url=https://www.redhat.com) Nov 26 03:41:41 localhost podman[91224]: 2025-11-26 08:41:40.905372741 +0000 UTC m=+0.155970909 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, vcs-type=git, tcib_managed=true, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, container_name=ovn_controller, vendor=Red Hat, Inc.) Nov 26 03:41:41 localhost podman[91225]: 2025-11-26 08:41:41.034445218 +0000 UTC m=+0.282050945 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4) Nov 26 03:41:41 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:41:41 localhost podman[91224]: 2025-11-26 08:41:41.089627466 +0000 UTC m=+0.340225644 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, distribution-scope=public, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, release=1761123044, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:41:41 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:41:41 localhost podman[91237]: 2025-11-26 08:41:41.357345552 +0000 UTC m=+0.601460242 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team) Nov 26 03:41:41 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:41:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:41:57 localhost systemd[1]: tmp-crun.no8G75.mount: Deactivated successfully. Nov 26 03:41:57 localhost podman[91409]: 2025-11-26 08:41:57.803802763 +0000 UTC m=+0.066159533 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Nov 26 03:41:58 localhost podman[91409]: 2025-11-26 08:41:58.047062382 +0000 UTC m=+0.309419162 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.openshift.expose-services=, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, io.buildah.version=1.41.4, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step1, container_name=metrics_qdr, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 03:41:58 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:42:00 localhost sshd[91437]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:42:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:42:03 localhost systemd[1]: tmp-crun.UWOVXY.mount: Deactivated successfully. Nov 26 03:42:03 localhost podman[91439]: 2025-11-26 08:42:03.823011509 +0000 UTC m=+0.080559334 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1761123044, io.openshift.expose-services=, tcib_managed=true, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, distribution-scope=public, config_id=tripleo_step5, batch=17.1_20251118.1, vendor=Red Hat, Inc.) Nov 26 03:42:03 localhost podman[91439]: 2025-11-26 08:42:03.854299816 +0000 UTC m=+0.111847631 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20251118.1, tcib_managed=true, version=17.1.12, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container) Nov 26 03:42:03 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:42:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:42:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:42:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:42:09 localhost podman[91465]: 2025-11-26 08:42:09.835445616 +0000 UTC m=+0.093682575 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 26 03:42:09 localhost podman[91465]: 2025-11-26 08:42:09.876291156 +0000 UTC m=+0.134528145 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step4, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:42:09 localhost systemd[1]: tmp-crun.0K5xVE.mount: Deactivated successfully. Nov 26 03:42:09 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:42:09 localhost podman[91466]: 2025-11-26 08:42:09.944108419 +0000 UTC m=+0.198056507 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1761123044, config_id=tripleo_step4, architecture=x86_64, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, container_name=logrotate_crond, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:42:09 localhost podman[91467]: 2025-11-26 08:42:09.907350205 +0000 UTC m=+0.159441826 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, container_name=ceilometer_agent_ipmi, release=1761123044, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container) Nov 26 03:42:09 localhost podman[91467]: 2025-11-26 08:42:09.987974131 +0000 UTC m=+0.240065692 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc.) Nov 26 03:42:10 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:42:10 localhost podman[91466]: 2025-11-26 08:42:10.011867541 +0000 UTC m=+0.265815649 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, batch=17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, distribution-scope=public, architecture=x86_64, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:42:10 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:42:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:42:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:42:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:42:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:42:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:42:11 localhost systemd[1]: tmp-crun.SFkwzg.mount: Deactivated successfully. Nov 26 03:42:11 localhost podman[91536]: 2025-11-26 08:42:11.848437038 +0000 UTC m=+0.101178194 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:42:11 localhost podman[91536]: 2025-11-26 08:42:11.859240499 +0000 UTC m=+0.111981685 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, container_name=iscsid, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, io.openshift.expose-services=, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64) Nov 26 03:42:11 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:42:11 localhost systemd[1]: tmp-crun.ptnx7w.mount: Deactivated successfully. Nov 26 03:42:11 localhost podman[91537]: 2025-11-26 08:42:11.905009578 +0000 UTC m=+0.152482723 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-type=git) Nov 26 03:42:11 localhost podman[91545]: 2025-11-26 08:42:11.941063741 +0000 UTC m=+0.179080837 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, release=1761123044, io.buildah.version=1.41.4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, vendor=Red Hat, Inc.) Nov 26 03:42:11 localhost podman[91537]: 2025-11-26 08:42:11.967844749 +0000 UTC m=+0.215317954 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, version=17.1.12, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1) Nov 26 03:42:11 localhost podman[91538]: 2025-11-26 08:42:11.822394822 +0000 UTC m=+0.070036032 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, io.openshift.expose-services=, container_name=ovn_controller, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, vendor=Red Hat, Inc., batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true) Nov 26 03:42:11 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:42:12 localhost podman[91538]: 2025-11-26 08:42:12.007608776 +0000 UTC m=+0.255250026 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, release=1761123044, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team) Nov 26 03:42:12 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:42:12 localhost podman[91555]: 2025-11-26 08:42:12.051497817 +0000 UTC m=+0.282029044 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:42:12 localhost podman[91545]: 2025-11-26 08:42:12.058258424 +0000 UTC m=+0.296275470 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, version=17.1.12, batch=17.1_20251118.1, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Nov 26 03:42:12 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:42:12 localhost podman[91555]: 2025-11-26 08:42:12.4535196 +0000 UTC m=+0.684050887 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc.) Nov 26 03:42:12 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:42:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:42:28 localhost systemd[1]: tmp-crun.iMgffS.mount: Deactivated successfully. Nov 26 03:42:28 localhost podman[91646]: 2025-11-26 08:42:28.829017822 +0000 UTC m=+0.093051826 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, build-date=2025-11-18T22:49:46Z, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible) Nov 26 03:42:29 localhost podman[91646]: 2025-11-26 08:42:29.045404509 +0000 UTC m=+0.309438573 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_id=tripleo_step1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, version=17.1.12, url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, architecture=x86_64, build-date=2025-11-18T22:49:46Z, tcib_managed=true, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public) Nov 26 03:42:29 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:42:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:42:34 localhost systemd[1]: tmp-crun.Nz2HMq.mount: Deactivated successfully. Nov 26 03:42:34 localhost podman[91675]: 2025-11-26 08:42:34.824535974 +0000 UTC m=+0.087614611 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, release=1761123044, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, architecture=x86_64, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Nov 26 03:42:34 localhost podman[91675]: 2025-11-26 08:42:34.88035879 +0000 UTC m=+0.143437427 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4) Nov 26 03:42:34 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:42:37 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 26 03:42:37 localhost recover_tripleo_nova_virtqemud[91702]: 61604 Nov 26 03:42:37 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 26 03:42:37 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 26 03:42:39 localhost sshd[91703]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:42:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:42:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:42:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:42:40 localhost podman[91705]: 2025-11-26 08:42:40.25970184 +0000 UTC m=+0.066576397 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true) Nov 26 03:42:40 localhost podman[91707]: 2025-11-26 08:42:40.295565025 +0000 UTC m=+0.091964882 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, tcib_managed=true, vcs-type=git, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, release=1761123044, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi) Nov 26 03:42:40 localhost podman[91706]: 2025-11-26 08:42:40.318117446 +0000 UTC m=+0.123025243 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=, name=rhosp17/openstack-cron, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.) Nov 26 03:42:40 localhost podman[91706]: 2025-11-26 08:42:40.325254234 +0000 UTC m=+0.130162061 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, build-date=2025-11-18T22:49:32Z, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, vcs-type=git, version=17.1.12, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:42:40 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:42:40 localhost podman[91705]: 2025-11-26 08:42:40.346495504 +0000 UTC m=+0.153370101 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, container_name=ceilometer_agent_compute, distribution-scope=public, architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:42:40 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:42:40 localhost podman[91707]: 2025-11-26 08:42:40.430523523 +0000 UTC m=+0.226923380 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12) Nov 26 03:42:40 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:42:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:42:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:42:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:42:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:42:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:42:42 localhost podman[91778]: 2025-11-26 08:42:42.836543133 +0000 UTC m=+0.086147905 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, distribution-scope=public, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_id=tripleo_step4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Nov 26 03:42:42 localhost systemd[1]: tmp-crun.ViEKuD.mount: Deactivated successfully. Nov 26 03:42:42 localhost podman[91775]: 2025-11-26 08:42:42.893008179 +0000 UTC m=+0.152276207 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, container_name=iscsid, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, batch=17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible) Nov 26 03:42:42 localhost podman[91775]: 2025-11-26 08:42:42.926005648 +0000 UTC m=+0.185273666 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., tcib_managed=true, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, io.buildah.version=1.41.4, version=17.1.12) Nov 26 03:42:42 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:42:42 localhost podman[91777]: 2025-11-26 08:42:42.938116969 +0000 UTC m=+0.189922769 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, architecture=x86_64, tcib_managed=true, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., url=https://www.redhat.com, container_name=ovn_controller, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, vcs-type=git) Nov 26 03:42:42 localhost podman[91778]: 2025-11-26 08:42:42.946777193 +0000 UTC m=+0.196381995 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, vcs-type=git, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64) Nov 26 03:42:42 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:42:42 localhost podman[91777]: 2025-11-26 08:42:42.958914074 +0000 UTC m=+0.210719914 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, io.buildah.version=1.41.4, release=1761123044, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_step4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:42:42 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:42:43 localhost podman[91776]: 2025-11-26 08:42:43.045573294 +0000 UTC m=+0.301046686 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, container_name=collectd, io.buildah.version=1.41.4, version=17.1.12, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, name=rhosp17/openstack-collectd, batch=17.1_20251118.1) Nov 26 03:42:43 localhost podman[91776]: 2025-11-26 08:42:43.055434576 +0000 UTC m=+0.310908028 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, version=17.1.12, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, io.openshift.expose-services=, distribution-scope=public, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Nov 26 03:42:43 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:42:43 localhost podman[91784]: 2025-11-26 08:42:43.086096183 +0000 UTC m=+0.331188658 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=) Nov 26 03:42:43 localhost podman[91784]: 2025-11-26 08:42:43.419461157 +0000 UTC m=+0.664553622 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, vcs-type=git, config_id=tripleo_step4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:42:43 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:42:43 localhost systemd[1]: tmp-crun.usNDaH.mount: Deactivated successfully. Nov 26 03:42:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:42:59 localhost podman[91957]: 2025-11-26 08:42:59.841663869 +0000 UTC m=+0.093419088 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, distribution-scope=public, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.expose-services=, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:43:00 localhost podman[91957]: 2025-11-26 08:43:00.03141411 +0000 UTC m=+0.283169329 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, release=1761123044, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Nov 26 03:43:00 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:43:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:43:05 localhost systemd[1]: tmp-crun.FQxmcL.mount: Deactivated successfully. Nov 26 03:43:05 localhost podman[91986]: 2025-11-26 08:43:05.831154605 +0000 UTC m=+0.092461459 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, distribution-scope=public, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, release=1761123044, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4) Nov 26 03:43:05 localhost podman[91986]: 2025-11-26 08:43:05.892462049 +0000 UTC m=+0.153768883 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, version=17.1.12, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, url=https://www.redhat.com, io.buildah.version=1.41.4) Nov 26 03:43:05 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:43:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:43:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:43:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:43:10 localhost podman[92012]: 2025-11-26 08:43:10.824997692 +0000 UTC m=+0.085658380 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, distribution-scope=public, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, batch=17.1_20251118.1, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team) Nov 26 03:43:10 localhost podman[92012]: 2025-11-26 08:43:10.857115515 +0000 UTC m=+0.117776173 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:43:10 localhost podman[92013]: 2025-11-26 08:43:10.870029109 +0000 UTC m=+0.127902732 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, version=17.1.12, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=) Nov 26 03:43:10 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:43:10 localhost systemd[1]: tmp-crun.J9xfxT.mount: Deactivated successfully. Nov 26 03:43:10 localhost podman[92014]: 2025-11-26 08:43:10.953697537 +0000 UTC m=+0.207435944 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, version=17.1.12, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, url=https://www.redhat.com, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:43:10 localhost podman[92013]: 2025-11-26 08:43:10.957333879 +0000 UTC m=+0.215207532 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, config_id=tripleo_step4, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, name=rhosp17/openstack-cron, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, architecture=x86_64) Nov 26 03:43:10 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:43:10 localhost podman[92014]: 2025-11-26 08:43:10.990444481 +0000 UTC m=+0.244182888 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, container_name=ceilometer_agent_ipmi, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true) Nov 26 03:43:11 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:43:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:43:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:43:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:43:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:43:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:43:13 localhost systemd[1]: tmp-crun.7L3j2F.mount: Deactivated successfully. Nov 26 03:43:13 localhost podman[92084]: 2025-11-26 08:43:13.841166058 +0000 UTC m=+0.091008873 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, version=17.1.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:43:13 localhost podman[92084]: 2025-11-26 08:43:13.852170135 +0000 UTC m=+0.102012990 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, batch=17.1_20251118.1, vcs-type=git, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, tcib_managed=true, architecture=x86_64, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Nov 26 03:43:13 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:43:13 localhost podman[92085]: 2025-11-26 08:43:13.892054664 +0000 UTC m=+0.135625148 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, container_name=ovn_controller, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step4) Nov 26 03:43:13 localhost podman[92085]: 2025-11-26 08:43:13.950231953 +0000 UTC m=+0.193802447 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_id=tripleo_step4) Nov 26 03:43:13 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:43:13 localhost podman[92087]: 2025-11-26 08:43:13.959250949 +0000 UTC m=+0.201613566 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Nov 26 03:43:13 localhost podman[92083]: 2025-11-26 08:43:13.988519064 +0000 UTC m=+0.238143763 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, version=17.1.12, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4) Nov 26 03:43:14 localhost podman[92083]: 2025-11-26 08:43:14.002232254 +0000 UTC m=+0.251856943 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, name=rhosp17/openstack-iscsid, release=1761123044, distribution-scope=public, architecture=x86_64, container_name=iscsid, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1) Nov 26 03:43:14 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:43:14 localhost podman[92086]: 2025-11-26 08:43:14.093623277 +0000 UTC m=+0.338918214 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, version=17.1.12) Nov 26 03:43:14 localhost podman[92086]: 2025-11-26 08:43:14.155335585 +0000 UTC m=+0.400630522 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, architecture=x86_64) Nov 26 03:43:14 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:43:14 localhost podman[92087]: 2025-11-26 08:43:14.354526225 +0000 UTC m=+0.596888842 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:43:14 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:43:14 localhost systemd[1]: tmp-crun.JyDM2K.mount: Deactivated successfully. Nov 26 03:43:20 localhost sshd[92190]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:43:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:43:30 localhost systemd[1]: tmp-crun.hhki4k.mount: Deactivated successfully. Nov 26 03:43:30 localhost podman[92193]: 2025-11-26 08:43:30.844605333 +0000 UTC m=+0.102254119 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:43:31 localhost podman[92193]: 2025-11-26 08:43:31.040288726 +0000 UTC m=+0.297937502 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, tcib_managed=true, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, batch=17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 03:43:31 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:43:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:43:36 localhost podman[92222]: 2025-11-26 08:43:36.82534843 +0000 UTC m=+0.086784734 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 26 03:43:36 localhost podman[92222]: 2025-11-26 08:43:36.886742908 +0000 UTC m=+0.148179202 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.buildah.version=1.41.4, container_name=nova_compute, release=1761123044) Nov 26 03:43:36 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:43:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:43:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:43:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:43:41 localhost systemd[1]: tmp-crun.GjjY1k.mount: Deactivated successfully. Nov 26 03:43:41 localhost podman[92248]: 2025-11-26 08:43:41.837917772 +0000 UTC m=+0.098778251 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 26 03:43:41 localhost podman[92249]: 2025-11-26 08:43:41.888097196 +0000 UTC m=+0.145739557 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, version=17.1.12, managed_by=tripleo_ansible) Nov 26 03:43:41 localhost podman[92248]: 2025-11-26 08:43:41.893496941 +0000 UTC m=+0.154357430 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, io.openshift.expose-services=, config_id=tripleo_step4, batch=17.1_20251118.1, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, maintainer=OpenStack TripleO Team) Nov 26 03:43:41 localhost podman[92249]: 2025-11-26 08:43:41.90129434 +0000 UTC m=+0.158936731 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, release=1761123044, config_id=tripleo_step4, architecture=x86_64, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team) Nov 26 03:43:41 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:43:41 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:43:41 localhost podman[92250]: 2025-11-26 08:43:41.972007632 +0000 UTC m=+0.226931950 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20251118.1, config_id=tripleo_step4, release=1761123044, distribution-scope=public, io.openshift.expose-services=, version=17.1.12, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, managed_by=tripleo_ansible) Nov 26 03:43:42 localhost podman[92250]: 2025-11-26 08:43:42.019269607 +0000 UTC m=+0.274193905 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, tcib_managed=true, distribution-scope=public, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., release=1761123044, build-date=2025-11-19T00:12:45Z, vcs-type=git, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Nov 26 03:43:42 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:43:42 localhost systemd[1]: tmp-crun.OqO5sJ.mount: Deactivated successfully. Nov 26 03:43:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:43:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:43:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:43:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:43:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:43:44 localhost systemd[1]: tmp-crun.Kuciai.mount: Deactivated successfully. Nov 26 03:43:44 localhost podman[92338]: 2025-11-26 08:43:44.540258714 +0000 UTC m=+0.118097273 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_id=tripleo_step3, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, url=https://www.redhat.com, batch=17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, container_name=collectd, distribution-scope=public, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Nov 26 03:43:44 localhost podman[92340]: 2025-11-26 08:43:44.483725365 +0000 UTC m=+0.058132148 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn) Nov 26 03:43:44 localhost podman[92341]: 2025-11-26 08:43:44.590030796 +0000 UTC m=+0.163145430 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step4) Nov 26 03:43:44 localhost podman[92339]: 2025-11-26 08:43:44.512167846 +0000 UTC m=+0.086248539 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-type=git, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Nov 26 03:43:44 localhost podman[92340]: 2025-11-26 08:43:44.618239238 +0000 UTC m=+0.192646011 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, version=17.1.12, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, release=1761123044, batch=17.1_20251118.1, config_id=tripleo_step4, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c) Nov 26 03:43:44 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:43:44 localhost podman[92336]: 2025-11-26 08:43:44.565732513 +0000 UTC m=+0.142548760 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=iscsid, release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 26 03:43:44 localhost podman[92338]: 2025-11-26 08:43:44.67747511 +0000 UTC m=+0.255313649 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, release=1761123044, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, distribution-scope=public, io.buildah.version=1.41.4, config_id=tripleo_step3, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd) Nov 26 03:43:44 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:43:44 localhost podman[92339]: 2025-11-26 08:43:44.696754899 +0000 UTC m=+0.270835582 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044) Nov 26 03:43:44 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Deactivated successfully. Nov 26 03:43:44 localhost podman[92336]: 2025-11-26 08:43:44.752015689 +0000 UTC m=+0.328831926 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, vcs-type=git, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, release=1761123044) Nov 26 03:43:44 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:43:44 localhost podman[92341]: 2025-11-26 08:43:44.978379221 +0000 UTC m=+0.551493815 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1761123044, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step4, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:43:44 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:43:57 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 26 03:43:57 localhost recover_tripleo_nova_virtqemud[92505]: 61604 Nov 26 03:43:57 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 26 03:43:57 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 26 03:44:01 localhost sshd[92506]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:44:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:44:01 localhost systemd[1]: tmp-crun.AwgZP2.mount: Deactivated successfully. Nov 26 03:44:01 localhost podman[92508]: 2025-11-26 08:44:01.794917652 +0000 UTC m=+0.082486014 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, config_id=tripleo_step1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, version=17.1.12, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 03:44:01 localhost podman[92508]: 2025-11-26 08:44:01.97211608 +0000 UTC m=+0.259684462 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, vendor=Red Hat, Inc., vcs-type=git, version=17.1.12, architecture=x86_64, config_id=tripleo_step1, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:44:01 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:44:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:44:07 localhost systemd[1]: tmp-crun.itbhxK.mount: Deactivated successfully. Nov 26 03:44:07 localhost podman[92536]: 2025-11-26 08:44:07.876413792 +0000 UTC m=+0.098635812 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, config_id=tripleo_step5, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, release=1761123044, container_name=nova_compute, maintainer=OpenStack TripleO Team) Nov 26 03:44:07 localhost podman[92536]: 2025-11-26 08:44:07.930032346 +0000 UTC m=+0.152254376 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, url=https://www.redhat.com, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team) Nov 26 03:44:07 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:44:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:44:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:44:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:44:12 localhost systemd[1]: tmp-crun.pLgeFS.mount: Deactivated successfully. Nov 26 03:44:12 localhost podman[92562]: 2025-11-26 08:44:12.836568429 +0000 UTC m=+0.101021106 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=) Nov 26 03:44:12 localhost podman[92562]: 2025-11-26 08:44:12.889357487 +0000 UTC m=+0.153810124 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1) Nov 26 03:44:12 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:44:12 localhost systemd[1]: tmp-crun.NMTe3C.mount: Deactivated successfully. Nov 26 03:44:12 localhost podman[92564]: 2025-11-26 08:44:12.980103486 +0000 UTC m=+0.238132504 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team) Nov 26 03:44:13 localhost podman[92564]: 2025-11-26 08:44:13.009144881 +0000 UTC m=+0.267173929 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, tcib_managed=true, distribution-scope=public, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 26 03:44:13 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:44:13 localhost podman[92563]: 2025-11-26 08:44:13.027880799 +0000 UTC m=+0.289022934 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, release=1761123044, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, distribution-scope=public) Nov 26 03:44:13 localhost podman[92563]: 2025-11-26 08:44:13.044486541 +0000 UTC m=+0.305628676 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, release=1761123044, container_name=logrotate_crond, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4) Nov 26 03:44:13 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:44:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:44:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:44:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:44:14 localhost podman[92634]: 2025-11-26 08:44:14.808904927 +0000 UTC m=+0.072290479 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1761123044, managed_by=tripleo_ansible, container_name=ovn_controller, io.openshift.expose-services=) Nov 26 03:44:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:44:14 localhost podman[92634]: 2025-11-26 08:44:14.828119599 +0000 UTC m=+0.091505161 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., release=1761123044, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, container_name=ovn_controller, tcib_managed=true, name=rhosp17/openstack-ovn-controller, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Nov 26 03:44:14 localhost podman[92634]: unhealthy Nov 26 03:44:14 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:44:14 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 03:44:14 localhost podman[92635]: 2025-11-26 08:44:14.874580782 +0000 UTC m=+0.132812056 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.expose-services=, version=17.1.12, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z) Nov 26 03:44:14 localhost podman[92635]: 2025-11-26 08:44:14.917594278 +0000 UTC m=+0.175825542 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-type=git, tcib_managed=true, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn) Nov 26 03:44:14 localhost systemd[1]: tmp-crun.O2J28C.mount: Deactivated successfully. Nov 26 03:44:14 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:44:14 localhost podman[92677]: 2025-11-26 08:44:14.966697353 +0000 UTC m=+0.132263769 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, managed_by=tripleo_ansible, container_name=iscsid, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:44:14 localhost podman[92633]: 2025-11-26 08:44:14.92444232 +0000 UTC m=+0.189278038 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, container_name=collectd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, name=rhosp17/openstack-collectd, version=17.1.12, release=1761123044, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Nov 26 03:44:15 localhost podman[92677]: 2025-11-26 08:44:14.999917308 +0000 UTC m=+0.165483724 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, version=17.1.12, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, release=1761123044) Nov 26 03:44:15 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:44:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:44:15 localhost podman[92633]: 2025-11-26 08:44:15.057203744 +0000 UTC m=+0.322039512 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, release=1761123044, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, batch=17.1_20251118.1, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:44:15 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:44:15 localhost podman[92718]: 2025-11-26 08:44:15.117221464 +0000 UTC m=+0.078007936 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, distribution-scope=public, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, io.openshift.expose-services=) Nov 26 03:44:15 localhost podman[92718]: 2025-11-26 08:44:15.520293043 +0000 UTC m=+0.481079515 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, batch=17.1_20251118.1, vendor=Red Hat, Inc., release=1761123044, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z) Nov 26 03:44:15 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:44:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:44:32 localhost podman[92741]: 2025-11-26 08:44:32.83980833 +0000 UTC m=+0.100678835 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, url=https://www.redhat.com, batch=17.1_20251118.1, config_id=tripleo_step1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., distribution-scope=public, container_name=metrics_qdr, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible) Nov 26 03:44:33 localhost podman[92741]: 2025-11-26 08:44:33.0476849 +0000 UTC m=+0.308555475 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, container_name=metrics_qdr, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:44:33 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:44:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:44:38 localhost podman[92768]: 2025-11-26 08:44:38.798739195 +0000 UTC m=+0.062554929 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, config_id=tripleo_step5, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 26 03:44:38 localhost podman[92768]: 2025-11-26 08:44:38.831263269 +0000 UTC m=+0.095078983 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, architecture=x86_64, container_name=nova_compute, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:44:38 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:44:41 localhost sshd[92794]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:44:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:44:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:44:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:44:43 localhost systemd[1]: tmp-crun.MGZ9oH.mount: Deactivated successfully. Nov 26 03:44:43 localhost podman[92797]: 2025-11-26 08:44:43.82369684 +0000 UTC m=+0.086438026 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, version=17.1.12, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, managed_by=tripleo_ansible, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z) Nov 26 03:44:43 localhost podman[92797]: 2025-11-26 08:44:43.85839303 +0000 UTC m=+0.121134186 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, container_name=logrotate_crond, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container) Nov 26 03:44:43 localhost podman[92796]: 2025-11-26 08:44:43.866069467 +0000 UTC m=+0.130674571 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_id=tripleo_step4, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team) Nov 26 03:44:43 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:44:43 localhost podman[92798]: 2025-11-26 08:44:43.925807469 +0000 UTC m=+0.181359124 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, batch=17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, release=1761123044, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 26 03:44:43 localhost podman[92796]: 2025-11-26 08:44:43.949987005 +0000 UTC m=+0.214592149 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, version=17.1.12, vcs-type=git, tcib_managed=true) Nov 26 03:44:43 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:44:43 localhost podman[92798]: 2025-11-26 08:44:43.973349625 +0000 UTC m=+0.228901310 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, distribution-scope=public, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 26 03:44:43 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:44:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:44:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:44:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:44:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:44:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:44:45 localhost podman[92868]: 2025-11-26 08:44:45.839812647 +0000 UTC m=+0.096221727 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, vcs-type=git, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1) Nov 26 03:44:45 localhost podman[92866]: 2025-11-26 08:44:45.890089698 +0000 UTC m=+0.150723878 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, url=https://www.redhat.com, release=1761123044, name=rhosp17/openstack-iscsid, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 26 03:44:45 localhost podman[92868]: 2025-11-26 08:44:45.918265687 +0000 UTC m=+0.174674797 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, release=1761123044, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, managed_by=tripleo_ansible, vcs-type=git) Nov 26 03:44:45 localhost podman[92868]: unhealthy Nov 26 03:44:45 localhost podman[92869]: 2025-11-26 08:44:45.932114864 +0000 UTC m=+0.183592522 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, vcs-type=git, url=https://www.redhat.com) Nov 26 03:44:45 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:44:45 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 03:44:45 localhost podman[92867]: 2025-11-26 08:44:45.993748654 +0000 UTC m=+0.252233729 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, container_name=collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:44:46 localhost podman[92866]: 2025-11-26 08:44:46.006671193 +0000 UTC m=+0.267305323 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, config_id=tripleo_step3, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, release=1761123044, com.redhat.component=openstack-iscsid-container, architecture=x86_64, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Nov 26 03:44:46 localhost podman[92869]: 2025-11-26 08:44:46.016333571 +0000 UTC m=+0.267811239 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent) Nov 26 03:44:46 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:44:46 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Deactivated successfully. Nov 26 03:44:46 localhost podman[92867]: 2025-11-26 08:44:46.057324895 +0000 UTC m=+0.315809970 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, version=17.1.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, batch=17.1_20251118.1, container_name=collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_id=tripleo_step3, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd) Nov 26 03:44:46 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:44:46 localhost podman[92875]: 2025-11-26 08:44:46.146823995 +0000 UTC m=+0.395361052 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, io.openshift.expose-services=, distribution-scope=public, build-date=2025-11-19T00:36:58Z, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team) Nov 26 03:44:46 localhost podman[92875]: 2025-11-26 08:44:46.583352835 +0000 UTC m=+0.831889832 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_id=tripleo_step4, container_name=nova_migration_target, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:44:46 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:44:46 localhost systemd[1]: tmp-crun.ObGUcp.mount: Deactivated successfully. Nov 26 03:45:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:45:03 localhost podman[93054]: 2025-11-26 08:45:03.824182317 +0000 UTC m=+0.090267484 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, architecture=x86_64, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, version=17.1.12, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Nov 26 03:45:03 localhost podman[93054]: 2025-11-26 08:45:03.994718715 +0000 UTC m=+0.260803852 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, container_name=metrics_qdr, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z) Nov 26 03:45:04 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:45:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:45:09 localhost podman[93083]: 2025-11-26 08:45:09.822539035 +0000 UTC m=+0.083660430 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Nov 26 03:45:09 localhost podman[93083]: 2025-11-26 08:45:09.852176079 +0000 UTC m=+0.113297504 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4) Nov 26 03:45:09 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:45:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:45:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:45:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:45:14 localhost systemd[1]: tmp-crun.NteaJZ.mount: Deactivated successfully. Nov 26 03:45:14 localhost podman[93110]: 2025-11-26 08:45:14.863855895 +0000 UTC m=+0.103995138 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, batch=17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-type=git, container_name=logrotate_crond) Nov 26 03:45:14 localhost podman[93110]: 2025-11-26 08:45:14.905281932 +0000 UTC m=+0.145421125 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, vcs-type=git) Nov 26 03:45:14 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:45:14 localhost podman[93109]: 2025-11-26 08:45:14.949024621 +0000 UTC m=+0.191652861 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team) Nov 26 03:45:14 localhost podman[93109]: 2025-11-26 08:45:14.974867258 +0000 UTC m=+0.217495518 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, io.openshift.expose-services=, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, version=17.1.12, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, batch=17.1_20251118.1) Nov 26 03:45:15 localhost podman[93111]: 2025-11-26 08:45:15.01416941 +0000 UTC m=+0.248069901 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, tcib_managed=true) Nov 26 03:45:15 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:45:15 localhost podman[93111]: 2025-11-26 08:45:15.06835469 +0000 UTC m=+0.302255101 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, release=1761123044, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 26 03:45:15 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:45:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:45:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:45:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:45:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:45:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:45:16 localhost podman[93182]: 2025-11-26 08:45:16.829427324 +0000 UTC m=+0.092086630 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, distribution-scope=public, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, release=1761123044, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:45:16 localhost podman[93182]: 2025-11-26 08:45:16.838487473 +0000 UTC m=+0.101146789 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, config_id=tripleo_step3, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, vendor=Red Hat, Inc.) Nov 26 03:45:16 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:45:16 localhost podman[93185]: 2025-11-26 08:45:16.887143573 +0000 UTC m=+0.142067692 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 26 03:45:16 localhost podman[93185]: 2025-11-26 08:45:16.907441839 +0000 UTC m=+0.162365988 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step4, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible) Nov 26 03:45:16 localhost podman[93191]: 2025-11-26 08:45:16.939938561 +0000 UTC m=+0.190440263 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., release=1761123044, url=https://www.redhat.com, config_id=tripleo_step4, io.openshift.expose-services=) Nov 26 03:45:16 localhost podman[93184]: 2025-11-26 08:45:16.983881937 +0000 UTC m=+0.240649912 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, release=1761123044, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 26 03:45:17 localhost podman[93183]: 2025-11-26 08:45:17.044639549 +0000 UTC m=+0.303488298 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd) Nov 26 03:45:17 localhost podman[93185]: unhealthy Nov 26 03:45:17 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:45:17 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 03:45:17 localhost podman[93184]: 2025-11-26 08:45:17.074035426 +0000 UTC m=+0.330803361 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1761123044, architecture=x86_64, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=ovn_controller) Nov 26 03:45:17 localhost podman[93184]: unhealthy Nov 26 03:45:17 localhost podman[93183]: 2025-11-26 08:45:17.085260422 +0000 UTC m=+0.344109151 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, container_name=collectd, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_id=tripleo_step3, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, version=17.1.12, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public) Nov 26 03:45:17 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:45:17 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 03:45:17 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:45:17 localhost podman[93191]: 2025-11-26 08:45:17.28788041 +0000 UTC m=+0.538382142 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_migration_target, release=1761123044, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1) Nov 26 03:45:17 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:45:17 localhost systemd[1]: tmp-crun.EBUXvD.mount: Deactivated successfully. Nov 26 03:45:19 localhost sshd[93280]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:45:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:45:34 localhost podman[93282]: 2025-11-26 08:45:34.836019639 +0000 UTC m=+0.090234303 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=metrics_qdr, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, architecture=x86_64) Nov 26 03:45:34 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 26 03:45:34 localhost recover_tripleo_nova_virtqemud[93309]: 61604 Nov 26 03:45:34 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 26 03:45:34 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 26 03:45:35 localhost podman[93282]: 2025-11-26 08:45:35.057594652 +0000 UTC m=+0.311809216 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, release=1761123044, vcs-type=git, config_id=tripleo_step1, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, architecture=x86_64, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1) Nov 26 03:45:35 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:45:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:45:40 localhost systemd[1]: tmp-crun.k9IDKq.mount: Deactivated successfully. Nov 26 03:45:40 localhost podman[93312]: 2025-11-26 08:45:40.823024249 +0000 UTC m=+0.087913242 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step5, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, release=1761123044, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible) Nov 26 03:45:40 localhost podman[93312]: 2025-11-26 08:45:40.84998063 +0000 UTC m=+0.114869653 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, release=1761123044, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, version=17.1.12, batch=17.1_20251118.1, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public) Nov 26 03:45:40 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:45:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:45:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:45:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:45:45 localhost systemd[1]: tmp-crun.4bonlF.mount: Deactivated successfully. Nov 26 03:45:45 localhost podman[93339]: 2025-11-26 08:45:45.814693867 +0000 UTC m=+0.079400339 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, release=1761123044, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4) Nov 26 03:45:45 localhost systemd[1]: tmp-crun.3i3uzm.mount: Deactivated successfully. Nov 26 03:45:45 localhost podman[93340]: 2025-11-26 08:45:45.838372277 +0000 UTC m=+0.096559839 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, version=17.1.12, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, tcib_managed=true) Nov 26 03:45:45 localhost podman[93339]: 2025-11-26 08:45:45.846046263 +0000 UTC m=+0.110752675 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., vcs-type=git, container_name=ceilometer_agent_compute, release=1761123044, distribution-scope=public, version=17.1.12, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true) Nov 26 03:45:45 localhost podman[93340]: 2025-11-26 08:45:45.871906171 +0000 UTC m=+0.130093733 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, tcib_managed=true, batch=17.1_20251118.1, distribution-scope=public, release=1761123044) Nov 26 03:45:45 localhost podman[93344]: 2025-11-26 08:45:45.878232046 +0000 UTC m=+0.132590609 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team) Nov 26 03:45:45 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:45:45 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:45:45 localhost podman[93344]: 2025-11-26 08:45:45.92829404 +0000 UTC m=+0.182652663 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vcs-type=git, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, io.openshift.expose-services=) Nov 26 03:45:45 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:45:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:45:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:45:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:45:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:45:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:45:47 localhost systemd[1]: tmp-crun.Q0v25c.mount: Deactivated successfully. Nov 26 03:45:47 localhost podman[93419]: 2025-11-26 08:45:47.850322216 +0000 UTC m=+0.101901623 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 26 03:45:47 localhost podman[93415]: 2025-11-26 08:45:47.826061387 +0000 UTC m=+0.080768742 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044) Nov 26 03:45:47 localhost podman[93414]: 2025-11-26 08:45:47.889567516 +0000 UTC m=+0.148364816 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, architecture=x86_64, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Nov 26 03:45:47 localhost podman[93419]: 2025-11-26 08:45:47.892812466 +0000 UTC m=+0.144391863 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, release=1761123044, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., batch=17.1_20251118.1, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4) Nov 26 03:45:47 localhost podman[93414]: 2025-11-26 08:45:47.899318236 +0000 UTC m=+0.158115536 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, config_id=tripleo_step3, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:45:47 localhost podman[93419]: unhealthy Nov 26 03:45:47 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:45:47 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 03:45:47 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:45:47 localhost podman[93413]: 2025-11-26 08:45:47.988464865 +0000 UTC m=+0.251427023 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z) Nov 26 03:45:48 localhost podman[93413]: 2025-11-26 08:45:48.020924826 +0000 UTC m=+0.283886914 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, release=1761123044, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.buildah.version=1.41.4) Nov 26 03:45:48 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:45:48 localhost podman[93422]: 2025-11-26 08:45:48.038878229 +0000 UTC m=+0.288504047 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Nov 26 03:45:48 localhost podman[93415]: 2025-11-26 08:45:48.05735429 +0000 UTC m=+0.312061605 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, vendor=Red Hat, Inc., vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1) Nov 26 03:45:48 localhost podman[93415]: unhealthy Nov 26 03:45:48 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:45:48 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 03:45:48 localhost podman[93422]: 2025-11-26 08:45:48.428344909 +0000 UTC m=+0.677970697 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 26 03:45:48 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:46:01 localhost sshd[93638]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:46:05 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 26 03:46:05 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3600.1 total, 600.0 interval#012Cumulative writes: 5692 writes, 25K keys, 5692 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 5692 writes, 763 syncs, 7.46 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 26 03:46:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:46:05 localhost podman[93640]: 2025-11-26 08:46:05.83095956 +0000 UTC m=+0.090899754 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, distribution-scope=public, io.buildah.version=1.41.4, release=1761123044, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12) Nov 26 03:46:06 localhost podman[93640]: 2025-11-26 08:46:06.021529326 +0000 UTC m=+0.281469410 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, release=1761123044, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 03:46:06 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:46:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 26 03:46:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3600.1 total, 600.0 interval#012Cumulative writes: 4860 writes, 21K keys, 4860 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4860 writes, 621 syncs, 7.83 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 26 03:46:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:46:11 localhost systemd[1]: tmp-crun.YDJNnp.mount: Deactivated successfully. Nov 26 03:46:11 localhost podman[93669]: 2025-11-26 08:46:11.817434573 +0000 UTC m=+0.079793090 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.12, distribution-scope=public, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container) Nov 26 03:46:11 localhost podman[93669]: 2025-11-26 08:46:11.857352915 +0000 UTC m=+0.119711432 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, container_name=nova_compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, config_id=tripleo_step5) Nov 26 03:46:11 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:46:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:46:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:46:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:46:16 localhost systemd[1]: tmp-crun.uAYzx9.mount: Deactivated successfully. Nov 26 03:46:16 localhost podman[93695]: 2025-11-26 08:46:16.837211289 +0000 UTC m=+0.101375916 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 26 03:46:16 localhost podman[93695]: 2025-11-26 08:46:16.867435292 +0000 UTC m=+0.131599869 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.12, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, container_name=ceilometer_agent_compute, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team) Nov 26 03:46:16 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:46:16 localhost podman[93697]: 2025-11-26 08:46:16.928048051 +0000 UTC m=+0.184009285 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, version=17.1.12) Nov 26 03:46:16 localhost podman[93696]: 2025-11-26 08:46:16.894960651 +0000 UTC m=+0.155261500 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_id=tripleo_step4, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1) Nov 26 03:46:16 localhost podman[93696]: 2025-11-26 08:46:16.976464484 +0000 UTC m=+0.236765363 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Nov 26 03:46:16 localhost podman[93697]: 2025-11-26 08:46:16.9825031 +0000 UTC m=+0.238464344 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible) Nov 26 03:46:16 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:46:17 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:46:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:46:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:46:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:46:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:46:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:46:18 localhost systemd[1]: tmp-crun.Pt65ye.mount: Deactivated successfully. Nov 26 03:46:18 localhost podman[93769]: 2025-11-26 08:46:18.825009334 +0000 UTC m=+0.087258582 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, name=rhosp17/openstack-collectd, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vendor=Red Hat, Inc.) Nov 26 03:46:18 localhost podman[93771]: 2025-11-26 08:46:18.83657016 +0000 UTC m=+0.092834543 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, config_id=tripleo_step4, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, release=1761123044, batch=17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, io.openshift.expose-services=) Nov 26 03:46:18 localhost podman[93769]: 2025-11-26 08:46:18.860232009 +0000 UTC m=+0.122481267 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, config_id=tripleo_step3, container_name=collectd, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd) Nov 26 03:46:18 localhost podman[93771]: 2025-11-26 08:46:18.879341329 +0000 UTC m=+0.135605702 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, architecture=x86_64, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, batch=17.1_20251118.1) Nov 26 03:46:18 localhost podman[93771]: unhealthy Nov 26 03:46:18 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:46:18 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 03:46:18 localhost podman[93772]: 2025-11-26 08:46:18.895674002 +0000 UTC m=+0.149310194 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, release=1761123044, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4) Nov 26 03:46:18 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:46:18 localhost podman[93768]: 2025-11-26 08:46:18.937550004 +0000 UTC m=+0.199578765 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, url=https://www.redhat.com, version=17.1.12, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, release=1761123044, config_id=tripleo_step3, vcs-type=git, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vendor=Red Hat, Inc.) Nov 26 03:46:18 localhost podman[93768]: 2025-11-26 08:46:18.947210471 +0000 UTC m=+0.209239232 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.buildah.version=1.41.4, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true) Nov 26 03:46:18 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:46:18 localhost podman[93770]: 2025-11-26 08:46:18.86998457 +0000 UTC m=+0.128608786 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, config_id=tripleo_step4, version=17.1.12, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, batch=17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:46:19 localhost podman[93770]: 2025-11-26 08:46:19.001354041 +0000 UTC m=+0.259978297 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, architecture=x86_64, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Nov 26 03:46:19 localhost podman[93770]: unhealthy Nov 26 03:46:19 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:46:19 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 03:46:19 localhost podman[93772]: 2025-11-26 08:46:19.269501169 +0000 UTC m=+0.523137361 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, vcs-type=git, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, distribution-scope=public, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, container_name=nova_migration_target, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.buildah.version=1.41.4) Nov 26 03:46:19 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:46:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:46:36 localhost podman[93866]: 2025-11-26 08:46:36.826426807 +0000 UTC m=+0.083924629 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git) Nov 26 03:46:37 localhost podman[93866]: 2025-11-26 08:46:37.047607277 +0000 UTC m=+0.305105119 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, release=1761123044, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible) Nov 26 03:46:37 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:46:42 localhost sshd[93894]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:46:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:46:42 localhost systemd[1]: tmp-crun.wW6oOk.mount: Deactivated successfully. Nov 26 03:46:42 localhost podman[93896]: 2025-11-26 08:46:42.540470299 +0000 UTC m=+0.093353049 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-11-19T00:36:58Z, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:46:42 localhost podman[93896]: 2025-11-26 08:46:42.576331835 +0000 UTC m=+0.129214585 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64) Nov 26 03:46:42 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:46:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:46:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:46:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:46:47 localhost systemd[1]: tmp-crun.i3r4rj.mount: Deactivated successfully. Nov 26 03:46:47 localhost podman[93922]: 2025-11-26 08:46:47.814797683 +0000 UTC m=+0.084279919 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_id=tripleo_step4, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, distribution-scope=public, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:46:47 localhost systemd[1]: tmp-crun.7C5M0N.mount: Deactivated successfully. Nov 26 03:46:47 localhost podman[93923]: 2025-11-26 08:46:47.830061045 +0000 UTC m=+0.093657270 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, vcs-type=git, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.buildah.version=1.41.4, url=https://www.redhat.com, config_id=tripleo_step4, distribution-scope=public, release=1761123044, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, version=17.1.12, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Nov 26 03:46:47 localhost podman[93922]: 2025-11-26 08:46:47.835918535 +0000 UTC m=+0.105400751 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public) Nov 26 03:46:47 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:46:47 localhost podman[93923]: 2025-11-26 08:46:47.871520642 +0000 UTC m=+0.135116897 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond) Nov 26 03:46:47 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:46:47 localhost podman[93924]: 2025-11-26 08:46:47.886093082 +0000 UTC m=+0.145726154 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, distribution-scope=public, build-date=2025-11-19T00:12:45Z, release=1761123044, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com) Nov 26 03:46:47 localhost podman[93924]: 2025-11-26 08:46:47.908340088 +0000 UTC m=+0.167973170 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:46:47 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:46:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:46:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:46:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:46:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:46:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:46:49 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 26 03:46:49 localhost recover_tripleo_nova_virtqemud[94025]: 61604 Nov 26 03:46:49 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 26 03:46:49 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 26 03:46:49 localhost systemd[1]: tmp-crun.4LVicS.mount: Deactivated successfully. Nov 26 03:46:49 localhost podman[93991]: 2025-11-26 08:46:49.841272711 +0000 UTC m=+0.103247415 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, release=1761123044, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible) Nov 26 03:46:49 localhost podman[93992]: 2025-11-26 08:46:49.853041093 +0000 UTC m=+0.102457050 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, vendor=Red Hat, Inc., batch=17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true) Nov 26 03:46:49 localhost podman[93991]: 2025-11-26 08:46:49.880344904 +0000 UTC m=+0.142319648 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_id=tripleo_step3, architecture=x86_64, container_name=iscsid, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, release=1761123044) Nov 26 03:46:49 localhost podman[93998]: 2025-11-26 08:46:49.894909304 +0000 UTC m=+0.141111262 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, container_name=ovn_controller, vcs-type=git, distribution-scope=public, release=1761123044, managed_by=tripleo_ansible, config_id=tripleo_step4, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Nov 26 03:46:49 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:46:49 localhost podman[93998]: 2025-11-26 08:46:49.938353333 +0000 UTC m=+0.184555321 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, batch=17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:46:49 localhost podman[93998]: unhealthy Nov 26 03:46:49 localhost podman[94007]: 2025-11-26 08:46:49.946542536 +0000 UTC m=+0.186264324 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.) Nov 26 03:46:49 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:46:49 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 03:46:49 localhost podman[93992]: 2025-11-26 08:46:49.967831573 +0000 UTC m=+0.217247570 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team) Nov 26 03:46:49 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:46:50 localhost podman[93999]: 2025-11-26 08:46:50.061128479 +0000 UTC m=+0.302875760 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-type=git, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, release=1761123044, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Nov 26 03:46:50 localhost podman[93999]: 2025-11-26 08:46:50.080391653 +0000 UTC m=+0.322138934 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, config_id=tripleo_step4, url=https://www.redhat.com) Nov 26 03:46:50 localhost podman[93999]: unhealthy Nov 26 03:46:50 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:46:50 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 03:46:50 localhost podman[94007]: 2025-11-26 08:46:50.336282154 +0000 UTC m=+0.576003942 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.expose-services=, container_name=nova_migration_target, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.12, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:46:50 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:46:59 localhost podman[94226]: Nov 26 03:46:59 localhost podman[94226]: 2025-11-26 08:46:59.347799136 +0000 UTC m=+0.084511337 container create 9e9f2535dd793ae3f8167ca717389d59749bb8ebd69cd5443b122010cf1daa1e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_jones, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, version=7, io.openshift.expose-services=, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, architecture=x86_64, RELEASE=main, io.buildah.version=1.33.12, ceph=True, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 26 03:46:59 localhost systemd[1]: Started libpod-conmon-9e9f2535dd793ae3f8167ca717389d59749bb8ebd69cd5443b122010cf1daa1e.scope. Nov 26 03:46:59 localhost podman[94226]: 2025-11-26 08:46:59.314160519 +0000 UTC m=+0.050872760 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 03:46:59 localhost systemd[1]: Started libcrun container. Nov 26 03:46:59 localhost podman[94226]: 2025-11-26 08:46:59.453064682 +0000 UTC m=+0.189776893 container init 9e9f2535dd793ae3f8167ca717389d59749bb8ebd69cd5443b122010cf1daa1e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_jones, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, vendor=Red Hat, Inc., RELEASE=main, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_BRANCH=main) Nov 26 03:46:59 localhost podman[94226]: 2025-11-26 08:46:59.465836596 +0000 UTC m=+0.202548807 container start 9e9f2535dd793ae3f8167ca717389d59749bb8ebd69cd5443b122010cf1daa1e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_jones, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, release=553, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 26 03:46:59 localhost podman[94226]: 2025-11-26 08:46:59.466145985 +0000 UTC m=+0.202858206 container attach 9e9f2535dd793ae3f8167ca717389d59749bb8ebd69cd5443b122010cf1daa1e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_jones, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, GIT_BRANCH=main, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, ceph=True, release=553, RELEASE=main, GIT_CLEAN=True, version=7, architecture=x86_64, io.openshift.expose-services=) Nov 26 03:46:59 localhost intelligent_jones[94241]: 167 167 Nov 26 03:46:59 localhost systemd[1]: libpod-9e9f2535dd793ae3f8167ca717389d59749bb8ebd69cd5443b122010cf1daa1e.scope: Deactivated successfully. Nov 26 03:46:59 localhost podman[94226]: 2025-11-26 08:46:59.47340987 +0000 UTC m=+0.210122131 container died 9e9f2535dd793ae3f8167ca717389d59749bb8ebd69cd5443b122010cf1daa1e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_jones, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., release=553, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.openshift.expose-services=, maintainer=Guillaume Abrioux , ceph=True, CEPH_POINT_RELEASE=, name=rhceph, version=7, vcs-type=git, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 26 03:46:59 localhost podman[94246]: 2025-11-26 08:46:59.567357246 +0000 UTC m=+0.085848968 container remove 9e9f2535dd793ae3f8167ca717389d59749bb8ebd69cd5443b122010cf1daa1e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_jones, ceph=True, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_CLEAN=True, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vcs-type=git, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , release=553, name=rhceph, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 26 03:46:59 localhost systemd[1]: libpod-conmon-9e9f2535dd793ae3f8167ca717389d59749bb8ebd69cd5443b122010cf1daa1e.scope: Deactivated successfully. Nov 26 03:46:59 localhost podman[94267]: Nov 26 03:46:59 localhost podman[94267]: 2025-11-26 08:46:59.807250523 +0000 UTC m=+0.088796069 container create 7ca8108ea7466a3a03d4a836c548e187069be14a6e5b20e000aaf395b2b05770 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_faraday, com.redhat.component=rhceph-container, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, release=553, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7) Nov 26 03:46:59 localhost systemd[1]: Started libpod-conmon-7ca8108ea7466a3a03d4a836c548e187069be14a6e5b20e000aaf395b2b05770.scope. Nov 26 03:46:59 localhost systemd[1]: Started libcrun container. Nov 26 03:46:59 localhost podman[94267]: 2025-11-26 08:46:59.774165123 +0000 UTC m=+0.055710709 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 03:46:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2cefb109f6fb223f719f6e3f8c0202f4c9985ee6bdb9a9951389cfb21bcc0461/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 26 03:46:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2cefb109f6fb223f719f6e3f8c0202f4c9985ee6bdb9a9951389cfb21bcc0461/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 26 03:46:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2cefb109f6fb223f719f6e3f8c0202f4c9985ee6bdb9a9951389cfb21bcc0461/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 26 03:46:59 localhost podman[94267]: 2025-11-26 08:46:59.883160334 +0000 UTC m=+0.164705880 container init 7ca8108ea7466a3a03d4a836c548e187069be14a6e5b20e000aaf395b2b05770 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_faraday, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, com.redhat.component=rhceph-container, GIT_CLEAN=True, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , RELEASE=main, vcs-type=git, description=Red Hat Ceph Storage 7, name=rhceph) Nov 26 03:46:59 localhost podman[94267]: 2025-11-26 08:46:59.894996249 +0000 UTC m=+0.176541805 container start 7ca8108ea7466a3a03d4a836c548e187069be14a6e5b20e000aaf395b2b05770 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_faraday, ceph=True, vcs-type=git, io.buildah.version=1.33.12, name=rhceph, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, release=553, description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , RELEASE=main, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 26 03:46:59 localhost podman[94267]: 2025-11-26 08:46:59.895402672 +0000 UTC m=+0.176948268 container attach 7ca8108ea7466a3a03d4a836c548e187069be14a6e5b20e000aaf395b2b05770 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_faraday, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, maintainer=Guillaume Abrioux , architecture=x86_64, io.openshift.tags=rhceph ceph, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, vcs-type=git, name=rhceph) Nov 26 03:47:00 localhost systemd[1]: var-lib-containers-storage-overlay-dd612e99bed55f3e5e3fcccf7aeacc5e48448a88a544639c795c3c6ff9e8ed6b-merged.mount: Deactivated successfully. Nov 26 03:47:01 localhost sweet_faraday[94283]: [ Nov 26 03:47:01 localhost sweet_faraday[94283]: { Nov 26 03:47:01 localhost sweet_faraday[94283]: "available": false, Nov 26 03:47:01 localhost sweet_faraday[94283]: "ceph_device": false, Nov 26 03:47:01 localhost sweet_faraday[94283]: "device_id": "QEMU_DVD-ROM_QM00001", Nov 26 03:47:01 localhost sweet_faraday[94283]: "lsm_data": {}, Nov 26 03:47:01 localhost sweet_faraday[94283]: "lvs": [], Nov 26 03:47:01 localhost sweet_faraday[94283]: "path": "/dev/sr0", Nov 26 03:47:01 localhost sweet_faraday[94283]: "rejected_reasons": [ Nov 26 03:47:01 localhost sweet_faraday[94283]: "Insufficient space (<5GB)", Nov 26 03:47:01 localhost sweet_faraday[94283]: "Has a FileSystem" Nov 26 03:47:01 localhost sweet_faraday[94283]: ], Nov 26 03:47:01 localhost sweet_faraday[94283]: "sys_api": { Nov 26 03:47:01 localhost sweet_faraday[94283]: "actuators": null, Nov 26 03:47:01 localhost sweet_faraday[94283]: "device_nodes": "sr0", Nov 26 03:47:01 localhost sweet_faraday[94283]: "human_readable_size": "482.00 KB", Nov 26 03:47:01 localhost sweet_faraday[94283]: "id_bus": "ata", Nov 26 03:47:01 localhost sweet_faraday[94283]: "model": "QEMU DVD-ROM", Nov 26 03:47:01 localhost sweet_faraday[94283]: "nr_requests": "2", Nov 26 03:47:01 localhost sweet_faraday[94283]: "partitions": {}, Nov 26 03:47:01 localhost sweet_faraday[94283]: "path": "/dev/sr0", Nov 26 03:47:01 localhost sweet_faraday[94283]: "removable": "1", Nov 26 03:47:01 localhost sweet_faraday[94283]: "rev": "2.5+", Nov 26 03:47:01 localhost sweet_faraday[94283]: "ro": "0", Nov 26 03:47:01 localhost sweet_faraday[94283]: "rotational": "1", Nov 26 03:47:01 localhost sweet_faraday[94283]: "sas_address": "", Nov 26 03:47:01 localhost sweet_faraday[94283]: "sas_device_handle": "", Nov 26 03:47:01 localhost sweet_faraday[94283]: "scheduler_mode": "mq-deadline", Nov 26 03:47:01 localhost sweet_faraday[94283]: "sectors": 0, Nov 26 03:47:01 localhost sweet_faraday[94283]: "sectorsize": "2048", Nov 26 03:47:01 localhost sweet_faraday[94283]: "size": 493568.0, Nov 26 03:47:01 localhost sweet_faraday[94283]: "support_discard": "0", Nov 26 03:47:01 localhost sweet_faraday[94283]: "type": "disk", Nov 26 03:47:01 localhost sweet_faraday[94283]: "vendor": "QEMU" Nov 26 03:47:01 localhost sweet_faraday[94283]: } Nov 26 03:47:01 localhost sweet_faraday[94283]: } Nov 26 03:47:01 localhost sweet_faraday[94283]: ] Nov 26 03:47:01 localhost systemd[1]: libpod-7ca8108ea7466a3a03d4a836c548e187069be14a6e5b20e000aaf395b2b05770.scope: Deactivated successfully. Nov 26 03:47:01 localhost systemd[1]: libpod-7ca8108ea7466a3a03d4a836c548e187069be14a6e5b20e000aaf395b2b05770.scope: Consumed 1.226s CPU time. Nov 26 03:47:01 localhost podman[96396]: 2025-11-26 08:47:01.105299568 +0000 UTC m=+0.039899371 container died 7ca8108ea7466a3a03d4a836c548e187069be14a6e5b20e000aaf395b2b05770 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_faraday, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, io.openshift.tags=rhceph ceph, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.openshift.expose-services=, RELEASE=main, release=553, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, distribution-scope=public) Nov 26 03:47:01 localhost systemd[1]: var-lib-containers-storage-overlay-2cefb109f6fb223f719f6e3f8c0202f4c9985ee6bdb9a9951389cfb21bcc0461-merged.mount: Deactivated successfully. Nov 26 03:47:01 localhost podman[96396]: 2025-11-26 08:47:01.146992124 +0000 UTC m=+0.081591857 container remove 7ca8108ea7466a3a03d4a836c548e187069be14a6e5b20e000aaf395b2b05770 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_faraday, GIT_CLEAN=True, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , ceph=True, name=rhceph, architecture=x86_64, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.tags=rhceph ceph, vcs-type=git, com.redhat.component=rhceph-container, distribution-scope=public, release=553, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 26 03:47:01 localhost systemd[1]: libpod-conmon-7ca8108ea7466a3a03d4a836c548e187069be14a6e5b20e000aaf395b2b05770.scope: Deactivated successfully. Nov 26 03:47:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:47:07 localhost podman[96425]: 2025-11-26 08:47:07.836130194 +0000 UTC m=+0.093751491 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, config_id=tripleo_step1, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 03:47:08 localhost podman[96425]: 2025-11-26 08:47:08.058803401 +0000 UTC m=+0.316424608 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, version=17.1.12, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, release=1761123044, distribution-scope=public, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, io.openshift.expose-services=, vcs-type=git) Nov 26 03:47:08 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:47:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:47:12 localhost podman[96454]: 2025-11-26 08:47:12.826017007 +0000 UTC m=+0.086246840 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, version=17.1.12, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, release=1761123044) Nov 26 03:47:12 localhost podman[96454]: 2025-11-26 08:47:12.885894314 +0000 UTC m=+0.146124137 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, config_id=tripleo_step5, tcib_managed=true, version=17.1.12, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible) Nov 26 03:47:12 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:47:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:47:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:47:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:47:18 localhost podman[96481]: 2025-11-26 08:47:18.810793858 +0000 UTC m=+0.067505503 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, architecture=x86_64, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1) Nov 26 03:47:18 localhost podman[96481]: 2025-11-26 08:47:18.822412546 +0000 UTC m=+0.079124181 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp17/openstack-cron, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, tcib_managed=true, config_id=tripleo_step4, vcs-type=git, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, version=17.1.12, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, batch=17.1_20251118.1, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:47:18 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:47:18 localhost podman[96482]: 2025-11-26 08:47:18.880038832 +0000 UTC m=+0.136355045 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, release=1761123044, batch=17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, version=17.1.12, io.buildah.version=1.41.4, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container) Nov 26 03:47:18 localhost podman[96480]: 2025-11-26 08:47:18.893975812 +0000 UTC m=+0.153626987 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, config_id=tripleo_step4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, release=1761123044, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:47:18 localhost podman[96482]: 2025-11-26 08:47:18.933566522 +0000 UTC m=+0.189882765 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, url=https://www.redhat.com, release=1761123044, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, architecture=x86_64) Nov 26 03:47:18 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:47:18 localhost podman[96480]: 2025-11-26 08:47:18.948412831 +0000 UTC m=+0.208064016 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64) Nov 26 03:47:18 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:47:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:47:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:47:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:47:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:47:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:47:20 localhost systemd[1]: tmp-crun.aqSzjY.mount: Deactivated successfully. Nov 26 03:47:20 localhost podman[96551]: 2025-11-26 08:47:20.821525068 +0000 UTC m=+0.078819381 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, version=17.1.12, batch=17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4) Nov 26 03:47:20 localhost podman[96551]: 2025-11-26 08:47:20.834258511 +0000 UTC m=+0.091552824 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 26 03:47:20 localhost podman[96551]: unhealthy Nov 26 03:47:20 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:47:20 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 03:47:20 localhost podman[96550]: 2025-11-26 08:47:20.880210428 +0000 UTC m=+0.140325658 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, version=17.1.12, io.openshift.expose-services=, config_id=tripleo_step3, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, release=1761123044, build-date=2025-11-18T22:51:28Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1) Nov 26 03:47:20 localhost podman[96550]: 2025-11-26 08:47:20.916300511 +0000 UTC m=+0.176415691 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, release=1761123044, vcs-type=git, version=17.1.12, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, architecture=x86_64, distribution-scope=public, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd) Nov 26 03:47:20 localhost podman[96549]: 2025-11-26 08:47:20.930124687 +0000 UTC m=+0.194734625 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.expose-services=, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vendor=Red Hat, Inc., container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, release=1761123044, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4) Nov 26 03:47:20 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:47:20 localhost podman[96549]: 2025-11-26 08:47:20.943201191 +0000 UTC m=+0.207811129 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, container_name=iscsid, version=17.1.12, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-iscsid, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container) Nov 26 03:47:20 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:47:20 localhost podman[96558]: 2025-11-26 08:47:20.988390164 +0000 UTC m=+0.239834077 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, distribution-scope=public, batch=17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Nov 26 03:47:21 localhost podman[96557]: 2025-11-26 08:47:21.02164799 +0000 UTC m=+0.276186458 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2025-11-19T00:14:25Z, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true) Nov 26 03:47:21 localhost podman[96557]: 2025-11-26 08:47:21.036327792 +0000 UTC m=+0.290866320 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=ovn_metadata_agent, tcib_managed=true, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:47:21 localhost podman[96557]: unhealthy Nov 26 03:47:21 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:47:21 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 03:47:21 localhost podman[96558]: 2025-11-26 08:47:21.351422629 +0000 UTC m=+0.602866582 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, release=1761123044, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vcs-type=git, distribution-scope=public, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:47:21 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:47:21 localhost sshd[96645]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:47:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:47:38 localhost podman[96647]: 2025-11-26 08:47:38.830592389 +0000 UTC m=+0.083808595 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:47:39 localhost podman[96647]: 2025-11-26 08:47:39.022525638 +0000 UTC m=+0.275741814 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.openshift.expose-services=, batch=17.1_20251118.1, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 03:47:39 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:47:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:47:43 localhost systemd[1]: tmp-crun.ZkHFHc.mount: Deactivated successfully. Nov 26 03:47:43 localhost podman[96676]: 2025-11-26 08:47:43.840033885 +0000 UTC m=+0.099538320 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, container_name=nova_compute, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, vcs-type=git, tcib_managed=true, config_id=tripleo_step5, io.openshift.expose-services=, version=17.1.12) Nov 26 03:47:43 localhost podman[96676]: 2025-11-26 08:47:43.902626295 +0000 UTC m=+0.162130730 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true) Nov 26 03:47:43 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:47:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:47:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:47:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:47:49 localhost podman[96702]: 2025-11-26 08:47:49.829208321 +0000 UTC m=+0.091968337 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, distribution-scope=public, architecture=x86_64, release=1761123044, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.) Nov 26 03:47:49 localhost podman[96702]: 2025-11-26 08:47:49.886184488 +0000 UTC m=+0.148944514 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, release=1761123044, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1) Nov 26 03:47:49 localhost systemd[1]: tmp-crun.mZpgO0.mount: Deactivated successfully. Nov 26 03:47:49 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:47:49 localhost podman[96703]: 2025-11-26 08:47:49.943906498 +0000 UTC m=+0.202385061 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, vcs-type=git, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, distribution-scope=public, tcib_managed=true) Nov 26 03:47:49 localhost podman[96704]: 2025-11-26 08:47:49.904110231 +0000 UTC m=+0.162401118 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, version=17.1.12, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container) Nov 26 03:47:49 localhost podman[96703]: 2025-11-26 08:47:49.956415734 +0000 UTC m=+0.214894297 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, distribution-scope=public, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc.) Nov 26 03:47:49 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:47:50 localhost podman[96704]: 2025-11-26 08:47:50.039460314 +0000 UTC m=+0.297751231 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1) Nov 26 03:47:50 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:47:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:47:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:47:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:47:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:47:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:47:51 localhost systemd[1]: tmp-crun.XxGviD.mount: Deactivated successfully. Nov 26 03:47:51 localhost podman[96772]: 2025-11-26 08:47:51.844587186 +0000 UTC m=+0.105831674 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, config_id=tripleo_step3, version=17.1.12, release=1761123044, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:47:51 localhost podman[96773]: 2025-11-26 08:47:51.894746782 +0000 UTC m=+0.154162444 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public) Nov 26 03:47:51 localhost podman[96772]: 2025-11-26 08:47:51.906604068 +0000 UTC m=+0.167848586 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, config_id=tripleo_step3, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid) Nov 26 03:47:51 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:47:51 localhost podman[96774]: 2025-11-26 08:47:51.988918446 +0000 UTC m=+0.244324474 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, version=17.1.12, container_name=ovn_controller, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1) Nov 26 03:47:52 localhost podman[96773]: 2025-11-26 08:47:52.009727789 +0000 UTC m=+0.269143461 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, container_name=collectd, name=rhosp17/openstack-collectd, release=1761123044, architecture=x86_64, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, version=17.1.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Nov 26 03:47:52 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:47:52 localhost podman[96774]: 2025-11-26 08:47:52.032314655 +0000 UTC m=+0.287720673 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., version=17.1.12, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, vcs-type=git, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1) Nov 26 03:47:52 localhost podman[96774]: unhealthy Nov 26 03:47:52 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:47:52 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 03:47:52 localhost podman[96775]: 2025-11-26 08:47:52.095515253 +0000 UTC m=+0.347842366 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.buildah.version=1.41.4, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:47:52 localhost podman[96775]: 2025-11-26 08:47:52.139275973 +0000 UTC m=+0.391603036 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true) Nov 26 03:47:52 localhost podman[96775]: unhealthy Nov 26 03:47:52 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:47:52 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 03:47:52 localhost podman[96781]: 2025-11-26 08:47:52.154296516 +0000 UTC m=+0.403740491 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_step4, version=17.1.12, container_name=nova_migration_target, release=1761123044, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:47:52 localhost podman[96781]: 2025-11-26 08:47:52.556337673 +0000 UTC m=+0.805781608 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, architecture=x86_64, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, distribution-scope=public, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, container_name=nova_migration_target, version=17.1.12, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true) Nov 26 03:47:52 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:47:52 localhost systemd[1]: tmp-crun.1yPHXV.mount: Deactivated successfully. Nov 26 03:48:01 localhost sshd[96872]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:48:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:48:09 localhost systemd[1]: tmp-crun.t8WEpp.mount: Deactivated successfully. Nov 26 03:48:09 localhost podman[97001]: 2025-11-26 08:48:09.838344128 +0000 UTC m=+0.102987787 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, release=1761123044, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, url=https://www.redhat.com, version=17.1.12, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, architecture=x86_64) Nov 26 03:48:10 localhost podman[97001]: 2025-11-26 08:48:10.033271098 +0000 UTC m=+0.297914687 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, version=17.1.12, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.4, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 03:48:10 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:48:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:48:14 localhost podman[97030]: 2025-11-26 08:48:14.813171328 +0000 UTC m=+0.075379566 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.4, version=17.1.12, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, config_id=tripleo_step5, container_name=nova_compute, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:48:14 localhost podman[97030]: 2025-11-26 08:48:14.84343415 +0000 UTC m=+0.105642468 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 26 03:48:14 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:48:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:48:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:48:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:48:20 localhost systemd[1]: tmp-crun.Vs3RAi.mount: Deactivated successfully. Nov 26 03:48:20 localhost podman[97058]: 2025-11-26 08:48:20.839608382 +0000 UTC m=+0.094614869 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Nov 26 03:48:20 localhost podman[97057]: 2025-11-26 08:48:20.870741092 +0000 UTC m=+0.124196451 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, version=17.1.12, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, container_name=logrotate_crond, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:48:20 localhost podman[97057]: 2025-11-26 08:48:20.883651929 +0000 UTC m=+0.137107358 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, distribution-scope=public, version=17.1.12, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=logrotate_crond, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:48:20 localhost podman[97056]: 2025-11-26 08:48:20.893170983 +0000 UTC m=+0.149304654 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:48:20 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:48:20 localhost podman[97058]: 2025-11-26 08:48:20.90538779 +0000 UTC m=+0.160394247 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, release=1761123044, vendor=Red Hat, Inc.) Nov 26 03:48:20 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:48:20 localhost podman[97056]: 2025-11-26 08:48:20.949426158 +0000 UTC m=+0.205559799 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1761123044, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, version=17.1.12) Nov 26 03:48:20 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:48:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:48:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:48:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:48:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:48:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:48:22 localhost systemd[1]: tmp-crun.lIa3wk.mount: Deactivated successfully. Nov 26 03:48:22 localhost podman[97128]: 2025-11-26 08:48:22.824352261 +0000 UTC m=+0.084774505 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, version=17.1.12, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, architecture=x86_64, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git) Nov 26 03:48:22 localhost podman[97141]: 2025-11-26 08:48:22.846103021 +0000 UTC m=+0.088338235 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, distribution-scope=public, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, release=1761123044, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc.) Nov 26 03:48:22 localhost podman[97127]: 2025-11-26 08:48:22.903104409 +0000 UTC m=+0.165971459 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.buildah.version=1.41.4, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible) Nov 26 03:48:22 localhost podman[97128]: 2025-11-26 08:48:22.905308187 +0000 UTC m=+0.165730361 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp17/openstack-collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.openshift.expose-services=, batch=17.1_20251118.1, container_name=collectd, architecture=x86_64) Nov 26 03:48:22 localhost podman[97127]: 2025-11-26 08:48:22.912226921 +0000 UTC m=+0.175093991 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 26 03:48:22 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:48:22 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:48:22 localhost podman[97140]: 2025-11-26 08:48:22.880844482 +0000 UTC m=+0.132846967 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, tcib_managed=true, io.buildah.version=1.41.4, managed_by=tripleo_ansible, release=1761123044, distribution-scope=public, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, vcs-type=git) Nov 26 03:48:22 localhost podman[97140]: 2025-11-26 08:48:22.962196381 +0000 UTC m=+0.214198886 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c) Nov 26 03:48:22 localhost podman[97140]: unhealthy Nov 26 03:48:22 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:48:22 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 03:48:22 localhost podman[97129]: 2025-11-26 08:48:22.974825231 +0000 UTC m=+0.225645900 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, release=1761123044, tcib_managed=true, container_name=ovn_controller, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, url=https://www.redhat.com, version=17.1.12, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible) Nov 26 03:48:22 localhost podman[97129]: 2025-11-26 08:48:22.992276629 +0000 UTC m=+0.243097298 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, architecture=x86_64, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 26 03:48:22 localhost podman[97129]: unhealthy Nov 26 03:48:22 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:48:22 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 03:48:23 localhost podman[97141]: 2025-11-26 08:48:23.186912381 +0000 UTC m=+0.429147635 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, distribution-scope=public, container_name=nova_migration_target, version=17.1.12, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true) Nov 26 03:48:23 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:48:23 localhost systemd[1]: tmp-crun.cqSZYk.mount: Deactivated successfully. Nov 26 03:48:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:48:40 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 26 03:48:40 localhost recover_tripleo_nova_virtqemud[97224]: 61604 Nov 26 03:48:40 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 26 03:48:40 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 26 03:48:40 localhost podman[97222]: 2025-11-26 08:48:40.829432699 +0000 UTC m=+0.089434088 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Nov 26 03:48:41 localhost podman[97222]: 2025-11-26 08:48:41.03083647 +0000 UTC m=+0.290837819 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, release=1761123044, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:48:41 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:48:41 localhost sshd[97253]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:48:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:48:45 localhost podman[97255]: 2025-11-26 08:48:45.816130926 +0000 UTC m=+0.081432503 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, release=1761123044, container_name=nova_compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, config_id=tripleo_step5, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=) Nov 26 03:48:45 localhost podman[97255]: 2025-11-26 08:48:45.868247843 +0000 UTC m=+0.133549440 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, container_name=nova_compute, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, tcib_managed=true) Nov 26 03:48:45 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:48:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:48:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:48:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:48:51 localhost podman[97281]: 2025-11-26 08:48:51.824466942 +0000 UTC m=+0.087651043 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2025-11-18T22:49:32Z, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, version=17.1.12, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:48:51 localhost podman[97281]: 2025-11-26 08:48:51.836202914 +0000 UTC m=+0.099387025 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.12, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, batch=17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, release=1761123044, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Nov 26 03:48:51 localhost systemd[1]: tmp-crun.uizevO.mount: Deactivated successfully. Nov 26 03:48:51 localhost podman[97282]: 2025-11-26 08:48:51.879897312 +0000 UTC m=+0.138144701 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, version=17.1.12, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, distribution-scope=public, build-date=2025-11-19T00:12:45Z) Nov 26 03:48:51 localhost podman[97280]: 2025-11-26 08:48:51.934174335 +0000 UTC m=+0.197264964 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044) Nov 26 03:48:51 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:48:51 localhost podman[97282]: 2025-11-26 08:48:51.987866141 +0000 UTC m=+0.246113570 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, build-date=2025-11-19T00:12:45Z, version=17.1.12) Nov 26 03:48:52 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:48:52 localhost podman[97280]: 2025-11-26 08:48:52.043005881 +0000 UTC m=+0.306096510 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, container_name=ceilometer_agent_compute, version=17.1.12, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 26 03:48:52 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:48:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:48:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:48:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:48:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:48:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:48:53 localhost podman[97359]: 2025-11-26 08:48:53.830124267 +0000 UTC m=+0.080412160 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, release=1761123044, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4) Nov 26 03:48:53 localhost podman[97350]: 2025-11-26 08:48:53.883190043 +0000 UTC m=+0.142204326 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, tcib_managed=true, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Nov 26 03:48:53 localhost podman[97350]: 2025-11-26 08:48:53.895301997 +0000 UTC m=+0.154316320 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.12, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, config_id=tripleo_step3, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Nov 26 03:48:53 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:48:53 localhost systemd[1]: tmp-crun.wrxqr5.mount: Deactivated successfully. Nov 26 03:48:53 localhost podman[97354]: 2025-11-26 08:48:53.953955675 +0000 UTC m=+0.203415052 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, url=https://www.redhat.com, version=17.1.12, container_name=ovn_metadata_agent, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, config_id=tripleo_step4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z) Nov 26 03:48:54 localhost podman[97352]: 2025-11-26 08:48:54.037329936 +0000 UTC m=+0.288271710 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, tcib_managed=true, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, managed_by=tripleo_ansible) Nov 26 03:48:54 localhost podman[97352]: 2025-11-26 08:48:54.054388502 +0000 UTC m=+0.305330266 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, vcs-type=git, build-date=2025-11-18T23:34:05Z, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, batch=17.1_20251118.1, version=17.1.12, container_name=ovn_controller, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Nov 26 03:48:54 localhost podman[97352]: unhealthy Nov 26 03:48:54 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:48:54 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 03:48:54 localhost podman[97354]: 2025-11-26 08:48:54.075687119 +0000 UTC m=+0.325146476 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, config_id=tripleo_step4, version=17.1.12, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, build-date=2025-11-19T00:14:25Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 26 03:48:54 localhost podman[97354]: unhealthy Nov 26 03:48:54 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:48:54 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 03:48:54 localhost podman[97351]: 2025-11-26 08:48:54.062308077 +0000 UTC m=+0.317969126 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, tcib_managed=true, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, release=1761123044, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, config_id=tripleo_step3, io.openshift.expose-services=) Nov 26 03:48:54 localhost podman[97351]: 2025-11-26 08:48:54.144379887 +0000 UTC m=+0.400040936 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=collectd, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, batch=17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Nov 26 03:48:54 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:48:54 localhost podman[97359]: 2025-11-26 08:48:54.216762139 +0000 UTC m=+0.467050032 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, version=17.1.12, config_id=tripleo_step4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git) Nov 26 03:48:54 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:49:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:49:11 localhost systemd[1]: tmp-crun.zpKIca.mount: Deactivated successfully. Nov 26 03:49:11 localhost podman[97530]: 2025-11-26 08:49:11.87298733 +0000 UTC m=+0.135677675 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.4, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12) Nov 26 03:49:12 localhost podman[97530]: 2025-11-26 08:49:12.052260598 +0000 UTC m=+0.314950963 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.expose-services=, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd) Nov 26 03:49:12 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:49:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:49:16 localhost podman[97560]: 2025-11-26 08:49:16.875647299 +0000 UTC m=+0.136686076 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.12, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, tcib_managed=true) Nov 26 03:49:16 localhost podman[97560]: 2025-11-26 08:49:16.908517982 +0000 UTC m=+0.169556789 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=nova_compute, distribution-scope=public, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Nov 26 03:49:16 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:49:19 localhost sshd[97587]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:49:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:49:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:49:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:49:22 localhost podman[97591]: 2025-11-26 08:49:22.801079289 +0000 UTC m=+0.062625973 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, vcs-type=git, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:49:22 localhost podman[97589]: 2025-11-26 08:49:22.871341765 +0000 UTC m=+0.135053535 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, version=17.1.12, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, distribution-scope=public, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com) Nov 26 03:49:22 localhost podman[97590]: 2025-11-26 08:49:22.877095673 +0000 UTC m=+0.138642107 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.12, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, io.buildah.version=1.41.4, tcib_managed=true, batch=17.1_20251118.1, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4) Nov 26 03:49:22 localhost podman[97590]: 2025-11-26 08:49:22.886167272 +0000 UTC m=+0.147713686 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1761123044, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=logrotate_crond, batch=17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:49:22 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:49:22 localhost podman[97589]: 2025-11-26 08:49:22.895235362 +0000 UTC m=+0.158947132 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, config_id=tripleo_step4, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, batch=17.1_20251118.1, io.openshift.expose-services=) Nov 26 03:49:22 localhost podman[97591]: 2025-11-26 08:49:22.904654373 +0000 UTC m=+0.166201037 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, distribution-scope=public, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1) Nov 26 03:49:22 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:49:22 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:49:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:49:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:49:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:49:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:49:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:49:24 localhost podman[97664]: 2025-11-26 08:49:24.800677886 +0000 UTC m=+0.067963557 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, container_name=collectd, build-date=2025-11-18T22:51:28Z, version=17.1.12, config_id=tripleo_step3, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4) Nov 26 03:49:24 localhost systemd[1]: tmp-crun.APetCQ.mount: Deactivated successfully. Nov 26 03:49:24 localhost podman[97666]: 2025-11-26 08:49:24.856861679 +0000 UTC m=+0.118713863 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, tcib_managed=true, config_id=tripleo_step4, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Nov 26 03:49:24 localhost podman[97665]: 2025-11-26 08:49:24.922177592 +0000 UTC m=+0.184699576 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, tcib_managed=true, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, version=17.1.12, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Nov 26 03:49:24 localhost podman[97663]: 2025-11-26 08:49:24.825113699 +0000 UTC m=+0.089906833 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, release=1761123044, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, version=17.1.12, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:49:24 localhost podman[97665]: 2025-11-26 08:49:24.929638113 +0000 UTC m=+0.192160017 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, architecture=x86_64, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Nov 26 03:49:24 localhost podman[97665]: unhealthy Nov 26 03:49:24 localhost podman[97664]: 2025-11-26 08:49:24.935632257 +0000 UTC m=+0.202917978 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, io.buildah.version=1.41.4, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, version=17.1.12) Nov 26 03:49:24 localhost podman[97666]: 2025-11-26 08:49:24.943642364 +0000 UTC m=+0.205494518 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Nov 26 03:49:24 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:49:24 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 03:49:24 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:49:24 localhost podman[97666]: unhealthy Nov 26 03:49:24 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:49:24 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 03:49:24 localhost podman[97663]: 2025-11-26 08:49:24.95516105 +0000 UTC m=+0.219954234 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, version=17.1.12) Nov 26 03:49:24 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:49:24 localhost podman[97672]: 2025-11-26 08:49:24.897954545 +0000 UTC m=+0.156808576 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step4, release=1761123044, url=https://www.redhat.com) Nov 26 03:49:25 localhost podman[97672]: 2025-11-26 08:49:25.286091664 +0000 UTC m=+0.544945775 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com) Nov 26 03:49:25 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:49:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:49:42 localhost systemd[1]: tmp-crun.pOCdhs.mount: Deactivated successfully. Nov 26 03:49:42 localhost podman[97765]: 2025-11-26 08:49:42.830727164 +0000 UTC m=+0.099985904 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, container_name=metrics_qdr, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, build-date=2025-11-18T22:49:46Z) Nov 26 03:49:43 localhost podman[97765]: 2025-11-26 08:49:43.024485958 +0000 UTC m=+0.293744668 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, url=https://www.redhat.com, container_name=metrics_qdr, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1) Nov 26 03:49:43 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:49:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:49:47 localhost podman[97794]: 2025-11-26 08:49:47.817285865 +0000 UTC m=+0.079280596 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, distribution-scope=public, release=1761123044, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container) Nov 26 03:49:47 localhost podman[97794]: 2025-11-26 08:49:47.849262391 +0000 UTC m=+0.111257132 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, architecture=x86_64, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:49:47 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:49:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:49:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:49:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:49:53 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 26 03:49:53 localhost recover_tripleo_nova_virtqemud[97840]: 61604 Nov 26 03:49:53 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 26 03:49:53 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 26 03:49:53 localhost podman[97826]: 2025-11-26 08:49:53.817963668 +0000 UTC m=+0.074206870 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, architecture=x86_64, release=1761123044, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4) Nov 26 03:49:53 localhost podman[97826]: 2025-11-26 08:49:53.86960237 +0000 UTC m=+0.125845582 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, url=https://www.redhat.com) Nov 26 03:49:53 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:49:53 localhost podman[97819]: 2025-11-26 08:49:53.875226483 +0000 UTC m=+0.140166873 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, container_name=ceilometer_agent_compute, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 26 03:49:53 localhost podman[97820]: 2025-11-26 08:49:53.930527679 +0000 UTC m=+0.187015668 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container) Nov 26 03:49:53 localhost podman[97819]: 2025-11-26 08:49:53.959397979 +0000 UTC m=+0.224338389 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, vcs-type=git, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.12, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, distribution-scope=public, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64) Nov 26 03:49:53 localhost podman[97820]: 2025-11-26 08:49:53.969386187 +0000 UTC m=+0.225874136 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step4, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, distribution-scope=public, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:49:53 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:49:54 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:49:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:49:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:49:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:49:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:49:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:49:55 localhost systemd[1]: tmp-crun.18qo6I.mount: Deactivated successfully. Nov 26 03:49:55 localhost podman[97895]: 2025-11-26 08:49:55.830068921 +0000 UTC m=+0.095631629 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, config_id=tripleo_step3, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, batch=17.1_20251118.1, container_name=iscsid, tcib_managed=true, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, version=17.1.12) Nov 26 03:49:55 localhost podman[97904]: 2025-11-26 08:49:55.837219982 +0000 UTC m=+0.087411617 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team) Nov 26 03:49:55 localhost podman[97896]: 2025-11-26 08:49:55.895364775 +0000 UTC m=+0.151543874 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, architecture=x86_64, url=https://www.redhat.com, release=1761123044, batch=17.1_20251118.1, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, container_name=collectd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Nov 26 03:49:55 localhost podman[97896]: 2025-11-26 08:49:55.903087543 +0000 UTC m=+0.159266572 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, release=1761123044, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, tcib_managed=true, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:49:55 localhost podman[97900]: 2025-11-26 08:49:55.860678975 +0000 UTC m=+0.113581893 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, config_id=tripleo_step4, tcib_managed=true, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, release=1761123044, io.buildah.version=1.41.4) Nov 26 03:49:55 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:49:55 localhost podman[97904]: 2025-11-26 08:49:55.916794236 +0000 UTC m=+0.166985921 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, batch=17.1_20251118.1, release=1761123044, version=17.1.12, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 26 03:49:55 localhost podman[97904]: unhealthy Nov 26 03:49:55 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:49:55 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 03:49:55 localhost podman[97900]: 2025-11-26 08:49:55.945358966 +0000 UTC m=+0.198261874 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, distribution-scope=public, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., container_name=ovn_controller) Nov 26 03:49:55 localhost podman[97900]: unhealthy Nov 26 03:49:55 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:49:55 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 03:49:55 localhost podman[97895]: 2025-11-26 08:49:55.965006402 +0000 UTC m=+0.230569070 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid) Nov 26 03:49:55 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:49:56 localhost podman[97905]: 2025-11-26 08:49:56.012429334 +0000 UTC m=+0.259096359 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1) Nov 26 03:49:56 localhost podman[97905]: 2025-11-26 08:49:56.380477124 +0000 UTC m=+0.627144169 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12) Nov 26 03:49:56 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:49:59 localhost sshd[97992]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:50:06 localhost podman[98097]: 2025-11-26 08:50:06.834072028 +0000 UTC m=+0.103252194 container exec a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, release=553, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, RELEASE=main, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, ceph=True, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12) Nov 26 03:50:06 localhost podman[98097]: 2025-11-26 08:50:06.935519396 +0000 UTC m=+0.204699602 container exec_died a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, ceph=True, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, name=rhceph, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 26 03:50:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:50:13 localhost podman[98240]: 2025-11-26 08:50:13.821657562 +0000 UTC m=+0.079300226 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, version=17.1.12, managed_by=tripleo_ansible, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team) Nov 26 03:50:14 localhost podman[98240]: 2025-11-26 08:50:14.025665093 +0000 UTC m=+0.283307737 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, distribution-scope=public, release=1761123044, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_id=tripleo_step1, tcib_managed=true, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Nov 26 03:50:14 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:50:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:50:18 localhost systemd[1]: tmp-crun.EBFmms.mount: Deactivated successfully. Nov 26 03:50:18 localhost podman[98271]: 2025-11-26 08:50:18.817171209 +0000 UTC m=+0.080251806 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64) Nov 26 03:50:18 localhost podman[98271]: 2025-11-26 08:50:18.850381322 +0000 UTC m=+0.113461909 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, release=1761123044, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, config_id=tripleo_step5, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:50:18 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:50:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:50:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:50:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:50:24 localhost podman[98297]: 2025-11-26 08:50:24.830805007 +0000 UTC m=+0.091602884 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, distribution-scope=public, batch=17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 26 03:50:24 localhost podman[98297]: 2025-11-26 08:50:24.886303819 +0000 UTC m=+0.147101716 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, url=https://www.redhat.com, version=17.1.12, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, release=1761123044, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, name=rhosp17/openstack-ceilometer-compute) Nov 26 03:50:24 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:50:24 localhost podman[98298]: 2025-11-26 08:50:24.869637525 +0000 UTC m=+0.123590272 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, io.openshift.expose-services=, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, batch=17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron) Nov 26 03:50:24 localhost podman[98303]: 2025-11-26 08:50:24.893756239 +0000 UTC m=+0.144680642 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com) Nov 26 03:50:24 localhost podman[98298]: 2025-11-26 08:50:24.952340586 +0000 UTC m=+0.206293343 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, name=rhosp17/openstack-cron, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, architecture=x86_64, release=1761123044, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z) Nov 26 03:50:24 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:50:24 localhost podman[98303]: 2025-11-26 08:50:24.976307804 +0000 UTC m=+0.227232177 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:50:24 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:50:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:50:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:50:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:50:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:50:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:50:26 localhost systemd[1]: tmp-crun.XrDVZz.mount: Deactivated successfully. Nov 26 03:50:26 localhost podman[98379]: 2025-11-26 08:50:26.842022182 +0000 UTC m=+0.086300852 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4) Nov 26 03:50:26 localhost podman[98370]: 2025-11-26 08:50:26.892122647 +0000 UTC m=+0.147768927 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.buildah.version=1.41.4, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, release=1761123044, version=17.1.12, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team) Nov 26 03:50:26 localhost podman[98370]: 2025-11-26 08:50:26.902175018 +0000 UTC m=+0.157821308 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, tcib_managed=true, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, container_name=iscsid, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git) Nov 26 03:50:26 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:50:26 localhost podman[98372]: 2025-11-26 08:50:26.946537456 +0000 UTC m=+0.193253370 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, version=17.1.12, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step4, release=1761123044, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, managed_by=tripleo_ansible) Nov 26 03:50:26 localhost podman[98372]: 2025-11-26 08:50:26.986884539 +0000 UTC m=+0.233600433 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, batch=17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, release=1761123044, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Nov 26 03:50:26 localhost podman[98372]: unhealthy Nov 26 03:50:26 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:50:26 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 03:50:27 localhost podman[98371]: 2025-11-26 08:50:26.988624773 +0000 UTC m=+0.238158504 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, distribution-scope=public, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, name=rhosp17/openstack-collectd, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true) Nov 26 03:50:27 localhost podman[98371]: 2025-11-26 08:50:27.073324325 +0000 UTC m=+0.322857976 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, name=rhosp17/openstack-collectd, version=17.1.12, com.redhat.component=openstack-collectd-container, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, release=1761123044, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:50:27 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:50:27 localhost podman[98378]: 2025-11-26 08:50:27.041994409 +0000 UTC m=+0.284406461 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, version=17.1.12, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Nov 26 03:50:27 localhost podman[98378]: 2025-11-26 08:50:27.122262074 +0000 UTC m=+0.364674036 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, url=https://www.redhat.com) Nov 26 03:50:27 localhost podman[98378]: unhealthy Nov 26 03:50:27 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:50:27 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 03:50:27 localhost podman[98379]: 2025-11-26 08:50:27.185342279 +0000 UTC m=+0.429620959 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, batch=17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_id=tripleo_step4) Nov 26 03:50:27 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:50:38 localhost sshd[98470]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:50:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:50:44 localhost podman[98472]: 2025-11-26 08:50:44.816768727 +0000 UTC m=+0.073126497 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, architecture=x86_64, url=https://www.redhat.com, version=17.1.12) Nov 26 03:50:45 localhost podman[98472]: 2025-11-26 08:50:45.004001849 +0000 UTC m=+0.260359629 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://www.redhat.com, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044) Nov 26 03:50:45 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:50:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:50:49 localhost podman[98500]: 2025-11-26 08:50:49.820020143 +0000 UTC m=+0.079109971 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, version=17.1.12, config_id=tripleo_step5, vendor=Red Hat, Inc., release=1761123044, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:50:49 localhost podman[98500]: 2025-11-26 08:50:49.881650263 +0000 UTC m=+0.140740161 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1761123044, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.expose-services=, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, tcib_managed=true, url=https://www.redhat.com) Nov 26 03:50:49 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:50:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:50:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:50:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:50:55 localhost podman[98527]: 2025-11-26 08:50:55.833389878 +0000 UTC m=+0.091756271 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, name=rhosp17/openstack-cron) Nov 26 03:50:55 localhost podman[98527]: 2025-11-26 08:50:55.84256189 +0000 UTC m=+0.100928333 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, name=rhosp17/openstack-cron, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, release=1761123044, architecture=x86_64, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public) Nov 26 03:50:55 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:50:55 localhost podman[98528]: 2025-11-26 08:50:55.892876772 +0000 UTC m=+0.142727902 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.12, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container) Nov 26 03:50:55 localhost podman[98526]: 2025-11-26 08:50:55.811278316 +0000 UTC m=+0.073375423 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, build-date=2025-11-19T00:11:48Z, distribution-scope=public, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, release=1761123044, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 26 03:50:55 localhost podman[98528]: 2025-11-26 08:50:55.944851895 +0000 UTC m=+0.194703045 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, version=17.1.12, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, batch=17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 26 03:50:55 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:50:55 localhost podman[98526]: 2025-11-26 08:50:55.995589769 +0000 UTC m=+0.257686856 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.12, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 26 03:50:56 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:50:56 localhost systemd[1]: tmp-crun.iZzYMN.mount: Deactivated successfully. Nov 26 03:50:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:50:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:50:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:50:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:50:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:50:57 localhost systemd[1]: tmp-crun.j6FZ6F.mount: Deactivated successfully. Nov 26 03:50:57 localhost podman[98614]: 2025-11-26 08:50:57.862066482 +0000 UTC m=+0.080108001 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, distribution-scope=public, tcib_managed=true, version=17.1.12, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:50:57 localhost podman[98604]: 2025-11-26 08:50:57.958387902 +0000 UTC m=+0.177092542 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, io.buildah.version=1.41.4, tcib_managed=true) Nov 26 03:50:57 localhost podman[98604]: 2025-11-26 08:50:57.971354891 +0000 UTC m=+0.190059491 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, vendor=Red Hat, Inc.) Nov 26 03:50:57 localhost podman[98604]: unhealthy Nov 26 03:50:57 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:50:57 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 03:50:58 localhost podman[98601]: 2025-11-26 08:50:58.023997865 +0000 UTC m=+0.276913980 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, config_id=tripleo_step3, version=17.1.12, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044) Nov 26 03:50:58 localhost podman[98600]: 2025-11-26 08:50:57.936241769 +0000 UTC m=+0.191459525 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.openshift.expose-services=, release=1761123044, architecture=x86_64, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, version=17.1.12, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z) Nov 26 03:50:58 localhost podman[98601]: 2025-11-26 08:50:58.058207549 +0000 UTC m=+0.311123634 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, container_name=collectd, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-type=git) Nov 26 03:50:58 localhost podman[98600]: 2025-11-26 08:50:58.065874076 +0000 UTC m=+0.321091842 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, tcib_managed=true, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container) Nov 26 03:50:58 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:50:58 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:50:58 localhost podman[98602]: 2025-11-26 08:50:57.833726328 +0000 UTC m=+0.083094044 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4) Nov 26 03:50:58 localhost podman[98602]: 2025-11-26 08:50:58.125424852 +0000 UTC m=+0.374792608 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, release=1761123044, architecture=x86_64, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:50:58 localhost podman[98602]: unhealthy Nov 26 03:50:58 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:50:58 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 03:50:58 localhost podman[98614]: 2025-11-26 08:50:58.225587571 +0000 UTC m=+0.443629100 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:50:58 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:50:58 localhost systemd[1]: tmp-crun.btYub8.mount: Deactivated successfully. Nov 26 03:51:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:51:15 localhost podman[98776]: 2025-11-26 08:51:15.817912041 +0000 UTC m=+0.079558805 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, container_name=metrics_qdr, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, release=1761123044, maintainer=OpenStack TripleO Team, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.openshift.expose-services=, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:51:15 localhost podman[98776]: 2025-11-26 08:51:15.991517804 +0000 UTC m=+0.253164568 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, architecture=x86_64, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.openshift.expose-services=, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public) Nov 26 03:51:16 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:51:18 localhost sshd[98805]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:51:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:51:20 localhost podman[98807]: 2025-11-26 08:51:20.822486936 +0000 UTC m=+0.084159706 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, tcib_managed=true, container_name=nova_compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.expose-services=, release=1761123044, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:51:20 localhost podman[98807]: 2025-11-26 08:51:20.852515382 +0000 UTC m=+0.114188192 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, container_name=nova_compute, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step5) Nov 26 03:51:20 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:51:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:51:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:51:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:51:26 localhost systemd[1]: tmp-crun.yk0lQG.mount: Deactivated successfully. Nov 26 03:51:26 localhost podman[98834]: 2025-11-26 08:51:26.837087184 +0000 UTC m=+0.102025307 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, io.buildah.version=1.41.4, tcib_managed=true, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z) Nov 26 03:51:26 localhost podman[98834]: 2025-11-26 08:51:26.859914227 +0000 UTC m=+0.124852350 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=ceilometer_agent_compute, release=1761123044, vcs-type=git, config_id=tripleo_step4, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64) Nov 26 03:51:26 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:51:26 localhost podman[98835]: 2025-11-26 08:51:26.934515738 +0000 UTC m=+0.196361426 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.12, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public) Nov 26 03:51:26 localhost podman[98836]: 2025-11-26 08:51:26.972791199 +0000 UTC m=+0.230023084 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Nov 26 03:51:26 localhost podman[98835]: 2025-11-26 08:51:26.977109241 +0000 UTC m=+0.238954879 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, container_name=logrotate_crond, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, architecture=x86_64, build-date=2025-11-18T22:49:32Z, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, url=https://www.redhat.com, name=rhosp17/openstack-cron) Nov 26 03:51:26 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:51:27 localhost podman[98836]: 2025-11-26 08:51:27.057403688 +0000 UTC m=+0.314635513 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, config_id=tripleo_step4, io.buildah.version=1.41.4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, batch=17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team) Nov 26 03:51:27 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:51:27 localhost systemd[1]: tmp-crun.tugdD0.mount: Deactivated successfully. Nov 26 03:51:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:51:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:51:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:51:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:51:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:51:28 localhost podman[98908]: 2025-11-26 08:51:28.863912841 +0000 UTC m=+0.117737561 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, version=17.1.12, name=rhosp17/openstack-ovn-controller, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, config_id=tripleo_step4) Nov 26 03:51:28 localhost podman[98907]: 2025-11-26 08:51:28.830155061 +0000 UTC m=+0.091114682 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-collectd-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, url=https://www.redhat.com, release=1761123044, vcs-type=git, name=rhosp17/openstack-collectd, config_id=tripleo_step3, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.) Nov 26 03:51:28 localhost podman[98906]: 2025-11-26 08:51:28.887741036 +0000 UTC m=+0.151546854 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:51:28 localhost podman[98906]: 2025-11-26 08:51:28.900334484 +0000 UTC m=+0.164140252 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, tcib_managed=true, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, release=1761123044, architecture=x86_64, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Nov 26 03:51:28 localhost podman[98908]: 2025-11-26 08:51:28.908657801 +0000 UTC m=+0.162482561 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, distribution-scope=public, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Nov 26 03:51:28 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:51:28 localhost podman[98908]: unhealthy Nov 26 03:51:28 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:51:28 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 03:51:28 localhost podman[98907]: 2025-11-26 08:51:28.963826433 +0000 UTC m=+0.224786104 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, release=1761123044, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container) Nov 26 03:51:28 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:51:29 localhost podman[98920]: 2025-11-26 08:51:29.052175336 +0000 UTC m=+0.299118974 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, build-date=2025-11-19T00:36:58Z, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, config_id=tripleo_step4, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Nov 26 03:51:29 localhost podman[98912]: 2025-11-26 08:51:29.105629434 +0000 UTC m=+0.358953869 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team) Nov 26 03:51:29 localhost podman[98912]: 2025-11-26 08:51:29.124400974 +0000 UTC m=+0.377725439 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20251118.1) Nov 26 03:51:29 localhost podman[98912]: unhealthy Nov 26 03:51:29 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:51:29 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 03:51:29 localhost podman[98920]: 2025-11-26 08:51:29.409566127 +0000 UTC m=+0.656509795 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, version=17.1.12, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, release=1761123044, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, distribution-scope=public) Nov 26 03:51:29 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:51:29 localhost systemd[1]: tmp-crun.bmL82l.mount: Deactivated successfully. Nov 26 03:51:37 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 26 03:51:37 localhost recover_tripleo_nova_virtqemud[99009]: 61604 Nov 26 03:51:37 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 26 03:51:37 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 26 03:51:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:51:46 localhost podman[99011]: 2025-11-26 08:51:46.828222699 +0000 UTC m=+0.087190689 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, version=17.1.12, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, release=1761123044, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., container_name=metrics_qdr, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 03:51:47 localhost podman[99011]: 2025-11-26 08:51:47.014257296 +0000 UTC m=+0.273225286 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, config_id=tripleo_step1, release=1761123044, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 03:51:47 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:51:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:51:51 localhost podman[99040]: 2025-11-26 08:51:51.821198648 +0000 UTC m=+0.088738957 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, tcib_managed=true, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, container_name=nova_compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team) Nov 26 03:51:51 localhost podman[99040]: 2025-11-26 08:51:51.84882724 +0000 UTC m=+0.116367529 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1761123044, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, architecture=x86_64, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:51:51 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:51:55 localhost sshd[99066]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:51:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:51:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:51:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:51:57 localhost podman[99070]: 2025-11-26 08:51:57.83453829 +0000 UTC m=+0.088837689 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, architecture=x86_64, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git) Nov 26 03:51:57 localhost podman[99069]: 2025-11-26 08:51:57.873593005 +0000 UTC m=+0.132129755 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, version=17.1.12, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., release=1761123044, config_id=tripleo_step4, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=) Nov 26 03:51:57 localhost podman[99069]: 2025-11-26 08:51:57.880476798 +0000 UTC m=+0.139013578 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, name=rhosp17/openstack-cron, batch=17.1_20251118.1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, release=1761123044, build-date=2025-11-18T22:49:32Z, version=17.1.12, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=logrotate_crond, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:51:57 localhost podman[99070]: 2025-11-26 08:51:57.891485337 +0000 UTC m=+0.145784696 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, release=1761123044, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step4, version=17.1.12, vendor=Red Hat, Inc.) Nov 26 03:51:57 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:51:57 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:51:57 localhost podman[99068]: 2025-11-26 08:51:57.985075853 +0000 UTC m=+0.245512872 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:51:58 localhost podman[99068]: 2025-11-26 08:51:58.020511616 +0000 UTC m=+0.280948665 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 26 03:51:58 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:51:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:51:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:51:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:51:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:51:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:51:59 localhost systemd[1]: tmp-crun.S61XPO.mount: Deactivated successfully. Nov 26 03:51:59 localhost podman[99143]: 2025-11-26 08:51:59.824696237 +0000 UTC m=+0.081503124 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, vendor=Red Hat, Inc.) Nov 26 03:51:59 localhost podman[99143]: 2025-11-26 08:51:59.83746223 +0000 UTC m=+0.094269147 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, config_id=tripleo_step4, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller) Nov 26 03:51:59 localhost podman[99143]: unhealthy Nov 26 03:51:59 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:51:59 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 03:51:59 localhost podman[99141]: 2025-11-26 08:51:59.889884666 +0000 UTC m=+0.149481549 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, container_name=iscsid, release=1761123044, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 26 03:51:59 localhost podman[99144]: 2025-11-26 08:51:59.902761274 +0000 UTC m=+0.155162096 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, architecture=x86_64, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, managed_by=tripleo_ansible, config_id=tripleo_step4) Nov 26 03:51:59 localhost podman[99141]: 2025-11-26 08:51:59.927537127 +0000 UTC m=+0.187134070 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible) Nov 26 03:51:59 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:51:59 localhost podman[99142]: 2025-11-26 08:51:59.976190068 +0000 UTC m=+0.235058799 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, release=1761123044, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:51:59 localhost podman[99144]: 2025-11-26 08:51:59.997788004 +0000 UTC m=+0.250188826 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, config_id=tripleo_step4, vendor=Red Hat, Inc., io.buildah.version=1.41.4, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c) Nov 26 03:52:00 localhost podman[99144]: unhealthy Nov 26 03:52:00 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:52:00 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 03:52:00 localhost podman[99145]: 2025-11-26 08:52:00.012978462 +0000 UTC m=+0.262418143 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, batch=17.1_20251118.1, version=17.1.12, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, config_id=tripleo_step4) Nov 26 03:52:00 localhost podman[99142]: 2025-11-26 08:52:00.065375028 +0000 UTC m=+0.324243739 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:52:00 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:52:00 localhost podman[99145]: 2025-11-26 08:52:00.38133326 +0000 UTC m=+0.630772991 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, batch=17.1_20251118.1, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, version=17.1.12, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:52:00 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:52:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:52:17 localhost systemd[1]: tmp-crun.IswEtR.mount: Deactivated successfully. Nov 26 03:52:17 localhost podman[99313]: 2025-11-26 08:52:17.836072039 +0000 UTC m=+0.093483344 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, container_name=metrics_qdr, version=17.1.12, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, release=1761123044, name=rhosp17/openstack-qdrouterd, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 03:52:18 localhost podman[99313]: 2025-11-26 08:52:18.038827131 +0000 UTC m=+0.296238496 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, architecture=x86_64, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, url=https://www.redhat.com, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 03:52:18 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:52:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:52:22 localhost systemd[1]: tmp-crun.cAiQOS.mount: Deactivated successfully. Nov 26 03:52:22 localhost podman[99342]: 2025-11-26 08:52:22.800915209 +0000 UTC m=+0.063721585 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, name=rhosp17/openstack-nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, release=1761123044, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:52:22 localhost podman[99342]: 2025-11-26 08:52:22.848396563 +0000 UTC m=+0.111202959 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, architecture=x86_64, tcib_managed=true, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, vcs-type=git, release=1761123044, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1) Nov 26 03:52:22 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:52:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:52:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:52:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:52:28 localhost systemd[1]: tmp-crun.FmbeRM.mount: Deactivated successfully. Nov 26 03:52:28 localhost systemd[1]: tmp-crun.A4PDZq.mount: Deactivated successfully. Nov 26 03:52:28 localhost podman[99369]: 2025-11-26 08:52:28.909604782 +0000 UTC m=+0.166561527 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, version=17.1.12, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, tcib_managed=true, batch=17.1_20251118.1) Nov 26 03:52:28 localhost podman[99370]: 2025-11-26 08:52:28.863061726 +0000 UTC m=+0.115865303 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 26 03:52:28 localhost podman[99369]: 2025-11-26 08:52:28.92250874 +0000 UTC m=+0.179465525 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, name=rhosp17/openstack-cron, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_id=tripleo_step4, batch=17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, com.redhat.component=openstack-cron-container) Nov 26 03:52:28 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:52:28 localhost podman[99370]: 2025-11-26 08:52:28.998806533 +0000 UTC m=+0.251610020 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, io.buildah.version=1.41.4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, container_name=ceilometer_agent_ipmi) Nov 26 03:52:29 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:52:29 localhost podman[99368]: 2025-11-26 08:52:28.863157799 +0000 UTC m=+0.126240443 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, architecture=x86_64, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, batch=17.1_20251118.1) Nov 26 03:52:29 localhost podman[99368]: 2025-11-26 08:52:29.048444673 +0000 UTC m=+0.311527347 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, version=17.1.12) Nov 26 03:52:29 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:52:29 localhost systemd[1]: tmp-crun.CsYbyn.mount: Deactivated successfully. Nov 26 03:52:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:52:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:52:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:52:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:52:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:52:30 localhost systemd[1]: tmp-crun.GQaCN9.mount: Deactivated successfully. Nov 26 03:52:30 localhost podman[99442]: 2025-11-26 08:52:30.862713126 +0000 UTC m=+0.116517253 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, container_name=ovn_controller, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team) Nov 26 03:52:30 localhost podman[99441]: 2025-11-26 08:52:30.905586749 +0000 UTC m=+0.163099221 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20251118.1, version=17.1.12, architecture=x86_64, container_name=collectd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public) Nov 26 03:52:30 localhost podman[99442]: 2025-11-26 08:52:30.945586832 +0000 UTC m=+0.199390949 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, config_id=tripleo_step4, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, vcs-type=git, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 26 03:52:30 localhost podman[99442]: unhealthy Nov 26 03:52:30 localhost podman[99449]: 2025-11-26 08:52:30.957879671 +0000 UTC m=+0.204762115 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-nova-compute-container, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, release=1761123044, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Nov 26 03:52:30 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:52:30 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 03:52:31 localhost podman[99440]: 2025-11-26 08:52:31.002834857 +0000 UTC m=+0.264352682 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-iscsid-container, container_name=iscsid, architecture=x86_64, config_id=tripleo_step3, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid) Nov 26 03:52:31 localhost podman[99440]: 2025-11-26 08:52:31.016192309 +0000 UTC m=+0.277710094 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, tcib_managed=true, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, release=1761123044, container_name=iscsid, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Nov 26 03:52:31 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:52:31 localhost podman[99443]: 2025-11-26 08:52:31.110676013 +0000 UTC m=+0.361216509 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1) Nov 26 03:52:31 localhost podman[99441]: 2025-11-26 08:52:31.123315592 +0000 UTC m=+0.380828084 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, release=1761123044, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public) Nov 26 03:52:31 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:52:31 localhost podman[99443]: 2025-11-26 08:52:31.152396859 +0000 UTC m=+0.402937325 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, version=17.1.12, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c) Nov 26 03:52:31 localhost podman[99443]: unhealthy Nov 26 03:52:31 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:52:31 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 03:52:31 localhost podman[99449]: 2025-11-26 08:52:31.343736979 +0000 UTC m=+0.590619373 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, io.buildah.version=1.41.4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute) Nov 26 03:52:31 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:52:34 localhost sshd[99541]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:52:40 localhost systemd[1]: session-29.scope: Deactivated successfully. Nov 26 03:52:40 localhost systemd[1]: session-29.scope: Consumed 7min 7.476s CPU time. Nov 26 03:52:40 localhost systemd-logind[761]: Session 29 logged out. Waiting for processes to exit. Nov 26 03:52:40 localhost systemd-logind[761]: Removed session 29. Nov 26 03:52:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:52:48 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 26 03:52:48 localhost recover_tripleo_nova_virtqemud[99545]: 61604 Nov 26 03:52:48 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 26 03:52:48 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 26 03:52:48 localhost podman[99543]: 2025-11-26 08:52:48.822909948 +0000 UTC m=+0.087694705 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, io.buildah.version=1.41.4, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, distribution-scope=public, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, container_name=metrics_qdr, version=17.1.12, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, vendor=Red Hat, Inc.) Nov 26 03:52:49 localhost podman[99543]: 2025-11-26 08:52:49.06141546 +0000 UTC m=+0.326199957 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, architecture=x86_64, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:52:49 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:52:50 localhost systemd[1]: Stopping User Manager for UID 1003... Nov 26 03:52:50 localhost systemd[36090]: Activating special unit Exit the Session... Nov 26 03:52:50 localhost systemd[36090]: Removed slice User Background Tasks Slice. Nov 26 03:52:50 localhost systemd[36090]: Stopped target Main User Target. Nov 26 03:52:50 localhost systemd[36090]: Stopped target Basic System. Nov 26 03:52:50 localhost systemd[36090]: Stopped target Paths. Nov 26 03:52:50 localhost systemd[36090]: Stopped target Sockets. Nov 26 03:52:50 localhost systemd[36090]: Stopped target Timers. Nov 26 03:52:50 localhost systemd[36090]: Stopped Mark boot as successful after the user session has run 2 minutes. Nov 26 03:52:50 localhost systemd[36090]: Stopped Daily Cleanup of User's Temporary Directories. Nov 26 03:52:50 localhost systemd[36090]: Closed D-Bus User Message Bus Socket. Nov 26 03:52:50 localhost systemd[36090]: Stopped Create User's Volatile Files and Directories. Nov 26 03:52:50 localhost systemd[36090]: Removed slice User Application Slice. Nov 26 03:52:50 localhost systemd[36090]: Reached target Shutdown. Nov 26 03:52:50 localhost systemd[36090]: Finished Exit the Session. Nov 26 03:52:50 localhost systemd[36090]: Reached target Exit the Session. Nov 26 03:52:50 localhost systemd[1]: user@1003.service: Deactivated successfully. Nov 26 03:52:50 localhost systemd[1]: Stopped User Manager for UID 1003. Nov 26 03:52:50 localhost systemd[1]: user@1003.service: Consumed 4.888s CPU time, read 0B from disk, written 7.0K to disk. Nov 26 03:52:50 localhost systemd[1]: Stopping User Runtime Directory /run/user/1003... Nov 26 03:52:50 localhost systemd[1]: run-user-1003.mount: Deactivated successfully. Nov 26 03:52:50 localhost systemd[1]: user-runtime-dir@1003.service: Deactivated successfully. Nov 26 03:52:50 localhost systemd[1]: Stopped User Runtime Directory /run/user/1003. Nov 26 03:52:50 localhost systemd[1]: Removed slice User Slice of UID 1003. Nov 26 03:52:50 localhost systemd[1]: user-1003.slice: Consumed 7min 12.392s CPU time. Nov 26 03:52:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:52:53 localhost podman[99574]: 2025-11-26 08:52:53.833981436 +0000 UTC m=+0.087172319 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z) Nov 26 03:52:53 localhost podman[99574]: 2025-11-26 08:52:53.867347004 +0000 UTC m=+0.120537877 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, release=1761123044, name=rhosp17/openstack-nova-compute, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.expose-services=) Nov 26 03:52:53 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:52:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:52:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:52:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:52:59 localhost podman[99599]: 2025-11-26 08:52:59.825707425 +0000 UTC m=+0.084649950 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, url=https://www.redhat.com, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_id=tripleo_step4) Nov 26 03:52:59 localhost podman[99599]: 2025-11-26 08:52:59.838305811 +0000 UTC m=+0.097248296 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, config_id=tripleo_step4, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, container_name=logrotate_crond, release=1761123044) Nov 26 03:52:59 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:52:59 localhost systemd[1]: tmp-crun.eKsWyg.mount: Deactivated successfully. Nov 26 03:52:59 localhost podman[99600]: 2025-11-26 08:52:59.890667485 +0000 UTC m=+0.144406986 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:52:59 localhost podman[99600]: 2025-11-26 08:52:59.91821521 +0000 UTC m=+0.171954671 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, version=17.1.12, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, config_id=tripleo_step4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 26 03:52:59 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:52:59 localhost podman[99598]: 2025-11-26 08:52:59.978739672 +0000 UTC m=+0.240227027 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4) Nov 26 03:53:00 localhost podman[99598]: 2025-11-26 08:53:00.037392794 +0000 UTC m=+0.298880179 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, vcs-type=git, io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, architecture=x86_64, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team) Nov 26 03:53:00 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:53:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:53:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:53:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:53:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:53:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:53:01 localhost podman[99674]: 2025-11-26 08:53:01.847232677 +0000 UTC m=+0.092490366 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_id=tripleo_step4, managed_by=tripleo_ansible, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, container_name=ovn_controller, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Nov 26 03:53:01 localhost podman[99674]: 2025-11-26 08:53:01.862338591 +0000 UTC m=+0.107596240 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, tcib_managed=true, version=17.1.12, config_id=tripleo_step4, vendor=Red Hat, Inc.) Nov 26 03:53:01 localhost podman[99674]: unhealthy Nov 26 03:53:01 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:53:01 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 03:53:01 localhost podman[99673]: 2025-11-26 08:53:01.903891716 +0000 UTC m=+0.151551791 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.openshift.expose-services=) Nov 26 03:53:01 localhost podman[99673]: 2025-11-26 08:53:01.916928676 +0000 UTC m=+0.164588721 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, url=https://www.redhat.com, config_id=tripleo_step3, name=rhosp17/openstack-collectd, vcs-type=git, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:53:01 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:53:02 localhost podman[99682]: 2025-11-26 08:53:02.005045743 +0000 UTC m=+0.243366015 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, distribution-scope=public, architecture=x86_64) Nov 26 03:53:02 localhost podman[99672]: 2025-11-26 08:53:02.057026636 +0000 UTC m=+0.308855862 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-iscsid-container, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:53:02 localhost podman[99672]: 2025-11-26 08:53:02.068468295 +0000 UTC m=+0.320297581 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, distribution-scope=public, version=17.1.12, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, tcib_managed=true, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:53:02 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:53:02 localhost podman[99675]: 2025-11-26 08:53:02.119494157 +0000 UTC m=+0.359777741 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, architecture=x86_64, release=1761123044, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com) Nov 26 03:53:02 localhost podman[99675]: 2025-11-26 08:53:02.161843127 +0000 UTC m=+0.402126641 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, build-date=2025-11-19T00:14:25Z, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, container_name=ovn_metadata_agent) Nov 26 03:53:02 localhost podman[99675]: unhealthy Nov 26 03:53:02 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:53:02 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 03:53:02 localhost podman[99682]: 2025-11-26 08:53:02.367252889 +0000 UTC m=+0.605573141 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, container_name=nova_migration_target, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64) Nov 26 03:53:02 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:53:14 localhost sshd[99848]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:53:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:53:19 localhost systemd[1]: tmp-crun.klPPup.mount: Deactivated successfully. Nov 26 03:53:19 localhost podman[99850]: 2025-11-26 08:53:19.844127398 +0000 UTC m=+0.099251529 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, distribution-scope=public, version=17.1.12, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, release=1761123044, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1) Nov 26 03:53:20 localhost podman[99850]: 2025-11-26 08:53:20.053037439 +0000 UTC m=+0.308161590 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.openshift.expose-services=, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.buildah.version=1.41.4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, architecture=x86_64, release=1761123044, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Nov 26 03:53:20 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:53:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:53:24 localhost podman[99879]: 2025-11-26 08:53:24.815561149 +0000 UTC m=+0.078293210 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 26 03:53:24 localhost podman[99879]: 2025-11-26 08:53:24.865701073 +0000 UTC m=+0.128433134 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, architecture=x86_64, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z) Nov 26 03:53:24 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:53:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:53:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:53:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:53:30 localhost podman[99904]: 2025-11-26 08:53:30.84280856 +0000 UTC m=+0.097980368 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, tcib_managed=true, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=) Nov 26 03:53:30 localhost podman[99904]: 2025-11-26 08:53:30.869357625 +0000 UTC m=+0.124529433 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, io.openshift.expose-services=, url=https://www.redhat.com, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, build-date=2025-11-19T00:11:48Z, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:53:30 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:53:30 localhost podman[99906]: 2025-11-26 08:53:30.925187127 +0000 UTC m=+0.174052317 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, version=17.1.12, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi) Nov 26 03:53:30 localhost podman[99906]: 2025-11-26 08:53:30.974233568 +0000 UTC m=+0.223098738 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 26 03:53:30 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:53:31 localhost podman[99905]: 2025-11-26 08:53:31.005052896 +0000 UTC m=+0.257337553 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, managed_by=tripleo_ansible, release=1761123044, container_name=logrotate_crond, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, distribution-scope=public, io.openshift.expose-services=, version=17.1.12, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1) Nov 26 03:53:31 localhost podman[99905]: 2025-11-26 08:53:31.011622672 +0000 UTC m=+0.263907339 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20251118.1, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, container_name=logrotate_crond, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Nov 26 03:53:31 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:53:31 localhost systemd[1]: tmp-crun.fi7WLx.mount: Deactivated successfully. Nov 26 03:53:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:53:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:53:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:53:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:53:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:53:32 localhost systemd[1]: tmp-crun.pKqHRp.mount: Deactivated successfully. Nov 26 03:53:32 localhost podman[99975]: 2025-11-26 08:53:32.843042744 +0000 UTC m=+0.099766894 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, name=rhosp17/openstack-iscsid, architecture=x86_64) Nov 26 03:53:32 localhost systemd[1]: tmp-crun.eB3xka.mount: Deactivated successfully. Nov 26 03:53:32 localhost podman[99975]: 2025-11-26 08:53:32.890697101 +0000 UTC m=+0.147421221 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.4, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_id=tripleo_step3, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:53:32 localhost podman[99976]: 2025-11-26 08:53:32.896891515 +0000 UTC m=+0.150607781 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-type=git, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.buildah.version=1.41.4, config_id=tripleo_step3, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:53:32 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:53:32 localhost podman[99976]: 2025-11-26 08:53:32.936247442 +0000 UTC m=+0.189963648 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, build-date=2025-11-18T22:51:28Z, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, com.redhat.component=openstack-collectd-container, distribution-scope=public, io.openshift.expose-services=) Nov 26 03:53:32 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:53:32 localhost podman[99977]: 2025-11-26 08:53:32.947047881 +0000 UTC m=+0.197313418 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, config_id=tripleo_step4, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, tcib_managed=true, version=17.1.12, vendor=Red Hat, Inc., distribution-scope=public, release=1761123044, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller) Nov 26 03:53:32 localhost podman[99977]: 2025-11-26 08:53:32.960833344 +0000 UTC m=+0.211098911 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, vcs-type=git, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 26 03:53:32 localhost podman[99977]: unhealthy Nov 26 03:53:32 localhost podman[99978]: 2025-11-26 08:53:32.8680678 +0000 UTC m=+0.114665432 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:53:32 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:53:32 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 03:53:33 localhost podman[99978]: 2025-11-26 08:53:33.001234663 +0000 UTC m=+0.247832195 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, distribution-scope=public, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, tcib_managed=true) Nov 26 03:53:33 localhost podman[99978]: unhealthy Nov 26 03:53:33 localhost podman[99984]: 2025-11-26 08:53:33.008989776 +0000 UTC m=+0.250785987 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, distribution-scope=public, config_id=tripleo_step4) Nov 26 03:53:33 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:53:33 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 03:53:33 localhost podman[99984]: 2025-11-26 08:53:33.424003641 +0000 UTC m=+0.665799902 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, name=rhosp17/openstack-nova-compute, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.buildah.version=1.41.4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:53:33 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:53:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:53:50 localhost podman[100073]: 2025-11-26 08:53:50.814791326 +0000 UTC m=+0.073562381 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, io.openshift.expose-services=, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, name=rhosp17/openstack-qdrouterd, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc.) Nov 26 03:53:51 localhost podman[100073]: 2025-11-26 08:53:51.042525239 +0000 UTC m=+0.301296314 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, release=1761123044, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, version=17.1.12, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true) Nov 26 03:53:51 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:53:55 localhost sshd[100102]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:53:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:53:55 localhost podman[100104]: 2025-11-26 08:53:55.812098902 +0000 UTC m=+0.074759720 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., url=https://www.redhat.com, distribution-scope=public, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible) Nov 26 03:53:55 localhost podman[100104]: 2025-11-26 08:53:55.871478066 +0000 UTC m=+0.134138824 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.expose-services=) Nov 26 03:53:55 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:54:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:54:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:54:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:54:01 localhost systemd[1]: tmp-crun.lT2Gzl.mount: Deactivated successfully. Nov 26 03:54:01 localhost podman[100130]: 2025-11-26 08:54:01.813500212 +0000 UTC m=+0.078011621 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 26 03:54:01 localhost podman[100132]: 2025-11-26 08:54:01.874723005 +0000 UTC m=+0.139833933 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true) Nov 26 03:54:01 localhost podman[100131]: 2025-11-26 08:54:01.838246489 +0000 UTC m=+0.101577381 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, version=17.1.12, batch=17.1_20251118.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible) Nov 26 03:54:01 localhost podman[100132]: 2025-11-26 08:54:01.904343885 +0000 UTC m=+0.169454773 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, build-date=2025-11-19T00:12:45Z, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12) Nov 26 03:54:01 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:54:01 localhost podman[100131]: 2025-11-26 08:54:01.925448958 +0000 UTC m=+0.188779870 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public) Nov 26 03:54:01 localhost podman[100130]: 2025-11-26 08:54:01.943695981 +0000 UTC m=+0.208207410 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.buildah.version=1.41.4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, vcs-type=git, name=rhosp17/openstack-ceilometer-compute) Nov 26 03:54:01 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:54:01 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:54:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:54:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:54:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:54:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:54:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:54:03 localhost podman[100204]: 2025-11-26 08:54:03.834721244 +0000 UTC m=+0.088971125 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, tcib_managed=true, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, version=17.1.12) Nov 26 03:54:03 localhost podman[100204]: 2025-11-26 08:54:03.846237156 +0000 UTC m=+0.100487017 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, release=1761123044, batch=17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:54:03 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:54:03 localhost podman[100203]: 2025-11-26 08:54:03.882003679 +0000 UTC m=+0.140714930 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, batch=17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, release=1761123044, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:54:03 localhost podman[100203]: 2025-11-26 08:54:03.890413774 +0000 UTC m=+0.149125055 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, io.buildah.version=1.41.4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, url=https://www.redhat.com, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-iscsid-container) Nov 26 03:54:03 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:54:03 localhost podman[100212]: 2025-11-26 08:54:03.934333293 +0000 UTC m=+0.180646855 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, architecture=x86_64, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, container_name=nova_migration_target) Nov 26 03:54:03 localhost podman[100205]: 2025-11-26 08:54:03.99887527 +0000 UTC m=+0.251420187 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, config_id=tripleo_step4, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, release=1761123044, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Nov 26 03:54:04 localhost podman[100205]: 2025-11-26 08:54:04.043271044 +0000 UTC m=+0.295815951 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, container_name=ovn_controller, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64) Nov 26 03:54:04 localhost podman[100205]: unhealthy Nov 26 03:54:04 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:54:04 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 03:54:04 localhost podman[100209]: 2025-11-26 08:54:04.056474479 +0000 UTC m=+0.303438601 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, vcs-type=git, release=1761123044, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64) Nov 26 03:54:04 localhost podman[100209]: 2025-11-26 08:54:04.09852658 +0000 UTC m=+0.345490742 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, io.buildah.version=1.41.4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Nov 26 03:54:04 localhost podman[100209]: unhealthy Nov 26 03:54:04 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:54:04 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 03:54:04 localhost podman[100212]: 2025-11-26 08:54:04.314359978 +0000 UTC m=+0.560673580 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, version=17.1.12, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:54:04 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:54:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:54:21 localhost systemd[1]: tmp-crun.AF0EXc.mount: Deactivated successfully. Nov 26 03:54:21 localhost podman[100375]: 2025-11-26 08:54:21.834220218 +0000 UTC m=+0.091774694 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, container_name=metrics_qdr, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:54:22 localhost podman[100375]: 2025-11-26 08:54:22.010304458 +0000 UTC m=+0.267858964 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 03:54:22 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:54:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:54:26 localhost podman[100405]: 2025-11-26 08:54:26.816625144 +0000 UTC m=+0.075455611 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.expose-services=) Nov 26 03:54:26 localhost podman[100405]: 2025-11-26 08:54:26.843714024 +0000 UTC m=+0.102544521 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.41.4, managed_by=tripleo_ansible, batch=17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, vcs-type=git) Nov 26 03:54:26 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:54:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:54:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:54:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:54:32 localhost systemd[1]: tmp-crun.yhqBXv.mount: Deactivated successfully. Nov 26 03:54:32 localhost podman[100430]: 2025-11-26 08:54:32.82906914 +0000 UTC m=+0.091971310 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, config_id=tripleo_step4) Nov 26 03:54:32 localhost podman[100430]: 2025-11-26 08:54:32.863700237 +0000 UTC m=+0.126602417 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc.) Nov 26 03:54:32 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:54:32 localhost systemd[1]: tmp-crun.E0hb1O.mount: Deactivated successfully. Nov 26 03:54:32 localhost podman[100431]: 2025-11-26 08:54:32.895440725 +0000 UTC m=+0.152204152 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_step4, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc.) Nov 26 03:54:32 localhost podman[100432]: 2025-11-26 08:54:32.932124577 +0000 UTC m=+0.185960332 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, release=1761123044, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., config_id=tripleo_step4, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:54:32 localhost podman[100431]: 2025-11-26 08:54:32.974791687 +0000 UTC m=+0.231555134 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, build-date=2025-11-18T22:49:32Z, distribution-scope=public, name=rhosp17/openstack-cron, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12) Nov 26 03:54:32 localhost podman[100432]: 2025-11-26 08:54:32.983794059 +0000 UTC m=+0.237629744 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20251118.1, release=1761123044, container_name=ceilometer_agent_ipmi, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 26 03:54:32 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:54:33 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:54:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:54:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:54:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:54:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:54:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:54:34 localhost podman[100505]: 2025-11-26 08:54:34.860470253 +0000 UTC m=+0.106091253 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, config_id=tripleo_step4, batch=17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn) Nov 26 03:54:34 localhost podman[100517]: 2025-11-26 08:54:34.925626249 +0000 UTC m=+0.165003823 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, release=1761123044, vendor=Red Hat, Inc., config_id=tripleo_step4) Nov 26 03:54:34 localhost podman[100503]: 2025-11-26 08:54:34.839136553 +0000 UTC m=+0.093886150 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, name=rhosp17/openstack-collectd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, release=1761123044, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com) Nov 26 03:54:34 localhost podman[100503]: 2025-11-26 08:54:34.970099666 +0000 UTC m=+0.224849273 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, version=17.1.12, release=1761123044, vcs-type=git, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible) Nov 26 03:54:34 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:54:34 localhost podman[100505]: 2025-11-26 08:54:34.990320461 +0000 UTC m=+0.235941461 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-type=git, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z) Nov 26 03:54:34 localhost podman[100502]: 2025-11-26 08:54:34.892741127 +0000 UTC m=+0.149113905 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=iscsid, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc.) Nov 26 03:54:35 localhost podman[100504]: 2025-11-26 08:54:35.010862697 +0000 UTC m=+0.259263984 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-type=git, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, container_name=ovn_controller, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1) Nov 26 03:54:35 localhost podman[100502]: 2025-11-26 08:54:35.02210502 +0000 UTC m=+0.278477828 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, tcib_managed=true, io.buildah.version=1.41.4, version=17.1.12, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 26 03:54:35 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:54:35 localhost podman[100505]: unhealthy Nov 26 03:54:35 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:54:35 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 03:54:35 localhost podman[100504]: 2025-11-26 08:54:35.077372415 +0000 UTC m=+0.325773682 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, release=1761123044, url=https://www.redhat.com, container_name=ovn_controller, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:54:35 localhost podman[100504]: unhealthy Nov 26 03:54:35 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:54:35 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 03:54:35 localhost podman[100517]: 2025-11-26 08:54:35.285785161 +0000 UTC m=+0.525162705 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vcs-type=git, architecture=x86_64, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:54:35 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:54:36 localhost sshd[100600]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:54:47 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 26 03:54:47 localhost recover_tripleo_nova_virtqemud[100603]: 61604 Nov 26 03:54:47 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 26 03:54:47 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 26 03:54:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:54:52 localhost podman[100604]: 2025-11-26 08:54:52.834572837 +0000 UTC m=+0.087785448 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, architecture=x86_64, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, tcib_managed=true, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, version=17.1.12, com.redhat.component=openstack-qdrouterd-container) Nov 26 03:54:53 localhost podman[100604]: 2025-11-26 08:54:53.023404629 +0000 UTC m=+0.276617300 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, container_name=metrics_qdr, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., release=1761123044, version=17.1.12, distribution-scope=public, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 03:54:53 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:54:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:54:57 localhost podman[100634]: 2025-11-26 08:54:57.821537136 +0000 UTC m=+0.083307047 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, container_name=nova_compute, distribution-scope=public, batch=17.1_20251118.1, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:54:57 localhost podman[100634]: 2025-11-26 08:54:57.847382769 +0000 UTC m=+0.109152650 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, version=17.1.12, vcs-type=git, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 26 03:54:57 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:55:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:55:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:55:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:55:03 localhost systemd[1]: tmp-crun.GFomJM.mount: Deactivated successfully. Nov 26 03:55:03 localhost podman[100660]: 2025-11-26 08:55:03.837182106 +0000 UTC m=+0.101626294 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, config_id=tripleo_step4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, architecture=x86_64, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 26 03:55:03 localhost podman[100660]: 2025-11-26 08:55:03.865279218 +0000 UTC m=+0.129723396 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, version=17.1.12, io.openshift.expose-services=, release=1761123044, io.buildah.version=1.41.4, vcs-type=git, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 26 03:55:03 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:55:03 localhost podman[100662]: 2025-11-26 08:55:03.906603467 +0000 UTC m=+0.162414573 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:55:03 localhost podman[100662]: 2025-11-26 08:55:03.941404949 +0000 UTC m=+0.197216065 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1) Nov 26 03:55:03 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:55:03 localhost podman[100661]: 2025-11-26 08:55:03.984806392 +0000 UTC m=+0.243895801 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_id=tripleo_step4, release=1761123044, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team) Nov 26 03:55:04 localhost podman[100661]: 2025-11-26 08:55:03.998205613 +0000 UTC m=+0.257294992 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, distribution-scope=public, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, version=17.1.12, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4) Nov 26 03:55:04 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:55:04 localhost systemd[1]: tmp-crun.cr5mH2.mount: Deactivated successfully. Nov 26 03:55:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:55:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:55:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:55:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:55:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:55:05 localhost podman[100732]: 2025-11-26 08:55:05.823155791 +0000 UTC m=+0.079307422 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, release=1761123044, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, architecture=x86_64, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, container_name=collectd, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Nov 26 03:55:05 localhost podman[100732]: 2025-11-26 08:55:05.833194296 +0000 UTC m=+0.089345897 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, tcib_managed=true, batch=17.1_20251118.1, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, architecture=x86_64, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:55:05 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:55:05 localhost podman[100745]: 2025-11-26 08:55:05.837102459 +0000 UTC m=+0.081945395 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, distribution-scope=public, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, release=1761123044, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:55:05 localhost podman[100739]: 2025-11-26 08:55:05.90114388 +0000 UTC m=+0.148006489 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, version=17.1.12, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible) Nov 26 03:55:05 localhost podman[100733]: 2025-11-26 08:55:05.880992527 +0000 UTC m=+0.133908126 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, name=rhosp17/openstack-ovn-controller, version=17.1.12, vendor=Red Hat, Inc., container_name=ovn_controller, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Nov 26 03:55:05 localhost podman[100739]: 2025-11-26 08:55:05.940196997 +0000 UTC m=+0.187059606 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_step4) Nov 26 03:55:05 localhost podman[100739]: unhealthy Nov 26 03:55:05 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:55:05 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 03:55:05 localhost podman[100731]: 2025-11-26 08:55:05.981124092 +0000 UTC m=+0.240868576 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, vcs-type=git, release=1761123044, tcib_managed=true, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, io.buildah.version=1.41.4, distribution-scope=public, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Nov 26 03:55:05 localhost podman[100731]: 2025-11-26 08:55:05.992227011 +0000 UTC m=+0.251971525 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.buildah.version=1.41.4, vcs-type=git, tcib_managed=true, container_name=iscsid, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64) Nov 26 03:55:06 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:55:06 localhost podman[100733]: 2025-11-26 08:55:06.014495011 +0000 UTC m=+0.267410640 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, architecture=x86_64, vcs-type=git, name=rhosp17/openstack-ovn-controller, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, container_name=ovn_controller, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, version=17.1.12, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, maintainer=OpenStack TripleO Team) Nov 26 03:55:06 localhost podman[100733]: unhealthy Nov 26 03:55:06 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:55:06 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 03:55:06 localhost podman[100745]: 2025-11-26 08:55:06.207554614 +0000 UTC m=+0.452397580 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, tcib_managed=true, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64) Nov 26 03:55:06 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:55:06 localhost systemd[1]: tmp-crun.1CRwpu.mount: Deactivated successfully. Nov 26 03:55:16 localhost sshd[100902]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:55:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:55:23 localhost podman[100905]: 2025-11-26 08:55:23.835043031 +0000 UTC m=+0.094229860 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, tcib_managed=true, maintainer=OpenStack TripleO Team, architecture=x86_64, version=17.1.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc.) Nov 26 03:55:24 localhost podman[100905]: 2025-11-26 08:55:24.034354571 +0000 UTC m=+0.293541440 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, release=1761123044, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd) Nov 26 03:55:24 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:55:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:55:28 localhost podman[100934]: 2025-11-26 08:55:28.819863424 +0000 UTC m=+0.082466222 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, release=1761123044, build-date=2025-11-19T00:36:58Z, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5) Nov 26 03:55:28 localhost podman[100934]: 2025-11-26 08:55:28.852325533 +0000 UTC m=+0.114928371 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.buildah.version=1.41.4, release=1761123044, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, version=17.1.12, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:55:28 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:55:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:55:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:55:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:55:34 localhost podman[100962]: 2025-11-26 08:55:34.823741771 +0000 UTC m=+0.083350529 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, io.buildah.version=1.41.4, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, vcs-type=git, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z) Nov 26 03:55:34 localhost podman[100961]: 2025-11-26 08:55:34.889760334 +0000 UTC m=+0.149321861 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, version=17.1.12, distribution-scope=public) Nov 26 03:55:34 localhost podman[100961]: 2025-11-26 08:55:34.897392534 +0000 UTC m=+0.156954071 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, architecture=x86_64, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1761123044, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Nov 26 03:55:34 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:55:34 localhost podman[100960]: 2025-11-26 08:55:34.937193313 +0000 UTC m=+0.200687503 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.4, config_id=tripleo_step4, io.openshift.expose-services=, architecture=x86_64, container_name=ceilometer_agent_compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 26 03:55:34 localhost podman[100962]: 2025-11-26 08:55:34.9574318 +0000 UTC m=+0.217040528 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 26 03:55:34 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:55:34 localhost podman[100960]: 2025-11-26 08:55:34.994469123 +0000 UTC m=+0.257963323 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, config_id=tripleo_step4, container_name=ceilometer_agent_compute, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 26 03:55:35 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:55:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:55:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:55:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:55:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:55:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:55:36 localhost podman[101034]: 2025-11-26 08:55:36.831102557 +0000 UTC m=+0.092231118 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, distribution-scope=public, architecture=x86_64, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, container_name=iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 26 03:55:36 localhost systemd[1]: tmp-crun.ICIyl3.mount: Deactivated successfully. Nov 26 03:55:36 localhost podman[101035]: 2025-11-26 08:55:36.901317442 +0000 UTC m=+0.155458943 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, distribution-scope=public, version=17.1.12, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044) Nov 26 03:55:36 localhost podman[101035]: 2025-11-26 08:55:36.939478771 +0000 UTC m=+0.193620252 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, batch=17.1_20251118.1, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true) Nov 26 03:55:36 localhost podman[101042]: 2025-11-26 08:55:36.948631138 +0000 UTC m=+0.195884273 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, config_id=tripleo_step4, release=1761123044, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, version=17.1.12, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Nov 26 03:55:36 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:55:37 localhost podman[101036]: 2025-11-26 08:55:36.999214247 +0000 UTC m=+0.250224520 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, architecture=x86_64, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, version=17.1.12, io.buildah.version=1.41.4, container_name=ovn_controller, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 26 03:55:37 localhost podman[101036]: 2025-11-26 08:55:37.014501328 +0000 UTC m=+0.265511571 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-type=git, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-ovn-controller, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc.) Nov 26 03:55:37 localhost podman[101042]: 2025-11-26 08:55:37.016869312 +0000 UTC m=+0.264122397 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, release=1761123044, vcs-type=git, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:55:37 localhost podman[101042]: unhealthy Nov 26 03:55:37 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:55:37 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 03:55:37 localhost podman[101036]: unhealthy Nov 26 03:55:37 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:55:37 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 03:55:37 localhost podman[101043]: 2025-11-26 08:55:37.155343201 +0000 UTC m=+0.400329715 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, release=1761123044, build-date=2025-11-19T00:36:58Z, architecture=x86_64, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team) Nov 26 03:55:37 localhost podman[101034]: 2025-11-26 08:55:37.172664645 +0000 UTC m=+0.433793236 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, architecture=x86_64, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:55:37 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:55:37 localhost podman[101043]: 2025-11-26 08:55:37.568491057 +0000 UTC m=+0.813477611 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, vcs-type=git, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.buildah.version=1.41.4, version=17.1.12, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=) Nov 26 03:55:37 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:55:37 localhost systemd[1]: tmp-crun.1jTsbC.mount: Deactivated successfully. Nov 26 03:55:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:55:54 localhost podman[101141]: 2025-11-26 08:55:54.822190697 +0000 UTC m=+0.086786086 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., version=17.1.12, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=) Nov 26 03:55:55 localhost podman[101141]: 2025-11-26 08:55:55.437310436 +0000 UTC m=+0.701905795 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, tcib_managed=true, architecture=x86_64, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Nov 26 03:55:55 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:55:56 localhost sshd[101168]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:55:56 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 26 03:55:56 localhost recover_tripleo_nova_virtqemud[101171]: 61604 Nov 26 03:55:56 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 26 03:55:56 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 26 03:55:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:55:59 localhost podman[101172]: 2025-11-26 08:55:59.818895251 +0000 UTC m=+0.078845427 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, version=17.1.12, container_name=nova_compute, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1) Nov 26 03:55:59 localhost podman[101172]: 2025-11-26 08:55:59.876417698 +0000 UTC m=+0.136367874 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, release=1761123044, container_name=nova_compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1) Nov 26 03:55:59 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:56:05 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 26 03:56:05 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4200.1 total, 600.0 interval#012Cumulative writes: 5692 writes, 25K keys, 5692 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5692 writes, 763 syncs, 7.46 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 26 03:56:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:56:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:56:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:56:05 localhost systemd[1]: tmp-crun.MPTEaL.mount: Deactivated successfully. Nov 26 03:56:05 localhost podman[101198]: 2025-11-26 08:56:05.872268476 +0000 UTC m=+0.134939909 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, release=1761123044, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Nov 26 03:56:05 localhost podman[101200]: 2025-11-26 08:56:05.840717745 +0000 UTC m=+0.094737056 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, batch=17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, release=1761123044, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., version=17.1.12, tcib_managed=true) Nov 26 03:56:05 localhost podman[101199]: 2025-11-26 08:56:05.930440613 +0000 UTC m=+0.186436897 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., release=1761123044) Nov 26 03:56:05 localhost podman[101198]: 2025-11-26 08:56:05.961578771 +0000 UTC m=+0.224250194 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, container_name=ceilometer_agent_compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, release=1761123044, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible) Nov 26 03:56:05 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:56:05 localhost podman[101200]: 2025-11-26 08:56:05.974711014 +0000 UTC m=+0.228730265 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, tcib_managed=true, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:56:06 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:56:06 localhost podman[101199]: 2025-11-26 08:56:06.018095536 +0000 UTC m=+0.274091870 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, architecture=x86_64, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Nov 26 03:56:06 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:56:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:56:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:56:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:56:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:56:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:56:07 localhost podman[101271]: 2025-11-26 08:56:07.838020347 +0000 UTC m=+0.095296874 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, name=rhosp17/openstack-collectd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, com.redhat.component=openstack-collectd-container, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Nov 26 03:56:07 localhost podman[101271]: 2025-11-26 08:56:07.847973529 +0000 UTC m=+0.105250046 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Nov 26 03:56:07 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:56:07 localhost podman[101272]: 2025-11-26 08:56:07.893200249 +0000 UTC m=+0.144752746 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, distribution-scope=public, batch=17.1_20251118.1, managed_by=tripleo_ansible) Nov 26 03:56:07 localhost podman[101272]: 2025-11-26 08:56:07.910351559 +0000 UTC m=+0.161904066 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, release=1761123044, config_id=tripleo_step4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible) Nov 26 03:56:07 localhost podman[101272]: unhealthy Nov 26 03:56:07 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:56:07 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 03:56:07 localhost podman[101284]: 2025-11-26 08:56:07.987741809 +0000 UTC m=+0.231398578 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.12, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, io.buildah.version=1.41.4, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Nov 26 03:56:08 localhost podman[101270]: 2025-11-26 08:56:08.049027494 +0000 UTC m=+0.307859550 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, container_name=iscsid, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.buildah.version=1.41.4, url=https://www.redhat.com, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true) Nov 26 03:56:08 localhost podman[101270]: 2025-11-26 08:56:08.086454169 +0000 UTC m=+0.345286255 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid) Nov 26 03:56:08 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:56:08 localhost podman[101273]: 2025-11-26 08:56:08.104113044 +0000 UTC m=+0.350840469 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, container_name=ovn_metadata_agent, distribution-scope=public, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, io.buildah.version=1.41.4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Nov 26 03:56:08 localhost podman[101273]: 2025-11-26 08:56:08.147316211 +0000 UTC m=+0.394043606 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Nov 26 03:56:08 localhost podman[101273]: unhealthy Nov 26 03:56:08 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:56:08 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 03:56:08 localhost podman[101284]: 2025-11-26 08:56:08.360705713 +0000 UTC m=+0.604362492 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.12, architecture=x86_64, container_name=nova_migration_target, release=1761123044, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com) Nov 26 03:56:08 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:56:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 26 03:56:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4200.1 total, 600.0 interval#012Cumulative writes: 4860 writes, 21K keys, 4860 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4860 writes, 621 syncs, 7.83 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 26 03:56:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:56:25 localhost systemd[1]: tmp-crun.RfejSv.mount: Deactivated successfully. Nov 26 03:56:25 localhost podman[101446]: 2025-11-26 08:56:25.82943524 +0000 UTC m=+0.085064982 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, distribution-scope=public, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12, batch=17.1_20251118.1, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team) Nov 26 03:56:26 localhost podman[101446]: 2025-11-26 08:56:26.014211014 +0000 UTC m=+0.269840746 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, config_id=tripleo_step1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:56:26 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:56:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:56:30 localhost podman[101475]: 2025-11-26 08:56:30.816010077 +0000 UTC m=+0.075748010 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, container_name=nova_compute, tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-type=git) Nov 26 03:56:30 localhost podman[101475]: 2025-11-26 08:56:30.847906209 +0000 UTC m=+0.107644142 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, release=1761123044, vcs-type=git, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, container_name=nova_compute, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, tcib_managed=true) Nov 26 03:56:30 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:56:36 localhost sshd[101501]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:56:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:56:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:56:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:56:36 localhost systemd[1]: tmp-crun.xuxHK6.mount: Deactivated successfully. Nov 26 03:56:36 localhost podman[101503]: 2025-11-26 08:56:36.835841207 +0000 UTC m=+0.096361997 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, maintainer=OpenStack TripleO Team) Nov 26 03:56:36 localhost systemd[1]: tmp-crun.st6DNd.mount: Deactivated successfully. Nov 26 03:56:36 localhost podman[101505]: 2025-11-26 08:56:36.884497275 +0000 UTC m=+0.139605865 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, release=1761123044, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z) Nov 26 03:56:36 localhost podman[101503]: 2025-11-26 08:56:36.937672735 +0000 UTC m=+0.198193565 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.12, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container) Nov 26 03:56:36 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:56:36 localhost podman[101505]: 2025-11-26 08:56:36.99671268 +0000 UTC m=+0.251821320 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, maintainer=OpenStack TripleO Team, release=1761123044, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, batch=17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 26 03:56:37 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:56:37 localhost podman[101504]: 2025-11-26 08:56:37.082324789 +0000 UTC m=+0.341484416 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.buildah.version=1.41.4, distribution-scope=public, name=rhosp17/openstack-cron, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4) Nov 26 03:56:37 localhost podman[101504]: 2025-11-26 08:56:37.119508206 +0000 UTC m=+0.378667823 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_id=tripleo_step4, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, version=17.1.12, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1) Nov 26 03:56:37 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:56:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:56:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:56:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:56:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:56:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:56:38 localhost podman[101574]: 2025-11-26 08:56:38.845752694 +0000 UTC m=+0.104492852 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12) Nov 26 03:56:38 localhost podman[101574]: 2025-11-26 08:56:38.878980258 +0000 UTC m=+0.137720406 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, version=17.1.12, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, container_name=iscsid, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public) Nov 26 03:56:38 localhost systemd[1]: tmp-crun.7xUE3z.mount: Deactivated successfully. Nov 26 03:56:38 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:56:38 localhost podman[101576]: 2025-11-26 08:56:38.901829325 +0000 UTC m=+0.153060577 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, distribution-scope=public, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true) Nov 26 03:56:38 localhost podman[101589]: 2025-11-26 08:56:38.952038052 +0000 UTC m=+0.193420795 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, release=1761123044, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, url=https://www.redhat.com, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1) Nov 26 03:56:38 localhost podman[101576]: 2025-11-26 08:56:38.969531543 +0000 UTC m=+0.220762795 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team) Nov 26 03:56:38 localhost podman[101576]: unhealthy Nov 26 03:56:38 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:56:38 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 03:56:39 localhost podman[101575]: 2025-11-26 08:56:39.065428484 +0000 UTC m=+0.319145814 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, batch=17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, version=17.1.12, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.buildah.version=1.41.4, url=https://www.redhat.com) Nov 26 03:56:39 localhost podman[101582]: 2025-11-26 08:56:39.108661862 +0000 UTC m=+0.353565576 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, version=17.1.12, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:56:39 localhost podman[101575]: 2025-11-26 08:56:39.127880505 +0000 UTC m=+0.381597835 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, url=https://www.redhat.com, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, distribution-scope=public, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:56:39 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:56:39 localhost podman[101582]: 2025-11-26 08:56:39.148445541 +0000 UTC m=+0.393349255 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, version=17.1.12, distribution-scope=public, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044) Nov 26 03:56:39 localhost podman[101582]: unhealthy Nov 26 03:56:39 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:56:39 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 03:56:39 localhost podman[101589]: 2025-11-26 08:56:39.350741426 +0000 UTC m=+0.592124169 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, batch=17.1_20251118.1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, version=17.1.12) Nov 26 03:56:39 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:56:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:56:56 localhost systemd[1]: tmp-crun.9gF3zv.mount: Deactivated successfully. Nov 26 03:56:56 localhost podman[101675]: 2025-11-26 08:56:56.840919531 +0000 UTC m=+0.093353252 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:56:57 localhost podman[101675]: 2025-11-26 08:56:57.054916912 +0000 UTC m=+0.307350623 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, version=17.1.12) Nov 26 03:56:57 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:57:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:57:01 localhost systemd[1]: tmp-crun.c42g6c.mount: Deactivated successfully. Nov 26 03:57:01 localhost podman[101705]: 2025-11-26 08:57:01.83163932 +0000 UTC m=+0.092355422 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, release=1761123044, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible) Nov 26 03:57:01 localhost podman[101705]: 2025-11-26 08:57:01.858764821 +0000 UTC m=+0.119480923 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, distribution-scope=public, batch=17.1_20251118.1, container_name=nova_compute, release=1761123044, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:57:01 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:57:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:57:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:57:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:57:07 localhost podman[101735]: 2025-11-26 08:57:07.831778131 +0000 UTC m=+0.081980816 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z) Nov 26 03:57:07 localhost podman[101734]: 2025-11-26 08:57:07.885773147 +0000 UTC m=+0.139548014 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, release=1761123044, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z) Nov 26 03:57:07 localhost podman[101736]: 2025-11-26 08:57:07.937275885 +0000 UTC m=+0.182297237 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1) Nov 26 03:57:07 localhost podman[101734]: 2025-11-26 08:57:07.947349441 +0000 UTC m=+0.201124278 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step4, distribution-scope=public, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vcs-type=git, container_name=ceilometer_agent_compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:57:07 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:57:07 localhost podman[101735]: 2025-11-26 08:57:07.964770968 +0000 UTC m=+0.214973603 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vcs-type=git, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, io.openshift.expose-services=, tcib_managed=true, version=17.1.12) Nov 26 03:57:07 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:57:07 localhost podman[101736]: 2025-11-26 08:57:07.990731513 +0000 UTC m=+0.235752825 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 26 03:57:08 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:57:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:57:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:57:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:57:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:57:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:57:09 localhost podman[101808]: 2025-11-26 08:57:09.832768068 +0000 UTC m=+0.089573024 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-type=git, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Nov 26 03:57:09 localhost podman[101808]: 2025-11-26 08:57:09.838836678 +0000 UTC m=+0.095641594 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1761123044, architecture=x86_64, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git) Nov 26 03:57:09 localhost podman[101808]: unhealthy Nov 26 03:57:09 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:57:09 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 03:57:09 localhost podman[101806]: 2025-11-26 08:57:09.882587203 +0000 UTC m=+0.147872886 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, release=1761123044) Nov 26 03:57:09 localhost systemd[1]: tmp-crun.zbtYFZ.mount: Deactivated successfully. Nov 26 03:57:09 localhost podman[101807]: 2025-11-26 08:57:09.932385896 +0000 UTC m=+0.196786491 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step3, distribution-scope=public) Nov 26 03:57:09 localhost podman[101806]: 2025-11-26 08:57:09.947336327 +0000 UTC m=+0.212622050 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, container_name=iscsid, maintainer=OpenStack TripleO Team, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, com.redhat.component=openstack-iscsid-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid) Nov 26 03:57:09 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:57:09 localhost podman[101820]: 2025-11-26 08:57:09.986669602 +0000 UTC m=+0.240019630 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step4, container_name=nova_migration_target) Nov 26 03:57:09 localhost podman[101807]: 2025-11-26 08:57:09.997613736 +0000 UTC m=+0.262014341 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, distribution-scope=public, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_id=tripleo_step3, container_name=collectd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container) Nov 26 03:57:10 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:57:10 localhost podman[101809]: 2025-11-26 08:57:10.050064502 +0000 UTC m=+0.305510426 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, container_name=ovn_metadata_agent, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, io.openshift.expose-services=, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Nov 26 03:57:10 localhost podman[101809]: 2025-11-26 08:57:10.064666171 +0000 UTC m=+0.320112145 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, batch=17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, version=17.1.12, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 26 03:57:10 localhost podman[101809]: unhealthy Nov 26 03:57:10 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:57:10 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 03:57:10 localhost podman[101820]: 2025-11-26 08:57:10.348654931 +0000 UTC m=+0.602004969 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, config_id=tripleo_step4, version=17.1.12, io.openshift.expose-services=) Nov 26 03:57:10 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:57:14 localhost sshd[101906]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:57:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:57:27 localhost systemd[1]: tmp-crun.qcZSdq.mount: Deactivated successfully. Nov 26 03:57:27 localhost podman[101984]: 2025-11-26 08:57:27.851100954 +0000 UTC m=+0.105661300 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, version=17.1.12, io.openshift.expose-services=, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 03:57:28 localhost podman[101984]: 2025-11-26 08:57:28.104624266 +0000 UTC m=+0.359184612 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., batch=17.1_20251118.1, managed_by=tripleo_ansible) Nov 26 03:57:28 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:57:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:57:32 localhost podman[102013]: 2025-11-26 08:57:32.818319162 +0000 UTC m=+0.083829874 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, distribution-scope=public, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-type=git, release=1761123044, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:57:32 localhost podman[102013]: 2025-11-26 08:57:32.873485454 +0000 UTC m=+0.138996196 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044) Nov 26 03:57:32 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:57:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:57:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:57:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:57:38 localhost podman[102040]: 2025-11-26 08:57:38.819975259 +0000 UTC m=+0.076060350 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., version=17.1.12, managed_by=tripleo_ansible, tcib_managed=true) Nov 26 03:57:38 localhost podman[102040]: 2025-11-26 08:57:38.856374072 +0000 UTC m=+0.112459103 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, distribution-scope=public, tcib_managed=true, release=1761123044, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64) Nov 26 03:57:38 localhost podman[102039]: 2025-11-26 08:57:38.869058431 +0000 UTC m=+0.127465445 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, vcs-type=git, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, release=1761123044, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, version=17.1.12, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 26 03:57:38 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:57:38 localhost podman[102041]: 2025-11-26 08:57:38.932862404 +0000 UTC m=+0.183553755 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:57:38 localhost podman[102039]: 2025-11-26 08:57:38.94864223 +0000 UTC m=+0.207049294 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, url=https://www.redhat.com, vcs-type=git, batch=17.1_20251118.1, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., version=17.1.12) Nov 26 03:57:38 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:57:38 localhost podman[102041]: 2025-11-26 08:57:38.965270682 +0000 UTC m=+0.215962023 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi) Nov 26 03:57:38 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:57:39 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 26 03:57:39 localhost recover_tripleo_nova_virtqemud[102111]: 61604 Nov 26 03:57:39 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 26 03:57:39 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 26 03:57:39 localhost systemd[1]: tmp-crun.4y2ZBp.mount: Deactivated successfully. Nov 26 03:57:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:57:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:57:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:57:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:57:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:57:40 localhost systemd[1]: tmp-crun.hRHboZ.mount: Deactivated successfully. Nov 26 03:57:40 localhost podman[102113]: 2025-11-26 08:57:40.803199868 +0000 UTC m=+0.066510240 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-11-18T22:51:28Z, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, container_name=collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., tcib_managed=true) Nov 26 03:57:40 localhost systemd[1]: tmp-crun.StklCV.mount: Deactivated successfully. Nov 26 03:57:40 localhost podman[102114]: 2025-11-26 08:57:40.836963738 +0000 UTC m=+0.093447125 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, config_id=tripleo_step4, io.buildah.version=1.41.4, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:57:40 localhost podman[102125]: 2025-11-26 08:57:40.863672928 +0000 UTC m=+0.119128283 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, architecture=x86_64, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 26 03:57:40 localhost podman[102114]: 2025-11-26 08:57:40.920242444 +0000 UTC m=+0.176725831 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, config_id=tripleo_step4, version=17.1.12, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z) Nov 26 03:57:40 localhost podman[102114]: unhealthy Nov 26 03:57:40 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:57:40 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 03:57:40 localhost podman[102112]: 2025-11-26 08:57:40.888748795 +0000 UTC m=+0.151415116 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, vendor=Red Hat, Inc., container_name=iscsid, vcs-type=git, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Nov 26 03:57:40 localhost podman[102125]: 2025-11-26 08:57:40.946144057 +0000 UTC m=+0.201599382 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, container_name=ovn_metadata_agent) Nov 26 03:57:40 localhost podman[102125]: unhealthy Nov 26 03:57:40 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:57:40 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 03:57:40 localhost podman[102127]: 2025-11-26 08:57:40.997363076 +0000 UTC m=+0.247149253 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true) Nov 26 03:57:41 localhost podman[102113]: 2025-11-26 08:57:41.019917854 +0000 UTC m=+0.283228346 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, release=1761123044, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd) Nov 26 03:57:41 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:57:41 localhost podman[102112]: 2025-11-26 08:57:41.072047952 +0000 UTC m=+0.334714273 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step3, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, version=17.1.12, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, batch=17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Nov 26 03:57:41 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:57:41 localhost podman[102127]: 2025-11-26 08:57:41.340292207 +0000 UTC m=+0.590078434 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, release=1761123044, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:57:41 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:57:52 localhost sshd[102212]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:57:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:57:58 localhost podman[102214]: 2025-11-26 08:57:58.797455376 +0000 UTC m=+0.060542561 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., release=1761123044, tcib_managed=true) Nov 26 03:57:58 localhost podman[102214]: 2025-11-26 08:57:58.989195999 +0000 UTC m=+0.252283134 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., release=1761123044, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, tcib_managed=true, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z) Nov 26 03:57:58 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:58:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:58:03 localhost podman[102243]: 2025-11-26 08:58:03.822650218 +0000 UTC m=+0.089933516 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=nova_compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:58:03 localhost podman[102243]: 2025-11-26 08:58:03.855034375 +0000 UTC m=+0.122317643 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, version=17.1.12, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:58:03 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:58:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:58:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:58:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:58:09 localhost systemd[1]: tmp-crun.2fdk3r.mount: Deactivated successfully. Nov 26 03:58:09 localhost podman[102269]: 2025-11-26 08:58:09.839917647 +0000 UTC m=+0.102330474 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, managed_by=tripleo_ansible, release=1761123044, io.buildah.version=1.41.4, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z) Nov 26 03:58:09 localhost podman[102269]: 2025-11-26 08:58:09.872433879 +0000 UTC m=+0.134846706 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z) Nov 26 03:58:09 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:58:09 localhost podman[102271]: 2025-11-26 08:58:09.890704062 +0000 UTC m=+0.141383671 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:58:09 localhost podman[102270]: 2025-11-26 08:58:09.941007502 +0000 UTC m=+0.199530767 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vendor=Red Hat, Inc., vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, version=17.1.12, config_id=tripleo_step4, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Nov 26 03:58:09 localhost podman[102270]: 2025-11-26 08:58:09.949126887 +0000 UTC m=+0.207650152 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, container_name=logrotate_crond, config_id=tripleo_step4, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:58:09 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:58:10 localhost podman[102271]: 2025-11-26 08:58:10.00015844 +0000 UTC m=+0.250838059 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, version=17.1.12, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 26 03:58:10 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:58:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:58:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:58:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:58:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:58:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:58:11 localhost podman[102341]: 2025-11-26 08:58:11.831151549 +0000 UTC m=+0.087959004 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, config_id=tripleo_step3, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20251118.1) Nov 26 03:58:11 localhost podman[102344]: 2025-11-26 08:58:11.885698111 +0000 UTC m=+0.135900048 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-type=git, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044) Nov 26 03:58:11 localhost podman[102341]: 2025-11-26 08:58:11.945834641 +0000 UTC m=+0.202642086 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc.) Nov 26 03:58:11 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:58:11 localhost podman[102343]: 2025-11-26 08:58:11.988626464 +0000 UTC m=+0.241073032 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, architecture=x86_64, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1761123044) Nov 26 03:58:12 localhost podman[102342]: 2025-11-26 08:58:12.038554313 +0000 UTC m=+0.292844279 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, batch=17.1_20251118.1, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, container_name=ovn_controller) Nov 26 03:58:12 localhost podman[102340]: 2025-11-26 08:58:11.945727357 +0000 UTC m=+0.203631427 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.4, distribution-scope=public, container_name=iscsid, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 26 03:58:12 localhost podman[102343]: 2025-11-26 08:58:12.057429286 +0000 UTC m=+0.309875844 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, release=1761123044, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, vcs-type=git) Nov 26 03:58:12 localhost podman[102343]: unhealthy Nov 26 03:58:12 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:58:12 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 03:58:12 localhost podman[102342]: 2025-11-26 08:58:12.07986411 +0000 UTC m=+0.334154036 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, architecture=x86_64, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Nov 26 03:58:12 localhost podman[102342]: unhealthy Nov 26 03:58:12 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:58:12 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 03:58:12 localhost podman[102340]: 2025-11-26 08:58:12.130275634 +0000 UTC m=+0.388179704 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step3, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true) Nov 26 03:58:12 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:58:12 localhost podman[102344]: 2025-11-26 08:58:12.221140027 +0000 UTC m=+0.471341874 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, release=1761123044, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.4, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z) Nov 26 03:58:12 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:58:12 localhost systemd[1]: tmp-crun.1o0J5x.mount: Deactivated successfully. Nov 26 03:58:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:58:29 localhost systemd[1]: tmp-crun.gZ9ZlW.mount: Deactivated successfully. Nov 26 03:58:29 localhost podman[102565]: 2025-11-26 08:58:29.878113875 +0000 UTC m=+0.141570457 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, tcib_managed=true, version=17.1.12, name=rhosp17/openstack-qdrouterd, distribution-scope=public, release=1761123044, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, managed_by=tripleo_ansible) Nov 26 03:58:30 localhost podman[102565]: 2025-11-26 08:58:30.055202256 +0000 UTC m=+0.318658828 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, version=17.1.12, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 03:58:30 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:58:32 localhost sshd[102594]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:58:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:58:34 localhost podman[102596]: 2025-11-26 08:58:34.826687109 +0000 UTC m=+0.087376656 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, release=1761123044, name=rhosp17/openstack-nova-compute, container_name=nova_compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Nov 26 03:58:34 localhost podman[102596]: 2025-11-26 08:58:34.888626234 +0000 UTC m=+0.149315751 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, batch=17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, distribution-scope=public) Nov 26 03:58:34 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:58:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:58:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:58:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:58:40 localhost podman[102623]: 2025-11-26 08:58:40.816501226 +0000 UTC m=+0.082376389 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, build-date=2025-11-19T00:11:48Z, architecture=x86_64, url=https://www.redhat.com, version=17.1.12, vendor=Red Hat, Inc., tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, release=1761123044) Nov 26 03:58:40 localhost systemd[1]: tmp-crun.1dLYMq.mount: Deactivated successfully. Nov 26 03:58:40 localhost podman[102623]: 2025-11-26 08:58:40.874395874 +0000 UTC m=+0.140271057 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, config_id=tripleo_step4, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, batch=17.1_20251118.1) Nov 26 03:58:40 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:58:40 localhost podman[102624]: 2025-11-26 08:58:40.927554013 +0000 UTC m=+0.191250917 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, version=17.1.12, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=logrotate_crond, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Nov 26 03:58:40 localhost podman[102625]: 2025-11-26 08:58:40.874249179 +0000 UTC m=+0.135136535 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 26 03:58:40 localhost podman[102624]: 2025-11-26 08:58:40.962242354 +0000 UTC m=+0.225939258 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.openshift.expose-services=) Nov 26 03:58:40 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:58:41 localhost podman[102625]: 2025-11-26 08:58:41.011444188 +0000 UTC m=+0.272331544 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, io.openshift.expose-services=) Nov 26 03:58:41 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:58:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:58:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:58:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:58:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:58:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:58:42 localhost systemd[1]: tmp-crun.c0QAO6.mount: Deactivated successfully. Nov 26 03:58:42 localhost podman[102699]: 2025-11-26 08:58:42.833425173 +0000 UTC m=+0.093660793 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., url=https://www.redhat.com, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 03:58:42 localhost podman[102698]: 2025-11-26 08:58:42.870774266 +0000 UTC m=+0.136582031 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, vcs-type=git, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, url=https://www.redhat.com) Nov 26 03:58:42 localhost podman[102707]: 2025-11-26 08:58:42.89129084 +0000 UTC m=+0.142184596 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, config_id=tripleo_step4, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.buildah.version=1.41.4, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=nova_migration_target) Nov 26 03:58:42 localhost podman[102699]: 2025-11-26 08:58:42.897106223 +0000 UTC m=+0.157341833 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, distribution-scope=public, version=17.1.12, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3) Nov 26 03:58:42 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:58:42 localhost podman[102700]: 2025-11-26 08:58:42.970434057 +0000 UTC m=+0.228851179 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, batch=17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Nov 26 03:58:43 localhost podman[102700]: 2025-11-26 08:58:43.012699744 +0000 UTC m=+0.271116916 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, name=rhosp17/openstack-ovn-controller) Nov 26 03:58:43 localhost podman[102700]: unhealthy Nov 26 03:58:43 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:58:43 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 03:58:43 localhost podman[102701]: 2025-11-26 08:58:43.030371569 +0000 UTC m=+0.286811529 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, io.buildah.version=1.41.4, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z) Nov 26 03:58:43 localhost podman[102701]: 2025-11-26 08:58:43.047364692 +0000 UTC m=+0.303804632 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, release=1761123044, version=17.1.12, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1) Nov 26 03:58:43 localhost podman[102701]: unhealthy Nov 26 03:58:43 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:58:43 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 03:58:43 localhost podman[102698]: 2025-11-26 08:58:43.058460281 +0000 UTC m=+0.324268086 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, vcs-type=git, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z) Nov 26 03:58:43 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:58:43 localhost podman[102707]: 2025-11-26 08:58:43.255671605 +0000 UTC m=+0.506565401 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-nova-compute, version=17.1.12, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:58:43 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:58:43 localhost systemd[1]: tmp-crun.kkdk9F.mount: Deactivated successfully. Nov 26 03:58:57 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 26 03:58:57 localhost recover_tripleo_nova_virtqemud[102800]: 61604 Nov 26 03:58:57 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 26 03:58:57 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 26 03:59:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:59:00 localhost podman[102801]: 2025-11-26 08:59:00.829076629 +0000 UTC m=+0.086469796 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp17/openstack-qdrouterd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 03:59:01 localhost podman[102801]: 2025-11-26 08:59:01.026325355 +0000 UTC m=+0.283718542 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, container_name=metrics_qdr, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step1, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.4, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044) Nov 26 03:59:01 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:59:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:59:05 localhost systemd[1]: tmp-crun.IH5ryX.mount: Deactivated successfully. Nov 26 03:59:05 localhost podman[102831]: 2025-11-26 08:59:05.80683395 +0000 UTC m=+0.072488817 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-11-19T00:36:58Z, release=1761123044, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step5, batch=17.1_20251118.1, architecture=x86_64, container_name=nova_compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 03:59:05 localhost podman[102831]: 2025-11-26 08:59:05.833150138 +0000 UTC m=+0.098804945 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, config_id=tripleo_step5, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc.) Nov 26 03:59:05 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:59:10 localhost sshd[102858]: main: sshd: ssh-rsa algorithm is disabled Nov 26 03:59:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:59:11 localhost systemd[1]: tmp-crun.W6Kx4W.mount: Deactivated successfully. Nov 26 03:59:11 localhost podman[102860]: 2025-11-26 08:59:11.089001352 +0000 UTC m=+0.106639800 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-type=git, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 26 03:59:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:59:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:59:11 localhost podman[102860]: 2025-11-26 08:59:11.150172073 +0000 UTC m=+0.167810361 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, vcs-type=git, version=17.1.12, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container) Nov 26 03:59:11 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:59:11 localhost podman[102882]: 2025-11-26 08:59:11.226345386 +0000 UTC m=+0.109260563 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, version=17.1.12, distribution-scope=public, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Nov 26 03:59:11 localhost podman[102882]: 2025-11-26 08:59:11.238533518 +0000 UTC m=+0.121448665 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, tcib_managed=true, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-cron-container, release=1761123044, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Nov 26 03:59:11 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:59:11 localhost podman[102885]: 2025-11-26 08:59:11.291911274 +0000 UTC m=+0.170207796 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Nov 26 03:59:11 localhost podman[102885]: 2025-11-26 08:59:11.322300799 +0000 UTC m=+0.200597361 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, config_id=tripleo_step4, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z) Nov 26 03:59:11 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:59:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:59:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:59:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:59:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:59:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:59:13 localhost podman[102932]: 2025-11-26 08:59:13.816141136 +0000 UTC m=+0.075649037 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4) Nov 26 03:59:13 localhost podman[102931]: 2025-11-26 08:59:13.889523321 +0000 UTC m=+0.145503811 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid) Nov 26 03:59:13 localhost podman[102933]: 2025-11-26 08:59:13.845787987 +0000 UTC m=+0.096466730 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.expose-services=, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 26 03:59:13 localhost podman[102932]: 2025-11-26 08:59:13.900420793 +0000 UTC m=+0.159928734 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., distribution-scope=public, release=1761123044, com.redhat.component=openstack-collectd-container, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, tcib_managed=true, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 03:59:13 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:59:13 localhost podman[102944]: 2025-11-26 08:59:13.953666816 +0000 UTC m=+0.201854392 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, release=1761123044, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, managed_by=tripleo_ansible) Nov 26 03:59:13 localhost podman[102945]: 2025-11-26 08:59:13.974151259 +0000 UTC m=+0.217763321 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, tcib_managed=true, release=1761123044, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 03:59:14 localhost podman[102931]: 2025-11-26 08:59:14.026332817 +0000 UTC m=+0.282313307 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, release=1761123044, config_id=tripleo_step3, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, vcs-type=git) Nov 26 03:59:14 localhost podman[102933]: 2025-11-26 08:59:14.02704065 +0000 UTC m=+0.277719403 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, container_name=ovn_controller, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, version=17.1.12, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, vcs-type=git, build-date=2025-11-18T23:34:05Z) Nov 26 03:59:14 localhost podman[102933]: unhealthy Nov 26 03:59:14 localhost podman[102944]: 2025-11-26 08:59:14.036318311 +0000 UTC m=+0.284505817 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, url=https://www.redhat.com, container_name=ovn_metadata_agent, distribution-scope=public, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Nov 26 03:59:14 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:59:14 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 03:59:14 localhost podman[102944]: unhealthy Nov 26 03:59:14 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:59:14 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 03:59:14 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:59:14 localhost podman[102945]: 2025-11-26 08:59:14.311322948 +0000 UTC m=+0.554935000 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, architecture=x86_64, config_id=tripleo_step4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target) Nov 26 03:59:14 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:59:14 localhost systemd[1]: tmp-crun.VA97Mb.mount: Deactivated successfully. Nov 26 03:59:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 03:59:31 localhost podman[103106]: 2025-11-26 08:59:31.822405873 +0000 UTC m=+0.083538555 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1) Nov 26 03:59:32 localhost podman[103106]: 2025-11-26 08:59:32.020711712 +0000 UTC m=+0.281844154 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, batch=17.1_20251118.1, container_name=metrics_qdr, architecture=x86_64, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, tcib_managed=true, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, distribution-scope=public) Nov 26 03:59:32 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 03:59:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 03:59:36 localhost podman[103136]: 2025-11-26 08:59:36.807611977 +0000 UTC m=+0.069868385 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, config_id=tripleo_step5) Nov 26 03:59:36 localhost podman[103136]: 2025-11-26 08:59:36.831852358 +0000 UTC m=+0.094108776 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, container_name=nova_compute, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_id=tripleo_step5, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute) Nov 26 03:59:36 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 03:59:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 03:59:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 03:59:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 03:59:41 localhost podman[103163]: 2025-11-26 08:59:41.880369112 +0000 UTC m=+0.140266507 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, distribution-scope=public, version=17.1.12, vcs-type=git, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 26 03:59:41 localhost podman[103162]: 2025-11-26 08:59:41.92807989 +0000 UTC m=+0.191288618 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, distribution-scope=public, vcs-type=git, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Nov 26 03:59:41 localhost podman[103161]: 2025-11-26 08:59:41.844319429 +0000 UTC m=+0.108917901 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.12, release=1761123044, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 26 03:59:41 localhost podman[103163]: 2025-11-26 08:59:41.957487024 +0000 UTC m=+0.217384529 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, version=17.1.12, tcib_managed=true, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 26 03:59:41 localhost podman[103162]: 2025-11-26 08:59:41.968980875 +0000 UTC m=+0.232189643 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vcs-type=git, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., container_name=logrotate_crond, io.openshift.expose-services=, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron) Nov 26 03:59:41 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 03:59:41 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 03:59:42 localhost podman[103161]: 2025-11-26 08:59:42.026344786 +0000 UTC m=+0.290943208 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, distribution-scope=public, container_name=ceilometer_agent_compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:59:42 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 03:59:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 03:59:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 03:59:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 03:59:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 03:59:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 03:59:44 localhost podman[103234]: 2025-11-26 08:59:44.86488934 +0000 UTC m=+0.127813656 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, architecture=x86_64, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, version=17.1.12, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Nov 26 03:59:44 localhost podman[103235]: 2025-11-26 08:59:44.815043604 +0000 UTC m=+0.075573374 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, url=https://www.redhat.com, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 26 03:59:44 localhost podman[103232]: 2025-11-26 08:59:44.927871798 +0000 UTC m=+0.191262488 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, release=1761123044, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git) Nov 26 03:59:44 localhost podman[103233]: 2025-11-26 08:59:44.839004906 +0000 UTC m=+0.100754284 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, release=1761123044, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team) Nov 26 03:59:44 localhost podman[103232]: 2025-11-26 08:59:44.940458693 +0000 UTC m=+0.203849333 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=iscsid, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, architecture=x86_64, name=rhosp17/openstack-iscsid, distribution-scope=public, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, batch=17.1_20251118.1, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 26 03:59:44 localhost podman[103235]: 2025-11-26 08:59:44.951236952 +0000 UTC m=+0.211766702 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.12, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1) Nov 26 03:59:44 localhost podman[103235]: unhealthy Nov 26 03:59:44 localhost podman[103234]: 2025-11-26 08:59:44.957157318 +0000 UTC m=+0.220081654 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4) Nov 26 03:59:44 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:59:44 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 03:59:44 localhost podman[103234]: unhealthy Nov 26 03:59:44 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 03:59:44 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 03:59:45 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 03:59:45 localhost podman[103233]: 2025-11-26 08:59:45.02951222 +0000 UTC m=+0.291261638 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, com.redhat.component=openstack-collectd-container, container_name=collectd, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Nov 26 03:59:45 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 03:59:45 localhost podman[103236]: 2025-11-26 08:59:45.073019857 +0000 UTC m=+0.330928285 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, version=17.1.12, distribution-scope=public, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Nov 26 03:59:45 localhost podman[103236]: 2025-11-26 08:59:45.422284626 +0000 UTC m=+0.680193124 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.4, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 03:59:45 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 03:59:51 localhost sshd[103332]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:00:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 04:00:02 localhost podman[103338]: 2025-11-26 09:00:02.824260345 +0000 UTC m=+0.088727668 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, url=https://www.redhat.com, release=1761123044, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 04:00:03 localhost podman[103338]: 2025-11-26 09:00:03.046541576 +0000 UTC m=+0.311008859 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 04:00:03 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 04:00:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 04:00:07 localhost podman[103367]: 2025-11-26 09:00:07.818917469 +0000 UTC m=+0.081520662 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, version=17.1.12, managed_by=tripleo_ansible, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, container_name=nova_compute, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute) Nov 26 04:00:07 localhost podman[103367]: 2025-11-26 09:00:07.878277694 +0000 UTC m=+0.140880817 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, version=17.1.12, io.buildah.version=1.41.4) Nov 26 04:00:07 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Deactivated successfully. Nov 26 04:00:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 04:00:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 04:00:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 04:00:12 localhost systemd[1]: tmp-crun.AsDcq7.mount: Deactivated successfully. Nov 26 04:00:12 localhost podman[103398]: 2025-11-26 09:00:12.817872875 +0000 UTC m=+0.075780811 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, version=17.1.12, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z) Nov 26 04:00:12 localhost podman[103398]: 2025-11-26 09:00:12.84637105 +0000 UTC m=+0.104279006 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20251118.1, version=17.1.12, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, config_id=tripleo_step4, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true) Nov 26 04:00:12 localhost podman[103397]: 2025-11-26 09:00:12.860506144 +0000 UTC m=+0.118977048 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, release=1761123044, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, version=17.1.12, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-cron-container) Nov 26 04:00:12 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 04:00:12 localhost podman[103397]: 2025-11-26 09:00:12.866167561 +0000 UTC m=+0.124638535 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, release=1761123044, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_id=tripleo_step4, url=https://www.redhat.com) Nov 26 04:00:12 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 04:00:12 localhost podman[103396]: 2025-11-26 09:00:12.923335948 +0000 UTC m=+0.184524767 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, managed_by=tripleo_ansible) Nov 26 04:00:12 localhost podman[103396]: 2025-11-26 09:00:12.975506295 +0000 UTC m=+0.236695104 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, url=https://www.redhat.com, io.buildah.version=1.41.4, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, architecture=x86_64) Nov 26 04:00:12 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 04:00:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 04:00:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 04:00:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 04:00:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 04:00:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 04:00:15 localhost systemd[1]: tmp-crun.cKNmal.mount: Deactivated successfully. Nov 26 04:00:15 localhost podman[103466]: 2025-11-26 09:00:15.816551725 +0000 UTC m=+0.083540844 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, release=1761123044, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Nov 26 04:00:15 localhost podman[103467]: 2025-11-26 09:00:15.839420534 +0000 UTC m=+0.098800534 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, version=17.1.12, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, container_name=collectd, batch=17.1_20251118.1, distribution-scope=public) Nov 26 04:00:15 localhost podman[103466]: 2025-11-26 09:00:15.904429826 +0000 UTC m=+0.171418905 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.component=openstack-iscsid-container, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, io.openshift.expose-services=) Nov 26 04:00:15 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 04:00:15 localhost podman[103475]: 2025-11-26 09:00:15.87366632 +0000 UTC m=+0.126773883 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, version=17.1.12, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, url=https://www.redhat.com, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, release=1761123044, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container) Nov 26 04:00:15 localhost podman[103467]: 2025-11-26 09:00:15.927325915 +0000 UTC m=+0.186705945 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, architecture=x86_64, distribution-scope=public, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team) Nov 26 04:00:15 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 04:00:15 localhost podman[103474]: 2025-11-26 09:00:15.977708157 +0000 UTC m=+0.235062284 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, batch=17.1_20251118.1, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc.) Nov 26 04:00:16 localhost podman[103474]: 2025-11-26 09:00:16.023362901 +0000 UTC m=+0.280717058 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, io.openshift.expose-services=, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_id=tripleo_step4) Nov 26 04:00:16 localhost podman[103474]: unhealthy Nov 26 04:00:16 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:00:16 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 04:00:16 localhost podman[103468]: 2025-11-26 09:00:16.032219369 +0000 UTC m=+0.292489727 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, release=1761123044, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Nov 26 04:00:16 localhost podman[103468]: 2025-11-26 09:00:16.113590315 +0000 UTC m=+0.373860743 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, managed_by=tripleo_ansible, url=https://www.redhat.com) Nov 26 04:00:16 localhost podman[103468]: unhealthy Nov 26 04:00:16 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:00:16 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 04:00:16 localhost podman[103475]: 2025-11-26 09:00:16.254411128 +0000 UTC m=+0.507518741 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1) Nov 26 04:00:16 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 04:00:16 localhost systemd[1]: tmp-crun.8GUqhi.mount: Deactivated successfully. Nov 26 04:00:25 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 26 04:00:25 localhost recover_tripleo_nova_virtqemud[103580]: 61604 Nov 26 04:00:25 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 26 04:00:25 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 26 04:00:26 localhost systemd[1]: tmp-crun.04AeGv.mount: Deactivated successfully. Nov 26 04:00:26 localhost podman[103669]: 2025-11-26 09:00:26.317815647 +0000 UTC m=+0.127431194 container exec a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, architecture=x86_64, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True) Nov 26 04:00:26 localhost podman[103669]: 2025-11-26 09:00:26.401044541 +0000 UTC m=+0.210660168 container exec_died a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, GIT_BRANCH=main, version=7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55) Nov 26 04:00:33 localhost sshd[103810]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:00:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 04:00:33 localhost systemd[1]: tmp-crun.rJU344.mount: Deactivated successfully. Nov 26 04:00:33 localhost podman[103811]: 2025-11-26 09:00:33.855551082 +0000 UTC m=+0.111974298 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, managed_by=tripleo_ansible, version=17.1.12) Nov 26 04:00:34 localhost podman[103811]: 2025-11-26 09:00:34.04333757 +0000 UTC m=+0.299760786 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.openshift.expose-services=, version=17.1.12) Nov 26 04:00:34 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 04:00:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 04:00:38 localhost systemd[1]: tmp-crun.iY9WsY.mount: Deactivated successfully. Nov 26 04:00:38 localhost podman[103842]: 2025-11-26 09:00:38.810092983 +0000 UTC m=+0.070758563 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, architecture=x86_64, release=1761123044) Nov 26 04:00:38 localhost podman[103842]: 2025-11-26 09:00:38.829324287 +0000 UTC m=+0.089989877 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5) Nov 26 04:00:38 localhost podman[103842]: unhealthy Nov 26 04:00:38 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:00:38 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Failed with result 'exit-code'. Nov 26 04:00:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 04:00:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 04:00:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 04:00:43 localhost podman[103865]: 2025-11-26 09:00:43.82139056 +0000 UTC m=+0.079001913 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 04:00:43 localhost podman[103865]: 2025-11-26 09:00:43.828867285 +0000 UTC m=+0.086478668 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible) Nov 26 04:00:43 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 04:00:43 localhost podman[103866]: 2025-11-26 09:00:43.895593201 +0000 UTC m=+0.150618352 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 04:00:43 localhost podman[103866]: 2025-11-26 09:00:43.926758049 +0000 UTC m=+0.181783210 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z) Nov 26 04:00:43 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 04:00:43 localhost podman[103864]: 2025-11-26 09:00:43.927671388 +0000 UTC m=+0.187789459 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, distribution-scope=public, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team) Nov 26 04:00:44 localhost podman[103864]: 2025-11-26 09:00:44.006605987 +0000 UTC m=+0.266724118 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com) Nov 26 04:00:44 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 04:00:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 04:00:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 04:00:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 04:00:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 04:00:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 04:00:46 localhost systemd[1]: tmp-crun.apv1Fc.mount: Deactivated successfully. Nov 26 04:00:46 localhost podman[103937]: 2025-11-26 09:00:46.839819732 +0000 UTC m=+0.088420109 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, config_id=tripleo_step4, tcib_managed=true, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, batch=17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Nov 26 04:00:46 localhost systemd[1]: tmp-crun.p7v4es.mount: Deactivated successfully. Nov 26 04:00:46 localhost podman[103943]: 2025-11-26 09:00:46.876003587 +0000 UTC m=+0.119814273 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, container_name=ovn_metadata_agent, vcs-type=git, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4) Nov 26 04:00:46 localhost podman[103937]: 2025-11-26 09:00:46.881334465 +0000 UTC m=+0.129934842 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, distribution-scope=public, config_id=tripleo_step4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.buildah.version=1.41.4, batch=17.1_20251118.1) Nov 26 04:00:46 localhost podman[103943]: 2025-11-26 09:00:46.884288988 +0000 UTC m=+0.128099684 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1761123044, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public) Nov 26 04:00:46 localhost podman[103937]: unhealthy Nov 26 04:00:46 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:00:46 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 04:00:46 localhost podman[103954]: 2025-11-26 09:00:46.852771728 +0000 UTC m=+0.087744786 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, container_name=nova_migration_target, distribution-scope=public, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Nov 26 04:00:46 localhost podman[103943]: unhealthy Nov 26 04:00:46 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:00:46 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 04:00:46 localhost podman[103936]: 2025-11-26 09:00:46.982084429 +0000 UTC m=+0.229333853 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, distribution-scope=public, architecture=x86_64, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, release=1761123044, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, container_name=collectd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Nov 26 04:00:47 localhost podman[103936]: 2025-11-26 09:00:47.024949006 +0000 UTC m=+0.272198460 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, release=1761123044, distribution-scope=public, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Nov 26 04:00:47 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 04:00:47 localhost podman[103935]: 2025-11-26 09:00:47.044313475 +0000 UTC m=+0.295284716 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, distribution-scope=public, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, architecture=x86_64, vcs-type=git, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044) Nov 26 04:00:47 localhost podman[103935]: 2025-11-26 09:00:47.055454885 +0000 UTC m=+0.306426206 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, version=17.1.12, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, managed_by=tripleo_ansible, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044) Nov 26 04:00:47 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 04:00:47 localhost podman[103954]: 2025-11-26 09:00:47.233862687 +0000 UTC m=+0.468835805 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, release=1761123044, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 04:00:47 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 04:01:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 04:01:04 localhost podman[104061]: 2025-11-26 09:01:04.816027576 +0000 UTC m=+0.079465816 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, version=17.1.12, vendor=Red Hat, Inc., vcs-type=git, url=https://www.redhat.com, release=1761123044, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 04:01:05 localhost podman[104061]: 2025-11-26 09:01:05.022715358 +0000 UTC m=+0.286153568 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=) Nov 26 04:01:05 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 04:01:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 04:01:09 localhost podman[104090]: 2025-11-26 09:01:09.823878805 +0000 UTC m=+0.084790834 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 04:01:09 localhost podman[104090]: 2025-11-26 09:01:09.864182331 +0000 UTC m=+0.125094380 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, url=https://www.redhat.com, io.buildah.version=1.41.4, distribution-scope=public, container_name=nova_compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=) Nov 26 04:01:09 localhost podman[104090]: unhealthy Nov 26 04:01:09 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:01:09 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Failed with result 'exit-code'. Nov 26 04:01:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 04:01:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 04:01:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 04:01:14 localhost podman[104113]: 2025-11-26 09:01:14.830996878 +0000 UTC m=+0.082175861 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, config_id=tripleo_step4, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, container_name=ceilometer_agent_compute) Nov 26 04:01:14 localhost podman[104113]: 2025-11-26 09:01:14.865460381 +0000 UTC m=+0.116639434 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, container_name=ceilometer_agent_compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-19T00:11:48Z, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute) Nov 26 04:01:14 localhost podman[104114]: 2025-11-26 09:01:14.893923685 +0000 UTC m=+0.140883375 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, release=1761123044, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com) Nov 26 04:01:14 localhost podman[104114]: 2025-11-26 09:01:14.924734023 +0000 UTC m=+0.171693673 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z) Nov 26 04:01:14 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 04:01:14 localhost podman[104115]: 2025-11-26 09:01:14.942302874 +0000 UTC m=+0.187002664 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4) Nov 26 04:01:14 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 04:01:15 localhost podman[104115]: 2025-11-26 09:01:15.023844605 +0000 UTC m=+0.268544425 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, config_id=tripleo_step4, tcib_managed=true, container_name=ceilometer_agent_ipmi, distribution-scope=public, io.buildah.version=1.41.4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 04:01:15 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 04:01:16 localhost sshd[104185]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:01:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 04:01:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 04:01:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 04:01:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 04:01:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 04:01:17 localhost systemd[1]: tmp-crun.iJyx5A.mount: Deactivated successfully. Nov 26 04:01:17 localhost podman[104189]: 2025-11-26 09:01:17.841670018 +0000 UTC m=+0.095218352 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, url=https://www.redhat.com, managed_by=tripleo_ansible, io.buildah.version=1.41.4, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true) Nov 26 04:01:17 localhost podman[104189]: 2025-11-26 09:01:17.885257097 +0000 UTC m=+0.138805351 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, tcib_managed=true, container_name=ovn_controller, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible) Nov 26 04:01:17 localhost podman[104189]: unhealthy Nov 26 04:01:17 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:01:17 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 04:01:17 localhost podman[104190]: 2025-11-26 09:01:17.879207487 +0000 UTC m=+0.128634961 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 26 04:01:17 localhost podman[104188]: 2025-11-26 09:01:17.935457194 +0000 UTC m=+0.190112592 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Nov 26 04:01:17 localhost podman[104196]: 2025-11-26 09:01:17.90827603 +0000 UTC m=+0.153437841 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 04:01:17 localhost podman[104188]: 2025-11-26 09:01:17.971179015 +0000 UTC m=+0.225834373 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, release=1761123044, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, container_name=collectd, distribution-scope=public, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12) Nov 26 04:01:17 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 04:01:17 localhost podman[104187]: 2025-11-26 09:01:17.987111166 +0000 UTC m=+0.244314915 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, config_id=tripleo_step3, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 04:01:17 localhost podman[104187]: 2025-11-26 09:01:17.999134613 +0000 UTC m=+0.256338282 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, url=https://www.redhat.com, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, release=1761123044, com.redhat.component=openstack-iscsid-container, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid) Nov 26 04:01:18 localhost podman[104190]: 2025-11-26 09:01:18.012021018 +0000 UTC m=+0.261448532 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, build-date=2025-11-19T00:14:25Z, distribution-scope=public, container_name=ovn_metadata_agent, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.buildah.version=1.41.4, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn) Nov 26 04:01:18 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 04:01:18 localhost podman[104190]: unhealthy Nov 26 04:01:18 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:01:18 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 04:01:18 localhost podman[104196]: 2025-11-26 09:01:18.281326226 +0000 UTC m=+0.526488007 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, release=1761123044, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public) Nov 26 04:01:18 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 04:01:28 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 26 04:01:28 localhost recover_tripleo_nova_virtqemud[104317]: 61604 Nov 26 04:01:28 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 26 04:01:28 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 26 04:01:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 04:01:35 localhost podman[104365]: 2025-11-26 09:01:35.797394397 +0000 UTC m=+0.065801612 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, architecture=x86_64) Nov 26 04:01:35 localhost podman[104365]: 2025-11-26 09:01:35.993607629 +0000 UTC m=+0.262014844 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-18T22:49:46Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1) Nov 26 04:01:36 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 04:01:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 04:01:40 localhost podman[104394]: 2025-11-26 09:01:40.822738288 +0000 UTC m=+0.081835394 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, release=1761123044, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 04:01:40 localhost podman[104394]: 2025-11-26 09:01:40.844680604 +0000 UTC m=+0.103777690 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, version=17.1.12, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 04:01:40 localhost podman[104394]: unhealthy Nov 26 04:01:40 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:01:40 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Failed with result 'exit-code'. Nov 26 04:01:44 localhost sshd[104417]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:01:44 localhost systemd-logind[761]: New session 37 of user zuul. Nov 26 04:01:44 localhost systemd[1]: Started Session 37 of User zuul. Nov 26 04:01:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 04:01:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 04:01:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 04:01:45 localhost podman[104513]: 2025-11-26 09:01:45.286994396 +0000 UTC m=+0.091648880 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, release=1761123044, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 26 04:01:45 localhost podman[104513]: 2025-11-26 09:01:45.338557761 +0000 UTC m=+0.143212215 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044) Nov 26 04:01:45 localhost systemd[1]: tmp-crun.1ZxAB5.mount: Deactivated successfully. Nov 26 04:01:45 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 04:01:45 localhost podman[104515]: 2025-11-26 09:01:45.348277594 +0000 UTC m=+0.143147312 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12) Nov 26 04:01:45 localhost podman[104514]: 2025-11-26 09:01:45.394001595 +0000 UTC m=+0.191963679 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, release=1761123044, version=17.1.12, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, container_name=logrotate_crond) Nov 26 04:01:45 localhost podman[104515]: 2025-11-26 09:01:45.401441029 +0000 UTC m=+0.196310857 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, release=1761123044, managed_by=tripleo_ansible, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 26 04:01:45 localhost python3.9[104512]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:01:45 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 04:01:45 localhost podman[104514]: 2025-11-26 09:01:45.428515917 +0000 UTC m=+0.226478031 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, architecture=x86_64, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.12) Nov 26 04:01:45 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 04:01:46 localhost python3.9[104676]: ansible-ansible.legacy.command Invoked with cmd=python3 -c "import configparser as c; p = c.ConfigParser(strict=False); p.read('/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf'); print(p['DEFAULT']['host'])"#012 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:01:46 localhost python3.9[104769]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/neutron/etc/neutron/neutron.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:01:47 localhost python3.9[104863]: ansible-ansible.legacy.command Invoked with cmd=python3 -c "import configparser as c; p = c.ConfigParser(strict=False); p.read('/var/lib/config-data/puppet-generated/neutron/etc/neutron/neutron.conf'); print(p['DEFAULT']['host'])"#012 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:01:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 04:01:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 04:01:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 04:01:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 04:01:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 04:01:48 localhost podman[104960]: 2025-11-26 09:01:48.39950286 +0000 UTC m=+0.096454910 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=, container_name=ovn_metadata_agent, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Nov 26 04:01:48 localhost systemd[1]: tmp-crun.wgF6QI.mount: Deactivated successfully. Nov 26 04:01:48 localhost systemd[1]: tmp-crun.zkYshW.mount: Deactivated successfully. Nov 26 04:01:48 localhost podman[104958]: 2025-11-26 09:01:48.430837011 +0000 UTC m=+0.138561979 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, container_name=collectd, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 04:01:48 localhost podman[104958]: 2025-11-26 09:01:48.43881108 +0000 UTC m=+0.146536038 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, version=17.1.12, batch=17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp17/openstack-collectd, config_id=tripleo_step3, io.buildah.version=1.41.4, container_name=collectd, managed_by=tripleo_ansible) Nov 26 04:01:48 localhost podman[104959]: 2025-11-26 09:01:48.447321257 +0000 UTC m=+0.149973886 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, release=1761123044, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, container_name=ovn_controller, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Nov 26 04:01:48 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 04:01:48 localhost podman[104957]: 2025-11-26 09:01:48.487314549 +0000 UTC m=+0.194057136 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, release=1761123044, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, build-date=2025-11-18T23:44:13Z) Nov 26 04:01:48 localhost python3.9[104956]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:01:48 localhost podman[104960]: 2025-11-26 09:01:48.517543175 +0000 UTC m=+0.214495195 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, managed_by=tripleo_ansible, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4) Nov 26 04:01:48 localhost podman[104960]: unhealthy Nov 26 04:01:48 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:01:48 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 04:01:48 localhost podman[104959]: 2025-11-26 09:01:48.540167283 +0000 UTC m=+0.242819962 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, container_name=ovn_controller, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 04:01:48 localhost podman[104959]: unhealthy Nov 26 04:01:48 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:01:48 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 04:01:48 localhost podman[105010]: 2025-11-26 09:01:48.521041935 +0000 UTC m=+0.115305711 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 26 04:01:48 localhost podman[104957]: 2025-11-26 09:01:48.574381755 +0000 UTC m=+0.281124372 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4) Nov 26 04:01:48 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 04:01:48 localhost podman[105010]: 2025-11-26 09:01:48.885673079 +0000 UTC m=+0.479936905 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, version=17.1.12, vendor=Red Hat, Inc., architecture=x86_64, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, release=1761123044, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 04:01:48 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 04:01:49 localhost python3.9[105148]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline Nov 26 04:01:50 localhost python3.9[105238]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:01:51 localhost python3.9[105330]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile Nov 26 04:01:52 localhost python3.9[105420]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 26 04:01:53 localhost python3.9[105468]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 26 04:01:53 localhost systemd[1]: session-37.scope: Deactivated successfully. Nov 26 04:01:53 localhost systemd[1]: session-37.scope: Consumed 4.847s CPU time. Nov 26 04:01:53 localhost systemd-logind[761]: Session 37 logged out. Waiting for processes to exit. Nov 26 04:01:53 localhost systemd-logind[761]: Removed session 37. Nov 26 04:01:56 localhost sshd[105484]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:02:02 localhost sshd[105486]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:02:02 localhost systemd-logind[761]: New session 38 of user zuul. Nov 26 04:02:02 localhost systemd[1]: Started Session 38 of User zuul. Nov 26 04:02:03 localhost python3.9[105581]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 26 04:02:03 localhost systemd[1]: Reloading. Nov 26 04:02:03 localhost systemd-sysv-generator[105612]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:02:03 localhost systemd-rc-local-generator[105606]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:02:03 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:02:04 localhost python3.9[105707]: ansible-ansible.builtin.service_facts Invoked Nov 26 04:02:04 localhost network[105724]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 26 04:02:04 localhost network[105725]: 'network-scripts' will be removed from distribution in near future. Nov 26 04:02:04 localhost network[105726]: It is advised to switch to 'NetworkManager' instead for network management. Nov 26 04:02:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 04:02:06 localhost systemd[1]: tmp-crun.l2FXL7.mount: Deactivated successfully. Nov 26 04:02:06 localhost podman[105734]: 2025-11-26 09:02:06.62823473 +0000 UTC m=+0.088097188 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 04:02:06 localhost podman[105734]: 2025-11-26 09:02:06.820823639 +0000 UTC m=+0.280686117 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., distribution-scope=public, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, version=17.1.12, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4) Nov 26 04:02:06 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 04:02:07 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:02:10 localhost python3.9[105953]: ansible-ansible.builtin.service_facts Invoked Nov 26 04:02:10 localhost network[105970]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 26 04:02:10 localhost network[105971]: 'network-scripts' will be removed from distribution in near future. Nov 26 04:02:10 localhost network[105972]: It is advised to switch to 'NetworkManager' instead for network management. Nov 26 04:02:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 04:02:10 localhost podman[105996]: 2025-11-26 09:02:10.98138941 +0000 UTC m=+0.085539208 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step5, release=1761123044, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 26 04:02:11 localhost podman[105996]: 2025-11-26 09:02:11.009924693 +0000 UTC m=+0.114074521 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, com.redhat.component=openstack-nova-compute-container, release=1761123044, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 04:02:11 localhost podman[105996]: unhealthy Nov 26 04:02:11 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:02:11 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Failed with result 'exit-code'. Nov 26 04:02:11 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:02:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 04:02:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 04:02:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 04:02:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38013 DF PROTO=TCP SPT=43976 DPT=9105 SEQ=503007414 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A3456F0000000001030307) Nov 26 04:02:15 localhost systemd[1]: tmp-crun.ODaMp6.mount: Deactivated successfully. Nov 26 04:02:15 localhost podman[106119]: 2025-11-26 09:02:15.841440815 +0000 UTC m=+0.095873372 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, config_id=tripleo_step4, io.buildah.version=1.41.4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 26 04:02:15 localhost podman[106117]: 2025-11-26 09:02:15.886257068 +0000 UTC m=+0.145437904 container health_status 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, version=17.1.12, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 26 04:02:15 localhost podman[106119]: 2025-11-26 09:02:15.924845746 +0000 UTC m=+0.179278313 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044) Nov 26 04:02:15 localhost podman[106117]: 2025-11-26 09:02:15.940365071 +0000 UTC m=+0.199545927 container exec_died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.12, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, architecture=x86_64) Nov 26 04:02:15 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 04:02:15 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Deactivated successfully. Nov 26 04:02:15 localhost podman[106118]: 2025-11-26 09:02:15.944182382 +0000 UTC m=+0.195207953 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_id=tripleo_step4, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 04:02:16 localhost podman[106118]: 2025-11-26 09:02:16.024056311 +0000 UTC m=+0.275081852 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container) Nov 26 04:02:16 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 04:02:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38014 DF PROTO=TCP SPT=43976 DPT=9105 SEQ=503007414 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A3497C0000000001030307) Nov 26 04:02:18 localhost python3.9[106264]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:02:18 localhost systemd[1]: Reloading. Nov 26 04:02:18 localhost systemd-sysv-generator[106293]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:02:18 localhost systemd-rc-local-generator[106289]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:02:18 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:02:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 04:02:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 04:02:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 04:02:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 04:02:18 localhost systemd[1]: Stopping ceilometer_agent_compute container... Nov 26 04:02:18 localhost systemd[1]: tmp-crun.1NUKB1.mount: Deactivated successfully. Nov 26 04:02:18 localhost podman[106304]: 2025-11-26 09:02:18.751342977 +0000 UTC m=+0.092746644 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.4, container_name=iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, release=1761123044, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3) Nov 26 04:02:18 localhost podman[106304]: 2025-11-26 09:02:18.763062974 +0000 UTC m=+0.104466711 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, config_id=tripleo_step3, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, release=1761123044, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=) Nov 26 04:02:18 localhost podman[106305]: 2025-11-26 09:02:18.802958102 +0000 UTC m=+0.144217755 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, batch=17.1_20251118.1, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd) Nov 26 04:02:18 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 04:02:18 localhost podman[106305]: 2025-11-26 09:02:18.845447662 +0000 UTC m=+0.186707355 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_id=tripleo_step3, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd) Nov 26 04:02:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38015 DF PROTO=TCP SPT=43976 DPT=9105 SEQ=503007414 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A3517C0000000001030307) Nov 26 04:02:18 localhost podman[106306]: 2025-11-26 09:02:18.860191975 +0000 UTC m=+0.200815568 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, version=17.1.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, container_name=ovn_controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 04:02:18 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 04:02:18 localhost podman[106306]: 2025-11-26 09:02:18.902067986 +0000 UTC m=+0.242691579 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, container_name=ovn_controller, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 04:02:18 localhost podman[106306]: unhealthy Nov 26 04:02:18 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:02:18 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 04:02:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 04:02:18 localhost podman[106307]: 2025-11-26 09:02:18.911762439 +0000 UTC m=+0.251685700 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Nov 26 04:02:19 localhost podman[106394]: 2025-11-26 09:02:19.02426468 +0000 UTC m=+0.081844243 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, distribution-scope=public, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 04:02:19 localhost podman[106307]: 2025-11-26 09:02:19.049751038 +0000 UTC m=+0.389674299 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_id=tripleo_step4, container_name=ovn_metadata_agent, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12) Nov 26 04:02:19 localhost podman[106307]: unhealthy Nov 26 04:02:19 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:02:19 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 04:02:19 localhost podman[106394]: 2025-11-26 09:02:19.3890865 +0000 UTC m=+0.446666023 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, container_name=nova_migration_target, vcs-type=git, config_id=tripleo_step4, io.buildah.version=1.41.4, release=1761123044, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, vendor=Red Hat, Inc., url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 04:02:19 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 04:02:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38016 DF PROTO=TCP SPT=43976 DPT=9105 SEQ=503007414 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A3613D0000000001030307) Nov 26 04:02:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1732 DF PROTO=TCP SPT=36266 DPT=9102 SEQ=2462347211 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A364B80000000001030307) Nov 26 04:02:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9463 DF PROTO=TCP SPT=38108 DPT=9100 SEQ=2344212765 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A365ED0000000001030307) Nov 26 04:02:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1733 DF PROTO=TCP SPT=36266 DPT=9102 SEQ=2462347211 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A368BC0000000001030307) Nov 26 04:02:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9464 DF PROTO=TCP SPT=38108 DPT=9100 SEQ=2344212765 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A369FC0000000001030307) Nov 26 04:02:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1734 DF PROTO=TCP SPT=36266 DPT=9102 SEQ=2462347211 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A370BD0000000001030307) Nov 26 04:02:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27126 DF PROTO=TCP SPT=39322 DPT=9101 SEQ=3983085281 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A3716B0000000001030307) Nov 26 04:02:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9465 DF PROTO=TCP SPT=38108 DPT=9100 SEQ=2344212765 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A371FC0000000001030307) Nov 26 04:02:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19249 DF PROTO=TCP SPT=47370 DPT=9882 SEQ=2960516245 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A373140000000001030307) Nov 26 04:02:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27127 DF PROTO=TCP SPT=39322 DPT=9101 SEQ=3983085281 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A3757C0000000001030307) Nov 26 04:02:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19250 DF PROTO=TCP SPT=47370 DPT=9882 SEQ=2960516245 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A3773C0000000001030307) Nov 26 04:02:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27128 DF PROTO=TCP SPT=39322 DPT=9101 SEQ=3983085281 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A37D7C0000000001030307) Nov 26 04:02:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19251 DF PROTO=TCP SPT=47370 DPT=9882 SEQ=2960516245 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A37F3D0000000001030307) Nov 26 04:02:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1735 DF PROTO=TCP SPT=36266 DPT=9102 SEQ=2462347211 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A3807D0000000001030307) Nov 26 04:02:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9466 DF PROTO=TCP SPT=38108 DPT=9100 SEQ=2344212765 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A381BC0000000001030307) Nov 26 04:02:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38017 DF PROTO=TCP SPT=43976 DPT=9105 SEQ=503007414 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A381FC0000000001030307) Nov 26 04:02:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27129 DF PROTO=TCP SPT=39322 DPT=9101 SEQ=3983085281 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A38D3C0000000001030307) Nov 26 04:02:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19252 DF PROTO=TCP SPT=47370 DPT=9882 SEQ=2960516245 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A38EFC0000000001030307) Nov 26 04:02:36 localhost sshd[106499]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:02:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 04:02:36 localhost podman[106501]: 2025-11-26 09:02:36.976450103 +0000 UTC m=+0.083306939 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, vendor=Red Hat, Inc., vcs-type=git, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd) Nov 26 04:02:37 localhost podman[106501]: 2025-11-26 09:02:37.158294815 +0000 UTC m=+0.265151561 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, version=17.1.12, tcib_managed=true, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1761123044, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 04:02:37 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 04:02:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1736 DF PROTO=TCP SPT=36266 DPT=9102 SEQ=2462347211 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A39FFC0000000001030307) Nov 26 04:02:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 04:02:41 localhost systemd[1]: tmp-crun.uNqh8h.mount: Deactivated successfully. Nov 26 04:02:41 localhost podman[106530]: 2025-11-26 09:02:41.580922849 +0000 UTC m=+0.093384254 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step5, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, version=17.1.12, architecture=x86_64, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z) Nov 26 04:02:41 localhost podman[106530]: 2025-11-26 09:02:41.627881879 +0000 UTC m=+0.140343234 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.buildah.version=1.41.4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 04:02:41 localhost podman[106530]: unhealthy Nov 26 04:02:41 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:02:41 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Failed with result 'exit-code'. Nov 26 04:02:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27130 DF PROTO=TCP SPT=39322 DPT=9101 SEQ=3983085281 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A3ADFC0000000001030307) Nov 26 04:02:43 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19253 DF PROTO=TCP SPT=47370 DPT=9882 SEQ=2960516245 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A3AFFC0000000001030307) Nov 26 04:02:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 04:02:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 04:02:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 04:02:46 localhost podman[106553]: 2025-11-26 09:02:46.318549184 +0000 UTC m=+0.083771694 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, batch=17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., container_name=logrotate_crond, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron) Nov 26 04:02:46 localhost podman[106552]: Error: container 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe is not running Nov 26 04:02:46 localhost podman[106553]: 2025-11-26 09:02:46.353521549 +0000 UTC m=+0.118744029 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, distribution-scope=public, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, architecture=x86_64, version=17.1.12) Nov 26 04:02:46 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Main process exited, code=exited, status=125/n/a Nov 26 04:02:46 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Failed with result 'exit-code'. Nov 26 04:02:46 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 04:02:46 localhost podman[106554]: 2025-11-26 09:02:46.428710813 +0000 UTC m=+0.189663658 container health_status 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.12, config_id=tripleo_step4, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1761123044, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 26 04:02:46 localhost podman[106554]: 2025-11-26 09:02:46.486306236 +0000 UTC m=+0.247259081 container exec_died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc., distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 04:02:46 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Deactivated successfully. Nov 26 04:02:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60196 DF PROTO=TCP SPT=43180 DPT=9105 SEQ=259178710 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A3BEBC0000000001030307) Nov 26 04:02:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60197 DF PROTO=TCP SPT=43180 DPT=9105 SEQ=259178710 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A3C6BD0000000001030307) Nov 26 04:02:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 04:02:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 04:02:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 04:02:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 04:02:49 localhost systemd[1]: tmp-crun.OoZqbd.mount: Deactivated successfully. Nov 26 04:02:49 localhost podman[106615]: 2025-11-26 09:02:49.149839435 +0000 UTC m=+0.160349051 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, config_id=tripleo_step3, vcs-type=git, container_name=collectd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, release=1761123044, distribution-scope=public, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, version=17.1.12, managed_by=tripleo_ansible, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team) Nov 26 04:02:49 localhost podman[106614]: 2025-11-26 09:02:49.185959705 +0000 UTC m=+0.196105799 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, vcs-type=git, distribution-scope=public, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, io.openshift.expose-services=) Nov 26 04:02:49 localhost podman[106614]: 2025-11-26 09:02:49.199357735 +0000 UTC m=+0.209503819 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 26 04:02:49 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 04:02:49 localhost podman[106615]: 2025-11-26 09:02:49.24552628 +0000 UTC m=+0.256035946 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, container_name=collectd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 04:02:49 localhost podman[106616]: 2025-11-26 09:02:49.101169011 +0000 UTC m=+0.106693991 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, container_name=ovn_controller, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, maintainer=OpenStack TripleO Team) Nov 26 04:02:49 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 04:02:49 localhost podman[106616]: 2025-11-26 09:02:49.286486392 +0000 UTC m=+0.292011372 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, io.openshift.expose-services=, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, release=1761123044, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 04:02:49 localhost podman[106616]: unhealthy Nov 26 04:02:49 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:02:49 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 04:02:49 localhost podman[106652]: 2025-11-26 09:02:49.303179485 +0000 UTC m=+0.198607818 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, version=17.1.12, vcs-type=git, tcib_managed=true, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn) Nov 26 04:02:49 localhost podman[106652]: 2025-11-26 09:02:49.347419089 +0000 UTC m=+0.242847352 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, release=1761123044, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 26 04:02:49 localhost podman[106652]: unhealthy Nov 26 04:02:49 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:02:49 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 04:02:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 04:02:49 localhost podman[106693]: 2025-11-26 09:02:49.828667594 +0000 UTC m=+0.084632181 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_migration_target, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 04:02:50 localhost podman[106693]: 2025-11-26 09:02:50.16715733 +0000 UTC m=+0.423121867 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step4, batch=17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, url=https://www.redhat.com, version=17.1.12) Nov 26 04:02:50 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 04:02:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60198 DF PROTO=TCP SPT=43180 DPT=9105 SEQ=259178710 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A3D67C0000000001030307) Nov 26 04:02:53 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 26 04:02:53 localhost recover_tripleo_nova_virtqemud[106717]: 61604 Nov 26 04:02:53 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 26 04:02:53 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 26 04:02:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2392 DF PROTO=TCP SPT=37520 DPT=9102 SEQ=1026897194 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A3DDFC0000000001030307) Nov 26 04:02:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24339 DF PROTO=TCP SPT=33036 DPT=9101 SEQ=839095054 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A3EABC0000000001030307) Nov 26 04:03:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2394 DF PROTO=TCP SPT=37520 DPT=9102 SEQ=1026897194 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A3F5BD0000000001030307) Nov 26 04:03:00 localhost podman[106309]: time="2025-11-26T09:03:00Z" level=warning msg="StopSignal SIGTERM failed to stop container ceilometer_agent_compute in 42 seconds, resorting to SIGKILL" Nov 26 04:03:00 localhost systemd[1]: tmp-crun.9tprkS.mount: Deactivated successfully. Nov 26 04:03:00 localhost systemd[1]: libpod-3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.scope: Deactivated successfully. Nov 26 04:03:00 localhost systemd[1]: libpod-3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.scope: Consumed 6.500s CPU time. Nov 26 04:03:00 localhost podman[106309]: 2025-11-26 09:03:00.980159652 +0000 UTC m=+42.314168508 container died 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.expose-services=, vcs-type=git, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com) Nov 26 04:03:00 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.timer: Deactivated successfully. Nov 26 04:03:00 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe. Nov 26 04:03:00 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Failed to open /run/systemd/transient/3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: No such file or directory Nov 26 04:03:01 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe-userdata-shm.mount: Deactivated successfully. Nov 26 04:03:01 localhost podman[106309]: 2025-11-26 09:03:01.036188186 +0000 UTC m=+42.370196982 container cleanup 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, config_id=tripleo_step4, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible) Nov 26 04:03:01 localhost podman[106309]: ceilometer_agent_compute Nov 26 04:03:01 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.timer: Failed to open /run/systemd/transient/3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.timer: No such file or directory Nov 26 04:03:01 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Failed to open /run/systemd/transient/3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: No such file or directory Nov 26 04:03:01 localhost podman[106720]: 2025-11-26 09:03:01.074074681 +0000 UTC m=+0.081888694 container cleanup 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, batch=17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 04:03:01 localhost systemd[1]: libpod-conmon-3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.scope: Deactivated successfully. Nov 26 04:03:01 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.timer: Failed to open /run/systemd/transient/3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.timer: No such file or directory Nov 26 04:03:01 localhost systemd[1]: 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: Failed to open /run/systemd/transient/3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe.service: No such file or directory Nov 26 04:03:01 localhost podman[106735]: 2025-11-26 09:03:01.172186793 +0000 UTC m=+0.061846387 container cleanup 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, version=17.1.12, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, release=1761123044, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container) Nov 26 04:03:01 localhost podman[106735]: ceilometer_agent_compute Nov 26 04:03:01 localhost systemd[1]: tripleo_ceilometer_agent_compute.service: Deactivated successfully. Nov 26 04:03:01 localhost systemd[1]: Stopped ceilometer_agent_compute container. Nov 26 04:03:01 localhost systemd[1]: tripleo_ceilometer_agent_compute.service: Consumed 1.121s CPU time, no IO. Nov 26 04:03:01 localhost python3.9[106839]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_ipmi.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:03:01 localhost systemd[1]: var-lib-containers-storage-overlay-0518d1e1e8bf83217956125556b84af78ba63cfd3e598ff1993dc680b963ed72-merged.mount: Deactivated successfully. Nov 26 04:03:01 localhost systemd[1]: Reloading. Nov 26 04:03:02 localhost systemd-rc-local-generator[106865]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:03:02 localhost systemd-sysv-generator[106870]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:03:02 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:03:02 localhost systemd[1]: Stopping ceilometer_agent_ipmi container... Nov 26 04:03:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24341 DF PROTO=TCP SPT=33036 DPT=9101 SEQ=839095054 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A4027C0000000001030307) Nov 26 04:03:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 04:03:07 localhost systemd[1]: tmp-crun.JCATMK.mount: Deactivated successfully. Nov 26 04:03:07 localhost podman[106893]: 2025-11-26 09:03:07.560443772 +0000 UTC m=+0.076207827 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.expose-services=, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 04:03:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:ef:84:b8 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.107 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=36192 SEQ=988966100 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 Nov 26 04:03:07 localhost podman[106893]: 2025-11-26 09:03:07.752619057 +0000 UTC m=+0.268383142 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-type=git) Nov 26 04:03:07 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 04:03:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 04:03:11 localhost podman[106922]: 2025-11-26 09:03:11.825006608 +0000 UTC m=+0.081109730 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, architecture=x86_64, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.) Nov 26 04:03:11 localhost podman[106922]: 2025-11-26 09:03:11.870471011 +0000 UTC m=+0.126574073 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1761123044, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, tcib_managed=true, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, vendor=Red Hat, Inc., version=17.1.12) Nov 26 04:03:11 localhost podman[106922]: unhealthy Nov 26 04:03:11 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:03:11 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Failed with result 'exit-code'. Nov 26 04:03:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24342 DF PROTO=TCP SPT=33036 DPT=9101 SEQ=839095054 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A421FC0000000001030307) Nov 26 04:03:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:ef:84:b8 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.107 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=36192 SEQ=988966100 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 Nov 26 04:03:15 localhost sshd[106944]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:03:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10609 DF PROTO=TCP SPT=38030 DPT=9105 SEQ=2179211065 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A42FD10000000001030307) Nov 26 04:03:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 04:03:16 localhost systemd[1]: tmp-crun.3XOAfx.mount: Deactivated successfully. Nov 26 04:03:16 localhost podman[106946]: 2025-11-26 09:03:16.559590891 +0000 UTC m=+0.071513399 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, architecture=x86_64, version=17.1.12, url=https://www.redhat.com, io.openshift.expose-services=, release=1761123044, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step4, tcib_managed=true, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron) Nov 26 04:03:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 04:03:16 localhost podman[106946]: 2025-11-26 09:03:16.600333267 +0000 UTC m=+0.112255815 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, version=17.1.12, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.expose-services=, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 04:03:16 localhost podman[106964]: Error: container 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 is not running Nov 26 04:03:16 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Main process exited, code=exited, status=125/n/a Nov 26 04:03:16 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Failed with result 'exit-code'. Nov 26 04:03:16 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 04:03:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10611 DF PROTO=TCP SPT=38030 DPT=9105 SEQ=2179211065 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A43BBC0000000001030307) Nov 26 04:03:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 04:03:19 localhost podman[106978]: 2025-11-26 09:03:19.319761556 +0000 UTC m=+0.083105743 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, distribution-scope=public, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, version=17.1.12, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, config_id=tripleo_step3, architecture=x86_64) Nov 26 04:03:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 04:03:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 04:03:19 localhost podman[106978]: 2025-11-26 09:03:19.356341481 +0000 UTC m=+0.119685658 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, container_name=iscsid, tcib_managed=true, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, distribution-scope=public, vcs-type=git, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container) Nov 26 04:03:19 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 04:03:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 04:03:19 localhost podman[106997]: 2025-11-26 09:03:19.462512364 +0000 UTC m=+0.126610165 container health_status 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, vcs-type=git, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 04:03:19 localhost podman[107020]: 2025-11-26 09:03:19.508577786 +0000 UTC m=+0.119688918 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, managed_by=tripleo_ansible, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 04:03:19 localhost podman[106997]: 2025-11-26 09:03:19.524106981 +0000 UTC m=+0.188204802 container exec_died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, config_id=tripleo_step3, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.buildah.version=1.41.4) Nov 26 04:03:19 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Deactivated successfully. Nov 26 04:03:19 localhost podman[107020]: 2025-11-26 09:03:19.552351477 +0000 UTC m=+0.163462589 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, release=1761123044, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Nov 26 04:03:19 localhost podman[107020]: unhealthy Nov 26 04:03:19 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:03:19 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 04:03:19 localhost podman[106998]: 2025-11-26 09:03:19.429191821 +0000 UTC m=+0.087466110 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, batch=17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, name=rhosp17/openstack-ovn-controller, distribution-scope=public, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git) Nov 26 04:03:19 localhost podman[106998]: 2025-11-26 09:03:19.612292493 +0000 UTC m=+0.270566752 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.buildah.version=1.41.4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, name=rhosp17/openstack-ovn-controller, release=1761123044, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=) Nov 26 04:03:19 localhost podman[106998]: unhealthy Nov 26 04:03:19 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:03:19 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 04:03:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 04:03:20 localhost systemd[1]: tmp-crun.dYugn3.mount: Deactivated successfully. Nov 26 04:03:20 localhost podman[107054]: 2025-11-26 09:03:20.314132354 +0000 UTC m=+0.080121160 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public) Nov 26 04:03:20 localhost podman[107054]: 2025-11-26 09:03:20.74145185 +0000 UTC m=+0.507440616 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vendor=Red Hat, Inc., release=1761123044, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute) Nov 26 04:03:20 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 04:03:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10612 DF PROTO=TCP SPT=38030 DPT=9105 SEQ=2179211065 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A44B7C0000000001030307) Nov 26 04:03:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57927 DF PROTO=TCP SPT=51160 DPT=9102 SEQ=2856988641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A4533C0000000001030307) Nov 26 04:03:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:ef:84:b8 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.107 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=36192 SEQ=988966100 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 Nov 26 04:03:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57929 DF PROTO=TCP SPT=51160 DPT=9102 SEQ=2856988641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A46AFC0000000001030307) Nov 26 04:03:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45288 DF PROTO=TCP SPT=50324 DPT=9101 SEQ=3117661568 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A4777C0000000001030307) Nov 26 04:03:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 04:03:38 localhost systemd[1]: tmp-crun.XFsjQj.mount: Deactivated successfully. Nov 26 04:03:38 localhost podman[107155]: 2025-11-26 09:03:38.318715146 +0000 UTC m=+0.083744283 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, name=rhosp17/openstack-qdrouterd, release=1761123044, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 26 04:03:38 localhost podman[107155]: 2025-11-26 09:03:38.513329147 +0000 UTC m=+0.278358304 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, container_name=metrics_qdr, tcib_managed=true, architecture=x86_64, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd) Nov 26 04:03:38 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 04:03:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49318 DF PROTO=TCP SPT=37070 DPT=9100 SEQ=1762415465 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A48BFC0000000001030307) Nov 26 04:03:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 04:03:42 localhost podman[107186]: 2025-11-26 09:03:42.068247409 +0000 UTC m=+0.084562898 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_id=tripleo_step5, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 04:03:42 localhost podman[107186]: 2025-11-26 09:03:42.095453391 +0000 UTC m=+0.111768840 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., version=17.1.12, container_name=nova_compute) Nov 26 04:03:42 localhost podman[107186]: unhealthy Nov 26 04:03:42 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:03:42 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Failed with result 'exit-code'. Nov 26 04:03:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45289 DF PROTO=TCP SPT=50324 DPT=9101 SEQ=3117661568 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A497FC0000000001030307) Nov 26 04:03:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55400 DF PROTO=TCP SPT=54356 DPT=9882 SEQ=3583715256 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A499FC0000000001030307) Nov 26 04:03:44 localhost podman[106879]: time="2025-11-26T09:03:44Z" level=warning msg="StopSignal SIGTERM failed to stop container ceilometer_agent_ipmi in 42 seconds, resorting to SIGKILL" Nov 26 04:03:44 localhost systemd[1]: tmp-crun.AWIUwi.mount: Deactivated successfully. Nov 26 04:03:44 localhost systemd[1]: libpod-90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.scope: Deactivated successfully. Nov 26 04:03:44 localhost systemd[1]: libpod-90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.scope: Consumed 6.459s CPU time. Nov 26 04:03:44 localhost podman[106879]: 2025-11-26 09:03:44.377037123 +0000 UTC m=+42.090814098 container stop 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z) Nov 26 04:03:44 localhost podman[106879]: 2025-11-26 09:03:44.409125208 +0000 UTC m=+42.122902183 container died 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git) Nov 26 04:03:44 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.timer: Deactivated successfully. Nov 26 04:03:44 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065. Nov 26 04:03:44 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Failed to open /run/systemd/transient/90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: No such file or directory Nov 26 04:03:44 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065-userdata-shm.mount: Deactivated successfully. Nov 26 04:03:44 localhost podman[106879]: 2025-11-26 09:03:44.460817126 +0000 UTC m=+42.174594071 container cleanup 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.12, io.buildah.version=1.41.4, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044) Nov 26 04:03:44 localhost podman[106879]: ceilometer_agent_ipmi Nov 26 04:03:44 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.timer: Failed to open /run/systemd/transient/90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.timer: No such file or directory Nov 26 04:03:44 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Failed to open /run/systemd/transient/90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: No such file or directory Nov 26 04:03:44 localhost podman[107209]: 2025-11-26 09:03:44.471753588 +0000 UTC m=+0.080231623 container cleanup 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.12, release=1761123044, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Nov 26 04:03:44 localhost systemd[1]: libpod-conmon-90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.scope: Deactivated successfully. Nov 26 04:03:44 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.timer: Failed to open /run/systemd/transient/90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.timer: No such file or directory Nov 26 04:03:44 localhost systemd[1]: 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: Failed to open /run/systemd/transient/90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065.service: No such file or directory Nov 26 04:03:44 localhost podman[107223]: 2025-11-26 09:03:44.571843442 +0000 UTC m=+0.071666325 container cleanup 90c110dc92992a9256e1ad3034dc50b59ab7e2341ceb927c2931a667ddba5065 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, architecture=x86_64, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, io.openshift.expose-services=, config_id=tripleo_step4, batch=17.1_20251118.1) Nov 26 04:03:44 localhost podman[107223]: ceilometer_agent_ipmi Nov 26 04:03:44 localhost systemd[1]: tripleo_ceilometer_agent_ipmi.service: Deactivated successfully. Nov 26 04:03:44 localhost systemd[1]: Stopped ceilometer_agent_ipmi container. Nov 26 04:03:45 localhost python3.9[107326]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_collectd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:03:45 localhost systemd[1]: var-lib-containers-storage-overlay-17c292e04cc2973af4faecaf51a38b12d0c20f47d0b5fc279a11e99087cbc694-merged.mount: Deactivated successfully. Nov 26 04:03:45 localhost systemd[1]: Reloading. Nov 26 04:03:45 localhost systemd-rc-local-generator[107349]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:03:45 localhost systemd-sysv-generator[107353]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:03:45 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:03:45 localhost systemd[1]: Stopping collectd container... Nov 26 04:03:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:ef:84:b8 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.107 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=36218 SEQ=2555751720 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 Nov 26 04:03:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 04:03:46 localhost systemd[1]: tmp-crun.vrPEhe.mount: Deactivated successfully. Nov 26 04:03:46 localhost podman[107380]: 2025-11-26 09:03:46.833075787 +0000 UTC m=+0.090790934 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, version=17.1.12, config_id=tripleo_step4, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, vcs-type=git, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, distribution-scope=public, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=) Nov 26 04:03:46 localhost podman[107380]: 2025-11-26 09:03:46.874496814 +0000 UTC m=+0.132211931 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, vcs-type=git, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible) Nov 26 04:03:46 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 04:03:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53542 DF PROTO=TCP SPT=56716 DPT=9105 SEQ=3125035489 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A4B0FC0000000001030307) Nov 26 04:03:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 04:03:49 localhost podman[107399]: 2025-11-26 09:03:49.551707393 +0000 UTC m=+0.059447091 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, release=1761123044, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible) Nov 26 04:03:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 04:03:49 localhost podman[107399]: 2025-11-26 09:03:49.564534025 +0000 UTC m=+0.072273773 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, version=17.1.12, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid) Nov 26 04:03:49 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 04:03:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 04:03:49 localhost podman[107418]: Error: container 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec is not running Nov 26 04:03:49 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Main process exited, code=exited, status=125/n/a Nov 26 04:03:49 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Failed with result 'exit-code'. Nov 26 04:03:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 04:03:49 localhost podman[107429]: 2025-11-26 09:03:49.697712614 +0000 UTC m=+0.095143240 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, config_id=tripleo_step4, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, release=1761123044, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.) Nov 26 04:03:49 localhost podman[107429]: 2025-11-26 09:03:49.718460494 +0000 UTC m=+0.115891140 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, release=1761123044, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible) Nov 26 04:03:49 localhost podman[107429]: unhealthy Nov 26 04:03:49 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:03:49 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 04:03:49 localhost podman[107447]: 2025-11-26 09:03:49.783796679 +0000 UTC m=+0.078005803 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., tcib_managed=true, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 26 04:03:49 localhost podman[107447]: 2025-11-26 09:03:49.803263618 +0000 UTC m=+0.097472742 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, batch=17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4) Nov 26 04:03:49 localhost podman[107447]: unhealthy Nov 26 04:03:49 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:03:49 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 04:03:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 04:03:51 localhost podman[107469]: 2025-11-26 09:03:51.819867896 +0000 UTC m=+0.081665497 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, managed_by=tripleo_ansible, release=1761123044, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, batch=17.1_20251118.1) Nov 26 04:03:52 localhost podman[107469]: 2025-11-26 09:03:52.210482804 +0000 UTC m=+0.472280435 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 26 04:03:52 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 04:03:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53543 DF PROTO=TCP SPT=56716 DPT=9105 SEQ=3125035489 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A4C0BD0000000001030307) Nov 26 04:03:54 localhost sshd[107492]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:03:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49263 DF PROTO=TCP SPT=54630 DPT=9102 SEQ=3273547792 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A4C83D0000000001030307) Nov 26 04:03:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2397 DF PROTO=TCP SPT=37520 DPT=9102 SEQ=1026897194 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A4D3FC0000000001030307) Nov 26 04:04:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49265 DF PROTO=TCP SPT=54630 DPT=9102 SEQ=3273547792 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A4DFFC0000000001030307) Nov 26 04:04:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45884 DF PROTO=TCP SPT=60714 DPT=9101 SEQ=4287321958 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A4ECBC0000000001030307) Nov 26 04:04:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 04:04:08 localhost podman[107494]: 2025-11-26 09:04:08.82679511 +0000 UTC m=+0.086361925 container health_status b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, version=17.1.12, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, architecture=x86_64, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Nov 26 04:04:09 localhost podman[107494]: 2025-11-26 09:04:09.026848142 +0000 UTC m=+0.286414967 container exec_died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, io.buildah.version=1.41.4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public) Nov 26 04:04:09 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Deactivated successfully. Nov 26 04:04:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49266 DF PROTO=TCP SPT=54630 DPT=9102 SEQ=3273547792 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A4FFFC0000000001030307) Nov 26 04:04:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:ef:84:b8 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.107 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=36218 SEQ=2555751720 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 Nov 26 04:04:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 04:04:12 localhost podman[107523]: 2025-11-26 09:04:12.572772213 +0000 UTC m=+0.083403942 container health_status f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, container_name=nova_compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, release=1761123044, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 26 04:04:12 localhost podman[107523]: 2025-11-26 09:04:12.593206652 +0000 UTC m=+0.103838331 container exec_died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, url=https://www.redhat.com, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute) Nov 26 04:04:12 localhost podman[107523]: unhealthy Nov 26 04:04:12 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:04:12 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Failed with result 'exit-code'. Nov 26 04:04:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52536 DF PROTO=TCP SPT=44190 DPT=9105 SEQ=1849284017 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A51A310000000001030307) Nov 26 04:04:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52537 DF PROTO=TCP SPT=44190 DPT=9105 SEQ=1849284017 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A51E3C0000000001030307) Nov 26 04:04:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 04:04:17 localhost systemd[1]: tmp-crun.YzD3O1.mount: Deactivated successfully. Nov 26 04:04:17 localhost podman[107545]: 2025-11-26 09:04:17.285644294 +0000 UTC m=+0.054736784 container health_status 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, url=https://www.redhat.com, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, release=1761123044, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Nov 26 04:04:17 localhost podman[107545]: 2025-11-26 09:04:17.321346832 +0000 UTC m=+0.090439272 container exec_died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Nov 26 04:04:17 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Deactivated successfully. Nov 26 04:04:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52538 DF PROTO=TCP SPT=44190 DPT=9105 SEQ=1849284017 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A5263D0000000001030307) Nov 26 04:04:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 04:04:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 04:04:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 04:04:19 localhost systemd[1]: tmp-crun.7VWkHi.mount: Deactivated successfully. Nov 26 04:04:19 localhost podman[107565]: 2025-11-26 09:04:19.838025692 +0000 UTC m=+0.099205236 container health_status 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, container_name=iscsid, name=rhosp17/openstack-iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc.) Nov 26 04:04:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 04:04:19 localhost podman[107565]: 2025-11-26 09:04:19.850326698 +0000 UTC m=+0.111506242 container exec_died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vendor=Red Hat, Inc., release=1761123044, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 04:04:19 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Deactivated successfully. Nov 26 04:04:19 localhost podman[107606]: 2025-11-26 09:04:19.922470186 +0000 UTC m=+0.065680117 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, distribution-scope=public, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, container_name=ovn_controller, release=1761123044, vcs-type=git) Nov 26 04:04:19 localhost podman[107606]: 2025-11-26 09:04:19.939346355 +0000 UTC m=+0.082556286 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step4, architecture=x86_64, release=1761123044, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git) Nov 26 04:04:19 localhost podman[107606]: unhealthy Nov 26 04:04:19 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:04:19 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 04:04:19 localhost systemd[1]: tmp-crun.pZsEqA.mount: Deactivated successfully. Nov 26 04:04:19 localhost podman[107567]: 2025-11-26 09:04:19.987651397 +0000 UTC m=+0.241293725 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, architecture=x86_64, tcib_managed=true, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 26 04:04:20 localhost podman[107567]: 2025-11-26 09:04:20.00437503 +0000 UTC m=+0.258017288 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, architecture=x86_64, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, release=1761123044, build-date=2025-11-19T00:14:25Z, distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Nov 26 04:04:20 localhost podman[107567]: unhealthy Nov 26 04:04:20 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:04:20 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 04:04:20 localhost podman[107566]: Error: container 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec is not running Nov 26 04:04:20 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Main process exited, code=exited, status=125/n/a Nov 26 04:04:20 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Failed with result 'exit-code'. Nov 26 04:04:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 04:04:22 localhost podman[107635]: 2025-11-26 09:04:22.567711213 +0000 UTC m=+0.078488668 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, version=17.1.12, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=) Nov 26 04:04:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52539 DF PROTO=TCP SPT=44190 DPT=9105 SEQ=1849284017 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A535FC0000000001030307) Nov 26 04:04:22 localhost podman[107635]: 2025-11-26 09:04:22.960295892 +0000 UTC m=+0.471073327 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_migration_target, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, architecture=x86_64, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1) Nov 26 04:04:22 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 04:04:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10552 DF PROTO=TCP SPT=56918 DPT=9102 SEQ=2507069629 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A53D7C0000000001030307) Nov 26 04:04:27 localhost podman[107367]: time="2025-11-26T09:04:27Z" level=warning msg="StopSignal SIGTERM failed to stop container collectd in 42 seconds, resorting to SIGKILL" Nov 26 04:04:27 localhost systemd[1]: tmp-crun.zt8rR1.mount: Deactivated successfully. Nov 26 04:04:27 localhost systemd[1]: libpod-1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.scope: Deactivated successfully. Nov 26 04:04:27 localhost systemd[1]: libpod-1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.scope: Consumed 2.064s CPU time. Nov 26 04:04:27 localhost podman[107367]: 2025-11-26 09:04:27.842290578 +0000 UTC m=+42.091940176 container died 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, config_id=tripleo_step3, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, release=1761123044, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, vcs-type=git, name=rhosp17/openstack-collectd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vendor=Red Hat, Inc.) Nov 26 04:04:27 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.timer: Deactivated successfully. Nov 26 04:04:27 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec. Nov 26 04:04:27 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Failed to open /run/systemd/transient/1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: No such file or directory Nov 26 04:04:27 localhost podman[107367]: 2025-11-26 09:04:27.899393525 +0000 UTC m=+42.149043103 container cleanup 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, release=1761123044, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, vcs-type=git, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12) Nov 26 04:04:27 localhost podman[107367]: collectd Nov 26 04:04:27 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.timer: Failed to open /run/systemd/transient/1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.timer: No such file or directory Nov 26 04:04:27 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Failed to open /run/systemd/transient/1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: No such file or directory Nov 26 04:04:27 localhost podman[107659]: 2025-11-26 09:04:27.975208578 +0000 UTC m=+0.129017509 container cleanup 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, vcs-type=git, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, release=1761123044, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible) Nov 26 04:04:27 localhost systemd[1]: libpod-conmon-1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.scope: Deactivated successfully. Nov 26 04:04:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57932 DF PROTO=TCP SPT=51160 DPT=9102 SEQ=2856988641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A549FC0000000001030307) Nov 26 04:04:28 localhost podman[107689]: error opening file `/run/crun/1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec/status`: No such file or directory Nov 26 04:04:28 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.timer: Failed to open /run/systemd/transient/1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.timer: No such file or directory Nov 26 04:04:28 localhost systemd[1]: 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: Failed to open /run/systemd/transient/1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec.service: No such file or directory Nov 26 04:04:28 localhost podman[107677]: 2025-11-26 09:04:28.086034298 +0000 UTC m=+0.079897643 container cleanup 1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 04:04:28 localhost podman[107677]: collectd Nov 26 04:04:28 localhost systemd[1]: tripleo_collectd.service: Deactivated successfully. Nov 26 04:04:28 localhost systemd[1]: Stopped collectd container. Nov 26 04:04:28 localhost systemd[1]: var-lib-containers-storage-overlay-ab1eeb830657f9ab8bbf0a1c1e595d808a09550d63278050b820041c6a307d5f-merged.mount: Deactivated successfully. Nov 26 04:04:28 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1985ed95e622c66e639a5ed2183fa07f0570fb6c87c29d0b006a9523387632ec-userdata-shm.mount: Deactivated successfully. Nov 26 04:04:28 localhost python3.9[107783]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_iscsid.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:04:28 localhost systemd[1]: Reloading. Nov 26 04:04:29 localhost systemd-rc-local-generator[107806]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:04:29 localhost systemd-sysv-generator[107810]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:04:29 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:04:29 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 26 04:04:29 localhost systemd[1]: Stopping iscsid container... Nov 26 04:04:29 localhost recover_tripleo_nova_virtqemud[107824]: 61604 Nov 26 04:04:29 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 26 04:04:29 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 26 04:04:29 localhost systemd[1]: tmp-crun.OPOJWS.mount: Deactivated successfully. Nov 26 04:04:29 localhost systemd[1]: libpod-1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.scope: Deactivated successfully. Nov 26 04:04:29 localhost systemd[1]: libpod-1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.scope: Consumed 1.063s CPU time. Nov 26 04:04:29 localhost podman[107825]: 2025-11-26 09:04:29.406206675 +0000 UTC m=+0.073778751 container died 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, architecture=x86_64, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, container_name=iscsid) Nov 26 04:04:29 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.timer: Deactivated successfully. Nov 26 04:04:29 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab. Nov 26 04:04:29 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Failed to open /run/systemd/transient/1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: No such file or directory Nov 26 04:04:29 localhost podman[107825]: 2025-11-26 09:04:29.44151623 +0000 UTC m=+0.109088256 container cleanup 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, config_id=tripleo_step3, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, architecture=x86_64, distribution-scope=public, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid) Nov 26 04:04:29 localhost podman[107825]: iscsid Nov 26 04:04:29 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.timer: Failed to open /run/systemd/transient/1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.timer: No such file or directory Nov 26 04:04:29 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Failed to open /run/systemd/transient/1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: No such file or directory Nov 26 04:04:29 localhost podman[107838]: 2025-11-26 09:04:29.477381572 +0000 UTC m=+0.063338813 container cleanup 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, vcs-type=git, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, url=https://www.redhat.com) Nov 26 04:04:29 localhost systemd[1]: libpod-conmon-1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.scope: Deactivated successfully. Nov 26 04:04:29 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.timer: Failed to open /run/systemd/transient/1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.timer: No such file or directory Nov 26 04:04:29 localhost systemd[1]: 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: Failed to open /run/systemd/transient/1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab.service: No such file or directory Nov 26 04:04:29 localhost podman[107852]: 2025-11-26 09:04:29.580257143 +0000 UTC m=+0.068122843 container cleanup 1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=) Nov 26 04:04:29 localhost podman[107852]: iscsid Nov 26 04:04:29 localhost systemd[1]: tripleo_iscsid.service: Deactivated successfully. Nov 26 04:04:29 localhost systemd[1]: Stopped iscsid container. Nov 26 04:04:29 localhost systemd[1]: var-lib-containers-storage-overlay-14988db00eaf3274b740fc90a2db62d16af3a82b44457432a1a6aa29dc90bda4-merged.mount: Deactivated successfully. Nov 26 04:04:29 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1169eaed688d1ca43b7aceaed41d7a3dbc16973650c46d0bfb18936bd683e5ab-userdata-shm.mount: Deactivated successfully. Nov 26 04:04:30 localhost python3.9[107956]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_logrotate_crond.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:04:30 localhost systemd[1]: Reloading. Nov 26 04:04:30 localhost systemd-rc-local-generator[107985]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:04:30 localhost systemd-sysv-generator[107989]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:04:30 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:04:30 localhost systemd[1]: Stopping logrotate_crond container... Nov 26 04:04:30 localhost systemd[1]: libpod-7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.scope: Deactivated successfully. Nov 26 04:04:30 localhost systemd[1]: libpod-7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.scope: Consumed 1.052s CPU time. Nov 26 04:04:30 localhost podman[107996]: 2025-11-26 09:04:30.709528594 +0000 UTC m=+0.086649424 container died 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20251118.1) Nov 26 04:04:30 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.timer: Deactivated successfully. Nov 26 04:04:30 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c. Nov 26 04:04:30 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Failed to open /run/systemd/transient/7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: No such file or directory Nov 26 04:04:30 localhost podman[107996]: 2025-11-26 09:04:30.758976812 +0000 UTC m=+0.136097622 container cleanup 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, config_id=tripleo_step4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, io.buildah.version=1.41.4, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Nov 26 04:04:30 localhost podman[107996]: logrotate_crond Nov 26 04:04:30 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.timer: Failed to open /run/systemd/transient/7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.timer: No such file or directory Nov 26 04:04:30 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Failed to open /run/systemd/transient/7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: No such file or directory Nov 26 04:04:30 localhost podman[108009]: 2025-11-26 09:04:30.805234799 +0000 UTC m=+0.084634780 container cleanup 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, config_id=tripleo_step4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Nov 26 04:04:30 localhost systemd[1]: tmp-crun.mnjmTb.mount: Deactivated successfully. Nov 26 04:04:30 localhost systemd[1]: var-lib-containers-storage-overlay-f99cd177b672ff33074ec35abbc6210e048ba1785e645693f779453f3bd61c4d-merged.mount: Deactivated successfully. Nov 26 04:04:30 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c-userdata-shm.mount: Deactivated successfully. Nov 26 04:04:30 localhost systemd[1]: libpod-conmon-7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.scope: Deactivated successfully. Nov 26 04:04:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10554 DF PROTO=TCP SPT=56918 DPT=9102 SEQ=2507069629 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A5553C0000000001030307) Nov 26 04:04:30 localhost podman[108040]: error opening file `/run/crun/7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c/status`: No such file or directory Nov 26 04:04:30 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.timer: Failed to open /run/systemd/transient/7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.timer: No such file or directory Nov 26 04:04:30 localhost systemd[1]: 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: Failed to open /run/systemd/transient/7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c.service: No such file or directory Nov 26 04:04:30 localhost podman[108026]: 2025-11-26 09:04:30.921406547 +0000 UTC m=+0.081276986 container cleanup 7f65ad87705ff132331075904c315c75e97a94b65fd81d3cdf9cf9d4dc04ad7c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4) Nov 26 04:04:30 localhost podman[108026]: logrotate_crond Nov 26 04:04:30 localhost systemd[1]: tripleo_logrotate_crond.service: Deactivated successfully. Nov 26 04:04:30 localhost systemd[1]: Stopped logrotate_crond container. Nov 26 04:04:31 localhost python3.9[108133]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_metrics_qdr.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:04:31 localhost systemd[1]: Reloading. Nov 26 04:04:31 localhost systemd-rc-local-generator[108164]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:04:31 localhost systemd-sysv-generator[108168]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:04:31 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:04:32 localhost systemd[1]: Stopping metrics_qdr container... Nov 26 04:04:32 localhost kernel: qdrouterd[54298]: segfault at 0 ip 00007fada87777cb sp 00007fffac09ce70 error 4 in libc.so.6[7fada8714000+175000] Nov 26 04:04:32 localhost kernel: Code: 0b 00 64 44 89 23 85 c0 75 d4 e9 2b ff ff ff e8 db a5 00 00 e9 fd fe ff ff e8 41 1d 0d 00 90 f3 0f 1e fa 41 54 55 48 89 fd 53 <8b> 07 f6 c4 20 0f 85 aa 00 00 00 89 c2 81 e2 00 80 00 00 0f 84 a9 Nov 26 04:04:32 localhost systemd[1]: Created slice Slice /system/systemd-coredump. Nov 26 04:04:32 localhost systemd[1]: Started Process Core Dump (PID 108187/UID 0). Nov 26 04:04:32 localhost systemd-coredump[108188]: Resource limits disable core dumping for process 54298 (qdrouterd). Nov 26 04:04:32 localhost systemd-coredump[108188]: Process 54298 (qdrouterd) of user 42465 dumped core. Nov 26 04:04:32 localhost systemd[1]: systemd-coredump@0-108187-0.service: Deactivated successfully. Nov 26 04:04:32 localhost podman[108174]: 2025-11-26 09:04:32.353378973 +0000 UTC m=+0.277111506 container died b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.buildah.version=1.41.4, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, release=1761123044, io.openshift.expose-services=) Nov 26 04:04:32 localhost systemd[1]: libpod-b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.scope: Deactivated successfully. Nov 26 04:04:32 localhost systemd[1]: libpod-b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.scope: Consumed 28.038s CPU time. Nov 26 04:04:32 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.timer: Deactivated successfully. Nov 26 04:04:32 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176. Nov 26 04:04:32 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Failed to open /run/systemd/transient/b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: No such file or directory Nov 26 04:04:32 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176-userdata-shm.mount: Deactivated successfully. Nov 26 04:04:32 localhost systemd[1]: var-lib-containers-storage-overlay-772983d29741817fb5112b04db0ec34846c51e947d40ce51144a956997c63192-merged.mount: Deactivated successfully. Nov 26 04:04:32 localhost podman[108174]: 2025-11-26 09:04:32.453804217 +0000 UTC m=+0.377536750 container cleanup b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, config_id=tripleo_step1, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, io.openshift.expose-services=) Nov 26 04:04:32 localhost podman[108174]: metrics_qdr Nov 26 04:04:32 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.timer: Failed to open /run/systemd/transient/b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.timer: No such file or directory Nov 26 04:04:32 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Failed to open /run/systemd/transient/b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: No such file or directory Nov 26 04:04:32 localhost podman[108192]: 2025-11-26 09:04:32.469926442 +0000 UTC m=+0.106691522 container cleanup b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-type=git, build-date=2025-11-18T22:49:46Z) Nov 26 04:04:32 localhost systemd[1]: tripleo_metrics_qdr.service: Main process exited, code=exited, status=139/n/a Nov 26 04:04:32 localhost systemd[1]: libpod-conmon-b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.scope: Deactivated successfully. Nov 26 04:04:32 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.timer: Failed to open /run/systemd/transient/b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.timer: No such file or directory Nov 26 04:04:32 localhost systemd[1]: b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: Failed to open /run/systemd/transient/b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176.service: No such file or directory Nov 26 04:04:32 localhost podman[108207]: 2025-11-26 09:04:32.578590973 +0000 UTC m=+0.072133829 container cleanup b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cb52c88276a571bf332b7657a13eab07'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, tcib_managed=true, architecture=x86_64, release=1761123044, vcs-type=git, url=https://www.redhat.com, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 26 04:04:32 localhost podman[108207]: metrics_qdr Nov 26 04:04:32 localhost systemd[1]: tripleo_metrics_qdr.service: Failed with result 'exit-code'. Nov 26 04:04:32 localhost systemd[1]: Stopped metrics_qdr container. Nov 26 04:04:33 localhost sshd[108344]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:04:33 localhost python3.9[108342]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_dhcp.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:04:34 localhost python3.9[108469]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_l3_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:04:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12002 DF PROTO=TCP SPT=41652 DPT=9101 SEQ=1117584996 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A561FD0000000001030307) Nov 26 04:04:34 localhost python3.9[108576]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_ovs_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:04:35 localhost python3.9[108670]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:04:35 localhost systemd[1]: Reloading. Nov 26 04:04:35 localhost systemd-rc-local-generator[108693]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:04:35 localhost systemd-sysv-generator[108696]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:04:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:04:35 localhost systemd[1]: Stopping nova_compute container... Nov 26 04:04:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:ef:84:b8 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.107 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=36218 SEQ=2555751720 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 Nov 26 04:04:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12003 DF PROTO=TCP SPT=41652 DPT=9101 SEQ=1117584996 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A581FC0000000001030307) Nov 26 04:04:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 04:04:42 localhost podman[108723]: Error: container f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d is not running Nov 26 04:04:42 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Main process exited, code=exited, status=125/n/a Nov 26 04:04:42 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Failed with result 'exit-code'. Nov 26 04:04:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53589 DF PROTO=TCP SPT=56660 DPT=9882 SEQ=474654929 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A583FC0000000001030307) Nov 26 04:04:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28824 DF PROTO=TCP SPT=45578 DPT=9105 SEQ=1484000179 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A58F600000000001030307) Nov 26 04:04:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28826 DF PROTO=TCP SPT=45578 DPT=9105 SEQ=1484000179 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A59B7D0000000001030307) Nov 26 04:04:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 04:04:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 04:04:50 localhost podman[108735]: 2025-11-26 09:04:50.082585435 +0000 UTC m=+0.094497319 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, release=1761123044, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step4, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, vcs-type=git, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 04:04:50 localhost podman[108735]: 2025-11-26 09:04:50.100687532 +0000 UTC m=+0.112599456 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, release=1761123044, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.4) Nov 26 04:04:50 localhost podman[108735]: unhealthy Nov 26 04:04:50 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:04:50 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 04:04:50 localhost podman[108752]: 2025-11-26 09:04:50.170724874 +0000 UTC m=+0.083564407 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, release=1761123044, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Nov 26 04:04:50 localhost podman[108752]: 2025-11-26 09:04:50.191363091 +0000 UTC m=+0.104202474 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 26 04:04:50 localhost podman[108752]: unhealthy Nov 26 04:04:50 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:04:50 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 04:04:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28827 DF PROTO=TCP SPT=45578 DPT=9105 SEQ=1484000179 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A5AB3D0000000001030307) Nov 26 04:04:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 04:04:53 localhost podman[108775]: 2025-11-26 09:04:53.314059793 +0000 UTC m=+0.077006011 container health_status 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public, managed_by=tripleo_ansible, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 26 04:04:53 localhost podman[108775]: 2025-11-26 09:04:53.654515841 +0000 UTC m=+0.417462099 container exec_died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.expose-services=) Nov 26 04:04:53 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Deactivated successfully. Nov 26 04:04:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51006 DF PROTO=TCP SPT=42712 DPT=9102 SEQ=3777487237 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A5B2BC0000000001030307) Nov 26 04:04:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5611 DF PROTO=TCP SPT=53270 DPT=9101 SEQ=1808949504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A5BF7C0000000001030307) Nov 26 04:05:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51008 DF PROTO=TCP SPT=42712 DPT=9102 SEQ=3777487237 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A5CA7C0000000001030307) Nov 26 04:05:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5613 DF PROTO=TCP SPT=53270 DPT=9101 SEQ=1808949504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A5D73D0000000001030307) Nov 26 04:05:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51009 DF PROTO=TCP SPT=42712 DPT=9102 SEQ=3777487237 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A5E9FC0000000001030307) Nov 26 04:05:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5614 DF PROTO=TCP SPT=53270 DPT=9101 SEQ=1808949504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A5F7FC0000000001030307) Nov 26 04:05:12 localhost sshd[108798]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:05:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 04:05:13 localhost podman[108799]: Error: container f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d is not running Nov 26 04:05:13 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Main process exited, code=exited, status=125/n/a Nov 26 04:05:13 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Failed with result 'exit-code'. Nov 26 04:05:13 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34989 DF PROTO=TCP SPT=53050 DPT=9882 SEQ=994700693 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A5F9FD0000000001030307) Nov 26 04:05:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60619 DF PROTO=TCP SPT=49480 DPT=9105 SEQ=4106380345 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A6087D0000000001030307) Nov 26 04:05:18 localhost podman[108710]: time="2025-11-26T09:05:18Z" level=warning msg="StopSignal SIGTERM failed to stop container nova_compute in 42 seconds, resorting to SIGKILL" Nov 26 04:05:18 localhost systemd[1]: session-c11.scope: Deactivated successfully. Nov 26 04:05:18 localhost systemd[1]: libpod-f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.scope: Deactivated successfully. Nov 26 04:05:18 localhost systemd[1]: libpod-f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.scope: Consumed 36.968s CPU time. Nov 26 04:05:18 localhost podman[108710]: 2025-11-26 09:05:18.031333469 +0000 UTC m=+42.103698489 container died f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, version=17.1.12, managed_by=tripleo_ansible, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, architecture=x86_64, batch=17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 04:05:18 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.timer: Deactivated successfully. Nov 26 04:05:18 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d. Nov 26 04:05:18 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Failed to open /run/systemd/transient/f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: No such file or directory Nov 26 04:05:18 localhost systemd[1]: var-lib-containers-storage-overlay-c309ab81ad8c7882d0bc2a3cffa363ac8b346f70f1c23bf5a7e70394ef52b071-merged.mount: Deactivated successfully. Nov 26 04:05:18 localhost podman[108710]: 2025-11-26 09:05:18.100244027 +0000 UTC m=+42.172609027 container cleanup f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, config_id=tripleo_step5, batch=17.1_20251118.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, vcs-type=git, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, release=1761123044, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, container_name=nova_compute) Nov 26 04:05:18 localhost podman[108710]: nova_compute Nov 26 04:05:18 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.timer: Failed to open /run/systemd/transient/f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.timer: No such file or directory Nov 26 04:05:18 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Failed to open /run/systemd/transient/f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: No such file or directory Nov 26 04:05:18 localhost podman[108814]: 2025-11-26 09:05:18.172422956 +0000 UTC m=+0.128678109 container cleanup f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, managed_by=tripleo_ansible, release=1761123044, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Nov 26 04:05:18 localhost systemd[1]: libpod-conmon-f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.scope: Deactivated successfully. Nov 26 04:05:18 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.timer: Failed to open /run/systemd/transient/f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.timer: No such file or directory Nov 26 04:05:18 localhost systemd[1]: f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: Failed to open /run/systemd/transient/f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d.service: No such file or directory Nov 26 04:05:18 localhost podman[108831]: 2025-11-26 09:05:18.275004977 +0000 UTC m=+0.070105545 container cleanup f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, config_id=tripleo_step5, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com) Nov 26 04:05:18 localhost podman[108831]: nova_compute Nov 26 04:05:18 localhost systemd[1]: tripleo_nova_compute.service: Deactivated successfully. Nov 26 04:05:18 localhost systemd[1]: Stopped nova_compute container. Nov 26 04:05:18 localhost systemd[1]: tripleo_nova_compute.service: Consumed 1.099s CPU time, no IO. Nov 26 04:05:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60620 DF PROTO=TCP SPT=49480 DPT=9105 SEQ=4106380345 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A6107C0000000001030307) Nov 26 04:05:19 localhost python3.9[108934]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:05:19 localhost systemd[1]: Reloading. Nov 26 04:05:19 localhost systemd-sysv-generator[108964]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:05:19 localhost systemd-rc-local-generator[108961]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:05:19 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:05:19 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 26 04:05:19 localhost systemd[1]: Stopping nova_migration_target container... Nov 26 04:05:19 localhost recover_tripleo_nova_virtqemud[108977]: 61604 Nov 26 04:05:19 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 26 04:05:19 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 26 04:05:19 localhost systemd[1]: libpod-9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.scope: Deactivated successfully. Nov 26 04:05:19 localhost systemd[1]: libpod-9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.scope: Consumed 33.970s CPU time. Nov 26 04:05:19 localhost podman[108976]: 2025-11-26 09:05:19.420035231 +0000 UTC m=+0.059524434 container died 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 04:05:19 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.timer: Deactivated successfully. Nov 26 04:05:19 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7. Nov 26 04:05:19 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Failed to open /run/systemd/transient/9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: No such file or directory Nov 26 04:05:19 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7-userdata-shm.mount: Deactivated successfully. Nov 26 04:05:19 localhost podman[108976]: 2025-11-26 09:05:19.481084823 +0000 UTC m=+0.120573966 container cleanup 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, version=17.1.12, distribution-scope=public, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1761123044, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git) Nov 26 04:05:19 localhost podman[108976]: nova_migration_target Nov 26 04:05:19 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.timer: Failed to open /run/systemd/transient/9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.timer: No such file or directory Nov 26 04:05:19 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Failed to open /run/systemd/transient/9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: No such file or directory Nov 26 04:05:19 localhost podman[108990]: 2025-11-26 09:05:19.5014514 +0000 UTC m=+0.076966040 container cleanup 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., container_name=nova_migration_target, release=1761123044, config_id=tripleo_step4) Nov 26 04:05:19 localhost systemd[1]: libpod-conmon-9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.scope: Deactivated successfully. Nov 26 04:05:19 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.timer: Failed to open /run/systemd/transient/9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.timer: No such file or directory Nov 26 04:05:19 localhost systemd[1]: 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: Failed to open /run/systemd/transient/9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7.service: No such file or directory Nov 26 04:05:19 localhost podman[109004]: 2025-11-26 09:05:19.594815373 +0000 UTC m=+0.064042396 container cleanup 9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, release=1761123044, config_id=tripleo_step4, version=17.1.12, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.buildah.version=1.41.4, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com) Nov 26 04:05:19 localhost podman[109004]: nova_migration_target Nov 26 04:05:19 localhost systemd[1]: tripleo_nova_migration_target.service: Deactivated successfully. Nov 26 04:05:19 localhost systemd[1]: Stopped nova_migration_target container. Nov 26 04:05:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 04:05:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 04:05:20 localhost python3.9[109109]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:05:20 localhost podman[109111]: 2025-11-26 09:05:20.338883645 +0000 UTC m=+0.098352100 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, container_name=ovn_metadata_agent, url=https://www.redhat.com, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Nov 26 04:05:20 localhost systemd[1]: Reloading. Nov 26 04:05:20 localhost podman[109110]: 2025-11-26 09:05:20.371818206 +0000 UTC m=+0.132099636 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, batch=17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, architecture=x86_64, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true) Nov 26 04:05:20 localhost podman[109110]: 2025-11-26 09:05:20.384712099 +0000 UTC m=+0.144993579 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step4, batch=17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, release=1761123044) Nov 26 04:05:20 localhost podman[109110]: unhealthy Nov 26 04:05:20 localhost systemd-rc-local-generator[109173]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:05:20 localhost systemd-sysv-generator[109179]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:05:20 localhost podman[109111]: 2025-11-26 09:05:20.475531533 +0000 UTC m=+0.234999978 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, batch=17.1_20251118.1, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4) Nov 26 04:05:20 localhost podman[109111]: unhealthy Nov 26 04:05:20 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:05:20 localhost systemd[1]: var-lib-containers-storage-overlay-1ad32a5db29098f5568060ccdb89afe68c9fb2dd318793af5aa95785da54e96e-merged.mount: Deactivated successfully. Nov 26 04:05:20 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:05:20 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 04:05:20 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:05:20 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 04:05:20 localhost systemd[1]: Stopping nova_virtlogd_wrapper container... Nov 26 04:05:20 localhost systemd[1]: libpod-8745f127beba509bb46acdd315816193362d33cc29035590ca4ada21f6718d93.scope: Deactivated successfully. Nov 26 04:05:20 localhost podman[109190]: 2025-11-26 09:05:20.797567344 +0000 UTC m=+0.076097883 container died 8745f127beba509bb46acdd315816193362d33cc29035590ca4ada21f6718d93 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, url=https://www.redhat.com, version=17.1.12, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, name=rhosp17/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-11-19T00:35:22Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_id=tripleo_step3, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtlogd_wrapper, managed_by=tripleo_ansible) Nov 26 04:05:20 localhost systemd[1]: tmp-crun.Q96BTr.mount: Deactivated successfully. Nov 26 04:05:20 localhost podman[109190]: 2025-11-26 09:05:20.843200112 +0000 UTC m=+0.121730651 container cleanup 8745f127beba509bb46acdd315816193362d33cc29035590ca4ada21f6718d93 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, name=rhosp17/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtlogd_wrapper, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, build-date=2025-11-19T00:35:22Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, batch=17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 26 04:05:20 localhost podman[109190]: nova_virtlogd_wrapper Nov 26 04:05:20 localhost podman[109204]: 2025-11-26 09:05:20.860163673 +0000 UTC m=+0.058658817 container cleanup 8745f127beba509bb46acdd315816193362d33cc29035590ca4ada21f6718d93 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_virtlogd_wrapper, version=17.1.12, config_id=tripleo_step3, url=https://www.redhat.com, build-date=2025-11-19T00:35:22Z, io.buildah.version=1.41.4, release=1761123044, name=rhosp17/openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt) Nov 26 04:05:21 localhost systemd[1]: var-lib-containers-storage-overlay-3a6b696492174e75acfc2b8fd9aa6357f72d30dd36dce3e87a766ba6c92f819d-merged.mount: Deactivated successfully. Nov 26 04:05:21 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8745f127beba509bb46acdd315816193362d33cc29035590ca4ada21f6718d93-userdata-shm.mount: Deactivated successfully. Nov 26 04:05:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60621 DF PROTO=TCP SPT=49480 DPT=9105 SEQ=4106380345 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A6203C0000000001030307) Nov 26 04:05:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17571 DF PROTO=TCP SPT=53230 DPT=9102 SEQ=2870595829 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A627FC0000000001030307) Nov 26 04:05:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58640 DF PROTO=TCP SPT=51122 DPT=9100 SEQ=2469258810 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A633FC0000000001030307) Nov 26 04:05:28 localhost systemd[1]: Stopping User Manager for UID 0... Nov 26 04:05:28 localhost systemd[83704]: Activating special unit Exit the Session... Nov 26 04:05:28 localhost systemd[83704]: Removed slice User Background Tasks Slice. Nov 26 04:05:28 localhost systemd[83704]: Stopped target Main User Target. Nov 26 04:05:28 localhost systemd[83704]: Stopped target Basic System. Nov 26 04:05:28 localhost systemd[83704]: Stopped target Paths. Nov 26 04:05:28 localhost systemd[83704]: Stopped target Sockets. Nov 26 04:05:28 localhost systemd[83704]: Stopped target Timers. Nov 26 04:05:28 localhost systemd[83704]: Stopped Daily Cleanup of User's Temporary Directories. Nov 26 04:05:28 localhost systemd[83704]: Closed D-Bus User Message Bus Socket. Nov 26 04:05:28 localhost systemd[83704]: Stopped Create User's Volatile Files and Directories. Nov 26 04:05:28 localhost systemd[83704]: Removed slice User Application Slice. Nov 26 04:05:28 localhost systemd[83704]: Reached target Shutdown. Nov 26 04:05:28 localhost systemd[83704]: Finished Exit the Session. Nov 26 04:05:28 localhost systemd[83704]: Reached target Exit the Session. Nov 26 04:05:28 localhost systemd[1]: user@0.service: Deactivated successfully. Nov 26 04:05:28 localhost systemd[1]: Stopped User Manager for UID 0. Nov 26 04:05:28 localhost systemd[1]: user@0.service: Consumed 3.985s CPU time, no IO. Nov 26 04:05:28 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Nov 26 04:05:28 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Nov 26 04:05:28 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Nov 26 04:05:28 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Nov 26 04:05:28 localhost systemd[1]: Removed slice User Slice of UID 0. Nov 26 04:05:28 localhost systemd[1]: user-0.slice: Consumed 4.995s CPU time. Nov 26 04:05:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17573 DF PROTO=TCP SPT=53230 DPT=9102 SEQ=2870595829 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A63FBC0000000001030307) Nov 26 04:05:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44144 DF PROTO=TCP SPT=39324 DPT=9101 SEQ=2504590409 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A64C3C0000000001030307) Nov 26 04:05:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17574 DF PROTO=TCP SPT=53230 DPT=9102 SEQ=2870595829 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A65FFD0000000001030307) Nov 26 04:05:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44145 DF PROTO=TCP SPT=39324 DPT=9101 SEQ=2504590409 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A66BFC0000000001030307) Nov 26 04:05:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63620 DF PROTO=TCP SPT=36356 DPT=9105 SEQ=2386594476 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A679C00000000001030307) Nov 26 04:05:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63621 DF PROTO=TCP SPT=36356 DPT=9105 SEQ=2386594476 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A67DBC0000000001030307) Nov 26 04:05:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63622 DF PROTO=TCP SPT=36356 DPT=9105 SEQ=2386594476 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A685BC0000000001030307) Nov 26 04:05:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 04:05:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 04:05:50 localhost podman[109298]: 2025-11-26 09:05:50.816769411 +0000 UTC m=+0.072560382 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=ovn_controller, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true) Nov 26 04:05:50 localhost podman[109298]: 2025-11-26 09:05:50.838208632 +0000 UTC m=+0.093999673 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, architecture=x86_64, container_name=ovn_controller, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, release=1761123044, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 26 04:05:50 localhost podman[109298]: unhealthy Nov 26 04:05:50 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:05:50 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 04:05:50 localhost systemd[1]: tmp-crun.HAuSmU.mount: Deactivated successfully. Nov 26 04:05:50 localhost podman[109299]: 2025-11-26 09:05:50.892734719 +0000 UTC m=+0.144637399 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, tcib_managed=true, version=17.1.12, build-date=2025-11-19T00:14:25Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, url=https://www.redhat.com) Nov 26 04:05:50 localhost podman[109299]: 2025-11-26 09:05:50.9046055 +0000 UTC m=+0.156508150 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Nov 26 04:05:50 localhost podman[109299]: unhealthy Nov 26 04:05:50 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:05:50 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 04:05:52 localhost sshd[109339]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:05:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63623 DF PROTO=TCP SPT=36356 DPT=9105 SEQ=2386594476 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A6957D0000000001030307) Nov 26 04:05:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42316 DF PROTO=TCP SPT=59464 DPT=9102 SEQ=3347199301 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A69CFD0000000001030307) Nov 26 04:05:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36646 DF PROTO=TCP SPT=41674 DPT=9101 SEQ=2558786360 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A6A9BC0000000001030307) Nov 26 04:06:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42318 DF PROTO=TCP SPT=59464 DPT=9102 SEQ=3347199301 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A6B4BC0000000001030307) Nov 26 04:06:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36648 DF PROTO=TCP SPT=41674 DPT=9101 SEQ=2558786360 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A6C17D0000000001030307) Nov 26 04:06:05 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 26 04:06:05 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4800.1 total, 600.0 interval#012Cumulative writes: 5692 writes, 25K keys, 5692 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5692 writes, 763 syncs, 7.46 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 26 04:06:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42319 DF PROTO=TCP SPT=59464 DPT=9102 SEQ=3347199301 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A6D5FC0000000001030307) Nov 26 04:06:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 26 04:06:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4800.1 total, 600.0 interval#012Cumulative writes: 4860 writes, 21K keys, 4860 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4860 writes, 621 syncs, 7.83 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 26 04:06:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36649 DF PROTO=TCP SPT=41674 DPT=9101 SEQ=2558786360 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A6E1FC0000000001030307) Nov 26 04:06:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8214 DF PROTO=TCP SPT=47132 DPT=9882 SEQ=3791962858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A6E3FC0000000001030307) Nov 26 04:06:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40575 DF PROTO=TCP SPT=56780 DPT=9105 SEQ=2306419023 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A6F2FD0000000001030307) Nov 26 04:06:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40576 DF PROTO=TCP SPT=56780 DPT=9105 SEQ=2306419023 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A6FAFC0000000001030307) Nov 26 04:06:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 04:06:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 04:06:21 localhost systemd[1]: tmp-crun.UVF8Cl.mount: Deactivated successfully. Nov 26 04:06:21 localhost podman[109342]: 2025-11-26 09:06:21.315877161 +0000 UTC m=+0.072320035 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, build-date=2025-11-19T00:14:25Z) Nov 26 04:06:21 localhost podman[109342]: 2025-11-26 09:06:21.331045166 +0000 UTC m=+0.087488030 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, distribution-scope=public, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_id=tripleo_step4, version=17.1.12, architecture=x86_64, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Nov 26 04:06:21 localhost podman[109342]: unhealthy Nov 26 04:06:21 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:06:21 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 04:06:21 localhost systemd[1]: tmp-crun.rxXYLm.mount: Deactivated successfully. Nov 26 04:06:21 localhost podman[109341]: 2025-11-26 09:06:21.379903966 +0000 UTC m=+0.139008923 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, architecture=x86_64, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.buildah.version=1.41.4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, container_name=ovn_controller, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container) Nov 26 04:06:21 localhost podman[109341]: 2025-11-26 09:06:21.396329689 +0000 UTC m=+0.155434646 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, vcs-type=git) Nov 26 04:06:21 localhost podman[109341]: unhealthy Nov 26 04:06:21 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:06:21 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 04:06:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40577 DF PROTO=TCP SPT=56780 DPT=9105 SEQ=2306419023 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A70ABC0000000001030307) Nov 26 04:06:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47293 DF PROTO=TCP SPT=49346 DPT=9102 SEQ=3851083387 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A7123D0000000001030307) Nov 26 04:06:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17576 DF PROTO=TCP SPT=53230 DPT=9102 SEQ=2870595829 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A71DFD0000000001030307) Nov 26 04:06:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44147 DF PROTO=TCP SPT=39324 DPT=9101 SEQ=2504590409 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A729FC0000000001030307) Nov 26 04:06:30 localhost sshd[109382]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:06:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27087 DF PROTO=TCP SPT=41212 DPT=9101 SEQ=3256315636 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A736BC0000000001030307) Nov 26 04:06:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47296 DF PROTO=TCP SPT=49346 DPT=9102 SEQ=3851083387 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A749FC0000000001030307) Nov 26 04:06:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27088 DF PROTO=TCP SPT=41212 DPT=9101 SEQ=3256315636 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A757FC0000000001030307) Nov 26 04:06:44 localhost systemd[1]: tripleo_nova_virtlogd_wrapper.service: State 'stop-sigterm' timed out. Killing. Nov 26 04:06:44 localhost systemd[1]: tripleo_nova_virtlogd_wrapper.service: Killing process 60839 (conmon) with signal SIGKILL. Nov 26 04:06:44 localhost systemd[1]: tripleo_nova_virtlogd_wrapper.service: Main process exited, code=killed, status=9/KILL Nov 26 04:06:44 localhost systemd[1]: libpod-conmon-8745f127beba509bb46acdd315816193362d33cc29035590ca4ada21f6718d93.scope: Deactivated successfully. Nov 26 04:06:45 localhost podman[109473]: error opening file `/run/crun/8745f127beba509bb46acdd315816193362d33cc29035590ca4ada21f6718d93/status`: No such file or directory Nov 26 04:06:45 localhost podman[109460]: 2025-11-26 09:06:45.043880563 +0000 UTC m=+0.060046101 container cleanup 8745f127beba509bb46acdd315816193362d33cc29035590ca4ada21f6718d93 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=nova_virtlogd_wrapper, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, config_id=tripleo_step3, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, version=17.1.12, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 04:06:45 localhost podman[109460]: nova_virtlogd_wrapper Nov 26 04:06:45 localhost systemd[1]: tripleo_nova_virtlogd_wrapper.service: Failed with result 'timeout'. Nov 26 04:06:45 localhost systemd[1]: Stopped nova_virtlogd_wrapper container. Nov 26 04:06:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47553 DF PROTO=TCP SPT=37504 DPT=9105 SEQ=1873896068 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A764200000000001030307) Nov 26 04:06:45 localhost python3.9[109566]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:06:45 localhost systemd[1]: Reloading. Nov 26 04:06:46 localhost systemd-rc-local-generator[109592]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:06:46 localhost systemd-sysv-generator[109598]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:06:46 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:06:46 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 26 04:06:46 localhost systemd[1]: Stopping nova_virtnodedevd container... Nov 26 04:06:46 localhost recover_tripleo_nova_virtqemud[109608]: 61604 Nov 26 04:06:46 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 26 04:06:46 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 26 04:06:46 localhost systemd[1]: libpod-5994135cac9d5a3d22a1087204d78a6aa9d790c1779606870a5be92c9633bf8d.scope: Deactivated successfully. Nov 26 04:06:46 localhost systemd[1]: libpod-5994135cac9d5a3d22a1087204d78a6aa9d790c1779606870a5be92c9633bf8d.scope: Consumed 1.495s CPU time. Nov 26 04:06:46 localhost podman[109609]: 2025-11-26 09:06:46.343591789 +0000 UTC m=+0.081510592 container died 5994135cac9d5a3d22a1087204d78a6aa9d790c1779606870a5be92c9633bf8d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, container_name=nova_virtnodedevd, build-date=2025-11-19T00:35:22Z, config_id=tripleo_step3, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, batch=17.1_20251118.1, release=1761123044, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, architecture=x86_64, version=17.1.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.buildah.version=1.41.4) Nov 26 04:06:46 localhost podman[109609]: 2025-11-26 09:06:46.37938676 +0000 UTC m=+0.117305513 container cleanup 5994135cac9d5a3d22a1087204d78a6aa9d790c1779606870a5be92c9633bf8d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, name=rhosp17/openstack-nova-libvirt, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_virtnodedevd, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container) Nov 26 04:06:46 localhost podman[109609]: nova_virtnodedevd Nov 26 04:06:46 localhost podman[109623]: 2025-11-26 09:06:46.431262973 +0000 UTC m=+0.074654068 container cleanup 5994135cac9d5a3d22a1087204d78a6aa9d790c1779606870a5be92c9633bf8d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, build-date=2025-11-19T00:35:22Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step3, batch=17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtnodedevd, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, io.buildah.version=1.41.4, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 26 04:06:46 localhost systemd[1]: libpod-conmon-5994135cac9d5a3d22a1087204d78a6aa9d790c1779606870a5be92c9633bf8d.scope: Deactivated successfully. Nov 26 04:06:46 localhost podman[109651]: error opening file `/run/crun/5994135cac9d5a3d22a1087204d78a6aa9d790c1779606870a5be92c9633bf8d/status`: No such file or directory Nov 26 04:06:46 localhost podman[109640]: 2025-11-26 09:06:46.54041301 +0000 UTC m=+0.067299677 container cleanup 5994135cac9d5a3d22a1087204d78a6aa9d790c1779606870a5be92c9633bf8d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, config_id=tripleo_step3, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_virtnodedevd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, build-date=2025-11-19T00:35:22Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1) Nov 26 04:06:46 localhost podman[109640]: nova_virtnodedevd Nov 26 04:06:46 localhost systemd[1]: tripleo_nova_virtnodedevd.service: Deactivated successfully. Nov 26 04:06:46 localhost systemd[1]: Stopped nova_virtnodedevd container. Nov 26 04:06:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47554 DF PROTO=TCP SPT=37504 DPT=9105 SEQ=1873896068 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A7683C0000000001030307) Nov 26 04:06:47 localhost python3.9[109744]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:06:47 localhost systemd[1]: var-lib-containers-storage-overlay-0341f1887aae20a301e856089dc461ce52079f292afb39f1be5bab8c0d01f7a2-merged.mount: Deactivated successfully. Nov 26 04:06:47 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5994135cac9d5a3d22a1087204d78a6aa9d790c1779606870a5be92c9633bf8d-userdata-shm.mount: Deactivated successfully. Nov 26 04:06:47 localhost systemd[1]: Reloading. Nov 26 04:06:47 localhost systemd-sysv-generator[109770]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:06:47 localhost systemd-rc-local-generator[109766]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:06:47 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:06:47 localhost systemd[1]: Stopping nova_virtproxyd container... Nov 26 04:06:47 localhost systemd[1]: libpod-f67337eb348d14cc8789e9dcf8617d0dec3d3d925b69fc4ab56922ca0f9658f9.scope: Deactivated successfully. Nov 26 04:06:47 localhost podman[109785]: 2025-11-26 09:06:47.724438885 +0000 UTC m=+0.065190122 container died f67337eb348d14cc8789e9dcf8617d0dec3d3d925b69fc4ab56922ca0f9658f9 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, container_name=nova_virtproxyd, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.4, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.12, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, config_id=tripleo_step3, build-date=2025-11-19T00:35:22Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible) Nov 26 04:06:47 localhost podman[109785]: 2025-11-26 09:06:47.751425209 +0000 UTC m=+0.092176446 container cleanup f67337eb348d14cc8789e9dcf8617d0dec3d3d925b69fc4ab56922ca0f9658f9 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, architecture=x86_64, build-date=2025-11-19T00:35:22Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, version=17.1.12, container_name=nova_virtproxyd, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step3, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, url=https://www.redhat.com, batch=17.1_20251118.1) Nov 26 04:06:47 localhost podman[109785]: nova_virtproxyd Nov 26 04:06:47 localhost podman[109800]: 2025-11-26 09:06:47.782049838 +0000 UTC m=+0.055863670 container cleanup f67337eb348d14cc8789e9dcf8617d0dec3d3d925b69fc4ab56922ca0f9658f9 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, build-date=2025-11-19T00:35:22Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_virtproxyd, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt, io.buildah.version=1.41.4, config_id=tripleo_step3, url=https://www.redhat.com, version=17.1.12, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}) Nov 26 04:06:47 localhost systemd[1]: libpod-conmon-f67337eb348d14cc8789e9dcf8617d0dec3d3d925b69fc4ab56922ca0f9658f9.scope: Deactivated successfully. Nov 26 04:06:47 localhost podman[109827]: error opening file `/run/crun/f67337eb348d14cc8789e9dcf8617d0dec3d3d925b69fc4ab56922ca0f9658f9/status`: No such file or directory Nov 26 04:06:47 localhost podman[109816]: 2025-11-26 09:06:47.872030445 +0000 UTC m=+0.060178455 container cleanup f67337eb348d14cc8789e9dcf8617d0dec3d3d925b69fc4ab56922ca0f9658f9 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, container_name=nova_virtproxyd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.12, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, name=rhosp17/openstack-nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, build-date=2025-11-19T00:35:22Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, config_id=tripleo_step3, batch=17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, architecture=x86_64) Nov 26 04:06:47 localhost podman[109816]: nova_virtproxyd Nov 26 04:06:47 localhost systemd[1]: tripleo_nova_virtproxyd.service: Deactivated successfully. Nov 26 04:06:47 localhost systemd[1]: Stopped nova_virtproxyd container. Nov 26 04:06:48 localhost systemd[1]: var-lib-containers-storage-overlay-562cf4f00dff93969d256b2be1bb2ab69067066ea2da814f14524981979b95c3-merged.mount: Deactivated successfully. Nov 26 04:06:48 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f67337eb348d14cc8789e9dcf8617d0dec3d3d925b69fc4ab56922ca0f9658f9-userdata-shm.mount: Deactivated successfully. Nov 26 04:06:48 localhost python3.9[109920]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:06:48 localhost systemd[1]: Reloading. Nov 26 04:06:48 localhost systemd-rc-local-generator[109947]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:06:48 localhost systemd-sysv-generator[109951]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:06:48 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:06:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47555 DF PROTO=TCP SPT=37504 DPT=9105 SEQ=1873896068 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A7703C0000000001030307) Nov 26 04:06:48 localhost systemd[1]: tripleo_nova_virtqemud_recover.timer: Deactivated successfully. Nov 26 04:06:48 localhost systemd[1]: Stopped Check and recover tripleo_nova_virtqemud every 10m. Nov 26 04:06:48 localhost systemd[1]: Stopping nova_virtqemud container... Nov 26 04:06:49 localhost systemd[1]: libpod-b29f4cd20a1c18ffd470f87f5036c652bb1768cdf8614e6a7c6503ca9a73b365.scope: Deactivated successfully. Nov 26 04:06:49 localhost systemd[1]: libpod-b29f4cd20a1c18ffd470f87f5036c652bb1768cdf8614e6a7c6503ca9a73b365.scope: Consumed 2.709s CPU time. Nov 26 04:06:49 localhost podman[109961]: 2025-11-26 09:06:49.085567614 +0000 UTC m=+0.077174487 container died b29f4cd20a1c18ffd470f87f5036c652bb1768cdf8614e6a7c6503ca9a73b365 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, build-date=2025-11-19T00:35:22Z, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, distribution-scope=public, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, url=https://www.redhat.com, io.buildah.version=1.41.4, container_name=nova_virtqemud, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.12, release=1761123044) Nov 26 04:06:49 localhost podman[109961]: 2025-11-26 09:06:49.113325712 +0000 UTC m=+0.104932585 container cleanup b29f4cd20a1c18ffd470f87f5036c652bb1768cdf8614e6a7c6503ca9a73b365 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, distribution-scope=public, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp17/openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, build-date=2025-11-19T00:35:22Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, container_name=nova_virtqemud, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}) Nov 26 04:06:49 localhost podman[109961]: nova_virtqemud Nov 26 04:06:49 localhost podman[109974]: 2025-11-26 09:06:49.143492177 +0000 UTC m=+0.046310701 container cleanup b29f4cd20a1c18ffd470f87f5036c652bb1768cdf8614e6a7c6503ca9a73b365 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, distribution-scope=public, release=1761123044, container_name=nova_virtqemud, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, version=17.1.12, name=rhosp17/openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git) Nov 26 04:06:49 localhost systemd[1]: var-lib-containers-storage-overlay-7fbacd248b5281d15359a0a3185510949d60a9c5c12517cf35c6a3746148bd16-merged.mount: Deactivated successfully. Nov 26 04:06:49 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b29f4cd20a1c18ffd470f87f5036c652bb1768cdf8614e6a7c6503ca9a73b365-userdata-shm.mount: Deactivated successfully. Nov 26 04:06:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 04:06:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 04:06:51 localhost podman[109990]: 2025-11-26 09:06:51.81802754 +0000 UTC m=+0.081918845 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, vcs-type=git, io.openshift.expose-services=, release=1761123044, distribution-scope=public) Nov 26 04:06:51 localhost systemd[1]: tmp-crun.0UdeSS.mount: Deactivated successfully. Nov 26 04:06:51 localhost podman[109990]: 2025-11-26 09:06:51.836309292 +0000 UTC m=+0.100200527 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, vendor=Red Hat, Inc.) Nov 26 04:06:51 localhost podman[109990]: unhealthy Nov 26 04:06:51 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:06:51 localhost podman[109989]: 2025-11-26 09:06:51.796038182 +0000 UTC m=+0.064442748 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, container_name=ovn_controller, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 04:06:51 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 04:06:51 localhost podman[109989]: 2025-11-26 09:06:51.87613279 +0000 UTC m=+0.144537386 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_id=tripleo_step4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, release=1761123044, io.buildah.version=1.41.4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 04:06:51 localhost podman[109989]: unhealthy Nov 26 04:06:51 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:06:51 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 04:06:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47556 DF PROTO=TCP SPT=37504 DPT=9105 SEQ=1873896068 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A77FFC0000000001030307) Nov 26 04:06:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17413 DF PROTO=TCP SPT=60590 DPT=9102 SEQ=1927455267 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A7877C0000000001030307) Nov 26 04:06:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47294 DF PROTO=TCP SPT=58128 DPT=9100 SEQ=1645773275 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A793FD0000000001030307) Nov 26 04:07:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17415 DF PROTO=TCP SPT=60590 DPT=9102 SEQ=1927455267 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A79F3C0000000001030307) Nov 26 04:07:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6234 DF PROTO=TCP SPT=43716 DPT=9101 SEQ=3475320121 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A7ABFC0000000001030307) Nov 26 04:07:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17416 DF PROTO=TCP SPT=60590 DPT=9102 SEQ=1927455267 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A7BFFC0000000001030307) Nov 26 04:07:11 localhost sshd[110026]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:07:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6235 DF PROTO=TCP SPT=43716 DPT=9101 SEQ=3475320121 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A7CBFC0000000001030307) Nov 26 04:07:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9429 DF PROTO=TCP SPT=50494 DPT=9882 SEQ=1106351039 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A7CDFC0000000001030307) Nov 26 04:07:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5805 DF PROTO=TCP SPT=38998 DPT=9105 SEQ=1984938096 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A7D9500000000001030307) Nov 26 04:07:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5807 DF PROTO=TCP SPT=38998 DPT=9105 SEQ=1984938096 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A7E53C0000000001030307) Nov 26 04:07:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 04:07:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 04:07:22 localhost systemd[1]: tmp-crun.kEgS8c.mount: Deactivated successfully. Nov 26 04:07:22 localhost podman[110028]: 2025-11-26 09:07:22.336379194 +0000 UTC m=+0.093824339 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, tcib_managed=true, name=rhosp17/openstack-ovn-controller, version=17.1.12, vendor=Red Hat, Inc., batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, url=https://www.redhat.com, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller) Nov 26 04:07:22 localhost podman[110028]: 2025-11-26 09:07:22.35416955 +0000 UTC m=+0.111614675 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, build-date=2025-11-18T23:34:05Z, version=17.1.12, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.buildah.version=1.41.4, release=1761123044, container_name=ovn_controller, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 26 04:07:22 localhost systemd[1]: tmp-crun.LV0hs4.mount: Deactivated successfully. Nov 26 04:07:22 localhost podman[110029]: 2025-11-26 09:07:22.381250898 +0000 UTC m=+0.137609368 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, release=1761123044, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn) Nov 26 04:07:22 localhost podman[110029]: 2025-11-26 09:07:22.395413642 +0000 UTC m=+0.151772072 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=) Nov 26 04:07:22 localhost podman[110029]: unhealthy Nov 26 04:07:22 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:07:22 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 04:07:22 localhost podman[110028]: unhealthy Nov 26 04:07:22 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:07:22 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 04:07:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5808 DF PROTO=TCP SPT=38998 DPT=9105 SEQ=1984938096 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A7F4FC0000000001030307) Nov 26 04:07:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36860 DF PROTO=TCP SPT=47360 DPT=9102 SEQ=2529337896 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A7FCBD0000000001030307) Nov 26 04:07:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19837 DF PROTO=TCP SPT=59602 DPT=9101 SEQ=1400290725 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A8093C0000000001030307) Nov 26 04:07:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36862 DF PROTO=TCP SPT=47360 DPT=9102 SEQ=2529337896 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A8147C0000000001030307) Nov 26 04:07:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19839 DF PROTO=TCP SPT=59602 DPT=9101 SEQ=1400290725 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A820FC0000000001030307) Nov 26 04:07:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36863 DF PROTO=TCP SPT=47360 DPT=9102 SEQ=2529337896 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A833FC0000000001030307) Nov 26 04:07:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19840 DF PROTO=TCP SPT=59602 DPT=9101 SEQ=1400290725 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A841FC0000000001030307) Nov 26 04:07:43 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59194 DF PROTO=TCP SPT=41548 DPT=9882 SEQ=4119856019 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A843FC0000000001030307) Nov 26 04:07:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35580 DF PROTO=TCP SPT=53152 DPT=9105 SEQ=524248744 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A84E810000000001030307) Nov 26 04:07:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35582 DF PROTO=TCP SPT=53152 DPT=9105 SEQ=524248744 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A85A7C0000000001030307) Nov 26 04:07:51 localhost sshd[110146]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:07:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 04:07:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 04:07:52 localhost podman[110149]: 2025-11-26 09:07:52.812721236 +0000 UTC m=+0.071279293 container health_status 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.buildah.version=1.41.4, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12) Nov 26 04:07:52 localhost systemd[1]: tmp-crun.9ZUUcy.mount: Deactivated successfully. Nov 26 04:07:52 localhost podman[110148]: 2025-11-26 09:07:52.863587988 +0000 UTC m=+0.124662743 container health_status 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, release=1761123044, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller) Nov 26 04:07:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35583 DF PROTO=TCP SPT=53152 DPT=9105 SEQ=524248744 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A86A3C0000000001030307) Nov 26 04:07:52 localhost podman[110148]: 2025-11-26 09:07:52.877261196 +0000 UTC m=+0.138336011 container exec_died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, batch=17.1_20251118.1, config_id=tripleo_step4, release=1761123044, tcib_managed=true, version=17.1.12, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible) Nov 26 04:07:52 localhost podman[110148]: unhealthy Nov 26 04:07:52 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:07:52 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed with result 'exit-code'. Nov 26 04:07:52 localhost podman[110149]: 2025-11-26 09:07:52.898472891 +0000 UTC m=+0.157030948 container exec_died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, release=1761123044, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, distribution-scope=public) Nov 26 04:07:52 localhost podman[110149]: unhealthy Nov 26 04:07:52 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:07:52 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed with result 'exit-code'. Nov 26 04:07:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47600 DF PROTO=TCP SPT=40894 DPT=9102 SEQ=722939789 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A871BD0000000001030307) Nov 26 04:07:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17418 DF PROTO=TCP SPT=60590 DPT=9102 SEQ=1927455267 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A87DFC0000000001030307) Nov 26 04:08:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47602 DF PROTO=TCP SPT=40894 DPT=9102 SEQ=722939789 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A8897C0000000001030307) Nov 26 04:08:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35356 DF PROTO=TCP SPT=55004 DPT=9101 SEQ=3519561454 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A8963C0000000001030307) Nov 26 04:08:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47603 DF PROTO=TCP SPT=40894 DPT=9102 SEQ=722939789 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A8A9FC0000000001030307) Nov 26 04:08:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35357 DF PROTO=TCP SPT=55004 DPT=9101 SEQ=3519561454 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A8B5FD0000000001030307) Nov 26 04:08:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38194 DF PROTO=TCP SPT=42138 DPT=9882 SEQ=2542137351 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A8B7FD0000000001030307) Nov 26 04:08:13 localhost systemd[1]: tripleo_nova_virtqemud.service: State 'stop-sigterm' timed out. Killing. Nov 26 04:08:13 localhost systemd[1]: tripleo_nova_virtqemud.service: Killing process 61600 (conmon) with signal SIGKILL. Nov 26 04:08:13 localhost systemd[1]: tripleo_nova_virtqemud.service: Main process exited, code=killed, status=9/KILL Nov 26 04:08:13 localhost systemd[1]: libpod-conmon-b29f4cd20a1c18ffd470f87f5036c652bb1768cdf8614e6a7c6503ca9a73b365.scope: Deactivated successfully. Nov 26 04:08:13 localhost podman[110198]: error opening file `/run/crun/b29f4cd20a1c18ffd470f87f5036c652bb1768cdf8614e6a7c6503ca9a73b365/status`: No such file or directory Nov 26 04:08:13 localhost podman[110186]: 2025-11-26 09:08:13.29694957 +0000 UTC m=+0.063026244 container cleanup b29f4cd20a1c18ffd470f87f5036c652bb1768cdf8614e6a7c6503ca9a73b365 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, build-date=2025-11-19T00:35:22Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtqemud, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, io.openshift.expose-services=, release=1761123044, io.buildah.version=1.41.4, managed_by=tripleo_ansible, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container) Nov 26 04:08:13 localhost podman[110186]: nova_virtqemud Nov 26 04:08:13 localhost systemd[1]: tripleo_nova_virtqemud.service: Failed with result 'timeout'. Nov 26 04:08:13 localhost systemd[1]: Stopped nova_virtqemud container. Nov 26 04:08:14 localhost python3.9[110292]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud_recover.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:08:14 localhost systemd[1]: Reloading. Nov 26 04:08:14 localhost systemd-rc-local-generator[110318]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:08:14 localhost systemd-sysv-generator[110321]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:08:14 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:08:15 localhost python3.9[110422]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:08:15 localhost systemd[1]: Reloading. Nov 26 04:08:15 localhost systemd-sysv-generator[110450]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:08:15 localhost systemd-rc-local-generator[110446]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:08:15 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:08:15 localhost systemd[1]: Stopping nova_virtsecretd container... Nov 26 04:08:15 localhost systemd[1]: libpod-906839e5d93b9347df842476746b0d3a39742bde0368f5b18aed5994f7acb07b.scope: Deactivated successfully. Nov 26 04:08:15 localhost podman[110463]: 2025-11-26 09:08:15.677410918 +0000 UTC m=+0.076062302 container died 906839e5d93b9347df842476746b0d3a39742bde0368f5b18aed5994f7acb07b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, release=1761123044, build-date=2025-11-19T00:35:22Z, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, config_id=tripleo_step3, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_virtsecretd, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 04:08:15 localhost podman[110463]: 2025-11-26 09:08:15.714596701 +0000 UTC m=+0.113248055 container cleanup 906839e5d93b9347df842476746b0d3a39742bde0368f5b18aed5994f7acb07b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_virtsecretd, release=1761123044, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-libvirt, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.4) Nov 26 04:08:15 localhost podman[110463]: nova_virtsecretd Nov 26 04:08:15 localhost podman[110476]: 2025-11-26 09:08:15.756350159 +0000 UTC m=+0.065100849 container cleanup 906839e5d93b9347df842476746b0d3a39742bde0368f5b18aed5994f7acb07b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, container_name=nova_virtsecretd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, release=1761123044, version=17.1.12, batch=17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:35:22Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, distribution-scope=public, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container) Nov 26 04:08:15 localhost systemd[1]: libpod-conmon-906839e5d93b9347df842476746b0d3a39742bde0368f5b18aed5994f7acb07b.scope: Deactivated successfully. Nov 26 04:08:15 localhost podman[110503]: error opening file `/run/crun/906839e5d93b9347df842476746b0d3a39742bde0368f5b18aed5994f7acb07b/status`: No such file or directory Nov 26 04:08:15 localhost podman[110492]: 2025-11-26 09:08:15.850914229 +0000 UTC m=+0.061618120 container cleanup 906839e5d93b9347df842476746b0d3a39742bde0368f5b18aed5994f7acb07b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, version=17.1.12, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, distribution-scope=public, batch=17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, release=1761123044, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:35:22Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtsecretd, io.openshift.expose-services=) Nov 26 04:08:15 localhost podman[110492]: nova_virtsecretd Nov 26 04:08:15 localhost systemd[1]: tripleo_nova_virtsecretd.service: Deactivated successfully. Nov 26 04:08:15 localhost systemd[1]: Stopped nova_virtsecretd container. Nov 26 04:08:16 localhost python3.9[110596]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:08:16 localhost systemd[1]: Reloading. Nov 26 04:08:16 localhost systemd-rc-local-generator[110619]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:08:16 localhost systemd-sysv-generator[110622]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:08:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2147 DF PROTO=TCP SPT=35528 DPT=9105 SEQ=2172896945 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A8C7BD0000000001030307) Nov 26 04:08:16 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:08:16 localhost systemd[1]: tmp-crun.umW3PR.mount: Deactivated successfully. Nov 26 04:08:16 localhost systemd[1]: var-lib-containers-storage-overlay-936e3cf49366e6a39a6d9fcfb7eda40c941ef016ddebdad776e7ba69c7632552-merged.mount: Deactivated successfully. Nov 26 04:08:16 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-906839e5d93b9347df842476746b0d3a39742bde0368f5b18aed5994f7acb07b-userdata-shm.mount: Deactivated successfully. Nov 26 04:08:16 localhost systemd[1]: Stopping nova_virtstoraged container... Nov 26 04:08:17 localhost systemd[1]: tmp-crun.750Tdq.mount: Deactivated successfully. Nov 26 04:08:17 localhost systemd[1]: libpod-73f5fd05db839fb6a1d1aa71f796fc97a73af0e0d291430d998c62ae8e85d8cb.scope: Deactivated successfully. Nov 26 04:08:17 localhost podman[110637]: 2025-11-26 09:08:17.071345383 +0000 UTC m=+0.088841171 container died 73f5fd05db839fb6a1d1aa71f796fc97a73af0e0d291430d998c62ae8e85d8cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step3, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtstoraged, build-date=2025-11-19T00:35:22Z, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-libvirt, release=1761123044, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}) Nov 26 04:08:17 localhost podman[110637]: 2025-11-26 09:08:17.103103018 +0000 UTC m=+0.120598746 container cleanup 73f5fd05db839fb6a1d1aa71f796fc97a73af0e0d291430d998c62ae8e85d8cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:35:22Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, container_name=nova_virtstoraged, io.buildah.version=1.41.4, tcib_managed=true, vcs-type=git, release=1761123044, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, config_id=tripleo_step3, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt) Nov 26 04:08:17 localhost podman[110637]: nova_virtstoraged Nov 26 04:08:17 localhost podman[110651]: 2025-11-26 09:08:17.156074646 +0000 UTC m=+0.073220324 container cleanup 73f5fd05db839fb6a1d1aa71f796fc97a73af0e0d291430d998c62ae8e85d8cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, vcs-type=git, version=17.1.12, tcib_managed=true, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., config_id=tripleo_step3, container_name=nova_virtstoraged, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 04:08:17 localhost systemd[1]: libpod-conmon-73f5fd05db839fb6a1d1aa71f796fc97a73af0e0d291430d998c62ae8e85d8cb.scope: Deactivated successfully. Nov 26 04:08:17 localhost podman[110679]: error opening file `/run/crun/73f5fd05db839fb6a1d1aa71f796fc97a73af0e0d291430d998c62ae8e85d8cb/status`: No such file or directory Nov 26 04:08:17 localhost podman[110666]: 2025-11-26 09:08:17.255373804 +0000 UTC m=+0.065909254 container cleanup 73f5fd05db839fb6a1d1aa71f796fc97a73af0e0d291430d998c62ae8e85d8cb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp17/openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_virtstoraged, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, version=17.1.12, com.redhat.component=openstack-nova-libvirt-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c7803ed1795969cb7cf47e6d4d57c4b9'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step3, release=1761123044, tcib_managed=true, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z) Nov 26 04:08:17 localhost podman[110666]: nova_virtstoraged Nov 26 04:08:17 localhost systemd[1]: tripleo_nova_virtstoraged.service: Deactivated successfully. Nov 26 04:08:17 localhost systemd[1]: Stopped nova_virtstoraged container. Nov 26 04:08:17 localhost systemd[1]: var-lib-containers-storage-overlay-eadc15d9188ab59a3183de8359c9702c1c3bf67b60cc946527b932af6f7de9b9-merged.mount: Deactivated successfully. Nov 26 04:08:17 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-73f5fd05db839fb6a1d1aa71f796fc97a73af0e0d291430d998c62ae8e85d8cb-userdata-shm.mount: Deactivated successfully. Nov 26 04:08:18 localhost python3.9[110772]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ovn_controller.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:08:18 localhost systemd[1]: Reloading. Nov 26 04:08:18 localhost systemd-sysv-generator[110802]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:08:18 localhost systemd-rc-local-generator[110799]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:08:18 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:08:18 localhost systemd[1]: Stopping ovn_controller container... Nov 26 04:08:18 localhost systemd[1]: libpod-4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.scope: Deactivated successfully. Nov 26 04:08:18 localhost systemd[1]: libpod-4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.scope: Consumed 2.680s CPU time. Nov 26 04:08:18 localhost podman[110813]: 2025-11-26 09:08:18.469360896 +0000 UTC m=+0.098036299 container died 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, container_name=ovn_controller, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, release=1761123044, architecture=x86_64, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 26 04:08:18 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.timer: Deactivated successfully. Nov 26 04:08:18 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5. Nov 26 04:08:18 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed to open /run/systemd/transient/4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: No such file or directory Nov 26 04:08:18 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5-userdata-shm.mount: Deactivated successfully. Nov 26 04:08:18 localhost systemd[1]: var-lib-containers-storage-overlay-4bfd9627f34b55559ce4f57af46bbdd5ae4252c5961ae723ed839f5c0358e239-merged.mount: Deactivated successfully. Nov 26 04:08:18 localhost podman[110813]: 2025-11-26 09:08:18.520785817 +0000 UTC m=+0.149461130 container cleanup 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, vcs-type=git, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12) Nov 26 04:08:18 localhost podman[110813]: ovn_controller Nov 26 04:08:18 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.timer: Failed to open /run/systemd/transient/4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.timer: No such file or directory Nov 26 04:08:18 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed to open /run/systemd/transient/4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: No such file or directory Nov 26 04:08:18 localhost podman[110828]: 2025-11-26 09:08:18.565127895 +0000 UTC m=+0.083755653 container cleanup 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.4, version=17.1.12, name=rhosp17/openstack-ovn-controller, release=1761123044, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 26 04:08:18 localhost systemd[1]: libpod-conmon-4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.scope: Deactivated successfully. Nov 26 04:08:18 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.timer: Failed to open /run/systemd/transient/4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.timer: No such file or directory Nov 26 04:08:18 localhost systemd[1]: 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: Failed to open /run/systemd/transient/4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5.service: No such file or directory Nov 26 04:08:18 localhost podman[110842]: 2025-11-26 09:08:18.672835046 +0000 UTC m=+0.064418947 container cleanup 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, version=17.1.12, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Nov 26 04:08:18 localhost podman[110842]: ovn_controller Nov 26 04:08:18 localhost systemd[1]: tripleo_ovn_controller.service: Deactivated successfully. Nov 26 04:08:18 localhost systemd[1]: Stopped ovn_controller container. Nov 26 04:08:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2148 DF PROTO=TCP SPT=35528 DPT=9105 SEQ=2172896945 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A8CFBC0000000001030307) Nov 26 04:08:19 localhost python3.9[110946]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ovn_metadata_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:08:19 localhost systemd[1]: Reloading. Nov 26 04:08:19 localhost systemd-rc-local-generator[110969]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:08:19 localhost systemd-sysv-generator[110975]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:08:19 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:08:19 localhost systemd[1]: Stopping ovn_metadata_agent container... Nov 26 04:08:20 localhost systemd[1]: libpod-670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.scope: Deactivated successfully. Nov 26 04:08:20 localhost systemd[1]: libpod-670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.scope: Consumed 11.287s CPU time. Nov 26 04:08:20 localhost podman[110987]: 2025-11-26 09:08:20.543189926 +0000 UTC m=+0.655122149 container died 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com) Nov 26 04:08:20 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.timer: Deactivated successfully. Nov 26 04:08:20 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942. Nov 26 04:08:20 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed to open /run/systemd/transient/670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: No such file or directory Nov 26 04:08:20 localhost podman[110987]: 2025-11-26 09:08:20.608563263 +0000 UTC m=+0.720495466 container cleanup 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, distribution-scope=public, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true) Nov 26 04:08:20 localhost podman[110987]: ovn_metadata_agent Nov 26 04:08:20 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.timer: Failed to open /run/systemd/transient/670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.timer: No such file or directory Nov 26 04:08:20 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed to open /run/systemd/transient/670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: No such file or directory Nov 26 04:08:20 localhost podman[111000]: 2025-11-26 09:08:20.647675957 +0000 UTC m=+0.086995874 container cleanup 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible) Nov 26 04:08:20 localhost systemd[1]: var-lib-containers-storage-overlay-7d5b4c6945eb62c6a5f0770a2d5f587e2cc4a0e0e03face3bef429dfc2877809-merged.mount: Deactivated successfully. Nov 26 04:08:20 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942-userdata-shm.mount: Deactivated successfully. Nov 26 04:08:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2149 DF PROTO=TCP SPT=35528 DPT=9105 SEQ=2172896945 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A8DF7C0000000001030307) Nov 26 04:08:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63888 DF PROTO=TCP SPT=46582 DPT=9102 SEQ=2283799202 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A8E6FD0000000001030307) Nov 26 04:08:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65494 DF PROTO=TCP SPT=48088 DPT=9101 SEQ=2635169813 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A8F3BC0000000001030307) Nov 26 04:08:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63890 DF PROTO=TCP SPT=46582 DPT=9102 SEQ=2283799202 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A8FEBC0000000001030307) Nov 26 04:08:31 localhost sshd[111016]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:08:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65496 DF PROTO=TCP SPT=48088 DPT=9101 SEQ=2635169813 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A90B7C0000000001030307) Nov 26 04:08:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23727 DF PROTO=TCP SPT=38976 DPT=9100 SEQ=3092091194 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A91FFD0000000001030307) Nov 26 04:08:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65497 DF PROTO=TCP SPT=48088 DPT=9101 SEQ=2635169813 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A92BFC0000000001030307) Nov 26 04:08:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57647 DF PROTO=TCP SPT=39178 DPT=9882 SEQ=2901900358 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A92DFC0000000001030307) Nov 26 04:08:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5727 DF PROTO=TCP SPT=57148 DPT=9105 SEQ=1376035902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A938E00000000001030307) Nov 26 04:08:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5729 DF PROTO=TCP SPT=57148 DPT=9105 SEQ=1376035902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A944FC0000000001030307) Nov 26 04:08:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5730 DF PROTO=TCP SPT=57148 DPT=9105 SEQ=1376035902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A954BC0000000001030307) Nov 26 04:08:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62542 DF PROTO=TCP SPT=45568 DPT=9102 SEQ=1176415353 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A95C3D0000000001030307) Nov 26 04:08:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47605 DF PROTO=TCP SPT=40894 DPT=9102 SEQ=722939789 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A967FC0000000001030307) Nov 26 04:09:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35359 DF PROTO=TCP SPT=55004 DPT=9101 SEQ=3519561454 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A973FC0000000001030307) Nov 26 04:09:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53716 DF PROTO=TCP SPT=34306 DPT=9101 SEQ=1118577009 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A980BC0000000001030307) Nov 26 04:09:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62545 DF PROTO=TCP SPT=45568 DPT=9102 SEQ=1176415353 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A993FC0000000001030307) Nov 26 04:09:11 localhost sshd[111145]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:09:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53717 DF PROTO=TCP SPT=34306 DPT=9101 SEQ=1118577009 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A9A1FD0000000001030307) Nov 26 04:09:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10868 DF PROTO=TCP SPT=35972 DPT=9105 SEQ=1587795430 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A9AE100000000001030307) Nov 26 04:09:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10869 DF PROTO=TCP SPT=35972 DPT=9105 SEQ=1587795430 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A9B1FC0000000001030307) Nov 26 04:09:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10870 DF PROTO=TCP SPT=35972 DPT=9105 SEQ=1587795430 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A9B9FD0000000001030307) Nov 26 04:09:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10871 DF PROTO=TCP SPT=35972 DPT=9105 SEQ=1587795430 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A9C9BC0000000001030307) Nov 26 04:09:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37742 DF PROTO=TCP SPT=44418 DPT=9102 SEQ=4075543129 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A9D17D0000000001030307) Nov 26 04:09:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4271 DF PROTO=TCP SPT=35206 DPT=9101 SEQ=1068231806 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A9DDFC0000000001030307) Nov 26 04:09:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37744 DF PROTO=TCP SPT=44418 DPT=9102 SEQ=4075543129 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A9E93C0000000001030307) Nov 26 04:09:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4273 DF PROTO=TCP SPT=35206 DPT=9101 SEQ=1068231806 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52A9F5BD0000000001030307) Nov 26 04:09:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37745 DF PROTO=TCP SPT=44418 DPT=9102 SEQ=4075543129 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AA09FC0000000001030307) Nov 26 04:09:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4274 DF PROTO=TCP SPT=35206 DPT=9101 SEQ=1068231806 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AA15FC0000000001030307) Nov 26 04:09:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39233 DF PROTO=TCP SPT=49454 DPT=9882 SEQ=2559575129 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AA17FC0000000001030307) Nov 26 04:09:44 localhost systemd[1]: tripleo_ovn_metadata_agent.service: State 'stop-sigterm' timed out. Killing. Nov 26 04:09:44 localhost systemd[1]: tripleo_ovn_metadata_agent.service: Killing process 69085 (conmon) with signal SIGKILL. Nov 26 04:09:44 localhost systemd[1]: tripleo_ovn_metadata_agent.service: Main process exited, code=killed, status=9/KILL Nov 26 04:09:44 localhost systemd[1]: libpod-conmon-670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.scope: Deactivated successfully. Nov 26 04:09:44 localhost systemd[1]: tmp-crun.1HiABF.mount: Deactivated successfully. Nov 26 04:09:44 localhost podman[111221]: error opening file `/run/crun/670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942/status`: No such file or directory Nov 26 04:09:44 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.timer: Failed to open /run/systemd/transient/670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.timer: No such file or directory Nov 26 04:09:44 localhost systemd[1]: 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: Failed to open /run/systemd/transient/670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942.service: No such file or directory Nov 26 04:09:44 localhost podman[111209]: 2025-11-26 09:09:44.851791076 +0000 UTC m=+0.109964352 container cleanup 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=ovn_metadata_agent, distribution-scope=public) Nov 26 04:09:44 localhost podman[111209]: ovn_metadata_agent Nov 26 04:09:44 localhost systemd[1]: tripleo_ovn_metadata_agent.service: Failed with result 'timeout'. Nov 26 04:09:44 localhost systemd[1]: Stopped ovn_metadata_agent container. Nov 26 04:09:45 localhost python3.9[111330]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_rsyslog.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:09:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63185 DF PROTO=TCP SPT=58536 DPT=9105 SEQ=2702659253 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AA23400000000001030307) Nov 26 04:09:46 localhost systemd[1]: Reloading. Nov 26 04:09:46 localhost systemd-rc-local-generator[111359]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:09:46 localhost systemd-sysv-generator[111363]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:09:46 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:09:48 localhost python3.9[111460]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:09:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63187 DF PROTO=TCP SPT=58536 DPT=9105 SEQ=2702659253 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AA2F3C0000000001030307) Nov 26 04:09:49 localhost python3.9[111552]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_ipmi.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:09:49 localhost python3.9[111644]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_collectd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:09:50 localhost python3.9[111736]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_iscsid.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:09:50 localhost sshd[111783]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:09:50 localhost python3.9[111830]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_logrotate_crond.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:09:51 localhost python3.9[111922]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_metrics_qdr.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:09:52 localhost python3.9[112014]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_dhcp.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:09:52 localhost python3.9[112106]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_l3_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:09:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63188 DF PROTO=TCP SPT=58536 DPT=9105 SEQ=2702659253 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AA3EFD0000000001030307) Nov 26 04:09:53 localhost python3.9[112198]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_ovs_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:09:53 localhost python3.9[112290]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:09:54 localhost python3.9[112382]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:09:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18830 DF PROTO=TCP SPT=34268 DPT=9102 SEQ=1348172210 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AA467D0000000001030307) Nov 26 04:09:55 localhost python3.9[112474]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:09:55 localhost python3.9[112566]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:09:56 localhost python3.9[112658]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:09:56 localhost python3.9[112750]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:09:57 localhost python3.9[112842]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud_recover.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:09:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42424 DF PROTO=TCP SPT=45080 DPT=9101 SEQ=1825497630 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AA533C0000000001030307) Nov 26 04:09:58 localhost python3.9[112934]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:09:58 localhost python3.9[113026]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:09:59 localhost python3.9[113119]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ovn_controller.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:09:59 localhost python3.9[113211]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ovn_metadata_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:10:00 localhost python3.9[113303]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_rsyslog.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:10:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18832 DF PROTO=TCP SPT=34268 DPT=9102 SEQ=1348172210 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AA5E3C0000000001030307) Nov 26 04:10:01 localhost python3.9[113395]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:10:02 localhost python3.9[113487]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:10:02 localhost python3.9[113579]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_collectd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:10:03 localhost python3.9[113671]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_iscsid.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:10:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42426 DF PROTO=TCP SPT=45080 DPT=9101 SEQ=1825497630 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AA6AFC0000000001030307) Nov 26 04:10:04 localhost python3.9[113763]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_logrotate_crond.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:10:04 localhost python3.9[113855]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_metrics_qdr.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:10:05 localhost python3.9[113947]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_dhcp.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:10:05 localhost python3.9[114039]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_l3_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:10:06 localhost python3.9[114131]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_ovs_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:10:07 localhost python3.9[114223]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:10:07 localhost python3.9[114315]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:10:08 localhost python3.9[114407]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:10:08 localhost python3.9[114499]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:10:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18833 DF PROTO=TCP SPT=34268 DPT=9102 SEQ=1348172210 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AA7DFC0000000001030307) Nov 26 04:10:09 localhost python3.9[114591]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:10:10 localhost python3.9[114683]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:10:10 localhost python3.9[114775]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:10:11 localhost python3.9[114867]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:10:12 localhost python3.9[114959]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:10:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42427 DF PROTO=TCP SPT=45080 DPT=9101 SEQ=1825497630 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AA8BFC0000000001030307) Nov 26 04:10:12 localhost python3.9[115051]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ovn_controller.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:10:13 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10381 DF PROTO=TCP SPT=36002 DPT=9882 SEQ=3340061116 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AA8DFC0000000001030307) Nov 26 04:10:13 localhost python3.9[115143]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:10:14 localhost python3.9[115235]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_rsyslog.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:10:15 localhost python3.9[115327]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:10:16 localhost python3.9[115419]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Nov 26 04:10:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63786 DF PROTO=TCP SPT=36912 DPT=9105 SEQ=851095871 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AA9C7D0000000001030307) Nov 26 04:10:17 localhost python3.9[115511]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 26 04:10:17 localhost systemd[1]: Reloading. Nov 26 04:10:17 localhost systemd-rc-local-generator[115536]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:10:17 localhost systemd-sysv-generator[115542]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:10:17 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:10:18 localhost python3.9[115639]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:10:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63787 DF PROTO=TCP SPT=36912 DPT=9105 SEQ=851095871 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AAA47D0000000001030307) Nov 26 04:10:19 localhost python3.9[115732]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_ipmi.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:10:20 localhost python3.9[115825]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_collectd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:10:21 localhost python3.9[115918]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_iscsid.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:10:21 localhost python3.9[116011]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_logrotate_crond.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:10:22 localhost python3.9[116104]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_metrics_qdr.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:10:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63788 DF PROTO=TCP SPT=36912 DPT=9105 SEQ=851095871 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AAB43D0000000001030307) Nov 26 04:10:23 localhost python3.9[116197]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_dhcp.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:10:23 localhost python3.9[116290]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_l3_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:10:24 localhost python3.9[116383]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_ovs_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:10:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12213 DF PROTO=TCP SPT=49502 DPT=9102 SEQ=2562890933 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AABBBC0000000001030307) Nov 26 04:10:24 localhost python3.9[116476]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:10:25 localhost python3.9[116569]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:10:26 localhost python3.9[116662]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:10:26 localhost python3.9[116755]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:10:27 localhost python3.9[116848]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:10:27 localhost python3.9[116941]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:10:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50696 DF PROTO=TCP SPT=52144 DPT=9100 SEQ=4281172620 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AAC7FC0000000001030307) Nov 26 04:10:28 localhost python3.9[117034]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud_recover.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:10:29 localhost python3.9[117127]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:10:29 localhost python3.9[117220]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:10:30 localhost python3.9[117313]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ovn_controller.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:10:30 localhost sshd[117380]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:10:30 localhost python3.9[117407]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ovn_metadata_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:10:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12215 DF PROTO=TCP SPT=49502 DPT=9102 SEQ=2562890933 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AAD37C0000000001030307) Nov 26 04:10:31 localhost python3.9[117501]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_rsyslog.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:10:33 localhost systemd[1]: session-38.scope: Deactivated successfully. Nov 26 04:10:33 localhost systemd[1]: session-38.scope: Consumed 49.087s CPU time. Nov 26 04:10:33 localhost systemd-logind[761]: Session 38 logged out. Waiting for processes to exit. Nov 26 04:10:33 localhost systemd-logind[761]: Removed session 38. Nov 26 04:10:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37108 DF PROTO=TCP SPT=50426 DPT=9101 SEQ=646132754 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AAE03C0000000001030307) Nov 26 04:10:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12216 DF PROTO=TCP SPT=49502 DPT=9102 SEQ=2562890933 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AAF3FC0000000001030307) Nov 26 04:10:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37109 DF PROTO=TCP SPT=50426 DPT=9101 SEQ=646132754 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AAFFFC0000000001030307) Nov 26 04:10:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18871 DF PROTO=TCP SPT=46420 DPT=9882 SEQ=4195222755 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AB01FC0000000001030307) Nov 26 04:10:46 localhost podman[117616]: 2025-11-26 09:10:46.165666931 +0000 UTC m=+0.087625294 container exec a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, io.buildah.version=1.33.12, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, architecture=x86_64, ceph=True, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 26 04:10:46 localhost podman[117616]: 2025-11-26 09:10:46.268525011 +0000 UTC m=+0.190483394 container exec_died a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, name=rhceph, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, description=Red Hat Ceph Storage 7, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, distribution-scope=public, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., GIT_BRANCH=main) Nov 26 04:10:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65056 DF PROTO=TCP SPT=55846 DPT=9105 SEQ=124421775 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AB11BD0000000001030307) Nov 26 04:10:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65057 DF PROTO=TCP SPT=55846 DPT=9105 SEQ=124421775 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AB19BD0000000001030307) Nov 26 04:10:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65058 DF PROTO=TCP SPT=55846 DPT=9105 SEQ=124421775 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AB297C0000000001030307) Nov 26 04:10:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2541 DF PROTO=TCP SPT=55526 DPT=9102 SEQ=2410125927 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AB30FD0000000001030307) Nov 26 04:10:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39356 DF PROTO=TCP SPT=54174 DPT=9101 SEQ=1790165610 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AB3DBD0000000001030307) Nov 26 04:10:58 localhost sshd[117759]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:10:58 localhost systemd-logind[761]: New session 39 of user zuul. Nov 26 04:10:58 localhost systemd[1]: Started Session 39 of User zuul. Nov 26 04:10:59 localhost python3.9[117852]: ansible-ansible.legacy.ping Invoked with data=pong Nov 26 04:11:00 localhost python3.9[117956]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 04:11:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2543 DF PROTO=TCP SPT=55526 DPT=9102 SEQ=2410125927 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AB48BC0000000001030307) Nov 26 04:11:01 localhost python3.9[118048]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:11:02 localhost python3.9[118141]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:11:03 localhost python3.9[118233]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:11:04 localhost python3.9[118325]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:11:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39358 DF PROTO=TCP SPT=54174 DPT=9101 SEQ=1790165610 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AB557C0000000001030307) Nov 26 04:11:04 localhost python3.9[118398]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764148263.5742145-178-261110787838578/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:11:05 localhost python3.9[118490]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 04:11:06 localhost python3.9[118586]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:11:07 localhost python3.9[118678]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:11:07 localhost python3.9[118768]: ansible-ansible.builtin.service_facts Invoked Nov 26 04:11:07 localhost network[118785]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 26 04:11:07 localhost network[118786]: 'network-scripts' will be removed from distribution in near future. Nov 26 04:11:07 localhost network[118787]: It is advised to switch to 'NetworkManager' instead for network management. Nov 26 04:11:08 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:11:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34496 DF PROTO=TCP SPT=40584 DPT=9100 SEQ=701969763 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AB69FC0000000001030307) Nov 26 04:11:09 localhost sshd[118851]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:11:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39359 DF PROTO=TCP SPT=54174 DPT=9101 SEQ=1790165610 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AB75FC0000000001030307) Nov 26 04:11:12 localhost python3.9[118986]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:11:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31362 DF PROTO=TCP SPT=47914 DPT=9882 SEQ=1723132743 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AB77FC0000000001030307) Nov 26 04:11:13 localhost python3.9[119076]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 04:11:14 localhost python3.9[119172]: ansible-ansible.legacy.command Invoked with _raw_params=# This is a hack to deploy RDO Delorean repos to RHEL as if it were Centos 9 Stream#012set -euxo pipefail#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./repo-setup-main#012# This is required for FIPS enabled until trunk.rdoproject.org#012# is not being served from a centos7 host, tracked by#012# https://issues.redhat.com/browse/RHOSZUUL-1517#012dnf -y install crypto-policies#012update-crypto-policies --set FIPS:NO-ENFORCE-EMS#012./venv/bin/repo-setup current-podified -b antelope -d centos9 --stream#012#012# Exclude ceph-common-18.2.7 as it's pulling newer openssl not compatible#012# with rhel 9.2 openssh#012dnf config-manager --setopt centos9-storage.exclude="ceph-common-18.2.7" --save#012# FIXME: perform dnf upgrade for other packages in EDPM ansible#012# here we only ensuring that decontainerized libvirt can start#012dnf -y upgrade openstack-selinux#012rm -f /run/virtlogd.pid#012#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:11:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29975 DF PROTO=TCP SPT=36970 DPT=9105 SEQ=1579365823 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AB82D00000000001030307) Nov 26 04:11:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29977 DF PROTO=TCP SPT=36970 DPT=9105 SEQ=1579365823 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AB8EBC0000000001030307) Nov 26 04:11:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29978 DF PROTO=TCP SPT=36970 DPT=9105 SEQ=1579365823 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AB9E7D0000000001030307) Nov 26 04:11:24 localhost systemd[1]: Stopping OpenSSH server daemon... Nov 26 04:11:24 localhost systemd[1]: sshd.service: Deactivated successfully. Nov 26 04:11:24 localhost systemd[1]: Stopped OpenSSH server daemon. Nov 26 04:11:24 localhost systemd[1]: sshd.service: Consumed 4.211s CPU time. Nov 26 04:11:24 localhost systemd[1]: Stopped target sshd-keygen.target. Nov 26 04:11:24 localhost systemd[1]: Stopping sshd-keygen.target... Nov 26 04:11:24 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 26 04:11:24 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 26 04:11:24 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 26 04:11:24 localhost systemd[1]: Reached target sshd-keygen.target. Nov 26 04:11:24 localhost systemd[1]: Starting OpenSSH server daemon... Nov 26 04:11:24 localhost sshd[119215]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:11:24 localhost systemd[1]: Started OpenSSH server daemon. Nov 26 04:11:24 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 26 04:11:24 localhost systemd[1]: Starting man-db-cache-update.service... Nov 26 04:11:24 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 26 04:11:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22115 DF PROTO=TCP SPT=49568 DPT=9102 SEQ=3885511370 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52ABA63C0000000001030307) Nov 26 04:11:24 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Nov 26 04:11:24 localhost systemd[1]: Finished man-db-cache-update.service. Nov 26 04:11:24 localhost systemd[1]: run-r52ff583073864d92929de6d169620e5d.service: Deactivated successfully. Nov 26 04:11:24 localhost systemd[1]: run-r95166c5f28134ffb8e5688f98e1f4dc4.service: Deactivated successfully. Nov 26 04:11:25 localhost systemd[1]: Stopping OpenSSH server daemon... Nov 26 04:11:25 localhost systemd[1]: sshd.service: Deactivated successfully. Nov 26 04:11:25 localhost systemd[1]: Stopped OpenSSH server daemon. Nov 26 04:11:25 localhost systemd[1]: Stopped target sshd-keygen.target. Nov 26 04:11:25 localhost systemd[1]: Stopping sshd-keygen.target... Nov 26 04:11:25 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 26 04:11:25 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 26 04:11:25 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 26 04:11:25 localhost systemd[1]: Reached target sshd-keygen.target. Nov 26 04:11:25 localhost systemd[1]: Starting OpenSSH server daemon... Nov 26 04:11:25 localhost sshd[119386]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:11:25 localhost systemd[1]: Started OpenSSH server daemon. Nov 26 04:11:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12218 DF PROTO=TCP SPT=49502 DPT=9102 SEQ=2562890933 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52ABB1FD0000000001030307) Nov 26 04:11:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29979 DF PROTO=TCP SPT=36970 DPT=9105 SEQ=1579365823 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52ABBDFC0000000001030307) Nov 26 04:11:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34670 DF PROTO=TCP SPT=50562 DPT=9101 SEQ=1568777707 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52ABCA7C0000000001030307) Nov 26 04:11:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22118 DF PROTO=TCP SPT=49568 DPT=9102 SEQ=3885511370 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52ABDDFC0000000001030307) Nov 26 04:11:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34671 DF PROTO=TCP SPT=50562 DPT=9101 SEQ=1568777707 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52ABE9FC0000000001030307) Nov 26 04:11:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62642 DF PROTO=TCP SPT=48768 DPT=9105 SEQ=1064641805 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52ABF7FF0000000001030307) Nov 26 04:11:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62643 DF PROTO=TCP SPT=48768 DPT=9105 SEQ=1064641805 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52ABFBFD0000000001030307) Nov 26 04:11:48 localhost sshd[119540]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:11:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62644 DF PROTO=TCP SPT=48768 DPT=9105 SEQ=1064641805 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AC03FD0000000001030307) Nov 26 04:11:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62645 DF PROTO=TCP SPT=48768 DPT=9105 SEQ=1064641805 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AC13BC0000000001030307) Nov 26 04:11:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7580 DF PROTO=TCP SPT=58636 DPT=9102 SEQ=1343168400 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AC1B3D0000000001030307) Nov 26 04:11:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34498 DF PROTO=TCP SPT=40584 DPT=9100 SEQ=701969763 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AC27FC0000000001030307) Nov 26 04:12:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7582 DF PROTO=TCP SPT=58636 DPT=9102 SEQ=1343168400 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AC32FD0000000001030307) Nov 26 04:12:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56247 DF PROTO=TCP SPT=35486 DPT=9101 SEQ=570447650 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AC3FBC0000000001030307) Nov 26 04:12:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42021 DF PROTO=TCP SPT=43690 DPT=9100 SEQ=549055891 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AC53FC0000000001030307) Nov 26 04:12:10 localhost sshd[119748]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:12:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56248 DF PROTO=TCP SPT=35486 DPT=9101 SEQ=570447650 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AC5FFC0000000001030307) Nov 26 04:12:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24715 DF PROTO=TCP SPT=47928 DPT=9882 SEQ=3201611040 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AC61FC0000000001030307) Nov 26 04:12:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55756 DF PROTO=TCP SPT=36128 DPT=9105 SEQ=3189564336 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AC713C0000000001030307) Nov 26 04:12:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55757 DF PROTO=TCP SPT=36128 DPT=9105 SEQ=3189564336 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AC793D0000000001030307) Nov 26 04:12:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55758 DF PROTO=TCP SPT=36128 DPT=9105 SEQ=3189564336 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AC88FC0000000001030307) Nov 26 04:12:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11528 DF PROTO=TCP SPT=56946 DPT=9102 SEQ=466993611 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AC907D0000000001030307) Nov 26 04:12:27 localhost sshd[119805]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:12:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7221 DF PROTO=TCP SPT=48794 DPT=9101 SEQ=1514976879 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AC9D3C0000000001030307) Nov 26 04:12:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34673 DF PROTO=TCP SPT=50562 DPT=9101 SEQ=1568777707 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52ACA7FC0000000001030307) Nov 26 04:12:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7223 DF PROTO=TCP SPT=48794 DPT=9101 SEQ=1514976879 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52ACB4FD0000000001030307) Nov 26 04:12:37 localhost kernel: SELinux: Converting 2752 SID table entries... Nov 26 04:12:37 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 26 04:12:37 localhost kernel: SELinux: policy capability open_perms=1 Nov 26 04:12:37 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 26 04:12:37 localhost kernel: SELinux: policy capability always_check_network=0 Nov 26 04:12:37 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 26 04:12:37 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 26 04:12:37 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 26 04:12:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11531 DF PROTO=TCP SPT=56946 DPT=9102 SEQ=466993611 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52ACC7FC0000000001030307) Nov 26 04:12:39 localhost dbus-broker-launch[756]: avc: op=load_policy lsm=selinux seqno=17 res=1 Nov 26 04:12:39 localhost python3.9[120033]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:12:40 localhost python3.9[120125]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/edpm.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:12:41 localhost python3.9[120198]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/edpm.fact mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764148359.9650087-427-44727755055829/.source.fact _original_basename=.gxpy8hst follow=False checksum=03aee63dcf9b49b0ac4473b2f1a1b5d3783aa639 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:12:41 localhost python3.9[120288]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 04:12:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7224 DF PROTO=TCP SPT=48794 DPT=9101 SEQ=1514976879 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52ACD5FD0000000001030307) Nov 26 04:12:43 localhost python3.9[120386]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 26 04:12:43 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57874 DF PROTO=TCP SPT=46210 DPT=9882 SEQ=1406944603 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52ACD7FC0000000001030307) Nov 26 04:12:43 localhost python3.9[120440]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 26 04:12:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8286 DF PROTO=TCP SPT=56274 DPT=9105 SEQ=2282105981 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52ACE2600000000001030307) Nov 26 04:12:47 localhost systemd[1]: Reloading. Nov 26 04:12:47 localhost systemd-rc-local-generator[120476]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:12:47 localhost systemd-sysv-generator[120479]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:12:47 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:12:47 localhost systemd[1]: Queuing reload/restart jobs for marked units… Nov 26 04:12:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8288 DF PROTO=TCP SPT=56274 DPT=9105 SEQ=2282105981 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52ACEE7C0000000001030307) Nov 26 04:12:49 localhost python3.9[120580]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:12:51 localhost python3.9[120894]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False Nov 26 04:12:52 localhost python3.9[120987]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None Nov 26 04:12:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8289 DF PROTO=TCP SPT=56274 DPT=9105 SEQ=2282105981 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52ACFE3C0000000001030307) Nov 26 04:12:53 localhost python3.9[121080]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:12:54 localhost python3.9[121172]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None Nov 26 04:12:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50137 DF PROTO=TCP SPT=34860 DPT=9102 SEQ=210393682 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AD05BC0000000001030307) Nov 26 04:12:55 localhost python3.9[121264]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:12:56 localhost python3.9[121356]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:12:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50138 DF PROTO=TCP SPT=34860 DPT=9102 SEQ=210393682 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AD0DBD0000000001030307) Nov 26 04:12:58 localhost python3.9[121429]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764148375.866164-751-39333116580790/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=5ae35f2cd6e1d86b32ab15d958135d599d5a1291 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:12:59 localhost python3.9[121521]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:13:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50139 DF PROTO=TCP SPT=34860 DPT=9102 SEQ=210393682 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AD1D7C0000000001030307) Nov 26 04:13:01 localhost python3.9[121615]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None Nov 26 04:13:01 localhost python3.9[121708]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None Nov 26 04:13:02 localhost python3.9[121801]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None Nov 26 04:13:03 localhost python3.9[121899]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None Nov 26 04:13:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42581 DF PROTO=TCP SPT=45144 DPT=9101 SEQ=1080875669 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AD2A3D0000000001030307) Nov 26 04:13:04 localhost python3.9[121991]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 26 04:13:06 localhost sshd[121994]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:13:08 localhost python3.9[122087]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:13:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50140 DF PROTO=TCP SPT=34860 DPT=9102 SEQ=210393682 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AD3DFC0000000001030307) Nov 26 04:13:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42582 DF PROTO=TCP SPT=45144 DPT=9101 SEQ=1080875669 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AD49FD0000000001030307) Nov 26 04:13:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54052 DF PROTO=TCP SPT=58420 DPT=9882 SEQ=985407002 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AD4BFD0000000001030307) Nov 26 04:13:12 localhost python3.9[122179]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:13:13 localhost python3.9[122252]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764148392.3539155-1024-237597128427267/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 26 04:13:15 localhost python3.9[122345]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 26 04:13:15 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Nov 26 04:13:15 localhost systemd[1]: Stopped Load Kernel Modules. Nov 26 04:13:15 localhost systemd[1]: Stopping Load Kernel Modules... Nov 26 04:13:15 localhost systemd[1]: Starting Load Kernel Modules... Nov 26 04:13:15 localhost systemd-modules-load[122349]: Module 'msr' is built in Nov 26 04:13:15 localhost systemd[1]: Finished Load Kernel Modules. Nov 26 04:13:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58933 DF PROTO=TCP SPT=60130 DPT=9105 SEQ=4175016231 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AD57900000000001030307) Nov 26 04:13:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58935 DF PROTO=TCP SPT=60130 DPT=9105 SEQ=4175016231 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AD637C0000000001030307) Nov 26 04:13:20 localhost python3.9[122441]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:13:21 localhost python3.9[122514]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764148398.582522-1094-19888567334696/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 26 04:13:22 localhost python3.9[122606]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 26 04:13:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58936 DF PROTO=TCP SPT=60130 DPT=9105 SEQ=4175016231 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AD733C0000000001030307) Nov 26 04:13:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31395 DF PROTO=TCP SPT=49570 DPT=9102 SEQ=1503954600 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AD7AFC0000000001030307) Nov 26 04:13:26 localhost python3.9[122698]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:13:26 localhost python3.9[122790]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile Nov 26 04:13:27 localhost python3.9[122880]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:13:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59162 DF PROTO=TCP SPT=36050 DPT=9101 SEQ=1034858938 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AD877C0000000001030307) Nov 26 04:13:28 localhost python3.9[122972]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:13:28 localhost systemd[1]: Stopping Dynamic System Tuning Daemon... Nov 26 04:13:28 localhost systemd[1]: tuned.service: Deactivated successfully. Nov 26 04:13:28 localhost systemd[1]: Stopped Dynamic System Tuning Daemon. Nov 26 04:13:28 localhost systemd[1]: tuned.service: Consumed 1.747s CPU time, no IO. Nov 26 04:13:28 localhost systemd[1]: Starting Dynamic System Tuning Daemon... Nov 26 04:13:29 localhost systemd[1]: Started Dynamic System Tuning Daemon. Nov 26 04:13:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31397 DF PROTO=TCP SPT=49570 DPT=9102 SEQ=1503954600 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AD92BD0000000001030307) Nov 26 04:13:31 localhost python3.9[123074]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline Nov 26 04:13:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59164 DF PROTO=TCP SPT=36050 DPT=9101 SEQ=1034858938 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AD9F3D0000000001030307) Nov 26 04:13:35 localhost python3.9[123166]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:13:35 localhost systemd[1]: Reloading. Nov 26 04:13:35 localhost systemd-sysv-generator[123198]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:13:35 localhost systemd-rc-local-generator[123195]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:13:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:13:36 localhost python3.9[123296]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:13:36 localhost systemd[1]: Reloading. Nov 26 04:13:36 localhost systemd-sysv-generator[123329]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:13:36 localhost systemd-rc-local-generator[123325]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:13:36 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:13:38 localhost python3.9[123426]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:13:39 localhost python3.9[123519]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:13:39 localhost kernel: Adding 1048572k swap on /swap. Priority:-2 extents:1 across:1048572k FS Nov 26 04:13:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60121 DF PROTO=TCP SPT=34804 DPT=9100 SEQ=1700388239 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52ADB3FC0000000001030307) Nov 26 04:13:40 localhost python3.9[123612]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:13:41 localhost python3.9[123711]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:13:42 localhost python3.9[123804]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 26 04:13:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59165 DF PROTO=TCP SPT=36050 DPT=9101 SEQ=1034858938 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52ADBFFC0000000001030307) Nov 26 04:13:42 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully. Nov 26 04:13:42 localhost systemd[1]: Stopped Apply Kernel Variables. Nov 26 04:13:42 localhost systemd[1]: Stopping Apply Kernel Variables... Nov 26 04:13:42 localhost systemd[1]: Starting Apply Kernel Variables... Nov 26 04:13:42 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Nov 26 04:13:42 localhost systemd[1]: Finished Apply Kernel Variables. Nov 26 04:13:43 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57106 DF PROTO=TCP SPT=33020 DPT=9882 SEQ=942122302 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52ADC1FC0000000001030307) Nov 26 04:13:43 localhost systemd[1]: session-39.scope: Deactivated successfully. Nov 26 04:13:43 localhost systemd[1]: session-39.scope: Consumed 1min 58.182s CPU time. Nov 26 04:13:43 localhost systemd-logind[761]: Session 39 logged out. Waiting for processes to exit. Nov 26 04:13:43 localhost systemd-logind[761]: Removed session 39. Nov 26 04:13:45 localhost sshd[123824]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:13:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42130 DF PROTO=TCP SPT=43008 DPT=9105 SEQ=673802068 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52ADD0BC0000000001030307) Nov 26 04:13:48 localhost sshd[123826]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:13:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42131 DF PROTO=TCP SPT=43008 DPT=9105 SEQ=673802068 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52ADD8BD0000000001030307) Nov 26 04:13:48 localhost systemd-logind[761]: New session 40 of user zuul. Nov 26 04:13:48 localhost systemd[1]: Started Session 40 of User zuul. Nov 26 04:13:49 localhost python3.9[123919]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 04:13:51 localhost python3.9[124013]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 04:13:52 localhost python3.9[124186]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:13:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42132 DF PROTO=TCP SPT=43008 DPT=9105 SEQ=673802068 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52ADE87C0000000001030307) Nov 26 04:13:53 localhost python3.9[124277]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 04:13:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20768 DF PROTO=TCP SPT=58316 DPT=9102 SEQ=4087837562 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52ADEFFC0000000001030307) Nov 26 04:13:54 localhost python3.9[124373]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 26 04:13:55 localhost python3.9[124427]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 26 04:13:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50142 DF PROTO=TCP SPT=34860 DPT=9102 SEQ=210393682 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52ADFBFC0000000001030307) Nov 26 04:13:59 localhost python3.9[124521]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 26 04:14:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20770 DF PROTO=TCP SPT=58316 DPT=9102 SEQ=4087837562 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AE07BD0000000001030307) Nov 26 04:14:00 localhost python3.9[124676]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:14:01 localhost python3.9[124768]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:14:02 localhost python3.9[124872]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:14:02 localhost python3.9[124920]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:14:03 localhost python3.9[125012]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:14:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60931 DF PROTO=TCP SPT=57588 DPT=9101 SEQ=2470538951 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AE147D0000000001030307) Nov 26 04:14:04 localhost python3.9[125085]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764148443.0584266-324-235209985463878/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 26 04:14:05 localhost python3.9[125177]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Nov 26 04:14:05 localhost systemd-journald[47778]: Field hash table of /run/log/journal/ea6370aa35b896eb1e7cdbd81aa316d7/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation. Nov 26 04:14:05 localhost systemd-journald[47778]: /run/log/journal/ea6370aa35b896eb1e7cdbd81aa316d7/system.journal: Journal header limits reached or header out-of-date, rotating. Nov 26 04:14:05 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 26 04:14:05 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 26 04:14:05 localhost python3.9[125270]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Nov 26 04:14:06 localhost python3.9[125362]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Nov 26 04:14:06 localhost python3.9[125454]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Nov 26 04:14:07 localhost python3.9[125544]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 04:14:08 localhost python3.9[125638]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Nov 26 04:14:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20771 DF PROTO=TCP SPT=58316 DPT=9102 SEQ=4087837562 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AE27FC0000000001030307) Nov 26 04:14:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60932 DF PROTO=TCP SPT=57588 DPT=9101 SEQ=2470538951 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AE33FC0000000001030307) Nov 26 04:14:12 localhost python3.9[125732]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Nov 26 04:14:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51905 DF PROTO=TCP SPT=51660 DPT=9105 SEQ=3186151158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AE41F00000000001030307) Nov 26 04:14:16 localhost python3.9[125826]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Nov 26 04:14:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51906 DF PROTO=TCP SPT=51660 DPT=9105 SEQ=3186151158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AE45FC0000000001030307) Nov 26 04:14:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51907 DF PROTO=TCP SPT=51660 DPT=9105 SEQ=3186151158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AE4DFC0000000001030307) Nov 26 04:14:20 localhost python3.9[125926]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Nov 26 04:14:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51908 DF PROTO=TCP SPT=51660 DPT=9105 SEQ=3186151158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AE5DBC0000000001030307) Nov 26 04:14:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9257 DF PROTO=TCP SPT=60686 DPT=9102 SEQ=4011794543 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AE653C0000000001030307) Nov 26 04:14:24 localhost sshd[126021]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:14:25 localhost python3.9[126020]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Nov 26 04:14:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31400 DF PROTO=TCP SPT=49570 DPT=9102 SEQ=1503954600 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AE71FC0000000001030307) Nov 26 04:14:29 localhost python3.9[126116]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Nov 26 04:14:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9259 DF PROTO=TCP SPT=60686 DPT=9102 SEQ=4011794543 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AE7CFC0000000001030307) Nov 26 04:14:33 localhost python3.9[126210]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Nov 26 04:14:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3763 DF PROTO=TCP SPT=41184 DPT=9101 SEQ=3346365841 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AE89BC0000000001030307) Nov 26 04:14:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22056 DF PROTO=TCP SPT=45732 DPT=9100 SEQ=1738957877 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AE9DFC0000000001030307) Nov 26 04:14:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3764 DF PROTO=TCP SPT=41184 DPT=9101 SEQ=3346365841 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AEA9FD0000000001030307) Nov 26 04:14:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27335 DF PROTO=TCP SPT=58422 DPT=9882 SEQ=2761474572 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AEABFC0000000001030307) Nov 26 04:14:45 localhost python3.9[126379]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:14:46 localhost python3.9[126484]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:14:46 localhost python3.9[126557]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1764148485.6098096-723-215496817259420/.source.json _original_basename=.zl5pxwi4 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:14:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13721 DF PROTO=TCP SPT=39480 DPT=9105 SEQ=119736391 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AEBB3C0000000001030307) Nov 26 04:14:47 localhost python3.9[126649]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Nov 26 04:14:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13722 DF PROTO=TCP SPT=39480 DPT=9105 SEQ=119736391 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AEC33D0000000001030307) Nov 26 04:14:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13723 DF PROTO=TCP SPT=39480 DPT=9105 SEQ=119736391 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AED2FC0000000001030307) Nov 26 04:14:53 localhost podman[126663]: 2025-11-26 09:14:47.889508543 +0000 UTC m=+0.044256836 image pull quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified Nov 26 04:14:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1623 DF PROTO=TCP SPT=52140 DPT=9102 SEQ=1636606834 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AEDA7C0000000001030307) Nov 26 04:14:55 localhost python3.9[126926]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Nov 26 04:14:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53809 DF PROTO=TCP SPT=44244 DPT=9101 SEQ=1330694740 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AEE73C0000000001030307) Nov 26 04:15:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60934 DF PROTO=TCP SPT=57588 DPT=9101 SEQ=2470538951 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AEF1FC0000000001030307) Nov 26 04:15:02 localhost podman[126939]: 2025-11-26 09:14:55.099225139 +0000 UTC m=+0.031267494 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Nov 26 04:15:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53811 DF PROTO=TCP SPT=44244 DPT=9101 SEQ=1330694740 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AEFEFC0000000001030307) Nov 26 04:15:04 localhost python3.9[127155]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Nov 26 04:15:05 localhost sshd[127181]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:15:06 localhost podman[127168]: 2025-11-26 09:15:04.497488748 +0000 UTC m=+0.031914624 image pull quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified Nov 26 04:15:07 localhost python3.9[127333]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Nov 26 04:15:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1626 DF PROTO=TCP SPT=52140 DPT=9102 SEQ=1636606834 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AF11FC0000000001030307) Nov 26 04:15:09 localhost podman[127346]: 2025-11-26 09:15:07.932317102 +0000 UTC m=+0.042244473 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 04:15:10 localhost python3.9[127508]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Nov 26 04:15:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53812 DF PROTO=TCP SPT=44244 DPT=9101 SEQ=1330694740 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AF1FFD0000000001030307) Nov 26 04:15:13 localhost podman[127521]: 2025-11-26 09:15:10.594169451 +0000 UTC m=+0.045437372 image pull quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified Nov 26 04:15:14 localhost python3.9[127698]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Nov 26 04:15:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21084 DF PROTO=TCP SPT=48644 DPT=9105 SEQ=1163738957 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AF2C500000000001030307) Nov 26 04:15:16 localhost podman[127711]: 2025-11-26 09:15:14.915743881 +0000 UTC m=+0.049039512 image pull quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c Nov 26 04:15:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21085 DF PROTO=TCP SPT=48644 DPT=9105 SEQ=1163738957 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AF303C0000000001030307) Nov 26 04:15:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21086 DF PROTO=TCP SPT=48644 DPT=9105 SEQ=1163738957 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AF383C0000000001030307) Nov 26 04:15:18 localhost systemd-logind[761]: Session 40 logged out. Waiting for processes to exit. Nov 26 04:15:18 localhost systemd[1]: session-40.scope: Deactivated successfully. Nov 26 04:15:18 localhost systemd[1]: session-40.scope: Consumed 1min 27.545s CPU time. Nov 26 04:15:18 localhost systemd-logind[761]: Removed session 40. Nov 26 04:15:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21087 DF PROTO=TCP SPT=48644 DPT=9105 SEQ=1163738957 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AF47FC0000000001030307) Nov 26 04:15:24 localhost sshd[127872]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:15:24 localhost systemd-logind[761]: New session 41 of user zuul. Nov 26 04:15:24 localhost systemd[1]: Started Session 41 of User zuul. Nov 26 04:15:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25157 DF PROTO=TCP SPT=50856 DPT=9102 SEQ=1160757675 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AF4FBD0000000001030307) Nov 26 04:15:25 localhost python3.9[127965]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 04:15:27 localhost python3.9[128061]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None Nov 26 04:15:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22058 DF PROTO=TCP SPT=45732 DPT=9100 SEQ=1738957877 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AF5BFC0000000001030307) Nov 26 04:15:29 localhost python3.9[128191]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 26 04:15:30 localhost python3.9[128336]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch3.3'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Nov 26 04:15:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25159 DF PROTO=TCP SPT=50856 DPT=9102 SEQ=1160757675 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AF677C0000000001030307) Nov 26 04:15:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5543 DF PROTO=TCP SPT=60370 DPT=9101 SEQ=2616454132 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AF73FC0000000001030307) Nov 26 04:15:36 localhost python3.9[128559]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 26 04:15:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25160 DF PROTO=TCP SPT=50856 DPT=9102 SEQ=1160757675 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AF87FC0000000001030307) Nov 26 04:15:40 localhost python3.9[128653]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Nov 26 04:15:41 localhost python3.9[128746]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 04:15:42 localhost python3.9[128838]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None Nov 26 04:15:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5544 DF PROTO=TCP SPT=60370 DPT=9101 SEQ=2616454132 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AF93FC0000000001030307) Nov 26 04:15:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30148 DF PROTO=TCP SPT=38232 DPT=9882 SEQ=3477274860 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AF95FC0000000001030307) Nov 26 04:15:43 localhost sshd[128845]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:15:43 localhost kernel: SELinux: Converting 2754 SID table entries... Nov 26 04:15:43 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 26 04:15:43 localhost kernel: SELinux: policy capability open_perms=1 Nov 26 04:15:43 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 26 04:15:43 localhost kernel: SELinux: policy capability always_check_network=0 Nov 26 04:15:43 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 26 04:15:43 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 26 04:15:43 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 26 04:15:45 localhost python3.9[128937]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 04:15:46 localhost dbus-broker-launch[756]: avc: op=load_policy lsm=selinux seqno=18 res=1 Nov 26 04:15:46 localhost python3.9[129035]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 26 04:15:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62210 DF PROTO=TCP SPT=58962 DPT=9105 SEQ=1902861555 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AFA57C0000000001030307) Nov 26 04:15:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62211 DF PROTO=TCP SPT=58962 DPT=9105 SEQ=1902861555 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AFAD7C0000000001030307) Nov 26 04:15:50 localhost python3.9[129129]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:15:51 localhost python3.9[129374]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Nov 26 04:15:52 localhost python3.9[129464]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:15:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62212 DF PROTO=TCP SPT=58962 DPT=9105 SEQ=1902861555 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AFBD3C0000000001030307) Nov 26 04:15:53 localhost python3.9[129558]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 26 04:15:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26451 DF PROTO=TCP SPT=35378 DPT=9102 SEQ=3084962078 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AFC4BC0000000001030307) Nov 26 04:15:57 localhost python3.9[129652]: ansible-ansible.legacy.dnf Invoked with name=['openstack-network-scripts'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 26 04:15:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14491 DF PROTO=TCP SPT=58434 DPT=9101 SEQ=1659336366 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AFD17C0000000001030307) Nov 26 04:16:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26453 DF PROTO=TCP SPT=35378 DPT=9102 SEQ=3084962078 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AFDC7D0000000001030307) Nov 26 04:16:01 localhost python3.9[129776]: ansible-ansible.builtin.systemd Invoked with enabled=True name=network daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None Nov 26 04:16:01 localhost systemd[1]: Reloading. Nov 26 04:16:01 localhost systemd-rc-local-generator[129815]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:16:01 localhost systemd-sysv-generator[129819]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:16:01 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:16:02 localhost python3.9[129969]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:16:03 localhost python3.9[130080]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:16:03 localhost python3.9[130174]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:16:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14493 DF PROTO=TCP SPT=58434 DPT=9101 SEQ=1659336366 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AFE93C0000000001030307) Nov 26 04:16:04 localhost python3.9[130266]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:16:05 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 26 04:16:05 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 5400.1 total, 600.0 interval#012Cumulative writes: 5692 writes, 25K keys, 5692 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5692 writes, 763 syncs, 7.46 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 26 04:16:05 localhost python3.9[130358]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:16:06 localhost python3.9[130446]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764148564.9855561-564-232834978793020/.source _original_basename=.wx2a1a22 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:16:06 localhost python3.9[130538]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:16:07 localhost python3.9[130630]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={} Nov 26 04:16:08 localhost python3.9[130722]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:16:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26454 DF PROTO=TCP SPT=35378 DPT=9102 SEQ=3084962078 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52AFFBFD0000000001030307) Nov 26 04:16:09 localhost python3.9[130814]: ansible-ansible.legacy.stat Invoked with path=/etc/os-net-config/config.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:16:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 26 04:16:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 5400.1 total, 600.0 interval#012Cumulative writes: 4860 writes, 21K keys, 4860 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4860 writes, 621 syncs, 7.83 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 26 04:16:09 localhost python3.9[130887]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/os-net-config/config.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764148568.728752-690-137944747482327/.source.yaml _original_basename=.6wv09ff1 follow=False checksum=06d744ebe702728c19f6d1a8f97158d086012058 force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:16:10 localhost python3.9[130979]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml Nov 26 04:16:11 localhost ansible-async_wrapper.py[131084]: Invoked with j572361731685 300 /home/zuul/.ansible/tmp/ansible-tmp-1764148570.977421-762-251543587848217/AnsiballZ_edpm_os_net_config.py _ Nov 26 04:16:11 localhost ansible-async_wrapper.py[131087]: Starting module and watcher Nov 26 04:16:11 localhost ansible-async_wrapper.py[131087]: Start watching 131088 (300) Nov 26 04:16:11 localhost ansible-async_wrapper.py[131088]: Start module (131088) Nov 26 04:16:11 localhost ansible-async_wrapper.py[131084]: Return async_wrapper task started. Nov 26 04:16:12 localhost python3.9[131089]: ansible-edpm_os_net_config Invoked with cleanup=False config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=False Nov 26 04:16:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14494 DF PROTO=TCP SPT=58434 DPT=9101 SEQ=1659336366 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B009FC0000000001030307) Nov 26 04:16:12 localhost ansible-async_wrapper.py[131088]: Module complete (131088) Nov 26 04:16:13 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10114 DF PROTO=TCP SPT=59356 DPT=9882 SEQ=2409567819 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B00BFD0000000001030307) Nov 26 04:16:15 localhost python3.9[131181]: ansible-ansible.legacy.async_status Invoked with jid=j572361731685.131084 mode=status _async_dir=/root/.ansible_async Nov 26 04:16:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38628 DF PROTO=TCP SPT=57972 DPT=9105 SEQ=3706600193 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B016B00000000001030307) Nov 26 04:16:15 localhost python3.9[131240]: ansible-ansible.legacy.async_status Invoked with jid=j572361731685.131084 mode=cleanup _async_dir=/root/.ansible_async Nov 26 04:16:16 localhost ansible-async_wrapper.py[131087]: Done in kid B. Nov 26 04:16:17 localhost python3.9[131332]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:16:17 localhost python3.9[131405]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764148576.6674416-828-100355238433605/.source.returncode _original_basename=.3xsms8zy follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:16:18 localhost python3.9[131497]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:16:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38630 DF PROTO=TCP SPT=57972 DPT=9105 SEQ=3706600193 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B022BD0000000001030307) Nov 26 04:16:19 localhost python3.9[131570]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764148578.0073159-876-109539910740974/.source.cfg _original_basename=.8epcqp3w follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:16:19 localhost python3.9[131662]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 26 04:16:19 localhost systemd[1]: Reloading Network Manager... Nov 26 04:16:19 localhost NetworkManager[5970]: [1764148579.8401] audit: op="reload" arg="0" pid=131666 uid=0 result="success" Nov 26 04:16:19 localhost NetworkManager[5970]: [1764148579.8411] config: signal: SIGHUP (no changes from disk) Nov 26 04:16:19 localhost systemd[1]: Reloaded Network Manager. Nov 26 04:16:20 localhost systemd[1]: session-41.scope: Deactivated successfully. Nov 26 04:16:20 localhost systemd[1]: session-41.scope: Consumed 35.300s CPU time. Nov 26 04:16:20 localhost systemd-logind[761]: Session 41 logged out. Waiting for processes to exit. Nov 26 04:16:20 localhost systemd-logind[761]: Removed session 41. Nov 26 04:16:22 localhost sshd[131681]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:16:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38631 DF PROTO=TCP SPT=57972 DPT=9105 SEQ=3706600193 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B0327C0000000001030307) Nov 26 04:16:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3162 DF PROTO=TCP SPT=53222 DPT=9102 SEQ=1285234960 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B039FC0000000001030307) Nov 26 04:16:25 localhost sshd[131683]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:16:25 localhost systemd-logind[761]: New session 42 of user zuul. Nov 26 04:16:25 localhost systemd[1]: Started Session 42 of User zuul. Nov 26 04:16:26 localhost python3.9[131776]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 04:16:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57024 DF PROTO=TCP SPT=60042 DPT=9100 SEQ=825567329 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B045FC0000000001030307) Nov 26 04:16:27 localhost python3.9[131870]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 26 04:16:29 localhost python3.9[132023]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:16:29 localhost systemd[1]: session-42.scope: Deactivated successfully. Nov 26 04:16:29 localhost systemd[1]: session-42.scope: Consumed 1.938s CPU time. Nov 26 04:16:29 localhost systemd-logind[761]: Session 42 logged out. Waiting for processes to exit. Nov 26 04:16:29 localhost systemd-logind[761]: Removed session 42. Nov 26 04:16:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3164 DF PROTO=TCP SPT=53222 DPT=9102 SEQ=1285234960 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B051BD0000000001030307) Nov 26 04:16:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20112 DF PROTO=TCP SPT=56028 DPT=9101 SEQ=1447830488 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B05E7C0000000001030307) Nov 26 04:16:34 localhost sshd[132039]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:16:35 localhost systemd-logind[761]: New session 43 of user zuul. Nov 26 04:16:35 localhost systemd[1]: Started Session 43 of User zuul. Nov 26 04:16:36 localhost python3.9[132132]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 04:16:37 localhost python3.9[132226]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 04:16:38 localhost python3.9[132322]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 26 04:16:39 localhost python3.9[132376]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 26 04:16:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3165 DF PROTO=TCP SPT=53222 DPT=9102 SEQ=1285234960 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B071FD0000000001030307) Nov 26 04:16:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20113 DF PROTO=TCP SPT=56028 DPT=9101 SEQ=1447830488 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B07DFD0000000001030307) Nov 26 04:16:43 localhost python3.9[132470]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 26 04:16:44 localhost python3.9[132625]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:16:45 localhost python3.9[132717]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:16:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36670 DF PROTO=TCP SPT=50250 DPT=9105 SEQ=2299978362 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B08BE00000000001030307) Nov 26 04:16:46 localhost python3.9[132819]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:16:46 localhost python3.9[132867]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:16:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36671 DF PROTO=TCP SPT=50250 DPT=9105 SEQ=2299978362 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B08FFD0000000001030307) Nov 26 04:16:47 localhost python3.9[132959]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:16:47 localhost python3.9[133007]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:16:48 localhost python3.9[133099]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Nov 26 04:16:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36672 DF PROTO=TCP SPT=50250 DPT=9105 SEQ=2299978362 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B097FC0000000001030307) Nov 26 04:16:49 localhost python3.9[133191]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Nov 26 04:16:49 localhost python3.9[133283]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Nov 26 04:16:50 localhost python3.9[133375]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Nov 26 04:16:51 localhost python3.9[133467]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 26 04:16:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36673 DF PROTO=TCP SPT=50250 DPT=9105 SEQ=2299978362 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B0A7BC0000000001030307) Nov 26 04:16:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52168 DF PROTO=TCP SPT=60990 DPT=9102 SEQ=1586718323 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B0AF3C0000000001030307) Nov 26 04:16:55 localhost python3.9[133561]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 04:16:55 localhost python3.9[133655]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:16:56 localhost python3.9[133747]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:16:57 localhost python3.9[133839]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:16:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10204 DF PROTO=TCP SPT=35384 DPT=9101 SEQ=3078839376 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B0BBFC0000000001030307) Nov 26 04:16:59 localhost python3.9[133932]: ansible-service_facts Invoked Nov 26 04:16:59 localhost network[133949]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 26 04:16:59 localhost network[133950]: 'network-scripts' will be removed from distribution in near future. Nov 26 04:16:59 localhost network[133951]: It is advised to switch to 'NetworkManager' instead for network management. Nov 26 04:17:00 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:17:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52170 DF PROTO=TCP SPT=60990 DPT=9102 SEQ=1586718323 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B0C6FC0000000001030307) Nov 26 04:17:01 localhost sshd[134014]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:17:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10206 DF PROTO=TCP SPT=35384 DPT=9101 SEQ=3078839376 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B0D3BC0000000001030307) Nov 26 04:17:05 localhost auditd[727]: Audit daemon rotating log files Nov 26 04:17:06 localhost python3.9[134304]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 26 04:17:07 localhost podman[134408]: Nov 26 04:17:07 localhost podman[134408]: 2025-11-26 09:17:07.46274783 +0000 UTC m=+0.101513110 container create a7d874d07fdb8fec598bc2318bca14f7979e18ef4e5fa4b32ae3d9a0a4d23b74 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_colden, release=553, GIT_BRANCH=main, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, version=7, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, GIT_CLEAN=True, name=rhceph, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, maintainer=Guillaume Abrioux , io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 26 04:17:07 localhost podman[134408]: 2025-11-26 09:17:07.408760846 +0000 UTC m=+0.047526146 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:17:07 localhost systemd[1]: Started libpod-conmon-a7d874d07fdb8fec598bc2318bca14f7979e18ef4e5fa4b32ae3d9a0a4d23b74.scope. Nov 26 04:17:07 localhost systemd[1]: Started libcrun container. Nov 26 04:17:07 localhost podman[134408]: 2025-11-26 09:17:07.576431215 +0000 UTC m=+0.215196515 container init a7d874d07fdb8fec598bc2318bca14f7979e18ef4e5fa4b32ae3d9a0a4d23b74 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_colden, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , ceph=True, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, RELEASE=main, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553) Nov 26 04:17:07 localhost podman[134408]: 2025-11-26 09:17:07.590132598 +0000 UTC m=+0.228897878 container start a7d874d07fdb8fec598bc2318bca14f7979e18ef4e5fa4b32ae3d9a0a4d23b74 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_colden, build-date=2025-09-24T08:57:55, version=7, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, com.redhat.component=rhceph-container, name=rhceph, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, vcs-type=git, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 26 04:17:07 localhost podman[134408]: 2025-11-26 09:17:07.590343314 +0000 UTC m=+0.229108604 container attach a7d874d07fdb8fec598bc2318bca14f7979e18ef4e5fa4b32ae3d9a0a4d23b74 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_colden, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, io.openshift.expose-services=, com.redhat.component=rhceph-container) Nov 26 04:17:07 localhost interesting_colden[134422]: 167 167 Nov 26 04:17:07 localhost systemd[1]: libpod-a7d874d07fdb8fec598bc2318bca14f7979e18ef4e5fa4b32ae3d9a0a4d23b74.scope: Deactivated successfully. Nov 26 04:17:07 localhost podman[134408]: 2025-11-26 09:17:07.594660118 +0000 UTC m=+0.233425428 container died a7d874d07fdb8fec598bc2318bca14f7979e18ef4e5fa4b32ae3d9a0a4d23b74 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_colden, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, distribution-scope=public, version=7, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, name=rhceph, io.buildah.version=1.33.12, RELEASE=main, ceph=True, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=) Nov 26 04:17:07 localhost podman[134427]: 2025-11-26 09:17:07.679810942 +0000 UTC m=+0.076023064 container remove a7d874d07fdb8fec598bc2318bca14f7979e18ef4e5fa4b32ae3d9a0a4d23b74 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_colden, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, name=rhceph, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, release=553, io.openshift.expose-services=, vcs-type=git, GIT_CLEAN=True, RELEASE=main, vendor=Red Hat, Inc., GIT_BRANCH=main, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 26 04:17:07 localhost systemd[1]: libpod-conmon-a7d874d07fdb8fec598bc2318bca14f7979e18ef4e5fa4b32ae3d9a0a4d23b74.scope: Deactivated successfully. Nov 26 04:17:07 localhost podman[134448]: Nov 26 04:17:07 localhost podman[134448]: 2025-11-26 09:17:07.889381743 +0000 UTC m=+0.073209538 container create bc7cd52c788866a65b9a9d2501112843267d8e7fe172784f483f08874eab9246 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_wing, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vendor=Red Hat, Inc., GIT_BRANCH=main, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, distribution-scope=public, name=rhceph, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, RELEASE=main, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12) Nov 26 04:17:07 localhost systemd[1]: Started libpod-conmon-bc7cd52c788866a65b9a9d2501112843267d8e7fe172784f483f08874eab9246.scope. Nov 26 04:17:07 localhost systemd[1]: Started libcrun container. Nov 26 04:17:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/12ab9a4ff24c08e96a7018c3df5460bfa482812a9d91e5145451fdff9e30c7fa/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 26 04:17:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/12ab9a4ff24c08e96a7018c3df5460bfa482812a9d91e5145451fdff9e30c7fa/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 26 04:17:07 localhost podman[134448]: 2025-11-26 09:17:07.863846576 +0000 UTC m=+0.047674351 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:17:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/12ab9a4ff24c08e96a7018c3df5460bfa482812a9d91e5145451fdff9e30c7fa/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 26 04:17:07 localhost podman[134448]: 2025-11-26 09:17:07.972850427 +0000 UTC m=+0.156678192 container init bc7cd52c788866a65b9a9d2501112843267d8e7fe172784f483f08874eab9246 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_wing, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, ceph=True, GIT_BRANCH=main, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, release=553, io.buildah.version=1.33.12) Nov 26 04:17:07 localhost podman[134448]: 2025-11-26 09:17:07.984406243 +0000 UTC m=+0.168234038 container start bc7cd52c788866a65b9a9d2501112843267d8e7fe172784f483f08874eab9246 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_wing, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, release=553, description=Red Hat Ceph Storage 7, distribution-scope=public, name=rhceph, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, io.buildah.version=1.33.12, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, GIT_BRANCH=main, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, version=7, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 26 04:17:07 localhost podman[134448]: 2025-11-26 09:17:07.984724923 +0000 UTC m=+0.168552748 container attach bc7cd52c788866a65b9a9d2501112843267d8e7fe172784f483f08874eab9246 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_wing, release=553, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, distribution-scope=public, com.redhat.component=rhceph-container, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, name=rhceph, maintainer=Guillaume Abrioux , ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, vendor=Red Hat, Inc., version=7) Nov 26 04:17:08 localhost systemd[1]: var-lib-containers-storage-overlay-4df0904317610a4c4d0de6842d1bd28bc8a395bdda90c8360bd5ca56a5fc121a-merged.mount: Deactivated successfully. Nov 26 04:17:08 localhost laughing_wing[134464]: [ Nov 26 04:17:08 localhost laughing_wing[134464]: { Nov 26 04:17:08 localhost laughing_wing[134464]: "available": false, Nov 26 04:17:08 localhost laughing_wing[134464]: "ceph_device": false, Nov 26 04:17:08 localhost laughing_wing[134464]: "device_id": "QEMU_DVD-ROM_QM00001", Nov 26 04:17:08 localhost laughing_wing[134464]: "lsm_data": {}, Nov 26 04:17:08 localhost laughing_wing[134464]: "lvs": [], Nov 26 04:17:08 localhost laughing_wing[134464]: "path": "/dev/sr0", Nov 26 04:17:08 localhost laughing_wing[134464]: "rejected_reasons": [ Nov 26 04:17:08 localhost laughing_wing[134464]: "Has a FileSystem", Nov 26 04:17:08 localhost laughing_wing[134464]: "Insufficient space (<5GB)" Nov 26 04:17:08 localhost laughing_wing[134464]: ], Nov 26 04:17:08 localhost laughing_wing[134464]: "sys_api": { Nov 26 04:17:08 localhost laughing_wing[134464]: "actuators": null, Nov 26 04:17:08 localhost laughing_wing[134464]: "device_nodes": "sr0", Nov 26 04:17:08 localhost laughing_wing[134464]: "human_readable_size": "482.00 KB", Nov 26 04:17:08 localhost laughing_wing[134464]: "id_bus": "ata", Nov 26 04:17:08 localhost laughing_wing[134464]: "model": "QEMU DVD-ROM", Nov 26 04:17:08 localhost laughing_wing[134464]: "nr_requests": "2", Nov 26 04:17:08 localhost laughing_wing[134464]: "partitions": {}, Nov 26 04:17:08 localhost laughing_wing[134464]: "path": "/dev/sr0", Nov 26 04:17:08 localhost laughing_wing[134464]: "removable": "1", Nov 26 04:17:08 localhost laughing_wing[134464]: "rev": "2.5+", Nov 26 04:17:08 localhost laughing_wing[134464]: "ro": "0", Nov 26 04:17:08 localhost laughing_wing[134464]: "rotational": "1", Nov 26 04:17:08 localhost laughing_wing[134464]: "sas_address": "", Nov 26 04:17:08 localhost laughing_wing[134464]: "sas_device_handle": "", Nov 26 04:17:08 localhost laughing_wing[134464]: "scheduler_mode": "mq-deadline", Nov 26 04:17:08 localhost laughing_wing[134464]: "sectors": 0, Nov 26 04:17:08 localhost laughing_wing[134464]: "sectorsize": "2048", Nov 26 04:17:08 localhost laughing_wing[134464]: "size": 493568.0, Nov 26 04:17:08 localhost laughing_wing[134464]: "support_discard": "0", Nov 26 04:17:08 localhost laughing_wing[134464]: "type": "disk", Nov 26 04:17:08 localhost laughing_wing[134464]: "vendor": "QEMU" Nov 26 04:17:08 localhost laughing_wing[134464]: } Nov 26 04:17:08 localhost laughing_wing[134464]: } Nov 26 04:17:08 localhost laughing_wing[134464]: ] Nov 26 04:17:08 localhost systemd[1]: libpod-bc7cd52c788866a65b9a9d2501112843267d8e7fe172784f483f08874eab9246.scope: Deactivated successfully. Nov 26 04:17:08 localhost podman[134448]: 2025-11-26 09:17:08.753941526 +0000 UTC m=+0.937769281 container died bc7cd52c788866a65b9a9d2501112843267d8e7fe172784f483f08874eab9246 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_wing, io.buildah.version=1.33.12, version=7, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, vcs-type=git, release=553, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, vendor=Red Hat, Inc., name=rhceph, CEPH_POINT_RELEASE=) Nov 26 04:17:08 localhost systemd[1]: var-lib-containers-storage-overlay-12ab9a4ff24c08e96a7018c3df5460bfa482812a9d91e5145451fdff9e30c7fa-merged.mount: Deactivated successfully. Nov 26 04:17:08 localhost podman[135890]: 2025-11-26 09:17:08.842732934 +0000 UTC m=+0.074452837 container remove bc7cd52c788866a65b9a9d2501112843267d8e7fe172784f483f08874eab9246 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_wing, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_BRANCH=main, name=rhceph, architecture=x86_64, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.openshift.expose-services=, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, vcs-type=git) Nov 26 04:17:08 localhost systemd[1]: libpod-conmon-bc7cd52c788866a65b9a9d2501112843267d8e7fe172784f483f08874eab9246.scope: Deactivated successfully. Nov 26 04:17:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53382 DF PROTO=TCP SPT=37476 DPT=9100 SEQ=2202499667 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B0E7FC0000000001030307) Nov 26 04:17:11 localhost python3.9[136011]: ansible-package_facts Invoked with manager=['auto'] strategy=first Nov 26 04:17:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10207 DF PROTO=TCP SPT=35384 DPT=9101 SEQ=3078839376 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B0F3FD0000000001030307) Nov 26 04:17:12 localhost python3.9[136103]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:17:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23066 DF PROTO=TCP SPT=33986 DPT=9882 SEQ=1112993879 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B0F5FC0000000001030307) Nov 26 04:17:13 localhost python3.9[136178]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764148632.19173-658-236507803085952/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:17:14 localhost python3.9[136272]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:17:14 localhost python3.9[136347]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764148633.7865753-703-199662528666300/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:17:16 localhost python3.9[136441]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:17:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8562 DF PROTO=TCP SPT=48772 DPT=9105 SEQ=2630262729 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B104FC0000000001030307) Nov 26 04:17:18 localhost python3.9[136535]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 26 04:17:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8563 DF PROTO=TCP SPT=48772 DPT=9105 SEQ=2630262729 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B10CFC0000000001030307) Nov 26 04:17:19 localhost python3.9[136589]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:17:21 localhost python3.9[136683]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 26 04:17:21 localhost python3.9[136737]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 26 04:17:21 localhost chronyd[25901]: chronyd exiting Nov 26 04:17:21 localhost systemd[1]: Stopping NTP client/server... Nov 26 04:17:21 localhost systemd[1]: chronyd.service: Deactivated successfully. Nov 26 04:17:21 localhost systemd[1]: Stopped NTP client/server. Nov 26 04:17:22 localhost systemd[1]: Starting NTP client/server... Nov 26 04:17:22 localhost chronyd[136746]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG) Nov 26 04:17:22 localhost chronyd[136746]: Frequency -30.473 +/- 0.364 ppm read from /var/lib/chrony/drift Nov 26 04:17:22 localhost chronyd[136746]: Loaded seccomp filter (level 2) Nov 26 04:17:22 localhost systemd[1]: Started NTP client/server. Nov 26 04:17:22 localhost systemd[1]: session-43.scope: Deactivated successfully. Nov 26 04:17:22 localhost systemd[1]: session-43.scope: Consumed 28.012s CPU time. Nov 26 04:17:22 localhost systemd-logind[761]: Session 43 logged out. Waiting for processes to exit. Nov 26 04:17:22 localhost systemd-logind[761]: Removed session 43. Nov 26 04:17:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8564 DF PROTO=TCP SPT=48772 DPT=9105 SEQ=2630262729 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B11CBC0000000001030307) Nov 26 04:17:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35648 DF PROTO=TCP SPT=60638 DPT=9102 SEQ=1449144443 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B1247C0000000001030307) Nov 26 04:17:27 localhost sshd[136762]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:17:27 localhost systemd-logind[761]: New session 44 of user zuul. Nov 26 04:17:27 localhost systemd[1]: Started Session 44 of User zuul. Nov 26 04:17:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3167 DF PROTO=TCP SPT=53222 DPT=9102 SEQ=1285234960 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B12FFD0000000001030307) Nov 26 04:17:28 localhost python3.9[136855]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 04:17:29 localhost python3.9[136951]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:17:30 localhost python3.9[137056]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:17:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20115 DF PROTO=TCP SPT=56028 DPT=9101 SEQ=1447830488 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B13BFC0000000001030307) Nov 26 04:17:31 localhost python3.9[137104]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.gkgjj7u3 recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:17:32 localhost python3.9[137196]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:17:32 localhost python3.9[137271]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764148651.8007298-144-212927868767324/.source _original_basename=.sm1sa9os follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:17:33 localhost python3.9[137363]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:17:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31116 DF PROTO=TCP SPT=36216 DPT=9101 SEQ=2701334916 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B148BD0000000001030307) Nov 26 04:17:34 localhost python3.9[137455]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:17:35 localhost python3.9[137528]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764148654.0611742-216-142196283009934/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 26 04:17:36 localhost python3.9[137620]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:17:36 localhost python3.9[137693]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764148655.6816406-216-175004338817894/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 26 04:17:38 localhost python3.9[137785]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:17:38 localhost python3.9[137877]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:17:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35651 DF PROTO=TCP SPT=60638 DPT=9102 SEQ=1449144443 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B15BFC0000000001030307) Nov 26 04:17:39 localhost python3.9[137950]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764148658.4082305-327-194211099972566/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:17:40 localhost python3.9[138042]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:17:40 localhost python3.9[138115]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764148659.5666993-372-99443994095735/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:17:40 localhost sshd[138162]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:17:41 localhost python3.9[138209]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:17:41 localhost systemd[1]: Reloading. Nov 26 04:17:41 localhost systemd-rc-local-generator[138227]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:17:41 localhost systemd-sysv-generator[138233]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:17:41 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:17:41 localhost systemd[1]: Reloading. Nov 26 04:17:41 localhost systemd-sysv-generator[138278]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:17:41 localhost systemd-rc-local-generator[138275]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:17:42 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:17:42 localhost systemd[1]: Starting EDPM Container Shutdown... Nov 26 04:17:42 localhost systemd[1]: Finished EDPM Container Shutdown. Nov 26 04:17:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31117 DF PROTO=TCP SPT=36216 DPT=9101 SEQ=2701334916 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B169FC0000000001030307) Nov 26 04:17:42 localhost python3.9[138377]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:17:43 localhost python3.9[138450]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764148662.3405962-441-128325716034654/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:17:44 localhost python3.9[138542]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:17:44 localhost python3.9[138615]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764148663.600802-486-153362498152783/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:17:45 localhost python3.9[138707]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:17:45 localhost systemd[1]: Reloading. Nov 26 04:17:45 localhost systemd-rc-local-generator[138730]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:17:45 localhost systemd-sysv-generator[138735]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:17:45 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:17:45 localhost systemd[1]: Starting Create netns directory... Nov 26 04:17:45 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Nov 26 04:17:45 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Nov 26 04:17:45 localhost systemd[1]: Finished Create netns directory. Nov 26 04:17:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20547 DF PROTO=TCP SPT=35600 DPT=9105 SEQ=686169749 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B176410000000001030307) Nov 26 04:17:46 localhost python3.9[138839]: ansible-ansible.builtin.service_facts Invoked Nov 26 04:17:46 localhost network[138856]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 26 04:17:46 localhost network[138857]: 'network-scripts' will be removed from distribution in near future. Nov 26 04:17:46 localhost network[138858]: It is advised to switch to 'NetworkManager' instead for network management. Nov 26 04:17:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20548 DF PROTO=TCP SPT=35600 DPT=9105 SEQ=686169749 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B17A3C0000000001030307) Nov 26 04:17:47 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:17:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20549 DF PROTO=TCP SPT=35600 DPT=9105 SEQ=686169749 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B1823C0000000001030307) Nov 26 04:17:51 localhost python3.9[139059]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:17:51 localhost python3.9[139134]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764148670.8754454-609-221022084188940/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:17:52 localhost python3.9[139227]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 26 04:17:52 localhost systemd[1]: Reloading OpenSSH server daemon... Nov 26 04:17:52 localhost systemd[1]: Reloaded OpenSSH server daemon. Nov 26 04:17:52 localhost sshd[119386]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:17:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20550 DF PROTO=TCP SPT=35600 DPT=9105 SEQ=686169749 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B191FC0000000001030307) Nov 26 04:17:53 localhost python3.9[139323]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:17:54 localhost python3.9[139415]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:17:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2270 DF PROTO=TCP SPT=33126 DPT=9102 SEQ=1236775802 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B1997C0000000001030307) Nov 26 04:17:54 localhost python3.9[139488]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764148674.000211-704-178425856856702/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:17:55 localhost python3.9[139580]: ansible-community.general.timezone Invoked with name=UTC hwclock=None Nov 26 04:17:56 localhost systemd[1]: Starting Time & Date Service... Nov 26 04:17:56 localhost systemd[1]: Started Time & Date Service. Nov 26 04:17:57 localhost python3.9[139676]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:17:57 localhost python3.9[139768]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:17:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52173 DF PROTO=TCP SPT=60990 DPT=9102 SEQ=1586718323 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B1A5FC0000000001030307) Nov 26 04:17:58 localhost python3.9[139841]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764148677.2684703-808-245312138105710/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:17:58 localhost python3.9[139933]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:17:59 localhost python3.9[140006]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764148678.5246143-852-128777221787075/.source.yaml _original_basename=.lxf54yvm follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:18:00 localhost python3.9[140098]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:18:00 localhost python3.9[140173]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764148679.7324486-898-150465922553864/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:18:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2272 DF PROTO=TCP SPT=33126 DPT=9102 SEQ=1236775802 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B1B13D0000000001030307) Nov 26 04:18:01 localhost python3.9[140265]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:18:02 localhost python3.9[140358]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:18:03 localhost python3[140451]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall Nov 26 04:18:03 localhost python3.9[140543]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:18:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55945 DF PROTO=TCP SPT=46368 DPT=9101 SEQ=1441433020 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B1BDFC0000000001030307) Nov 26 04:18:04 localhost python3.9[140616]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764148683.4669993-1014-68773304977447/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:18:05 localhost python3.9[140708]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:18:05 localhost python3.9[140781]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764148684.8226097-1059-70421733882167/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:18:06 localhost python3.9[140873]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:18:07 localhost python3.9[140946]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764148686.1126175-1105-41995552259422/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:18:07 localhost python3.9[141038]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:18:08 localhost python3.9[141111]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764148687.340889-1150-222041303750122/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:18:09 localhost python3.9[141203]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:18:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29686 DF PROTO=TCP SPT=34866 DPT=9100 SEQ=2042509323 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B1D1FD0000000001030307) Nov 26 04:18:09 localhost python3.9[141277]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764148688.6034145-1194-26931943242809/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:18:10 localhost python3.9[141428]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:18:11 localhost python3.9[141520]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:18:11 localhost python3.9[141630]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:18:12 localhost python3.9[141723]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:18:13 localhost python3.9[141815]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:18:14 localhost python3.9[141907]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None Nov 26 04:18:14 localhost python3.9[142000]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None Nov 26 04:18:15 localhost systemd[1]: session-44.scope: Deactivated successfully. Nov 26 04:18:15 localhost systemd[1]: session-44.scope: Consumed 26.960s CPU time. Nov 26 04:18:15 localhost systemd-logind[761]: Session 44 logged out. Waiting for processes to exit. Nov 26 04:18:15 localhost systemd-logind[761]: Removed session 44. Nov 26 04:18:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11547 DF PROTO=TCP SPT=52028 DPT=9105 SEQ=4108040695 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B1EB700000000001030307) Nov 26 04:18:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20552 DF PROTO=TCP SPT=35600 DPT=9105 SEQ=686169749 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B1F1FC0000000001030307) Nov 26 04:18:19 localhost sshd[142016]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:18:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8567 DF PROTO=TCP SPT=48772 DPT=9105 SEQ=2630262729 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B1FBFC0000000001030307) Nov 26 04:18:21 localhost sshd[142018]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:18:21 localhost systemd-logind[761]: New session 45 of user zuul. Nov 26 04:18:21 localhost systemd[1]: Started Session 45 of User zuul. Nov 26 04:18:22 localhost python3.9[142113]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None Nov 26 04:18:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26751 DF PROTO=TCP SPT=57954 DPT=9102 SEQ=2417312452 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B20AC50000000001030307) Nov 26 04:18:23 localhost python3.9[142205]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:18:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15118 DF PROTO=TCP SPT=54532 DPT=9100 SEQ=508726150 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B20BED0000000001030307) Nov 26 04:18:25 localhost python3.9[142299]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts Nov 26 04:18:26 localhost systemd[1]: systemd-timedated.service: Deactivated successfully. Nov 26 04:18:26 localhost python3.9[142394]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.w1whuj5c follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:18:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14567 DF PROTO=TCP SPT=46934 DPT=9101 SEQ=4145575286 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B2176B0000000001030307) Nov 26 04:18:27 localhost python3.9[142469]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.w1whuj5c mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764148706.142977-190-242704776309217/.source.w1whuj5c _original_basename=.zclj_6ev follow=False checksum=62000e966070c93c0ab4a39aaf05545c71aa7283 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:18:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22751 DF PROTO=TCP SPT=50486 DPT=9100 SEQ=3892229714 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B21BFC0000000001030307) Nov 26 04:18:29 localhost python3.9[142561]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 04:18:30 localhost python3.9[142653]: ansible-ansible.builtin.blockinfile Invoked with block=np0005536118.localdomain,192.168.122.107,np0005536118* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCfnfafkBxgNm6Kh5k6DvljneX0c5GykKaP2XXhwemEW1/qBm+4yfqmo2y4C08a3GNzTwfPv6B8iwTNKUs6SWEt24JWI/lAVh/ocwBt6VE7KXoscG0Ha7PCZsrtdondI3wHXdYnelEySlDWfgVDx7J/STk7rxKcUS+V5otrTN85yB2OHKKmJ492Y00oCiaudBb9eef6hCAL/JcF2/VXXOpMs1zH/15Mb8bYruhzP/5xB7CCeKLXivfRH7Wn37Ds8UzxSdZUUK5y7TD0QGXGoTnLf3XGG5pBLXvHjG5Mh+Owvt+B5RLXIbXX03+hVoOi8ZMOZNkSUJl/z82BYCARUxbkbrANQqxf9138BGvkERGRfDEWqQUW1dWYEWc8PGLx6fIdrWDBHglnS2RYtXjK/rQktahcaZD7FXQOnaXQv8mfQ0q8kAqmrFA+gV93Ss3keS7YBd8ZASQpIHPZQFRxkztDV638fMq/eiuc59AZGmrzDe3PFtDXroz1uJKJAJdLNhc=#012np0005536118.localdomain,192.168.122.107,np0005536118* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIA91mIpEpZ/+BnZLgJ7EcSJqXjm9KE7iJyil0HOYyh1n#012np0005536118.localdomain,192.168.122.107,np0005536118* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBC2OKhioNsHm0UhKef5IguteC1CzFuf/qzNAJl6YZetwaM/GBaPTHk3boeHHiqwxEjjgemEh8fJUU2qEg0veIQ=#012np0005536117.localdomain,192.168.122.106,np0005536117* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCvuiblKTsbmL2sEkAoI8N4XKL02+ZQhAWAKn7UrsKCBB+OGZN1PDIVxXaBQLqG39LX15HM+BCqqS/mo8n1AnvNg89PsWVtq1GxHkisfP/wlNkstEbw85Ezi/3gIjGsbvZ0Bnyh1Zi9GQny2ekd5fOPUP7VVcn5pOrgWNAsC4JWU2vNSsGyg9B+aqV/qfQxTX1REK7lVvKdzqw4RCTHje8SJXUTTSDnD81wreB6Vl3QWWAdVVHsW2UUA74nGGY3XyNUZZuGuCHYKAUElXQdguhdjsq986AS/I81Km6Ak6I9FajfVDmk/iJl/G/Kg7bcas4rbNMBNmR22pcxHOrRUR6RMBYnYdfFTmatTMJKzZr9SFHbYT8gIkC9S4Xi5esJmySHtFBlK0u2MMmOaQAyqGL7xZEOxPrYvaTwrn4QPJYOlGDJKH8HuruPKL6h1HrLKK5no0WhlfnufFo2rdMbbvNbGy+5PgwqJCpqX8DltF/Og/pa+33AXc1JlJZ8zPx1LYM=#012np0005536117.localdomain,192.168.122.106,np0005536117* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICqlTH0QQqnRVSXOj2NpG79oVXs1lgaK5gwwjl9aCWqk#012np0005536117.localdomain,192.168.122.106,np0005536117* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ9QTjPvNhKkNLnpnhnOUtXpatbxHD+gT+Jw1dmbC0ETHYftvw/hytJjU2Ax4VdyjNk68MSdLsizzaP+HYhHolo=#012np0005536112.localdomain,192.168.122.103,np0005536112* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCnw/aSCc3H/eikMKIv8MupYWNOR8z7XR+3smBj2uRUTCxpYQMrYPQJS0ym3zpBW0k4x6uKgLc5QfAOIUixQFqmWslELY62n9dp/YdIcnizhKWLQDam94X1ghhDgUbGQ+fdkkvUXWdf7fjjf1xQpHhiL9GDojGAucM1miq+IMHyr2MOJUch+9AXTKwY6Uj8bQi8zuipqxiZHqJhJTAqihg84NSz+4j2x2Ne5dj4Q51PgO2g4TbRhlGB6fKfAB4bRJoPCJ5B9CVBMQaMoWOjwTQ2IYOyF1S2NmYp1Q5+48gmnmW+/Q2RVpvV3nO+JamlCt62HbfuI7eVY6iA3yGJFdUMtlvEZnWj33b1ZflItoDXioyNIfjDo5apKjs3c2W7bnYUpj4Ibdm2IG6nnZJUwRmiAK+UJyvppntz7sD/sj9mlcMJ2Is+lKZKk6x+xMap1clUet53JUhbYz48+AlIKsLq42H0Q1bNkwHVjHe9G8J0Oyey6yoGZ/3Ct7WfChtICek=#012np0005536112.localdomain,192.168.122.103,np0005536112* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEhTNMHoEUKeo+SbFEjVq0noQjdh5ueeCaAMXSXq+cUS#012np0005536112.localdomain,192.168.122.103,np0005536112* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAS5iO/sH1uWBLyJGCeN1lcb3XfVYnTcZfJCpNP9CuyoZ07rz87H7oQZHVDzQPk6h/i8gNKlBcRQ/+RyUfW57sQ=#012np0005536114.localdomain,192.168.122.105,np0005536114* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDMsepcg+BueYLiPgRHnP9Izs6ROoJIH+OgayDdq1vuZHUwaHTqGCuLqGGJHUB7pN6LVaaMeaxMqz615UuHzL1S8q0VpdrxlRYDvwaY/OI3okxeGCpmkUWORcZlxfhYklmXCUTnEfVisKc379eTDBcFWqgA/GKCRJ+KzzNuunc4S5HjuSGdXSMFSlNOhdX0yW1dGsGVIG7Yihr76o1WhifRGz4KEAQ9F3Kq3YTcbLLcsqlU9r6qHaAj3M19ulSoUaH8GvfUnQa9FX24xH5pbSFBL9P5onW2xZZf4Dl/K5sE4PETonYgeqPONH09NPE8qBduLsJKVGl3wXMkMNbOcxOM1TEuOld3F4kkmFj5txTfV+vftRpWL5fcP83bMRw4r1lm6XmiQi+5KAKizplNvKE74oOKiash0ylGy8qK4yFMkTVu5F8ulzb9EDeKyzLTBz6HFeosGXyEMG15QXT68QHoYUEPVl/WiDxcqy2qed8dCCZd0b7xV4tb3jJ45MUVI/M=#012np0005536114.localdomain,192.168.122.105,np0005536114* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIC8g3+5saep0KUt+B+xBvdPd+g8d1cqU/g3MxUV7N1u+#012np0005536114.localdomain,192.168.122.105,np0005536114* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHT8rjPGIJNO77K+UcW9V6JAD+bMHLaS11y3riZBA/kj4EWt9qpD9zvxzByAWLKFL4Czse4+1gZJH9aEA42byFU=#012np0005536113.localdomain,192.168.122.104,np0005536113* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCcNCOFbU4kXLoESv/2g1Ngr/xjK4i+uRjHpmzX1pOkz0pFxT3jscjN9VufPlOVwhkzCZZmudRNYn+Yv6BKrU+arWLx5NIIkErYWE6+lYRTKPt81XDZ9pZwtdR59NiimZURgJntJ/Ru0lPTJGpMJ3x65MyMyQV4kOEjdCiwhnLSp4XlQMBsmOB7tpglxHiPSkFMvwWTg8pbWwMA+td249DJF7U5eM+rvSCXS4maLqVmbYXuy6O+rNjaPgpAiLoJUn4HclHA3QPPT0KFE4aamiRb6ge0mG/XEMj3yM402Amdu7Rf4uR5Y+25j0VGSlmEOLKsRLOpV2xNfgJthx7xnfOoPlzRsl5sp0VxC9k9FeKtdJH8vtnrRkBeCZpME3/DWwH7ZylwHFC2Ew4ws53R/R+hp27zxNJ0isqnkvAViw9HjfC+ChQv9H0Z52p8plqo14Nc0JCESOolX2/apTOTKbd6+Cfxv5QxsZWdNeteKQfFK4mUcGrhujE4vqv/U11ThQE=#012np0005536113.localdomain,192.168.122.104,np0005536113* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOzkfyvzlV9FsbZQSAPfK4jG5AuvWk8fegG/IHJm5rej#012np0005536113.localdomain,192.168.122.104,np0005536113* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLyPJlf94gNjz7jeDrIzlyx4RjLJguu39FlSUBLZ1n9KwTPy93Bbte44nltO23J8X0JxXqGXAaPpw2jIndnJRWw=#012np0005536119.localdomain,192.168.122.108,np0005536119* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCyUtJmwR3ObwtG76w1yXvomHeazZ9D9Kciy4thYCqfEBxb3vhVsB3GMYI6r6bv/mIdyY4palMp4Sr9+W36ruD0JQWfBhLPCSN+hps78UbvG6JXVY/AXW8Cweb5cxgq+IgYeMfhHkpciM4wq8I7uZ0kcMw9+pL76alR0DkvQW/eedRwdkx1/b4pXDds4YPlbSAHas3nVgc/RrfGIQJ1tDnFyRK50M85UHs9j59jGMB/Bho4zv+gEU5EzIQuUPaCY0sdRohlIWCqIynw0PycXoJ7eeCuhrCd3U9FD1XuCVKtOPfX2U3altG+lcVUpyjgP+3dYffy/mDzC6vTljrAuxXHtoePKGJWvB6OS25CSmjSngLlV3ZbXFIDCi2RQpKrjVknksY86vz/sch6ul+qHi/m7r2zptSixRHV9c+BDd1EjAllDmCHp3R9E5dF9Re5IZtna4qBBxgKFmbLgYpuvoNtRNlzQ5ZshhVC01OfQBmSGOqXmr9+KJIYMTYePz1aGUs=#012np0005536119.localdomain,192.168.122.108,np0005536119* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKDWa3W6rempWXWAD3XiR7U5lajG2mwWlQNmnR/NPY3b#012np0005536119.localdomain,192.168.122.108,np0005536119* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCmbufxfpzTVAEqjQKEi5qjhiahiXQR9yLC6XItoceyLgwK1Qq7LdpRIuli6kY/kfbLcgz9GzW0nSFv/ATL4+pE=#012 create=True mode=0644 path=/tmp/ansible.w1whuj5c state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:18:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25523 DF PROTO=TCP SPT=44470 DPT=9882 SEQ=2245472588 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B227FC0000000001030307) Nov 26 04:18:32 localhost python3.9[142745]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.w1whuj5c' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:18:33 localhost python3.9[142839]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.w1whuj5c state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:18:34 localhost systemd[1]: session-45.scope: Deactivated successfully. Nov 26 04:18:34 localhost systemd[1]: session-45.scope: Consumed 4.269s CPU time. Nov 26 04:18:34 localhost systemd-logind[761]: Session 45 logged out. Waiting for processes to exit. Nov 26 04:18:34 localhost systemd-logind[761]: Removed session 45. Nov 26 04:18:41 localhost sshd[142854]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:18:41 localhost systemd-logind[761]: New session 46 of user zuul. Nov 26 04:18:41 localhost systemd[1]: Started Session 46 of User zuul. Nov 26 04:18:42 localhost python3.9[142947]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 04:18:43 localhost python3.9[143043]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None Nov 26 04:18:45 localhost python3.9[143137]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 26 04:18:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16041 DF PROTO=TCP SPT=47888 DPT=9105 SEQ=2402541362 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B260A00000000001030307) Nov 26 04:18:46 localhost python3.9[143230]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:18:47 localhost python3.9[143323]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:18:47 localhost python3.9[143417]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:18:48 localhost python3.9[143512]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:18:49 localhost systemd-logind[761]: Session 46 logged out. Waiting for processes to exit. Nov 26 04:18:49 localhost systemd[1]: session-46.scope: Deactivated successfully. Nov 26 04:18:49 localhost systemd[1]: session-46.scope: Consumed 3.773s CPU time. Nov 26 04:18:49 localhost systemd-logind[761]: Removed session 46. Nov 26 04:18:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64246 DF PROTO=TCP SPT=49816 DPT=9102 SEQ=520698026 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B27FE90000000001030307) Nov 26 04:18:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37951 DF PROTO=TCP SPT=43500 DPT=9100 SEQ=1299689687 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B2811E0000000001030307) Nov 26 04:18:54 localhost sshd[143527]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:18:54 localhost systemd-logind[761]: New session 47 of user zuul. Nov 26 04:18:54 localhost systemd[1]: Started Session 47 of User zuul. Nov 26 04:18:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64247 DF PROTO=TCP SPT=49816 DPT=9102 SEQ=520698026 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B283FC0000000001030307) Nov 26 04:18:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37952 DF PROTO=TCP SPT=43500 DPT=9100 SEQ=1299689687 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B2853D0000000001030307) Nov 26 04:18:55 localhost python3.9[143620]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 04:18:56 localhost python3.9[143716]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 26 04:18:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64248 DF PROTO=TCP SPT=49816 DPT=9102 SEQ=520698026 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B28BFD0000000001030307) Nov 26 04:18:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12721 DF PROTO=TCP SPT=56836 DPT=9101 SEQ=3202272863 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B28C9B0000000001030307) Nov 26 04:18:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37953 DF PROTO=TCP SPT=43500 DPT=9100 SEQ=1299689687 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B28D3C0000000001030307) Nov 26 04:18:57 localhost python3.9[143770]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Nov 26 04:18:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12722 DF PROTO=TCP SPT=56836 DPT=9101 SEQ=3202272863 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B290BD0000000001030307) Nov 26 04:19:00 localhost sshd[143773]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:19:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64249 DF PROTO=TCP SPT=49816 DPT=9102 SEQ=520698026 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B29BBC0000000001030307) Nov 26 04:19:01 localhost python3.9[143864]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:19:03 localhost python3.9[143957]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/reboot_required/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:19:03 localhost python3.9[144049]: ansible-ansible.builtin.file Invoked with mode=0600 path=/var/lib/openstack/reboot_required/needs_restarting state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:19:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12724 DF PROTO=TCP SPT=56836 DPT=9101 SEQ=3202272863 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B2A87C0000000001030307) Nov 26 04:19:04 localhost python3.9[144141]: ansible-ansible.builtin.lineinfile Invoked with dest=/var/lib/openstack/reboot_required/needs_restarting line=Not root, Subscription Management repositories not updated#012Core libraries or services have been updated since boot-up:#012 * systemd#012#012Reboot is required to fully utilize these updates.#012More information: https://access.redhat.com/solutions/27943 path=/var/lib/openstack/reboot_required/needs_restarting state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:19:06 localhost python3.9[144231]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Nov 26 04:19:06 localhost python3.9[144321]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:19:07 localhost python3.9[144413]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:19:07 localhost systemd[1]: session-47.scope: Deactivated successfully. Nov 26 04:19:07 localhost systemd[1]: session-47.scope: Consumed 8.680s CPU time. Nov 26 04:19:07 localhost systemd-logind[761]: Session 47 logged out. Waiting for processes to exit. Nov 26 04:19:07 localhost systemd-logind[761]: Removed session 47. Nov 26 04:19:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64250 DF PROTO=TCP SPT=49816 DPT=9102 SEQ=520698026 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B2BBFC0000000001030307) Nov 26 04:19:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12725 DF PROTO=TCP SPT=56836 DPT=9101 SEQ=3202272863 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B2C7FD0000000001030307) Nov 26 04:19:13 localhost sshd[144555]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:19:14 localhost systemd-logind[761]: New session 48 of user zuul. Nov 26 04:19:14 localhost systemd[1]: Started Session 48 of User zuul. Nov 26 04:19:15 localhost python3.9[144648]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 04:19:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17461 DF PROTO=TCP SPT=40598 DPT=9105 SEQ=108370506 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B2D5D00000000001030307) Nov 26 04:19:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17462 DF PROTO=TCP SPT=40598 DPT=9105 SEQ=108370506 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B2D9BC0000000001030307) Nov 26 04:19:17 localhost python3.9[144744]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:19:18 localhost python3.9[144836]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:19:18 localhost python3.9[144909]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764148757.4599376-180-274710154220000/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=5ae35f2cd6e1d86b32ab15d958135d599d5a1291 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:19:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17463 DF PROTO=TCP SPT=40598 DPT=9105 SEQ=108370506 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B2E1BD0000000001030307) Nov 26 04:19:19 localhost python3.9[145001]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-sriov setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:19:19 localhost python3.9[145093]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:19:20 localhost python3.9[145166]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764148759.4610045-253-216440471158917/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=5ae35f2cd6e1d86b32ab15d958135d599d5a1291 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:19:21 localhost python3.9[145258]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-dhcp setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:19:21 localhost python3.9[145350]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:19:22 localhost python3.9[145423]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764148761.2707765-324-242569740964263/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=5ae35f2cd6e1d86b32ab15d958135d599d5a1291 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:19:22 localhost python3.9[145515]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:19:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17464 DF PROTO=TCP SPT=40598 DPT=9105 SEQ=108370506 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B2F17D0000000001030307) Nov 26 04:19:23 localhost python3.9[145607]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:19:23 localhost python3.9[145680]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764148762.9943073-395-69810506555325/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=5ae35f2cd6e1d86b32ab15d958135d599d5a1291 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:19:24 localhost python3.9[145772]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:19:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53185 DF PROTO=TCP SPT=39908 DPT=9102 SEQ=3209304611 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B2F93C0000000001030307) Nov 26 04:19:25 localhost python3.9[145864]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:19:25 localhost python3.9[145937]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764148764.7765245-463-139958230456469/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=5ae35f2cd6e1d86b32ab15d958135d599d5a1291 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:19:26 localhost python3.9[146029]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:19:27 localhost python3.9[146121]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:19:27 localhost python3.9[146194]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764148766.6539524-532-195105698193/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=5ae35f2cd6e1d86b32ab15d958135d599d5a1291 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:19:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24501 DF PROTO=TCP SPT=56990 DPT=9101 SEQ=3632472092 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B305BD0000000001030307) Nov 26 04:19:28 localhost python3.9[146286]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:19:28 localhost python3.9[146378]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:19:29 localhost python3.9[146451]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764148768.4329765-607-76170827687231/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=5ae35f2cd6e1d86b32ab15d958135d599d5a1291 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:19:30 localhost python3.9[146543]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:19:30 localhost python3.9[146635]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:19:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53187 DF PROTO=TCP SPT=39908 DPT=9102 SEQ=3209304611 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B310FC0000000001030307) Nov 26 04:19:31 localhost chronyd[136746]: Selected source 149.56.19.163 (pool.ntp.org) Nov 26 04:19:31 localhost python3.9[146708]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764148770.2951908-679-252619137730714/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=5ae35f2cd6e1d86b32ab15d958135d599d5a1291 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:19:32 localhost systemd[1]: session-48.scope: Deactivated successfully. Nov 26 04:19:32 localhost systemd[1]: session-48.scope: Consumed 11.171s CPU time. Nov 26 04:19:32 localhost systemd-logind[761]: Session 48 logged out. Waiting for processes to exit. Nov 26 04:19:32 localhost systemd-logind[761]: Removed session 48. Nov 26 04:19:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24503 DF PROTO=TCP SPT=56990 DPT=9101 SEQ=3632472092 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B31D7C0000000001030307) Nov 26 04:19:37 localhost sshd[146724]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:19:37 localhost systemd-logind[761]: New session 49 of user zuul. Nov 26 04:19:37 localhost systemd[1]: Started Session 49 of User zuul. Nov 26 04:19:38 localhost python3.9[146819]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:19:39 localhost python3.9[146911]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:19:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53188 DF PROTO=TCP SPT=39908 DPT=9102 SEQ=3209304611 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B331FC0000000001030307) Nov 26 04:19:39 localhost python3.9[146984]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764148778.7365854-63-105668538195391/.source.conf _original_basename=ceph.conf follow=False checksum=5363da7c09e5b1f3ad2ebedbef5afc26e1e7bde3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:19:40 localhost python3.9[147076]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:19:41 localhost python3.9[147149]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764148780.0652242-63-66126985014230/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=fb7204a16245207b5739f6a2b62bcbdeec90bcc9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:19:41 localhost sshd[147164]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:19:41 localhost systemd[1]: session-49.scope: Deactivated successfully. Nov 26 04:19:41 localhost systemd[1]: session-49.scope: Consumed 2.245s CPU time. Nov 26 04:19:41 localhost systemd-logind[761]: Session 49 logged out. Waiting for processes to exit. Nov 26 04:19:41 localhost systemd-logind[761]: Removed session 49. Nov 26 04:19:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24504 DF PROTO=TCP SPT=56990 DPT=9101 SEQ=3632472092 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B33DFC0000000001030307) Nov 26 04:19:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43042 DF PROTO=TCP SPT=45376 DPT=9882 SEQ=973594443 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B33FFC0000000001030307) Nov 26 04:19:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37870 DF PROTO=TCP SPT=53196 DPT=9105 SEQ=3089786120 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B34B030000000001030307) Nov 26 04:19:46 localhost sshd[147166]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:19:47 localhost systemd-logind[761]: New session 50 of user zuul. Nov 26 04:19:47 localhost systemd[1]: Started Session 50 of User zuul. Nov 26 04:19:48 localhost python3.9[147259]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 04:19:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37872 DF PROTO=TCP SPT=53196 DPT=9105 SEQ=3089786120 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B356FC0000000001030307) Nov 26 04:19:49 localhost python3.9[147355]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:19:49 localhost python3.9[147447]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Nov 26 04:19:50 localhost python3.9[147537]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 04:19:51 localhost python3.9[147629]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False Nov 26 04:19:52 localhost python3.9[147721]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 26 04:19:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37873 DF PROTO=TCP SPT=53196 DPT=9105 SEQ=3089786120 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B366BC0000000001030307) Nov 26 04:19:53 localhost python3.9[147775]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 26 04:19:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36618 DF PROTO=TCP SPT=54718 DPT=9102 SEQ=3042888503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B36E3D0000000001030307) Nov 26 04:19:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64252 DF PROTO=TCP SPT=49816 DPT=9102 SEQ=520698026 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B379FD0000000001030307) Nov 26 04:19:57 localhost python3.9[147869]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Nov 26 04:19:58 localhost python3[147964]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012 rule:#012 proto: udp#012 dport: 4789#012- rule_name: 119 neutron geneve networks#012 rule:#012 proto: udp#012 dport: 6081#012 state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012 rule:#012 proto: udp#012 dport: 6081#012 table: raw#012 chain: OUTPUT#012 jump: NOTRACK#012 action: append#012 state: []#012- rule_name: 121 neutron geneve networks no conntrack#012 rule:#012 proto: udp#012 dport: 6081#012 table: raw#012 chain: PREROUTING#012 jump: NOTRACK#012 action: append#012 state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present Nov 26 04:19:59 localhost python3.9[148056]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:20:00 localhost python3.9[148148]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:20:00 localhost python3.9[148196]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:20:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12727 DF PROTO=TCP SPT=56836 DPT=9101 SEQ=3202272863 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B385FC0000000001030307) Nov 26 04:20:01 localhost python3.9[148288]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:20:01 localhost python3.9[148336]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.t828v0p2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:20:02 localhost python3.9[148428]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:20:03 localhost python3.9[148476]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:20:04 localhost python3.9[148568]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:20:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2883 DF PROTO=TCP SPT=47420 DPT=9101 SEQ=2191307523 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B392BC0000000001030307) Nov 26 04:20:04 localhost python3[148661]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall Nov 26 04:20:05 localhost python3.9[148753]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:20:06 localhost python3.9[148828]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764148805.1414537-432-189407067586449/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:20:07 localhost python3.9[148920]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:20:07 localhost python3.9[148995]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764148806.5858305-477-245802375546345/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:20:08 localhost python3.9[149087]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:20:08 localhost python3.9[149162]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764148807.872267-522-110267246068647/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:20:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36621 DF PROTO=TCP SPT=54718 DPT=9102 SEQ=3042888503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B3A5FC0000000001030307) Nov 26 04:20:09 localhost python3.9[149254]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:20:10 localhost python3.9[149329]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764148809.147056-567-20568026371673/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:20:10 localhost python3.9[149421]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:20:11 localhost python3.9[149496]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764148810.4635754-612-194556060302013/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:20:12 localhost python3.9[149588]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:20:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2141 DF PROTO=TCP SPT=53178 DPT=9882 SEQ=2287863369 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B3B3FC0000000001030307) Nov 26 04:20:13 localhost python3.9[149680]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:20:13 localhost python3.9[149805]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:20:14 localhost python3.9[149929]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:20:15 localhost python3.9[150022]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:20:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58225 DF PROTO=TCP SPT=42666 DPT=9105 SEQ=3618243819 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B3C0300000000001030307) Nov 26 04:20:16 localhost python3.9[150131]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:20:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58226 DF PROTO=TCP SPT=42666 DPT=9105 SEQ=3618243819 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B3C43C0000000001030307) Nov 26 04:20:16 localhost python3.9[150226]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:20:18 localhost python3.9[150317]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 04:20:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58227 DF PROTO=TCP SPT=42666 DPT=9105 SEQ=3618243819 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B3CC3C0000000001030307) Nov 26 04:20:19 localhost python3.9[150410]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=np0005536118.localdomain external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:3e:0a:21:75:a3:de" external_ids:ovn-encap-ip=172.19.0.107 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=tcp:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:20:19 localhost ovs-vsctl[150411]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=np0005536118.localdomain external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:3e:0a:21:75:a3:de external_ids:ovn-encap-ip=172.19.0.107 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=tcp:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch Nov 26 04:20:19 localhost sshd[150504]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:20:20 localhost python3.9[150503]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:20:20 localhost python3.9[150598]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:20:22 localhost python3.9[150692]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:20:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58228 DF PROTO=TCP SPT=42666 DPT=9105 SEQ=3618243819 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B3DBFC0000000001030307) Nov 26 04:20:23 localhost python3.9[150784]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:20:24 localhost python3.9[150832]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:20:24 localhost python3.9[150924]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:20:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52138 DF PROTO=TCP SPT=39878 DPT=9102 SEQ=887474935 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B3E37C0000000001030307) Nov 26 04:20:25 localhost python3.9[150972]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:20:26 localhost python3.9[151064]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:20:26 localhost python3.9[151156]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:20:27 localhost python3.9[151204]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:20:27 localhost python3.9[151296]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:20:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53190 DF PROTO=TCP SPT=39908 DPT=9102 SEQ=3209304611 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B3EFFC0000000001030307) Nov 26 04:20:28 localhost python3.9[151344]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:20:29 localhost python3.9[151436]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:20:29 localhost systemd[1]: Reloading. Nov 26 04:20:29 localhost systemd-rc-local-generator[151458]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:20:29 localhost systemd-sysv-generator[151463]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:20:29 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:20:30 localhost python3.9[151566]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:20:30 localhost python3.9[151614]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:20:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52140 DF PROTO=TCP SPT=39878 DPT=9102 SEQ=887474935 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B3FB3C0000000001030307) Nov 26 04:20:31 localhost python3.9[151706]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:20:31 localhost python3.9[151754]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:20:32 localhost python3.9[151846]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:20:32 localhost systemd[1]: Reloading. Nov 26 04:20:32 localhost systemd-rc-local-generator[151869]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:20:32 localhost systemd-sysv-generator[151875]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:20:32 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:20:32 localhost systemd[1]: Starting Create netns directory... Nov 26 04:20:32 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Nov 26 04:20:32 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Nov 26 04:20:32 localhost systemd[1]: Finished Create netns directory. Nov 26 04:20:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5640 DF PROTO=TCP SPT=47968 DPT=9101 SEQ=3905995171 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B407FC0000000001030307) Nov 26 04:20:35 localhost python3.9[151982]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:20:36 localhost python3.9[152074]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:20:36 localhost python3.9[152147]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764148835.5867672-1344-194848703421260/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 26 04:20:37 localhost python3.9[152239]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:20:38 localhost python3.9[152332]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:20:38 localhost python3.9[152407]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764148837.614473-1419-5931938252841/.source.json _original_basename=.oez8ejel follow=False checksum=38f75f59f5c2ef6b5da12297bfd31cd1e97012ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:20:39 localhost python3.9[152499]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:20:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51363 DF PROTO=TCP SPT=58792 DPT=9100 SEQ=568607760 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B41BFC0000000001030307) Nov 26 04:20:41 localhost python3.9[152756]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False Nov 26 04:20:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5641 DF PROTO=TCP SPT=47968 DPT=9101 SEQ=3905995171 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B427FC0000000001030307) Nov 26 04:20:42 localhost python3.9[152848]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Nov 26 04:20:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61201 DF PROTO=TCP SPT=55232 DPT=9882 SEQ=3830614910 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B429FD0000000001030307) Nov 26 04:20:43 localhost python3.9[152940]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Nov 26 04:20:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58508 DF PROTO=TCP SPT=42398 DPT=9105 SEQ=465368665 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B435600000000001030307) Nov 26 04:20:47 localhost python3[153058]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False Nov 26 04:20:48 localhost python3[153058]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "197857ba4b35dfe0da58eb2e9c37f91c8a1d2b66c0967b4c66656aa6329b870c",#012 "Digest": "sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-11-21T06:40:43.504967825Z",#012 "Config": {#012 "User": "root",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251118",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 345731014,#012 "VirtualSize": 345731014,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/0ff11ed3154c8bbd91096301c9cfc5b95bbe726d99c5650ba8d355053fb0bbad/diff:/var/lib/containers/storage/overlay/6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068/diff:/var/lib/containers/storage/overlay/ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/d16160b7dcc2f7ec400dce38b825ab93d5279c0ca0a9a7ff351e435b4aeeea92/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/d16160b7dcc2f7ec400dce38b825ab93d5279c0ca0a9a7ff351e435b4aeeea92/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6",#012 "sha256:573e98f577c8b1610c1485067040ff856a142394fcd22ad4cb9c66b7d1de6bef",#012 "sha256:2e0f9ca9a8387a3566096aacaecfe5797e3fc2585f07cb97a1706897fa1a86a3",#012 "sha256:db37b2d335b44e6a9cb2eb88713051bc469233d1e0a06670f1303bc9539b97a0"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251118",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "root",#012 "History": [#012 {#012 "created": "2025-11-18T01:56:49.795434035Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:6d427dd138d2b0977a7ef7feaa8bd82d04e99cc5f4a16d555d6cff0cb52d43c6 in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-18T01:56:49.795512415Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251118\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-18T01:56:52.547242013Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-11-21T06:10:01.947310748Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947327778Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947358359Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947372589Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.94738527Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.94739397Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:02.324930938Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:36.349393468Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:39.924297673Z",#012 "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-li Nov 26 04:20:48 localhost podman[153108]: 2025-11-26 09:20:48.287316137 +0000 UTC m=+0.079411789 container remove 4f1b0e615ec54e9892d3e21c74d40b971337f161a3f8953c022e7aed2b6ef3d5 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, url=https://www.redhat.com, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 26 04:20:48 localhost python3[153058]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ovn_controller Nov 26 04:20:48 localhost podman[153121]: Nov 26 04:20:48 localhost podman[153121]: 2025-11-26 09:20:48.387353596 +0000 UTC m=+0.082174168 container create 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:20:48 localhost podman[153121]: 2025-11-26 09:20:48.349176458 +0000 UTC m=+0.043997060 image pull quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified Nov 26 04:20:48 localhost python3[153058]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified Nov 26 04:20:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58510 DF PROTO=TCP SPT=42398 DPT=9105 SEQ=465368665 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B4417C0000000001030307) Nov 26 04:20:49 localhost python3.9[153249]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:20:50 localhost python3.9[153343]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:20:50 localhost python3.9[153389]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:20:51 localhost python3.9[153480]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764148850.474708-1683-180519024144933/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:20:51 localhost python3.9[153526]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 26 04:20:51 localhost systemd[1]: Reloading. Nov 26 04:20:51 localhost systemd-rc-local-generator[153551]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:20:51 localhost systemd-sysv-generator[153554]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:20:51 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:20:52 localhost python3.9[153608]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:20:52 localhost systemd[1]: Reloading. Nov 26 04:20:52 localhost systemd-rc-local-generator[153635]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:20:52 localhost systemd-sysv-generator[153640]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:20:52 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:20:52 localhost systemd[1]: Starting ovn_controller container... Nov 26 04:20:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58511 DF PROTO=TCP SPT=42398 DPT=9105 SEQ=465368665 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B4513C0000000001030307) Nov 26 04:20:52 localhost systemd[1]: tmp-crun.8xzM4k.mount: Deactivated successfully. Nov 26 04:20:53 localhost systemd[1]: Started libcrun container. Nov 26 04:20:53 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c2cf7aaddd7bd7b9dc8ec6ca9b73671744fcd3a4da7b63a52a8061ab1ca73f70/merged/run/ovn supports timestamps until 2038 (0x7fffffff) Nov 26 04:20:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:20:53 localhost podman[153650]: 2025-11-26 09:20:53.052816048 +0000 UTC m=+0.144053509 container init 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118) Nov 26 04:20:53 localhost ovn_controller[153664]: + sudo -E kolla_set_configs Nov 26 04:20:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:20:53 localhost podman[153650]: 2025-11-26 09:20:53.08675317 +0000 UTC m=+0.177990581 container start 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller) Nov 26 04:20:53 localhost edpm-start-podman-container[153650]: ovn_controller Nov 26 04:20:53 localhost systemd[1]: Created slice User Slice of UID 0. Nov 26 04:20:53 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Nov 26 04:20:53 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Nov 26 04:20:53 localhost systemd[1]: Starting User Manager for UID 0... Nov 26 04:20:53 localhost podman[153672]: 2025-11-26 09:20:53.226654675 +0000 UTC m=+0.133387875 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 26 04:20:53 localhost podman[153672]: 2025-11-26 09:20:53.241648606 +0000 UTC m=+0.148381836 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true) Nov 26 04:20:53 localhost edpm-start-podman-container[153649]: Creating additional drop-in dependency for "ovn_controller" (123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140) Nov 26 04:20:53 localhost podman[153672]: unhealthy Nov 26 04:20:53 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:20:53 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Failed with result 'exit-code'. Nov 26 04:20:53 localhost systemd[1]: Reloading. Nov 26 04:20:53 localhost systemd[153694]: Queued start job for default target Main User Target. Nov 26 04:20:53 localhost systemd[153694]: Created slice User Application Slice. Nov 26 04:20:53 localhost systemd-journald[47778]: Field hash table of /run/log/journal/ea6370aa35b896eb1e7cdbd81aa316d7/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation. Nov 26 04:20:53 localhost systemd-journald[47778]: /run/log/journal/ea6370aa35b896eb1e7cdbd81aa316d7/system.journal: Journal header limits reached or header out-of-date, rotating. Nov 26 04:20:53 localhost systemd[153694]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Nov 26 04:20:53 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 26 04:20:53 localhost systemd[153694]: Started Daily Cleanup of User's Temporary Directories. Nov 26 04:20:53 localhost systemd[153694]: Reached target Paths. Nov 26 04:20:53 localhost systemd[153694]: Reached target Timers. Nov 26 04:20:53 localhost systemd[153694]: Starting D-Bus User Message Bus Socket... Nov 26 04:20:53 localhost systemd[153694]: Starting Create User's Volatile Files and Directories... Nov 26 04:20:53 localhost systemd[153694]: Finished Create User's Volatile Files and Directories. Nov 26 04:20:53 localhost systemd[153694]: Listening on D-Bus User Message Bus Socket. Nov 26 04:20:53 localhost systemd[153694]: Reached target Sockets. Nov 26 04:20:53 localhost systemd[153694]: Reached target Basic System. Nov 26 04:20:53 localhost systemd[153694]: Reached target Main User Target. Nov 26 04:20:53 localhost systemd[153694]: Startup finished in 132ms. Nov 26 04:20:53 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 26 04:20:53 localhost systemd-rc-local-generator[153749]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:20:53 localhost systemd-sysv-generator[153752]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:20:53 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:20:53 localhost systemd[1]: Started User Manager for UID 0. Nov 26 04:20:53 localhost systemd[1]: Started ovn_controller container. Nov 26 04:20:53 localhost systemd[1]: Started Session c12 of User root. Nov 26 04:20:53 localhost ovn_controller[153664]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Nov 26 04:20:53 localhost ovn_controller[153664]: INFO:__main__:Validating config file Nov 26 04:20:53 localhost ovn_controller[153664]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Nov 26 04:20:53 localhost ovn_controller[153664]: INFO:__main__:Writing out command to execute Nov 26 04:20:53 localhost systemd[1]: session-c12.scope: Deactivated successfully. Nov 26 04:20:53 localhost ovn_controller[153664]: ++ cat /run_command Nov 26 04:20:53 localhost ovn_controller[153664]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock ' Nov 26 04:20:53 localhost ovn_controller[153664]: + ARGS= Nov 26 04:20:53 localhost ovn_controller[153664]: + sudo kolla_copy_cacerts Nov 26 04:20:53 localhost systemd[1]: Started Session c13 of User root. Nov 26 04:20:53 localhost ovn_controller[153664]: + [[ ! -n '' ]] Nov 26 04:20:53 localhost ovn_controller[153664]: + . kolla_extend_start Nov 26 04:20:53 localhost ovn_controller[153664]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock '\''' Nov 26 04:20:53 localhost ovn_controller[153664]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock ' Nov 26 04:20:53 localhost ovn_controller[153664]: + umask 0022 Nov 26 04:20:53 localhost ovn_controller[153664]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock Nov 26 04:20:53 localhost systemd[1]: session-c13.scope: Deactivated successfully. Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting... Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8] Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00004|main|INFO|OVS IDL reconnected, force recompute. Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00005|reconnect|INFO|tcp:ovsdbserver-sb.openstack.svc:6642: connecting... Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00006|main|INFO|OVNSB IDL reconnected, force recompute. Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00007|reconnect|INFO|tcp:ovsdbserver-sb.openstack.svc:6642: connected Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting... Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00011|features|INFO|OVS Feature: ct_flush, state: supported Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00012|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting... Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00013|main|INFO|OVS feature set changed, force recompute. Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00014|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00015|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting... Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00016|reconnect|INFO|unix:/run/openvswitch/db.sock: connected Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00018|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00019|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms) Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00020|main|INFO|OVS OpenFlow connection reconnected,force recompute. Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00021|main|INFO|OVS feature set changed, force recompute. Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00022|ovn_bfd|INFO|Disabled BFD on interface ovn-0e4a56-0 Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00023|ovn_bfd|INFO|Disabled BFD on interface ovn-9f6a17-0 Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00024|ovn_bfd|INFO|Disabled BFD on interface ovn-7174ad-0 Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00025|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4 Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00026|binding|INFO|Claiming lport 5afdc9d0-9595-4904-b83b-3d24f739ffec for this chassis. Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00027|binding|INFO|5afdc9d0-9595-4904-b83b-3d24f739ffec: Claiming fa:16:3e:8c:0f:d8 192.168.0.160 Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00028|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00029|binding|INFO|Removing lport 5afdc9d0-9595-4904-b83b-3d24f739ffec ovn-installed in OVS Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting... Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting... Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00030|ovn_bfd|INFO|Enabled BFD on interface ovn-0e4a56-0 Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00031|ovn_bfd|INFO|Enabled BFD on interface ovn-9f6a17-0 Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00032|ovn_bfd|INFO|Enabled BFD on interface ovn-7174ad-0 Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00033|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00034|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00035|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00036|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00037|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 04:20:53 localhost ovn_controller[153664]: 2025-11-26T09:20:53Z|00038|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 04:20:54 localhost python3.9[153864]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:20:54 localhost ovs-vsctl[153865]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload Nov 26 04:20:54 localhost ovn_controller[153664]: 2025-11-26T09:20:54Z|00039|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 04:20:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34688 DF PROTO=TCP SPT=60000 DPT=9102 SEQ=2724752433 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B458BC0000000001030307) Nov 26 04:20:55 localhost python3.9[153957]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:20:55 localhost ovs-vsctl[153959]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids Nov 26 04:20:55 localhost ovn_controller[153664]: 2025-11-26T09:20:55Z|00040|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 04:20:55 localhost ovn_controller[153664]: 2025-11-26T09:20:55Z|00041|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 04:20:56 localhost python3.9[154052]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:20:56 localhost ovs-vsctl[154053]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options Nov 26 04:20:56 localhost systemd[1]: session-50.scope: Deactivated successfully. Nov 26 04:20:56 localhost systemd[1]: session-50.scope: Consumed 39.769s CPU time. Nov 26 04:20:56 localhost systemd-logind[761]: Session 50 logged out. Waiting for processes to exit. Nov 26 04:20:56 localhost systemd-logind[761]: Removed session 50. Nov 26 04:20:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29778 DF PROTO=TCP SPT=37488 DPT=9101 SEQ=3873168092 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B4657C0000000001030307) Nov 26 04:20:58 localhost sshd[154068]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:21:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34690 DF PROTO=TCP SPT=60000 DPT=9102 SEQ=2724752433 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B4707C0000000001030307) Nov 26 04:21:01 localhost ovn_controller[153664]: 2025-11-26T09:21:01Z|00042|binding|INFO|Setting lport 5afdc9d0-9595-4904-b83b-3d24f739ffec ovn-installed in OVS Nov 26 04:21:01 localhost ovn_controller[153664]: 2025-11-26T09:21:01Z|00043|binding|INFO|Setting lport 5afdc9d0-9595-4904-b83b-3d24f739ffec up in Southbound Nov 26 04:21:03 localhost systemd[1]: Stopping User Manager for UID 0... Nov 26 04:21:03 localhost systemd[153694]: Activating special unit Exit the Session... Nov 26 04:21:03 localhost systemd[153694]: Stopped target Main User Target. Nov 26 04:21:03 localhost systemd[153694]: Stopped target Basic System. Nov 26 04:21:03 localhost systemd[153694]: Stopped target Paths. Nov 26 04:21:03 localhost systemd[153694]: Stopped target Sockets. Nov 26 04:21:03 localhost systemd[153694]: Stopped target Timers. Nov 26 04:21:03 localhost systemd[153694]: Stopped Daily Cleanup of User's Temporary Directories. Nov 26 04:21:03 localhost systemd[153694]: Closed D-Bus User Message Bus Socket. Nov 26 04:21:03 localhost systemd[153694]: Stopped Create User's Volatile Files and Directories. Nov 26 04:21:03 localhost systemd[153694]: Removed slice User Application Slice. Nov 26 04:21:03 localhost systemd[153694]: Reached target Shutdown. Nov 26 04:21:03 localhost systemd[153694]: Finished Exit the Session. Nov 26 04:21:03 localhost systemd[153694]: Reached target Exit the Session. Nov 26 04:21:03 localhost systemd[1]: user@0.service: Deactivated successfully. Nov 26 04:21:03 localhost systemd[1]: Stopped User Manager for UID 0. Nov 26 04:21:03 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Nov 26 04:21:03 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Nov 26 04:21:03 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Nov 26 04:21:03 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Nov 26 04:21:03 localhost systemd[1]: Removed slice User Slice of UID 0. Nov 26 04:21:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29780 DF PROTO=TCP SPT=37488 DPT=9101 SEQ=3873168092 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B47D3D0000000001030307) Nov 26 04:21:04 localhost sshd[154074]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:21:04 localhost systemd-logind[761]: New session 52 of user zuul. Nov 26 04:21:04 localhost systemd[1]: Started Session 52 of User zuul. Nov 26 04:21:05 localhost python3.9[154167]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 04:21:06 localhost python3.9[154263]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Nov 26 04:21:07 localhost python3.9[154355]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:21:07 localhost python3.9[154447]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:21:08 localhost python3.9[154539]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:21:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34691 DF PROTO=TCP SPT=60000 DPT=9102 SEQ=2724752433 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B48FFD0000000001030307) Nov 26 04:21:08 localhost python3.9[154631]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:21:10 localhost python3.9[154721]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 04:21:11 localhost python3.9[154813]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False Nov 26 04:21:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29781 DF PROTO=TCP SPT=37488 DPT=9101 SEQ=3873168092 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B49DFC0000000001030307) Nov 26 04:21:13 localhost python3.9[154904]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:21:13 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2711 DF PROTO=TCP SPT=59464 DPT=9882 SEQ=3785723740 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B49FFC0000000001030307) Nov 26 04:21:13 localhost python3.9[154977]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764148872.4374623-219-76427860921742/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:21:14 localhost python3.9[155067]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:21:14 localhost python3.9[155140]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764148873.9044437-264-54125964157917/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:21:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11268 DF PROTO=TCP SPT=60154 DPT=9105 SEQ=427704466 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B4AA900000000001030307) Nov 26 04:21:16 localhost python3.9[155275]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 26 04:21:16 localhost podman[155341]: 2025-11-26 09:21:16.415014973 +0000 UTC m=+0.111611113 container exec a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, maintainer=Guillaume Abrioux , GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, build-date=2025-09-24T08:57:55, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, distribution-scope=public, io.buildah.version=1.33.12, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_CLEAN=True, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 26 04:21:16 localhost podman[155341]: 2025-11-26 09:21:16.530417906 +0000 UTC m=+0.227014056 container exec_died a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, version=7, maintainer=Guillaume Abrioux , io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, name=rhceph, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, CEPH_POINT_RELEASE=, RELEASE=main, ceph=True, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., GIT_BRANCH=main, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 26 04:21:17 localhost python3.9[155470]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 26 04:21:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11270 DF PROTO=TCP SPT=60154 DPT=9105 SEQ=427704466 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B4B67C0000000001030307) Nov 26 04:21:22 localhost python3.9[155625]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Nov 26 04:21:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11271 DF PROTO=TCP SPT=60154 DPT=9105 SEQ=427704466 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B4C63D0000000001030307) Nov 26 04:21:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:21:23 localhost systemd[1]: tmp-crun.GNV67S.mount: Deactivated successfully. Nov 26 04:21:23 localhost podman[155719]: 2025-11-26 09:21:23.834922034 +0000 UTC m=+0.091903433 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:21:23 localhost ovn_controller[153664]: 2025-11-26T09:21:23Z|00044|memory|INFO|17024 kB peak resident set size after 30.2 seconds Nov 26 04:21:23 localhost ovn_controller[153664]: 2025-11-26T09:21:23Z|00045|memory|INFO|idl-cells-OVN_Southbound:4028 idl-cells-Open_vSwitch:1045 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:1 lflow-cache-entries-cache-expr:76 lflow-cache-entries-cache-matches:195 lflow-cache-size-KB:288 local_datapath_usage-KB:1 ofctrl_desired_flow_usage-KB:154 ofctrl_installed_flow_usage-KB:111 ofctrl_sb_flow_ref_usage-KB:67 Nov 26 04:21:23 localhost python3.9[155718]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:21:23 localhost podman[155719]: 2025-11-26 09:21:23.916283463 +0000 UTC m=+0.173264812 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true) Nov 26 04:21:23 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:21:24 localhost python3.9[155813]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764148883.4582956-375-49273537626983/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:21:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1572 DF PROTO=TCP SPT=57118 DPT=9102 SEQ=1899584126 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B4CDFD0000000001030307) Nov 26 04:21:24 localhost python3.9[155903]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:21:25 localhost python3.9[155974]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764148884.531842-375-14774508891962/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:21:26 localhost python3.9[156064]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:21:27 localhost python3.9[156135]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764148886.3526828-507-79017420648687/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=aa9e89725fbcebf7a5c773d7b97083445b7b7759 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:21:27 localhost python3.9[156225]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:21:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51365 DF PROTO=TCP SPT=58792 DPT=9100 SEQ=568607760 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B4D9FD0000000001030307) Nov 26 04:21:28 localhost python3.9[156296]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764148887.3938172-507-149099849062367/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=979187b925479d81d0609f4188e5b95fe1f92c18 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:21:29 localhost python3.9[156386]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:21:29 localhost python3.9[156480]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:21:30 localhost python3.9[156572]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:21:30 localhost python3.9[156620]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:21:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1574 DF PROTO=TCP SPT=57118 DPT=9102 SEQ=1899584126 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B4E5BC0000000001030307) Nov 26 04:21:31 localhost python3.9[156712]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:21:31 localhost ovn_controller[153664]: 2025-11-26T09:21:31Z|00046|memory_trim|INFO|Detected inactivity (last active 30009 ms ago): trimming memory Nov 26 04:21:31 localhost python3.9[156760]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:21:33 localhost python3.9[156852]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:21:33 localhost python3.9[156944]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:21:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37948 DF PROTO=TCP SPT=34854 DPT=9101 SEQ=4043688281 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B4F23C0000000001030307) Nov 26 04:21:34 localhost python3.9[156992]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:21:35 localhost python3.9[157084]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:21:35 localhost python3.9[157132]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:21:36 localhost python3.9[157224]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:21:36 localhost systemd[1]: Reloading. Nov 26 04:21:36 localhost systemd-rc-local-generator[157246]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:21:36 localhost systemd-sysv-generator[157252]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:21:36 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:21:37 localhost sshd[157285]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:21:37 localhost python3.9[157356]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:21:38 localhost python3.9[157404]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:21:38 localhost python3.9[157496]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:21:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1575 DF PROTO=TCP SPT=57118 DPT=9102 SEQ=1899584126 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B505FC0000000001030307) Nov 26 04:21:39 localhost python3.9[157544]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:21:40 localhost python3.9[157636]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:21:40 localhost systemd[1]: Reloading. Nov 26 04:21:40 localhost systemd-rc-local-generator[157661]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:21:40 localhost systemd-sysv-generator[157665]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:21:40 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:21:40 localhost systemd[1]: Starting Create netns directory... Nov 26 04:21:40 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Nov 26 04:21:40 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Nov 26 04:21:40 localhost systemd[1]: Finished Create netns directory. Nov 26 04:21:41 localhost python3.9[157770]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:21:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37949 DF PROTO=TCP SPT=34854 DPT=9101 SEQ=4043688281 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B511FC0000000001030307) Nov 26 04:21:42 localhost python3.9[157862]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:21:42 localhost python3.9[157935]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764148901.7805843-960-38903190796628/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 26 04:21:44 localhost python3.9[158027]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:21:45 localhost python3.9[158119]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:21:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49924 DF PROTO=TCP SPT=48756 DPT=9105 SEQ=3980765548 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B51FC00000000001030307) Nov 26 04:21:46 localhost python3.9[158194]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764148905.013359-1035-233435374130007/.source.json _original_basename=.7x1qroke follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:21:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49925 DF PROTO=TCP SPT=48756 DPT=9105 SEQ=3980765548 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B523BC0000000001030307) Nov 26 04:21:47 localhost python3.9[158286]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:21:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49926 DF PROTO=TCP SPT=48756 DPT=9105 SEQ=3980765548 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B52BBD0000000001030307) Nov 26 04:21:49 localhost python3.9[158543]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False Nov 26 04:21:50 localhost python3.9[158635]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Nov 26 04:21:51 localhost python3.9[158727]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Nov 26 04:21:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49927 DF PROTO=TCP SPT=48756 DPT=9105 SEQ=3980765548 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B53B7C0000000001030307) Nov 26 04:21:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:21:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59063 DF PROTO=TCP SPT=49926 DPT=9102 SEQ=1436529357 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B542FD0000000001030307) Nov 26 04:21:54 localhost podman[158768]: 2025-11-26 09:21:54.827346679 +0000 UTC m=+0.087065959 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3) Nov 26 04:21:54 localhost podman[158768]: 2025-11-26 09:21:54.868355068 +0000 UTC m=+0.128074398 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 26 04:21:54 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:21:55 localhost python3[158870]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False Nov 26 04:21:56 localhost python3[158870]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9",#012 "Digest": "sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-11-21T06:31:40.431364621Z",#012 "Config": {#012 "User": "neutron",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251118",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 784198911,#012 "VirtualSize": 784198911,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/7f2203bb12e8263bda654cf994f0f457cee81cbd85ac1474f70f0dcdab850bfc/diff:/var/lib/containers/storage/overlay/cb3e7a7b413bc69102c7d8435b32176125a43d794f5f87a0ef3f45710221e344/diff:/var/lib/containers/storage/overlay/0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c/diff:/var/lib/containers/storage/overlay/6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068/diff:/var/lib/containers/storage/overlay/ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/94bc28862446c9d52bea5cb761ece58f4b7ce0b6f4d30585ec973efe7d5007f7/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/94bc28862446c9d52bea5cb761ece58f4b7ce0b6f4d30585ec973efe7d5007f7/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6",#012 "sha256:573e98f577c8b1610c1485067040ff856a142394fcd22ad4cb9c66b7d1de6bef",#012 "sha256:5a71e5d7d31f15255619cb8b9384b708744757c93993652418b0f45b0c0931d5",#012 "sha256:03228f16e908b0892695bcc077f4378f9669ff86bd51a3747df5ce9269c56477",#012 "sha256:1bc9c5b4c351caaeaa6b900805b43669e78b079f06d9048393517dd05690b8dc",#012 "sha256:83d6638c009d9ced6da21e0f659e23221a9a8d7c283582e370f21a7551100a49"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251118",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "neutron",#012 "History": [#012 {#012 "created": "2025-11-18T01:56:49.795434035Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:6d427dd138d2b0977a7ef7feaa8bd82d04e99cc5f4a16d555d6cff0cb52d43c6 in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-18T01:56:49.795512415Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251118\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-18T01:56:52.547242013Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-11-21T06:10:01.947310748Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947327778Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947358359Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947372589Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.94738527Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.94739397Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:02.324930938Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:36.349393468Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf Nov 26 04:21:56 localhost podman[158921]: 2025-11-26 09:21:56.344410239 +0000 UTC m=+0.094124304 container remove 670a0f2ef4a7fe523d1dae5c11e50652bc9d25ede4b9212ac36edce6d03c0942 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_id=tripleo_step4, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8346a4a86ac2c2b1d52b2e36f598d419'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, container_name=ovn_metadata_agent, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 04:21:56 localhost python3[158870]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ovn_metadata_agent Nov 26 04:21:56 localhost podman[158935]: Nov 26 04:21:56 localhost podman[158935]: 2025-11-26 09:21:56.444612972 +0000 UTC m=+0.076878315 container create 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:21:56 localhost podman[158935]: 2025-11-26 09:21:56.41356604 +0000 UTC m=+0.045831433 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Nov 26 04:21:56 localhost python3[158870]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311 --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Nov 26 04:21:57 localhost python3.9[159065]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:21:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57033 DF PROTO=TCP SPT=36076 DPT=9101 SEQ=1165923763 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B54FBC0000000001030307) Nov 26 04:21:58 localhost python3.9[159159]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:21:59 localhost python3.9[159205]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:21:59 localhost python3.9[159296]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764148919.3057773-1299-79101995371047/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:22:00 localhost python3.9[159342]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 26 04:22:00 localhost systemd[1]: Reloading. Nov 26 04:22:00 localhost systemd-rc-local-generator[159365]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:22:00 localhost systemd-sysv-generator[159372]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:22:00 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:22:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59065 DF PROTO=TCP SPT=49926 DPT=9102 SEQ=1436529357 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B55ABD0000000001030307) Nov 26 04:22:01 localhost python3.9[159423]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:22:01 localhost systemd[1]: Reloading. Nov 26 04:22:01 localhost systemd-rc-local-generator[159451]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:22:01 localhost systemd-sysv-generator[159457]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:22:01 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:22:01 localhost systemd[1]: Starting dnf makecache... Nov 26 04:22:01 localhost systemd[1]: Starting ovn_metadata_agent container... Nov 26 04:22:01 localhost systemd[1]: tmp-crun.pHMh6y.mount: Deactivated successfully. Nov 26 04:22:01 localhost systemd[1]: Started libcrun container. Nov 26 04:22:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/637a74e3b65d22b4a64eac10cc49c00ba6501fc28ce23004549e92491131d7fc/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Nov 26 04:22:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/637a74e3b65d22b4a64eac10cc49c00ba6501fc28ce23004549e92491131d7fc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 04:22:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:22:01 localhost podman[159466]: 2025-11-26 09:22:01.934003969 +0000 UTC m=+0.175986778 container init 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:22:01 localhost ovn_metadata_agent[159481]: + sudo -E kolla_set_configs Nov 26 04:22:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:22:01 localhost podman[159466]: 2025-11-26 09:22:01.979177614 +0000 UTC m=+0.221160413 container start 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:22:01 localhost edpm-start-podman-container[159466]: ovn_metadata_agent Nov 26 04:22:02 localhost ovn_metadata_agent[159481]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Nov 26 04:22:02 localhost ovn_metadata_agent[159481]: INFO:__main__:Validating config file Nov 26 04:22:02 localhost ovn_metadata_agent[159481]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Nov 26 04:22:02 localhost ovn_metadata_agent[159481]: INFO:__main__:Copying service configuration files Nov 26 04:22:02 localhost ovn_metadata_agent[159481]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Nov 26 04:22:02 localhost ovn_metadata_agent[159481]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Nov 26 04:22:02 localhost ovn_metadata_agent[159481]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Nov 26 04:22:02 localhost ovn_metadata_agent[159481]: INFO:__main__:Writing out command to execute Nov 26 04:22:02 localhost ovn_metadata_agent[159481]: INFO:__main__:Setting permission for /var/lib/neutron Nov 26 04:22:02 localhost ovn_metadata_agent[159481]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Nov 26 04:22:02 localhost ovn_metadata_agent[159481]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Nov 26 04:22:02 localhost ovn_metadata_agent[159481]: INFO:__main__:Setting permission for /var/lib/neutron/external Nov 26 04:22:02 localhost ovn_metadata_agent[159481]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Nov 26 04:22:02 localhost ovn_metadata_agent[159481]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Nov 26 04:22:02 localhost ovn_metadata_agent[159481]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Nov 26 04:22:02 localhost ovn_metadata_agent[159481]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Nov 26 04:22:02 localhost ovn_metadata_agent[159481]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Nov 26 04:22:02 localhost ovn_metadata_agent[159481]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934 Nov 26 04:22:02 localhost ovn_metadata_agent[159481]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Nov 26 04:22:02 localhost ovn_metadata_agent[159481]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/3633976c-3aa0-4c4a-aa49-e8224cd25e39.pid.haproxy Nov 26 04:22:02 localhost ovn_metadata_agent[159481]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/3633976c-3aa0-4c4a-aa49-e8224cd25e39.conf Nov 26 04:22:02 localhost ovn_metadata_agent[159481]: ++ cat /run_command Nov 26 04:22:02 localhost ovn_metadata_agent[159481]: + CMD=neutron-ovn-metadata-agent Nov 26 04:22:02 localhost ovn_metadata_agent[159481]: + ARGS= Nov 26 04:22:02 localhost ovn_metadata_agent[159481]: + sudo kolla_copy_cacerts Nov 26 04:22:02 localhost dnf[159463]: Updating Subscription Management repositories. Nov 26 04:22:02 localhost ovn_metadata_agent[159481]: Running command: 'neutron-ovn-metadata-agent' Nov 26 04:22:02 localhost ovn_metadata_agent[159481]: + [[ ! -n '' ]] Nov 26 04:22:02 localhost ovn_metadata_agent[159481]: + . kolla_extend_start Nov 26 04:22:02 localhost ovn_metadata_agent[159481]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\''' Nov 26 04:22:02 localhost ovn_metadata_agent[159481]: + umask 0022 Nov 26 04:22:02 localhost ovn_metadata_agent[159481]: + exec neutron-ovn-metadata-agent Nov 26 04:22:02 localhost podman[159490]: 2025-11-26 09:22:02.073878084 +0000 UTC m=+0.086805712 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=starting, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 26 04:22:02 localhost podman[159490]: 2025-11-26 09:22:02.157228438 +0000 UTC m=+0.170156076 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 26 04:22:02 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:22:02 localhost edpm-start-podman-container[159465]: Creating additional drop-in dependency for "ovn_metadata_agent" (659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c) Nov 26 04:22:02 localhost systemd[1]: Reloading. Nov 26 04:22:02 localhost systemd-rc-local-generator[159559]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:22:02 localhost systemd-sysv-generator[159563]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:22:02 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:22:02 localhost systemd[1]: Started ovn_metadata_agent container. Nov 26 04:22:03 localhost systemd[1]: session-52.scope: Deactivated successfully. Nov 26 04:22:03 localhost systemd[1]: session-52.scope: Consumed 31.719s CPU time. Nov 26 04:22:03 localhost systemd-logind[761]: Session 52 logged out. Waiting for processes to exit. Nov 26 04:22:03 localhost systemd-logind[761]: Removed session 52. Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.585 159486 INFO neutron.common.config [-] Logging enabled!#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.585 159486 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.585 159486 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.586 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.586 159486 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.586 159486 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.586 159486 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.586 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.586 159486 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.587 159486 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.587 159486 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.587 159486 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.587 159486 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.587 159486 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.587 159486 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.587 159486 DEBUG neutron.agent.ovn.metadata_agent [-] backlog = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.587 159486 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.588 159486 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.588 159486 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.588 159486 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.588 159486 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.588 159486 DEBUG neutron.agent.ovn.metadata_agent [-] config_file = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.588 159486 DEBUG neutron.agent.ovn.metadata_agent [-] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.588 159486 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.589 159486 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.589 159486 DEBUG neutron.agent.ovn.metadata_agent [-] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.589 159486 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.589 159486 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.589 159486 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.589 159486 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.589 159486 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.589 159486 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.589 159486 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.590 159486 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.590 159486 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.590 159486 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.590 159486 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.590 159486 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.590 159486 DEBUG neutron.agent.ovn.metadata_agent [-] host = np0005536118.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.590 159486 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.591 159486 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.591 159486 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.591 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.591 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.591 159486 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.591 159486 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.591 159486 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.591 159486 DEBUG neutron.agent.ovn.metadata_agent [-] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.591 159486 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.592 159486 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.592 159486 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.592 159486 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.592 159486 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.592 159486 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.592 159486 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.592 159486 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.592 159486 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.593 159486 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.593 159486 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.593 159486 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.593 159486 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.593 159486 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.593 159486 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.593 159486 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.593 159486 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.594 159486 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.594 159486 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.594 159486 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.594 159486 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.594 159486 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.594 159486 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.594 159486 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.594 159486 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.594 159486 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.594 159486 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.595 159486 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.595 159486 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol = http log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.595 159486 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.595 159486 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.595 159486 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.595 159486 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.595 159486 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.595 159486 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.595 159486 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.595 159486 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.595 159486 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.596 159486 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.596 159486 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.596 159486 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.596 159486 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.596 159486 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.596 159486 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.596 159486 DEBUG neutron.agent.ovn.metadata_agent [-] state_path = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.596 159486 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.596 159486 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.596 159486 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.597 159486 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.597 159486 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.597 159486 DEBUG neutron.agent.ovn.metadata_agent [-] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.597 159486 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.597 159486 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.597 159486 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.597 159486 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.597 159486 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.597 159486 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.597 159486 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.597 159486 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.598 159486 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.598 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.598 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.598 159486 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.598 159486 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.598 159486 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.598 159486 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.598 159486 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.598 159486 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.599 159486 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.599 159486 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.599 159486 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.599 159486 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.599 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.599 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.599 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.599 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.599 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.599 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.600 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.600 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.600 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.600 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.600 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.600 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.600 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.600 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.600 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.601 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.601 159486 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.601 159486 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.601 159486 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.601 159486 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.601 159486 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.601 159486 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.601 159486 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.601 159486 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.602 159486 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.602 159486 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.602 159486 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.602 159486 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.602 159486 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.602 159486 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.602 159486 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.602 159486 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.602 159486 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.603 159486 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.603 159486 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.603 159486 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.603 159486 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.603 159486 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.603 159486 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.603 159486 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.603 159486 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.603 159486 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.604 159486 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.604 159486 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.604 159486 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.604 159486 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.604 159486 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.604 159486 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.604 159486 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.604 159486 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.604 159486 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.605 159486 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.605 159486 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.605 159486 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.605 159486 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.605 159486 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.605 159486 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.605 159486 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.605 159486 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.605 159486 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.605 159486 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.606 159486 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.606 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.606 159486 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.606 159486 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.606 159486 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.606 159486 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.606 159486 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.606 159486 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.606 159486 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.606 159486 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.607 159486 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.607 159486 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.607 159486 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.607 159486 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.607 159486 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.607 159486 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.607 159486 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.607 159486 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.607 159486 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.608 159486 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.608 159486 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.608 159486 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.608 159486 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.608 159486 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.608 159486 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.608 159486 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.608 159486 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.608 159486 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.608 159486 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.609 159486 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.609 159486 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.609 159486 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.609 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.609 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.609 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.609 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.609 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.609 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.609 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.610 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.610 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.610 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.610 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.610 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.610 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.610 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.610 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.610 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.610 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.611 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.611 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.611 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.611 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.611 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.611 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.611 159486 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.611 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.611 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.612 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.612 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.612 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.612 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.612 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.612 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.612 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.612 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.612 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.612 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.613 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.613 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.613 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.613 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.613 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.613 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.613 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection = tcp:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.613 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.613 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.613 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.614 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.614 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.614 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.614 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.614 159486 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.614 159486 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.614 159486 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.614 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.614 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.614 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.615 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.615 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.615 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.615 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.615 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.615 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.615 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.615 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.615 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.615 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.616 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.616 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.616 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.616 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.616 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.616 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.616 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.616 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.616 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.616 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.617 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.617 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.617 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.617 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.617 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.617 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.617 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.617 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.617 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.618 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.618 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.618 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.618 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.618 159486 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.618 159486 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.627 159486 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.627 159486 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.627 159486 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.628 159486 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.628 159486 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.644 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 8fad182b-d1fd-4eb1-a4d3-436a76a6f49e (UUID: 8fad182b-d1fd-4eb1-a4d3-436a76a6f49e) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.660 159486 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.661 159486 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.661 159486 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.661 159486 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.663 159486 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.666 159486 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connected#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.677 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: PortBindingCreateWithChassis(events=('create',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:0f:d8 192.168.0.160'], port_security=['fa:16:3e:8c:0f:d8 192.168.0.160'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.160/24', 'neutron:device_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'neutron:device_owner': 'compute:nova', 'neutron:host_id': 'np0005536118.localdomain', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3633976c-3aa0-4c4a-aa49-e8224cd25e39', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'neutron:revision_number': '7', 'neutron:security_group_ids': '10c2b79b-e9f0-444f-8b9c-e9015cac7c52 4b147283-0178-4a15-bbd3-c1ef9b53dbb6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9eb25cee-4262-4506-9877-de1032fbc4e7, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=5afdc9d0-9595-4904-b83b-3d24f739ffec) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.677 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '8fad182b-d1fd-4eb1-a4d3-436a76a6f49e'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[], external_ids={'neutron:ovn-metadata-id': 'c9580114-75b9-56a5-ae74-eb2303220346', 'neutron:ovn-metadata-sb-cfg': '1'}, name=8fad182b-d1fd-4eb1-a4d3-436a76a6f49e, nb_cfg_timestamp=1764148861757, nb_cfg=3) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.678 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 5afdc9d0-9595-4904-b83b-3d24f739ffec in datapath 3633976c-3aa0-4c4a-aa49-e8224cd25e39 bound to our chassis on insert#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.678 159486 DEBUG neutron_lib.callbacks.manager [-] Subscribe: > process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.679 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.679 159486 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.679 159486 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.679 159486 INFO oslo_service.service [-] Starting 1 workers#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.682 159486 DEBUG oslo_service.service [-] Started child 159587 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.684 159486 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3633976c-3aa0-4c4a-aa49-e8224cd25e39#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.685 159587 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-2031005'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.686 159486 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmp_4ce7v26/privsep.sock']#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.700 159587 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.700 159587 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.700 159587 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.702 159587 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.703 159587 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connected#033[00m Nov 26 04:22:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:03.711 159587 INFO eventlet.wsgi.server [-] (159587) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m Nov 26 04:22:03 localhost dnf[159463]: Metadata cache refreshed recently. Nov 26 04:22:03 localhost systemd[1]: dnf-makecache.service: Deactivated successfully. Nov 26 04:22:03 localhost systemd[1]: Finished dnf makecache. Nov 26 04:22:03 localhost systemd[1]: dnf-makecache.service: Consumed 2.102s CPU time. Nov 26 04:22:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57035 DF PROTO=TCP SPT=36076 DPT=9101 SEQ=1165923763 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B5677C0000000001030307) Nov 26 04:22:04 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:04.351 159486 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m Nov 26 04:22:04 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:04.352 159486 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp_4ce7v26/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m Nov 26 04:22:04 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:04.232 159592 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Nov 26 04:22:04 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:04.239 159592 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Nov 26 04:22:04 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:04.245 159592 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m Nov 26 04:22:04 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:04.245 159592 INFO oslo.privsep.daemon [-] privsep daemon running as pid 159592#033[00m Nov 26 04:22:04 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:04.355 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[41ef3687-92ac-49ea-8e12-3cffe9427cfd]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:22:04 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:04.836 159592 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:22:04 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:04.836 159592 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:22:04 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:04.836 159592 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:22:05 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:05.318 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[19fdc223-fe84-4967-81ae-2cbb7f559096]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:22:05 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:05.320 159486 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpvaizkd7a/privsep.sock']#033[00m Nov 26 04:22:05 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:05.900 159486 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m Nov 26 04:22:05 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:05.901 159486 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpvaizkd7a/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m Nov 26 04:22:05 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:05.793 159603 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Nov 26 04:22:05 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:05.797 159603 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Nov 26 04:22:05 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:05.799 159603 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m Nov 26 04:22:05 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:05.799 159603 INFO oslo.privsep.daemon [-] privsep daemon running as pid 159603#033[00m Nov 26 04:22:05 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:05.904 159603 DEBUG oslo.privsep.daemon [-] privsep: reply[b85c1094-9351-434d-8b6c-d442dc7d57aa]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:22:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:06.350 159603 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:22:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:06.350 159603 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:22:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:06.350 159603 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:22:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:06.832 159603 DEBUG oslo.privsep.daemon [-] privsep: reply[363bdc6b-25e8-4eac-9ebb-aa17a5c73eb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:22:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:06.834 159603 DEBUG oslo.privsep.daemon [-] privsep: reply[228155c2-9ad8-4384-aaae-c382bf9b4c37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:22:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:06.850 159603 DEBUG oslo.privsep.daemon [-] privsep: reply[0d6d8f39-1e47-43a4-998d-0f89a05caf60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:22:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:06.863 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[9407b0b4-0b97-4ef5-8942-7355c745dfba]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3633976c-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:45:53:57'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 107, 'tx_packets': 68, 'rx_bytes': 9052, 'tx_bytes': 7143, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 107, 'tx_packets': 68, 'rx_bytes': 9052, 'tx_bytes': 7143, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483664], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 623498, 'reachable_time': 44747, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 17, 'outoctets': 1164, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 17, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 1164, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 17, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 159613, 'error': None, 'target': 'ovnmeta-3633976c-3aa0-4c4a-aa49-e8224cd25e39', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:22:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:06.875 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[73ca2f52-5cfc-47ef-b6bb-2ab74adca10f]: (4, ({'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '192.168.0.2'], ['IFA_LOCAL', '192.168.0.2'], ['IFA_BROADCAST', '192.168.0.255'], ['IFA_LABEL', 'tap3633976c-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 623503, 'tstamp': 623503}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 159614, 'error': None, 'target': 'ovnmeta-3633976c-3aa0-4c4a-aa49-e8224cd25e39', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap3633976c-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 623504, 'tstamp': 623504}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 159614, 'error': None, 'target': 'ovnmeta-3633976c-3aa0-4c4a-aa49-e8224cd25e39', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 10, 'prefixlen': 64, 'flags': 128, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::a9fe:a9fe'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 623506, 'tstamp': 623506}], ['IFA_FLAGS', 128]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 159614, 'error': None, 'target': 'ovnmeta-3633976c-3aa0-4c4a-aa49-e8224cd25e39', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 10, 'prefixlen': 64, 'flags': 128, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe45:5357'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 623498, 'tstamp': 623498}], ['IFA_FLAGS', 128]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 159614, 'error': None, 'target': 'ovnmeta-3633976c-3aa0-4c4a-aa49-e8224cd25e39', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:22:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:06.920 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[bda4d0e0-2552-4564-8c83-b327a7198af8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:22:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:06.922 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3633976c-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:22:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:06.926 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3633976c-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:22:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:06.927 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 26 04:22:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:06.927 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3633976c-30, col_values=(('external_ids', {'iface-id': '7d243368-b21b-43d3-98dc-158093f352bc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:22:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:06.928 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 26 04:22:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:06.931 159486 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpmndhh0r6/privsep.sock']#033[00m Nov 26 04:22:07 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:07.550 159486 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m Nov 26 04:22:07 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:07.551 159486 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpmndhh0r6/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m Nov 26 04:22:07 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:07.436 159623 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Nov 26 04:22:07 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:07.440 159623 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Nov 26 04:22:07 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:07.442 159623 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m Nov 26 04:22:07 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:07.442 159623 INFO oslo.privsep.daemon [-] privsep daemon running as pid 159623#033[00m Nov 26 04:22:07 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:07.556 159623 DEBUG oslo.privsep.daemon [-] privsep: reply[a5d7e630-2b2d-4752-b488-054f6354d295]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.011 159623 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.011 159623 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.011 159623 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.458 159623 DEBUG oslo.privsep.daemon [-] privsep: reply[08436941-2ea6-473d-9c89-67973b7edfde]: (4, ['ovnmeta-3633976c-3aa0-4c4a-aa49-e8224cd25e39']) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.461 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=8fad182b-d1fd-4eb1-a4d3-436a76a6f49e, column=external_ids, values=({'neutron:ovn-metadata-id': 'c9580114-75b9-56a5-ae74-eb2303220346'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.462 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.463 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fad182b-d1fd-4eb1-a4d3-436a76a6f49e, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.477 159486 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.478 159486 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.478 159486 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.478 159486 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.478 159486 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.478 159486 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.478 159486 DEBUG oslo_service.service [-] agent_down_time = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.479 159486 DEBUG oslo_service.service [-] allow_bulk = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.479 159486 DEBUG oslo_service.service [-] api_extensions_path = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.479 159486 DEBUG oslo_service.service [-] api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.479 159486 DEBUG oslo_service.service [-] api_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.479 159486 DEBUG oslo_service.service [-] auth_ca_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.479 159486 DEBUG oslo_service.service [-] auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.479 159486 DEBUG oslo_service.service [-] backlog = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.480 159486 DEBUG oslo_service.service [-] base_mac = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.480 159486 DEBUG oslo_service.service [-] bind_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.480 159486 DEBUG oslo_service.service [-] bind_port = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.480 159486 DEBUG oslo_service.service [-] client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.480 159486 DEBUG oslo_service.service [-] config_dir = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.480 159486 DEBUG oslo_service.service [-] config_file = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.480 159486 DEBUG oslo_service.service [-] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.481 159486 DEBUG oslo_service.service [-] control_exchange = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.481 159486 DEBUG oslo_service.service [-] core_plugin = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.481 159486 DEBUG oslo_service.service [-] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.481 159486 DEBUG oslo_service.service [-] default_availability_zones = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.481 159486 DEBUG oslo_service.service [-] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.482 159486 DEBUG oslo_service.service [-] dhcp_agent_notification = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.482 159486 DEBUG oslo_service.service [-] dhcp_lease_duration = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.482 159486 DEBUG oslo_service.service [-] dhcp_load_type = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.482 159486 DEBUG oslo_service.service [-] dns_domain = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.482 159486 DEBUG oslo_service.service [-] enable_new_agents = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.482 159486 DEBUG oslo_service.service [-] enable_traditional_dhcp = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.482 159486 DEBUG oslo_service.service [-] external_dns_driver = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.483 159486 DEBUG oslo_service.service [-] external_pids = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.483 159486 DEBUG oslo_service.service [-] filter_validation = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.483 159486 DEBUG oslo_service.service [-] global_physnet_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.483 159486 DEBUG oslo_service.service [-] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.483 159486 DEBUG oslo_service.service [-] host = np0005536118.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.483 159486 DEBUG oslo_service.service [-] http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.483 159486 DEBUG oslo_service.service [-] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.484 159486 DEBUG oslo_service.service [-] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.484 159486 DEBUG oslo_service.service [-] ipam_driver = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.484 159486 DEBUG oslo_service.service [-] ipv6_pd_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.484 159486 DEBUG oslo_service.service [-] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.484 159486 DEBUG oslo_service.service [-] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.484 159486 DEBUG oslo_service.service [-] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.484 159486 DEBUG oslo_service.service [-] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.485 159486 DEBUG oslo_service.service [-] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.485 159486 DEBUG oslo_service.service [-] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.485 159486 DEBUG oslo_service.service [-] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.485 159486 DEBUG oslo_service.service [-] log_rotation_type = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.485 159486 DEBUG oslo_service.service [-] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.485 159486 DEBUG oslo_service.service [-] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.486 159486 DEBUG oslo_service.service [-] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.486 159486 DEBUG oslo_service.service [-] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.486 159486 DEBUG oslo_service.service [-] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.486 159486 DEBUG oslo_service.service [-] max_dns_nameservers = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.486 159486 DEBUG oslo_service.service [-] max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.486 159486 DEBUG oslo_service.service [-] max_logfile_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.486 159486 DEBUG oslo_service.service [-] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.487 159486 DEBUG oslo_service.service [-] max_subnet_host_routes = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.487 159486 DEBUG oslo_service.service [-] metadata_backlog = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.487 159486 DEBUG oslo_service.service [-] metadata_proxy_group = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.487 159486 DEBUG oslo_service.service [-] metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.487 159486 DEBUG oslo_service.service [-] metadata_proxy_socket = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.487 159486 DEBUG oslo_service.service [-] metadata_proxy_socket_mode = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.487 159486 DEBUG oslo_service.service [-] metadata_proxy_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.488 159486 DEBUG oslo_service.service [-] metadata_workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.488 159486 DEBUG oslo_service.service [-] network_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.488 159486 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.488 159486 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.488 159486 DEBUG oslo_service.service [-] nova_client_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.488 159486 DEBUG oslo_service.service [-] nova_client_priv_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.488 159486 DEBUG oslo_service.service [-] nova_metadata_host = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.488 159486 DEBUG oslo_service.service [-] nova_metadata_insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.489 159486 DEBUG oslo_service.service [-] nova_metadata_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.489 159486 DEBUG oslo_service.service [-] nova_metadata_protocol = http log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.489 159486 DEBUG oslo_service.service [-] pagination_max_limit = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.489 159486 DEBUG oslo_service.service [-] periodic_fuzzy_delay = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.489 159486 DEBUG oslo_service.service [-] periodic_interval = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.489 159486 DEBUG oslo_service.service [-] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.489 159486 DEBUG oslo_service.service [-] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.490 159486 DEBUG oslo_service.service [-] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.490 159486 DEBUG oslo_service.service [-] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.490 159486 DEBUG oslo_service.service [-] retry_until_window = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.490 159486 DEBUG oslo_service.service [-] rpc_resources_processing_step = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.490 159486 DEBUG oslo_service.service [-] rpc_response_max_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.490 159486 DEBUG oslo_service.service [-] rpc_state_report_workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.490 159486 DEBUG oslo_service.service [-] rpc_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.491 159486 DEBUG oslo_service.service [-] send_events_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.491 159486 DEBUG oslo_service.service [-] service_plugins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.491 159486 DEBUG oslo_service.service [-] setproctitle = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.491 159486 DEBUG oslo_service.service [-] state_path = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.491 159486 DEBUG oslo_service.service [-] syslog_log_facility = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.491 159486 DEBUG oslo_service.service [-] tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.491 159486 DEBUG oslo_service.service [-] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.492 159486 DEBUG oslo_service.service [-] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.492 159486 DEBUG oslo_service.service [-] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.492 159486 DEBUG oslo_service.service [-] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.492 159486 DEBUG oslo_service.service [-] use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.492 159486 DEBUG oslo_service.service [-] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.492 159486 DEBUG oslo_service.service [-] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.492 159486 DEBUG oslo_service.service [-] vlan_transparent = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.493 159486 DEBUG oslo_service.service [-] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.493 159486 DEBUG oslo_service.service [-] wsgi_default_pool_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.493 159486 DEBUG oslo_service.service [-] wsgi_keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.493 159486 DEBUG oslo_service.service [-] wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.493 159486 DEBUG oslo_service.service [-] wsgi_server_debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.493 159486 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.493 159486 DEBUG oslo_service.service [-] oslo_concurrency.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.494 159486 DEBUG oslo_service.service [-] profiler.connection_string = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.494 159486 DEBUG oslo_service.service [-] profiler.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.494 159486 DEBUG oslo_service.service [-] profiler.es_doc_type = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.494 159486 DEBUG oslo_service.service [-] profiler.es_scroll_size = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.494 159486 DEBUG oslo_service.service [-] profiler.es_scroll_time = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.494 159486 DEBUG oslo_service.service [-] profiler.filter_error_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.495 159486 DEBUG oslo_service.service [-] profiler.hmac_keys = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.495 159486 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.495 159486 DEBUG oslo_service.service [-] profiler.socket_timeout = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.495 159486 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.495 159486 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.495 159486 DEBUG oslo_service.service [-] oslo_policy.enforce_scope = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.495 159486 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.496 159486 DEBUG oslo_service.service [-] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.496 159486 DEBUG oslo_service.service [-] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.496 159486 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.496 159486 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.496 159486 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.496 159486 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.496 159486 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.497 159486 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.497 159486 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.497 159486 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.497 159486 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.497 159486 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.497 159486 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.498 159486 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.498 159486 DEBUG oslo_service.service [-] privsep.capabilities = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.498 159486 DEBUG oslo_service.service [-] privsep.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.498 159486 DEBUG oslo_service.service [-] privsep.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.498 159486 DEBUG oslo_service.service [-] privsep.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.498 159486 DEBUG oslo_service.service [-] privsep.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.498 159486 DEBUG oslo_service.service [-] privsep.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.499 159486 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.499 159486 DEBUG oslo_service.service [-] privsep_dhcp_release.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.499 159486 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.499 159486 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.499 159486 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.499 159486 DEBUG oslo_service.service [-] privsep_dhcp_release.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.499 159486 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.500 159486 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.500 159486 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.500 159486 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.500 159486 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.500 159486 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.500 159486 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.500 159486 DEBUG oslo_service.service [-] privsep_namespace.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.501 159486 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.501 159486 DEBUG oslo_service.service [-] privsep_namespace.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.501 159486 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.501 159486 DEBUG oslo_service.service [-] privsep_namespace.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.501 159486 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.501 159486 DEBUG oslo_service.service [-] privsep_conntrack.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.501 159486 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.502 159486 DEBUG oslo_service.service [-] privsep_conntrack.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.502 159486 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.502 159486 DEBUG oslo_service.service [-] privsep_conntrack.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.502 159486 DEBUG oslo_service.service [-] privsep_link.capabilities = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.502 159486 DEBUG oslo_service.service [-] privsep_link.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.502 159486 DEBUG oslo_service.service [-] privsep_link.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.502 159486 DEBUG oslo_service.service [-] privsep_link.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.502 159486 DEBUG oslo_service.service [-] privsep_link.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.503 159486 DEBUG oslo_service.service [-] privsep_link.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.503 159486 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.503 159486 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.503 159486 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.503 159486 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.503 159486 DEBUG oslo_service.service [-] AGENT.kill_scripts_path = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.504 159486 DEBUG oslo_service.service [-] AGENT.root_helper = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.504 159486 DEBUG oslo_service.service [-] AGENT.root_helper_daemon = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.504 159486 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.504 159486 DEBUG oslo_service.service [-] AGENT.use_random_fully = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.504 159486 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.504 159486 DEBUG oslo_service.service [-] QUOTAS.default_quota = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.504 159486 DEBUG oslo_service.service [-] QUOTAS.quota_driver = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.505 159486 DEBUG oslo_service.service [-] QUOTAS.quota_network = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.505 159486 DEBUG oslo_service.service [-] QUOTAS.quota_port = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.505 159486 DEBUG oslo_service.service [-] QUOTAS.quota_security_group = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.505 159486 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.505 159486 DEBUG oslo_service.service [-] QUOTAS.quota_subnet = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.505 159486 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.505 159486 DEBUG oslo_service.service [-] nova.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.505 159486 DEBUG oslo_service.service [-] nova.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.506 159486 DEBUG oslo_service.service [-] nova.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.506 159486 DEBUG oslo_service.service [-] nova.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.506 159486 DEBUG oslo_service.service [-] nova.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.506 159486 DEBUG oslo_service.service [-] nova.endpoint_type = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.506 159486 DEBUG oslo_service.service [-] nova.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.506 159486 DEBUG oslo_service.service [-] nova.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.506 159486 DEBUG oslo_service.service [-] nova.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.507 159486 DEBUG oslo_service.service [-] nova.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.507 159486 DEBUG oslo_service.service [-] nova.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.507 159486 DEBUG oslo_service.service [-] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.507 159486 DEBUG oslo_service.service [-] placement.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.507 159486 DEBUG oslo_service.service [-] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.507 159486 DEBUG oslo_service.service [-] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.507 159486 DEBUG oslo_service.service [-] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.508 159486 DEBUG oslo_service.service [-] placement.endpoint_type = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.508 159486 DEBUG oslo_service.service [-] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.508 159486 DEBUG oslo_service.service [-] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.508 159486 DEBUG oslo_service.service [-] placement.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.508 159486 DEBUG oslo_service.service [-] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.508 159486 DEBUG oslo_service.service [-] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.508 159486 DEBUG oslo_service.service [-] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.509 159486 DEBUG oslo_service.service [-] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.509 159486 DEBUG oslo_service.service [-] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.509 159486 DEBUG oslo_service.service [-] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.509 159486 DEBUG oslo_service.service [-] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.509 159486 DEBUG oslo_service.service [-] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.509 159486 DEBUG oslo_service.service [-] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.509 159486 DEBUG oslo_service.service [-] ironic.enable_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.509 159486 DEBUG oslo_service.service [-] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.510 159486 DEBUG oslo_service.service [-] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.510 159486 DEBUG oslo_service.service [-] ironic.interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.510 159486 DEBUG oslo_service.service [-] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.510 159486 DEBUG oslo_service.service [-] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.510 159486 DEBUG oslo_service.service [-] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.510 159486 DEBUG oslo_service.service [-] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.510 159486 DEBUG oslo_service.service [-] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.511 159486 DEBUG oslo_service.service [-] ironic.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.511 159486 DEBUG oslo_service.service [-] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.511 159486 DEBUG oslo_service.service [-] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.511 159486 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.511 159486 DEBUG oslo_service.service [-] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.511 159486 DEBUG oslo_service.service [-] ironic.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.511 159486 DEBUG oslo_service.service [-] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.512 159486 DEBUG oslo_service.service [-] cli_script.dry_run = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.512 159486 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.512 159486 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.512 159486 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.512 159486 DEBUG oslo_service.service [-] ovn.dns_servers = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.512 159486 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.512 159486 DEBUG oslo_service.service [-] ovn.neutron_sync_mode = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.513 159486 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.513 159486 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.513 159486 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.513 159486 DEBUG oslo_service.service [-] ovn.ovn_l3_mode = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.513 159486 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.513 159486 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.513 159486 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.514 159486 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.514 159486 DEBUG oslo_service.service [-] ovn.ovn_nb_connection = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.514 159486 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.514 159486 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.514 159486 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.514 159486 DEBUG oslo_service.service [-] ovn.ovn_sb_connection = tcp:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.514 159486 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.515 159486 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.515 159486 DEBUG oslo_service.service [-] ovn.ovsdb_log_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.515 159486 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.515 159486 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.515 159486 DEBUG oslo_service.service [-] ovn.vhost_sock_dir = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.515 159486 DEBUG oslo_service.service [-] ovn.vif_type = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.515 159486 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.516 159486 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.516 159486 DEBUG oslo_service.service [-] OVS.ovsdb_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.516 159486 DEBUG oslo_service.service [-] ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.516 159486 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.516 159486 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.516 159486 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.517 159486 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.517 159486 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.517 159486 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.517 159486 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.517 159486 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.517 159486 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.517 159486 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.518 159486 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.518 159486 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.518 159486 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.518 159486 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.518 159486 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.518 159486 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.518 159486 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.519 159486 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.519 159486 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.519 159486 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.519 159486 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.519 159486 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.519 159486 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.519 159486 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.520 159486 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.520 159486 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.520 159486 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.520 159486 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.520 159486 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.520 159486 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.520 159486 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.521 159486 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.521 159486 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.521 159486 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.521 159486 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.521 159486 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:22:08 localhost ovn_metadata_agent[159481]: 2025-11-26 09:22:08.521 159486 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Nov 26 04:22:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30929 DF PROTO=TCP SPT=35442 DPT=9100 SEQ=2785925651 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B57BFD0000000001030307) Nov 26 04:22:10 localhost sshd[159628]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:22:10 localhost systemd-logind[761]: New session 53 of user zuul. Nov 26 04:22:10 localhost systemd[1]: Started Session 53 of User zuul. Nov 26 04:22:11 localhost python3.9[159721]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 04:22:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57036 DF PROTO=TCP SPT=36076 DPT=9101 SEQ=1165923763 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B587FC0000000001030307) Nov 26 04:22:12 localhost python3.9[159817]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:22:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1064 DF PROTO=TCP SPT=48856 DPT=9882 SEQ=1369882176 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B589FC0000000001030307) Nov 26 04:22:13 localhost python3.9[159922]: ansible-ansible.legacy.command Invoked with _raw_params=podman stop nova_virtlogd _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:22:13 localhost systemd[1]: libpod-9609aaa40dbc5e0c67f8fb745bb157c960dc140d8b4865f0a757076da81f19da.scope: Deactivated successfully. Nov 26 04:22:13 localhost podman[159923]: 2025-11-26 09:22:13.6951921 +0000 UTC m=+0.077470723 container died 9609aaa40dbc5e0c67f8fb745bb157c960dc140d8b4865f0a757076da81f19da (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, name=rhosp17/openstack-nova-libvirt, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-19T00:35:22Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 26 04:22:13 localhost podman[159923]: 2025-11-26 09:22:13.729504732 +0000 UTC m=+0.111783345 container cleanup 9609aaa40dbc5e0c67f8fb745bb157c960dc140d8b4865f0a757076da81f19da (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, build-date=2025-11-19T00:35:22Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-libvirt, release=1761123044, batch=17.1_20251118.1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com) Nov 26 04:22:13 localhost podman[159937]: 2025-11-26 09:22:13.79242781 +0000 UTC m=+0.084802465 container remove 9609aaa40dbc5e0c67f8fb745bb157c960dc140d8b4865f0a757076da81f19da (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.12, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 04:22:13 localhost systemd[1]: libpod-conmon-9609aaa40dbc5e0c67f8fb745bb157c960dc140d8b4865f0a757076da81f19da.scope: Deactivated successfully. Nov 26 04:22:14 localhost systemd[1]: var-lib-containers-storage-overlay-9cad70600db6f91a3503354117d6c001b29942b13ad48c875233bf63324fad13-merged.mount: Deactivated successfully. Nov 26 04:22:14 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9609aaa40dbc5e0c67f8fb745bb157c960dc140d8b4865f0a757076da81f19da-userdata-shm.mount: Deactivated successfully. Nov 26 04:22:15 localhost python3.9[160043]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 26 04:22:15 localhost systemd[1]: Reloading. Nov 26 04:22:15 localhost systemd-rc-local-generator[160065]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:22:15 localhost systemd-sysv-generator[160070]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:22:15 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:22:15 localhost sshd[160080]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:22:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21025 DF PROTO=TCP SPT=52094 DPT=9105 SEQ=3759750439 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B594F00000000001030307) Nov 26 04:22:16 localhost python3.9[160171]: ansible-ansible.builtin.service_facts Invoked Nov 26 04:22:16 localhost network[160188]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 26 04:22:16 localhost network[160189]: 'network-scripts' will be removed from distribution in near future. Nov 26 04:22:16 localhost network[160190]: It is advised to switch to 'NetworkManager' instead for network management. Nov 26 04:22:17 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:22:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21027 DF PROTO=TCP SPT=52094 DPT=9105 SEQ=3759750439 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B5A0FD0000000001030307) Nov 26 04:22:21 localhost python3.9[160468]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:22:21 localhost systemd[1]: Reloading. Nov 26 04:22:21 localhost systemd-sysv-generator[160500]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:22:21 localhost systemd-rc-local-generator[160495]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:22:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:22:22 localhost systemd[1]: Stopped target tripleo_nova_libvirt.target. Nov 26 04:22:22 localhost python3.9[160600]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:22:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21028 DF PROTO=TCP SPT=52094 DPT=9105 SEQ=3759750439 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B5B0BC0000000001030307) Nov 26 04:22:23 localhost python3.9[160693]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:22:24 localhost python3.9[160786]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:22:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46020 DF PROTO=TCP SPT=60764 DPT=9102 SEQ=1195419449 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B5B83C0000000001030307) Nov 26 04:22:25 localhost python3.9[160879]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:22:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:22:25 localhost podman[160881]: 2025-11-26 09:22:25.170424819 +0000 UTC m=+0.091963614 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller) Nov 26 04:22:25 localhost podman[160881]: 2025-11-26 09:22:25.213362913 +0000 UTC m=+0.134901718 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS) Nov 26 04:22:25 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:22:25 localhost python3.9[160997]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:22:26 localhost python3.9[161090]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:22:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1577 DF PROTO=TCP SPT=57118 DPT=9102 SEQ=1899584126 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B5C3FC0000000001030307) Nov 26 04:22:29 localhost python3.9[161183]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:22:29 localhost python3.9[161275]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:22:30 localhost python3.9[161367]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:22:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37951 DF PROTO=TCP SPT=34854 DPT=9101 SEQ=4043688281 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B5CFFC0000000001030307) Nov 26 04:22:31 localhost python3.9[161459]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:22:32 localhost python3.9[161551]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:22:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:22:32 localhost podman[161644]: 2025-11-26 09:22:32.658498282 +0000 UTC m=+0.077467422 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:22:32 localhost podman[161644]: 2025-11-26 09:22:32.690359617 +0000 UTC m=+0.109328717 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118) Nov 26 04:22:32 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:22:32 localhost python3.9[161643]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:22:33 localhost python3.9[161754]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:22:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19278 DF PROTO=TCP SPT=36182 DPT=9101 SEQ=2451765477 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B5DCBC0000000001030307) Nov 26 04:22:34 localhost python3.9[161846]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:22:34 localhost python3.9[161938]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:22:35 localhost python3.9[162030]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:22:36 localhost python3.9[162122]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:22:36 localhost python3.9[162214]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:22:37 localhost python3.9[162306]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:22:37 localhost python3.9[162398]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:22:38 localhost python3.9[162490]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:22:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46023 DF PROTO=TCP SPT=60764 DPT=9102 SEQ=1195419449 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B5EFFC0000000001030307) Nov 26 04:22:39 localhost python3.9[162582]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Nov 26 04:22:40 localhost python3.9[162674]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 26 04:22:40 localhost systemd[1]: Reloading. Nov 26 04:22:40 localhost systemd-rc-local-generator[162697]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:22:40 localhost systemd-sysv-generator[162703]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:22:40 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:22:41 localhost python3.9[162802]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:22:42 localhost python3.9[162895]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:22:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1019 DF PROTO=TCP SPT=46060 DPT=9882 SEQ=548572217 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B5FDFC0000000001030307) Nov 26 04:22:43 localhost python3.9[162988]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:22:45 localhost python3.9[163081]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:22:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42780 DF PROTO=TCP SPT=53818 DPT=9105 SEQ=1968428166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B60A200000000001030307) Nov 26 04:22:45 localhost python3.9[163174]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:22:46 localhost python3.9[163267]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:22:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42781 DF PROTO=TCP SPT=53818 DPT=9105 SEQ=1968428166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B60E3C0000000001030307) Nov 26 04:22:48 localhost python3.9[163360]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:22:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42782 DF PROTO=TCP SPT=53818 DPT=9105 SEQ=1968428166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B6163D0000000001030307) Nov 26 04:22:49 localhost python3.9[163453]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None Nov 26 04:22:50 localhost python3.9[163546]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None Nov 26 04:22:51 localhost python3.9[163644]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005536118.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None Nov 26 04:22:52 localhost python3.9[163744]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 26 04:22:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42783 DF PROTO=TCP SPT=53818 DPT=9105 SEQ=1968428166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B625FC0000000001030307) Nov 26 04:22:53 localhost python3.9[163798]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 26 04:22:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17109 DF PROTO=TCP SPT=44392 DPT=9102 SEQ=1401379642 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B62D7C0000000001030307) Nov 26 04:22:55 localhost sshd[163801]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:22:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:22:55 localhost podman[163803]: 2025-11-26 09:22:55.89281714 +0000 UTC m=+0.388058759 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:22:55 localhost podman[163803]: 2025-11-26 09:22:55.926591167 +0000 UTC m=+0.421832796 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0) Nov 26 04:22:55 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:22:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30931 DF PROTO=TCP SPT=35442 DPT=9100 SEQ=2785925651 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B639FC0000000001030307) Nov 26 04:23:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17111 DF PROTO=TCP SPT=44392 DPT=9102 SEQ=1401379642 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B6453C0000000001030307) Nov 26 04:23:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:23:02 localhost podman[163899]: 2025-11-26 09:23:02.824419127 +0000 UTC m=+0.085557275 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:23:02 localhost podman[163899]: 2025-11-26 09:23:02.854185124 +0000 UTC m=+0.115323342 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent) Nov 26 04:23:02 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:23:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:23:03.620 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:23:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:23:03.621 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:23:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:23:03.622 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:23:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26993 DF PROTO=TCP SPT=45820 DPT=9101 SEQ=3853995611 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B651FC0000000001030307) Nov 26 04:23:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24952 DF PROTO=TCP SPT=52086 DPT=9100 SEQ=2136962314 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B665FC0000000001030307) Nov 26 04:23:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26994 DF PROTO=TCP SPT=45820 DPT=9101 SEQ=3853995611 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B671FC0000000001030307) Nov 26 04:23:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47686 DF PROTO=TCP SPT=46498 DPT=9882 SEQ=2355613075 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B673FC0000000001030307) Nov 26 04:23:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3994 DF PROTO=TCP SPT=55878 DPT=9105 SEQ=2662936692 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B67F500000000001030307) Nov 26 04:23:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3996 DF PROTO=TCP SPT=55878 DPT=9105 SEQ=2662936692 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B68B3D0000000001030307) Nov 26 04:23:18 localhost kernel: SELinux: Converting 2757 SID table entries... Nov 26 04:23:18 localhost kernel: SELinux: Context system_u:object_r:insights_client_cache_t:s0 became invalid (unmapped). Nov 26 04:23:18 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 26 04:23:18 localhost kernel: SELinux: policy capability open_perms=1 Nov 26 04:23:18 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 26 04:23:18 localhost kernel: SELinux: policy capability always_check_network=0 Nov 26 04:23:18 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 26 04:23:18 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 26 04:23:18 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 26 04:23:19 localhost dbus-broker-launch[756]: avc: op=load_policy lsm=selinux seqno=19 res=1 Nov 26 04:23:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3997 DF PROTO=TCP SPT=55878 DPT=9105 SEQ=2662936692 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B69AFC0000000001030307) Nov 26 04:23:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46795 DF PROTO=TCP SPT=36008 DPT=9102 SEQ=2165894344 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B6A2BD0000000001030307) Nov 26 04:23:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:23:26 localhost systemd[1]: tmp-crun.IVVSmf.mount: Deactivated successfully. Nov 26 04:23:26 localhost podman[165046]: 2025-11-26 09:23:26.839668944 +0000 UTC m=+0.091477244 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 26 04:23:26 localhost podman[165046]: 2025-11-26 09:23:26.878289247 +0000 UTC m=+0.130097577 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:23:26 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:23:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56354 DF PROTO=TCP SPT=34450 DPT=9101 SEQ=203096648 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B6AF3C0000000001030307) Nov 26 04:23:29 localhost kernel: SELinux: Converting 2760 SID table entries... Nov 26 04:23:29 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 26 04:23:29 localhost kernel: SELinux: policy capability open_perms=1 Nov 26 04:23:29 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 26 04:23:29 localhost kernel: SELinux: policy capability always_check_network=0 Nov 26 04:23:29 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 26 04:23:29 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 26 04:23:29 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 26 04:23:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46797 DF PROTO=TCP SPT=36008 DPT=9102 SEQ=2165894344 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B6BA7D0000000001030307) Nov 26 04:23:33 localhost dbus-broker-launch[756]: avc: op=load_policy lsm=selinux seqno=20 res=1 Nov 26 04:23:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:23:33 localhost sshd[165096]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:23:33 localhost podman[165080]: 2025-11-26 09:23:33.85019118 +0000 UTC m=+0.099529905 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent) Nov 26 04:23:33 localhost podman[165080]: 2025-11-26 09:23:33.880442703 +0000 UTC m=+0.129781388 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 26 04:23:33 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:23:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56356 DF PROTO=TCP SPT=34450 DPT=9101 SEQ=203096648 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B6C6FD0000000001030307) Nov 26 04:23:37 localhost kernel: SELinux: Converting 2760 SID table entries... Nov 26 04:23:37 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 26 04:23:37 localhost kernel: SELinux: policy capability open_perms=1 Nov 26 04:23:37 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 26 04:23:37 localhost kernel: SELinux: policy capability always_check_network=0 Nov 26 04:23:37 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 26 04:23:37 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 26 04:23:37 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 26 04:23:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46798 DF PROTO=TCP SPT=36008 DPT=9102 SEQ=2165894344 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B6D9FD0000000001030307) Nov 26 04:23:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56357 DF PROTO=TCP SPT=34450 DPT=9101 SEQ=203096648 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B6E7FC0000000001030307) Nov 26 04:23:43 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45851 DF PROTO=TCP SPT=56918 DPT=9882 SEQ=617643682 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B6E9FC0000000001030307) Nov 26 04:23:45 localhost kernel: SELinux: Converting 2760 SID table entries... Nov 26 04:23:45 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 26 04:23:45 localhost kernel: SELinux: policy capability open_perms=1 Nov 26 04:23:45 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 26 04:23:45 localhost kernel: SELinux: policy capability always_check_network=0 Nov 26 04:23:45 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 26 04:23:45 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 26 04:23:45 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 26 04:23:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51358 DF PROTO=TCP SPT=53488 DPT=9105 SEQ=3364427527 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B6F4800000000001030307) Nov 26 04:23:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51360 DF PROTO=TCP SPT=53488 DPT=9105 SEQ=3364427527 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B7007C0000000001030307) Nov 26 04:23:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51361 DF PROTO=TCP SPT=53488 DPT=9105 SEQ=3364427527 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B7103D0000000001030307) Nov 26 04:23:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17221 DF PROTO=TCP SPT=47940 DPT=9102 SEQ=3963332953 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B717BC0000000001030307) Nov 26 04:23:55 localhost kernel: SELinux: Converting 2760 SID table entries... Nov 26 04:23:55 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 26 04:23:55 localhost kernel: SELinux: policy capability open_perms=1 Nov 26 04:23:55 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 26 04:23:55 localhost kernel: SELinux: policy capability always_check_network=0 Nov 26 04:23:55 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 26 04:23:55 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 26 04:23:55 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 26 04:23:57 localhost dbus-broker-launch[756]: avc: op=load_policy lsm=selinux seqno=23 res=1 Nov 26 04:23:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:23:57 localhost systemd[1]: tmp-crun.mXmORN.mount: Deactivated successfully. Nov 26 04:23:57 localhost podman[165129]: 2025-11-26 09:23:57.850366571 +0000 UTC m=+0.096872099 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:23:57 localhost podman[165129]: 2025-11-26 09:23:57.908314165 +0000 UTC m=+0.154819693 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 26 04:23:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24954 DF PROTO=TCP SPT=52086 DPT=9100 SEQ=2136962314 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B723FC0000000001030307) Nov 26 04:23:57 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:24:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17223 DF PROTO=TCP SPT=47940 DPT=9102 SEQ=3963332953 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B72F7C0000000001030307) Nov 26 04:24:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:24:03.622 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:24:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:24:03.622 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:24:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:24:03.623 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:24:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29191 DF PROTO=TCP SPT=48246 DPT=9101 SEQ=455676012 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B73C3D0000000001030307) Nov 26 04:24:04 localhost kernel: SELinux: Converting 2760 SID table entries... Nov 26 04:24:04 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 26 04:24:04 localhost kernel: SELinux: policy capability open_perms=1 Nov 26 04:24:04 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 26 04:24:04 localhost kernel: SELinux: policy capability always_check_network=0 Nov 26 04:24:04 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 26 04:24:04 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 26 04:24:04 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 26 04:24:04 localhost dbus-broker-launch[756]: avc: op=load_policy lsm=selinux seqno=24 res=1 Nov 26 04:24:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:24:04 localhost systemd[1]: tmp-crun.wOK2SE.mount: Deactivated successfully. Nov 26 04:24:04 localhost podman[165159]: 2025-11-26 09:24:04.842987983 +0000 UTC m=+0.095730681 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, managed_by=edpm_ansible) Nov 26 04:24:04 localhost podman[165159]: 2025-11-26 09:24:04.879249351 +0000 UTC m=+0.131992019 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent) Nov 26 04:24:04 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:24:05 localhost systemd[1]: Reloading. Nov 26 04:24:05 localhost systemd-rc-local-generator[165202]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:24:05 localhost systemd-sysv-generator[165209]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:24:05 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:24:05 localhost systemd[1]: Reloading. Nov 26 04:24:05 localhost systemd-rc-local-generator[165239]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:24:05 localhost systemd-sysv-generator[165242]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:24:05 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:24:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17224 DF PROTO=TCP SPT=47940 DPT=9102 SEQ=3963332953 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B74FFC0000000001030307) Nov 26 04:24:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29192 DF PROTO=TCP SPT=48246 DPT=9101 SEQ=455676012 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B75BFC0000000001030307) Nov 26 04:24:12 localhost sshd[165259]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:24:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21722 DF PROTO=TCP SPT=49228 DPT=9882 SEQ=461496817 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B75DFC0000000001030307) Nov 26 04:24:14 localhost kernel: SELinux: Converting 2761 SID table entries... Nov 26 04:24:14 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 26 04:24:14 localhost kernel: SELinux: policy capability open_perms=1 Nov 26 04:24:14 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 26 04:24:14 localhost kernel: SELinux: policy capability always_check_network=0 Nov 26 04:24:14 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 26 04:24:14 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 26 04:24:14 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 26 04:24:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23948 DF PROTO=TCP SPT=35762 DPT=9105 SEQ=4093006147 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B769B00000000001030307) Nov 26 04:24:18 localhost dbus-broker-launch[752]: Noticed file-system modification, trigger reload. Nov 26 04:24:18 localhost dbus-broker-launch[756]: avc: op=load_policy lsm=selinux seqno=25 res=1 Nov 26 04:24:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23950 DF PROTO=TCP SPT=35762 DPT=9105 SEQ=4093006147 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B775BC0000000001030307) Nov 26 04:24:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23951 DF PROTO=TCP SPT=35762 DPT=9105 SEQ=4093006147 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B7857C0000000001030307) Nov 26 04:24:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64806 DF PROTO=TCP SPT=42238 DPT=9102 SEQ=150552122 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B78CFC0000000001030307) Nov 26 04:24:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7421 DF PROTO=TCP SPT=35078 DPT=9101 SEQ=2953588210 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B799BC0000000001030307) Nov 26 04:24:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:24:28 localhost podman[165584]: 2025-11-26 09:24:28.823450672 +0000 UTC m=+0.085100356 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0) Nov 26 04:24:28 localhost podman[165584]: 2025-11-26 09:24:28.919365239 +0000 UTC m=+0.181014933 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 26 04:24:28 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:24:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64808 DF PROTO=TCP SPT=42238 DPT=9102 SEQ=150552122 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B7A4BC0000000001030307) Nov 26 04:24:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7423 DF PROTO=TCP SPT=35078 DPT=9101 SEQ=2953588210 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B7B17C0000000001030307) Nov 26 04:24:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:24:35 localhost podman[167696]: 2025-11-26 09:24:35.832501678 +0000 UTC m=+0.091937568 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0) Nov 26 04:24:35 localhost systemd[1]: tmp-crun.KzYP7P.mount: Deactivated successfully. Nov 26 04:24:35 localhost podman[167696]: 2025-11-26 09:24:35.838652118 +0000 UTC m=+0.098087978 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 26 04:24:35 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:24:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8447 DF PROTO=TCP SPT=56618 DPT=9100 SEQ=3835330855 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B7C5FC0000000001030307) Nov 26 04:24:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7424 DF PROTO=TCP SPT=35078 DPT=9101 SEQ=2953588210 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B7D1FC0000000001030307) Nov 26 04:24:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63571 DF PROTO=TCP SPT=48686 DPT=9882 SEQ=3283515217 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B7D3FD0000000001030307) Nov 26 04:24:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12998 DF PROTO=TCP SPT=60708 DPT=9105 SEQ=4034069727 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B7DEDF0000000001030307) Nov 26 04:24:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13000 DF PROTO=TCP SPT=60708 DPT=9105 SEQ=4034069727 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B7EAFD0000000001030307) Nov 26 04:24:50 localhost sshd[179888]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:24:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13001 DF PROTO=TCP SPT=60708 DPT=9105 SEQ=4034069727 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B7FABC0000000001030307) Nov 26 04:24:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61419 DF PROTO=TCP SPT=52570 DPT=9102 SEQ=2549742456 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B8023C0000000001030307) Nov 26 04:24:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17226 DF PROTO=TCP SPT=47940 DPT=9102 SEQ=3963332953 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B80DFC0000000001030307) Nov 26 04:24:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:24:59 localhost systemd[1]: Stopping OpenSSH server daemon... Nov 26 04:24:59 localhost systemd[1]: sshd.service: Deactivated successfully. Nov 26 04:24:59 localhost systemd[1]: Stopped OpenSSH server daemon. Nov 26 04:24:59 localhost systemd[1]: sshd.service: Consumed 1.672s CPU time, read 32.0K from disk, written 4.0K to disk. Nov 26 04:24:59 localhost systemd[1]: Stopped target sshd-keygen.target. Nov 26 04:24:59 localhost systemd[1]: Stopping sshd-keygen.target... Nov 26 04:24:59 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 26 04:24:59 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 26 04:24:59 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 26 04:24:59 localhost systemd[1]: Reached target sshd-keygen.target. Nov 26 04:24:59 localhost systemd[1]: Starting OpenSSH server daemon... Nov 26 04:24:59 localhost sshd[183104]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:24:59 localhost systemd[1]: Started OpenSSH server daemon. Nov 26 04:24:59 localhost podman[183091]: 2025-11-26 09:24:59.33286254 +0000 UTC m=+0.100817294 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3) Nov 26 04:24:59 localhost podman[183091]: 2025-11-26 09:24:59.402411171 +0000 UTC m=+0.170365975 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible) Nov 26 04:24:59 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:24:59 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:24:59 localhost systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload Nov 26 04:24:59 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:24:59 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:24:59 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:24:59 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:24:59 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:24:59 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:24:59 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:00 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:00 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:00 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:00 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:00 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:00 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:00 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61421 DF PROTO=TCP SPT=52570 DPT=9102 SEQ=2549742456 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B819FD0000000001030307) Nov 26 04:25:01 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 26 04:25:01 localhost systemd[1]: Starting man-db-cache-update.service... Nov 26 04:25:01 localhost systemd[1]: Reloading. Nov 26 04:25:01 localhost systemd-sysv-generator[183353]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:25:01 localhost systemd-rc-local-generator[183349]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:25:01 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:01 localhost systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:01 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:25:01 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:01 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:01 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:01 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:01 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:01 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:01 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:01 localhost systemd[1]: Queuing reload/restart jobs for marked units… Nov 26 04:25:01 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 26 04:25:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:25:03.624 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:25:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:25:03.625 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:25:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:25:03.627 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:25:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62027 DF PROTO=TCP SPT=53416 DPT=9101 SEQ=1724907158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B826BC0000000001030307) Nov 26 04:25:06 localhost python3.9[189097]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Nov 26 04:25:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:25:06 localhost systemd[1]: Reloading. Nov 26 04:25:06 localhost podman[189283]: 2025-11-26 09:25:06.266858208 +0000 UTC m=+0.095225931 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, tcib_managed=true) Nov 26 04:25:06 localhost podman[189283]: 2025-11-26 09:25:06.306468577 +0000 UTC m=+0.134836260 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 26 04:25:06 localhost systemd-sysv-generator[189358]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:25:06 localhost systemd-rc-local-generator[189353]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:25:06 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:06 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:25:06 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:06 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:06 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:06 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:06 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:06 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:06 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:06 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:25:07 localhost python3.9[189683]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Nov 26 04:25:07 localhost systemd[1]: Reloading. Nov 26 04:25:07 localhost systemd-sysv-generator[189968]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:25:07 localhost systemd-rc-local-generator[189965]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:25:07 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:07 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:25:07 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:07 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:07 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:07 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:07 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:07 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:07 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:08 localhost python3.9[190445]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Nov 26 04:25:08 localhost systemd[1]: Reloading. Nov 26 04:25:08 localhost systemd-rc-local-generator[190702]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:25:08 localhost systemd-sysv-generator[190705]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:25:08 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:08 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:25:08 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:08 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:08 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:08 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:08 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:08 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:08 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61422 DF PROTO=TCP SPT=52570 DPT=9102 SEQ=2549742456 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B839FC0000000001030307) Nov 26 04:25:09 localhost python3.9[191139]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Nov 26 04:25:09 localhost systemd[1]: Reloading. Nov 26 04:25:09 localhost systemd-rc-local-generator[191397]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:25:09 localhost systemd-sysv-generator[191404]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:25:09 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:09 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:25:09 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:09 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:09 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:09 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:09 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:09 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:09 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:12 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Nov 26 04:25:12 localhost systemd[1]: Finished man-db-cache-update.service. Nov 26 04:25:12 localhost systemd[1]: man-db-cache-update.service: Consumed 13.630s CPU time. Nov 26 04:25:12 localhost systemd[1]: run-r69a426d1e84f4cb4b439235ee45fb50c.service: Deactivated successfully. Nov 26 04:25:12 localhost systemd[1]: run-ra238e24eacab494aa82bac0baf845346.service: Deactivated successfully. Nov 26 04:25:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25312 DF PROTO=TCP SPT=50988 DPT=9882 SEQ=3827049042 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B847FC0000000001030307) Nov 26 04:25:13 localhost python3.9[192795]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 26 04:25:14 localhost systemd[1]: Reloading. Nov 26 04:25:14 localhost systemd-rc-local-generator[192821]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:25:14 localhost systemd-sysv-generator[192826]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:25:14 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:14 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:25:14 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:14 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:14 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:14 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:14 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:14 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:14 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:15 localhost python3.9[192944]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 26 04:25:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41416 DF PROTO=TCP SPT=48044 DPT=9105 SEQ=3471573022 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B854100000000001030307) Nov 26 04:25:16 localhost systemd[1]: Reloading. Nov 26 04:25:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41417 DF PROTO=TCP SPT=48044 DPT=9105 SEQ=3471573022 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B857FD0000000001030307) Nov 26 04:25:16 localhost systemd-sysv-generator[192977]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:25:16 localhost systemd-rc-local-generator[192973]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:25:16 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:16 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:16 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:25:16 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:16 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:16 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:16 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:16 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:16 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:17 localhost python3.9[193092]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 26 04:25:17 localhost systemd[1]: Reloading. Nov 26 04:25:17 localhost systemd-rc-local-generator[193123]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:25:17 localhost systemd-sysv-generator[193128]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:25:18 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:18 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:18 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:18 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:25:18 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:18 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:18 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:18 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:18 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41418 DF PROTO=TCP SPT=48044 DPT=9105 SEQ=3471573022 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B85FFC0000000001030307) Nov 26 04:25:18 localhost python3.9[193242]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 26 04:25:19 localhost python3.9[193355]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 26 04:25:19 localhost systemd[1]: Reloading. Nov 26 04:25:19 localhost systemd-rc-local-generator[193385]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:25:19 localhost systemd-sysv-generator[193390]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:25:19 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:19 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:19 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:19 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:19 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:25:19 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:19 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:19 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:19 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:21 localhost python3.9[193504]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Nov 26 04:25:21 localhost systemd[1]: Reloading. Nov 26 04:25:21 localhost systemd-sysv-generator[193534]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:25:21 localhost systemd-rc-local-generator[193531]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:25:21 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:21 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:21 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:21 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:25:21 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:21 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:21 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:21 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:25:22 localhost python3.9[193653]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 26 04:25:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41419 DF PROTO=TCP SPT=48044 DPT=9105 SEQ=3471573022 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B86FBC0000000001030307) Nov 26 04:25:23 localhost python3.9[193766]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 26 04:25:23 localhost python3.9[193934]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 26 04:25:24 localhost python3.9[194077]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 26 04:25:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54594 DF PROTO=TCP SPT=51326 DPT=9102 SEQ=1757516278 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B8777D0000000001030307) Nov 26 04:25:25 localhost python3.9[194190]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 26 04:25:26 localhost python3.9[194303]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 26 04:25:27 localhost python3.9[194416]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 26 04:25:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1437 DF PROTO=TCP SPT=33442 DPT=9101 SEQ=2638737075 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B883FC0000000001030307) Nov 26 04:25:29 localhost python3.9[194529]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 26 04:25:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:25:29 localhost podman[194643]: 2025-11-26 09:25:29.709503867 +0000 UTC m=+0.093040114 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true) Nov 26 04:25:29 localhost podman[194643]: 2025-11-26 09:25:29.767039119 +0000 UTC m=+0.150575426 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true) Nov 26 04:25:29 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:25:29 localhost python3.9[194642]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 26 04:25:30 localhost python3.9[194780]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 26 04:25:30 localhost sshd[194782]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:25:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54596 DF PROTO=TCP SPT=51326 DPT=9102 SEQ=1757516278 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B88F3C0000000001030307) Nov 26 04:25:32 localhost python3.9[194895]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 26 04:25:33 localhost python3.9[195008]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 26 04:25:34 localhost python3.9[195121]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 26 04:25:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1439 DF PROTO=TCP SPT=33442 DPT=9101 SEQ=2638737075 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B89BBC0000000001030307) Nov 26 04:25:34 localhost python3.9[195234]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 26 04:25:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:25:36 localhost podman[195255]: 2025-11-26 09:25:36.822701782 +0000 UTC m=+0.083489460 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:25:36 localhost podman[195255]: 2025-11-26 09:25:36.833897006 +0000 UTC m=+0.094684694 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:25:36 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:25:38 localhost python3.9[195366]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Nov 26 04:25:39 localhost python3.9[195476]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Nov 26 04:25:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54597 DF PROTO=TCP SPT=51326 DPT=9102 SEQ=1757516278 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B8AFFC0000000001030307) Nov 26 04:25:40 localhost python3.9[195586]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:25:41 localhost python3.9[195696]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:25:41 localhost python3.9[195806]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:25:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1440 DF PROTO=TCP SPT=33442 DPT=9101 SEQ=2638737075 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B8BBFC0000000001030307) Nov 26 04:25:42 localhost python3.9[195916]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Nov 26 04:25:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9841 DF PROTO=TCP SPT=53308 DPT=9882 SEQ=1880828147 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B8BDFC0000000001030307) Nov 26 04:25:43 localhost python3.9[196026]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:25:44 localhost python3.9[196116]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764149142.7596326-1644-170969101782856/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:25:44 localhost python3.9[196226]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:25:45 localhost python3.9[196316]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764149144.201014-1644-37266476085198/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:25:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43968 DF PROTO=TCP SPT=34970 DPT=9105 SEQ=2590976831 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B8C9400000000001030307) Nov 26 04:25:45 localhost python3.9[196426]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:25:46 localhost python3.9[196516]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764149145.3998811-1644-67133462056306/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:25:47 localhost python3.9[196626]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:25:47 localhost python3.9[196716]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764149146.602227-1644-277774066050368/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:25:48 localhost python3.9[196826]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:25:48 localhost python3.9[196916]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764149147.7254057-1644-268957326696213/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=8d9b2057482987a531d808ceb2ac4bc7d43bf17c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:25:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43970 DF PROTO=TCP SPT=34970 DPT=9105 SEQ=2590976831 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B8D53C0000000001030307) Nov 26 04:25:50 localhost python3.9[197026]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:25:50 localhost python3.9[197116]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764149148.9134214-1644-239876544238893/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:25:51 localhost python3.9[197226]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:25:52 localhost python3.9[197314]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764149150.7998292-1644-56709625886934/.source.conf follow=False _original_basename=auth.conf checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:25:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43971 DF PROTO=TCP SPT=34970 DPT=9105 SEQ=2590976831 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B8E4FD0000000001030307) Nov 26 04:25:52 localhost python3.9[197424]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:25:53 localhost python3.9[197514]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764149152.4863935-1644-134534981723043/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:25:54 localhost python3.9[197624]: ansible-ansible.builtin.file Invoked with path=/etc/libvirt/passwd.db state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:25:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28814 DF PROTO=TCP SPT=47758 DPT=9102 SEQ=3851889085 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B8EC7C0000000001030307) Nov 26 04:25:55 localhost python3.9[197734]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:25:55 localhost python3.9[197844]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:25:57 localhost python3.9[197954]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:25:57 localhost python3.9[198064]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:25:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55735 DF PROTO=TCP SPT=36048 DPT=9101 SEQ=178345305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B8F93C0000000001030307) Nov 26 04:25:58 localhost python3.9[198174]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:25:58 localhost python3.9[198284]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:25:59 localhost python3.9[198394]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:25:59 localhost python3.9[198504]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:26:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:26:00 localhost systemd[1]: tmp-crun.GoOAlc.mount: Deactivated successfully. Nov 26 04:26:00 localhost podman[198615]: 2025-11-26 09:26:00.477030836 +0000 UTC m=+0.114745905 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, managed_by=edpm_ansible) Nov 26 04:26:00 localhost podman[198615]: 2025-11-26 09:26:00.555707314 +0000 UTC m=+0.193422383 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS) Nov 26 04:26:00 localhost python3.9[198614]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:26:00 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:26:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28816 DF PROTO=TCP SPT=47758 DPT=9102 SEQ=3851889085 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B9043C0000000001030307) Nov 26 04:26:01 localhost python3.9[198749]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:26:02 localhost python3.9[198859]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:26:03 localhost python3.9[198969]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:26:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:26:03.625 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:26:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:26:03.625 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:26:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:26:03.626 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:26:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55737 DF PROTO=TCP SPT=36048 DPT=9101 SEQ=178345305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B910FD0000000001030307) Nov 26 04:26:04 localhost python3.9[199079]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:26:04 localhost python3.9[199189]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:26:05 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 26 04:26:05 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6000.1 total, 600.0 interval#012Cumulative writes: 5692 writes, 25K keys, 5692 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5692 writes, 763 syncs, 7.46 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5576e0ff82d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5576e0ff82d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_sl Nov 26 04:26:05 localhost python3.9[199299]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:26:06 localhost python3.9[199388]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764149164.9568276-2307-15124863805832/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:26:06 localhost python3.9[199498]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:26:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:26:07 localhost systemd[1]: tmp-crun.HFTXu6.mount: Deactivated successfully. Nov 26 04:26:07 localhost podman[199587]: 2025-11-26 09:26:07.072121002 +0000 UTC m=+0.086699681 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, io.buildah.version=1.41.3) Nov 26 04:26:07 localhost podman[199587]: 2025-11-26 09:26:07.106402863 +0000 UTC m=+0.120981482 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 26 04:26:07 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:26:07 localhost python3.9[199586]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764149166.1793969-2307-193571955090559/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:26:07 localhost python3.9[199714]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:26:08 localhost python3.9[199802]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764149167.3335133-2307-268084191927949/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:26:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28817 DF PROTO=TCP SPT=47758 DPT=9102 SEQ=3851889085 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B923FC0000000001030307) Nov 26 04:26:09 localhost python3.9[199912]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:26:09 localhost sshd[199975]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:26:09 localhost python3.9[200001]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764149168.554101-2307-128584214163528/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:26:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 26 04:26:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6000.1 total, 600.0 interval#012Cumulative writes: 4860 writes, 21K keys, 4860 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4860 writes, 621 syncs, 7.83 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.02 0.00 1 0.022 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.02 0.00 1 0.022 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1 0.02 0.00 1 0.022 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5593355362d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5593355362d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdo Nov 26 04:26:10 localhost python3.9[200112]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:26:10 localhost python3.9[200200]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764149169.7121298-2307-71396112081369/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:26:12 localhost python3.9[200310]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:26:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55738 DF PROTO=TCP SPT=36048 DPT=9101 SEQ=178345305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B931FC0000000001030307) Nov 26 04:26:12 localhost python3.9[200398]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764149171.6012664-2307-157172826697327/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:26:13 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55688 DF PROTO=TCP SPT=40926 DPT=9882 SEQ=2707297842 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B933FC0000000001030307) Nov 26 04:26:13 localhost python3.9[200508]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:26:14 localhost python3.9[200596]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764149172.7266288-2307-37194673506010/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:26:14 localhost python3.9[200706]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:26:15 localhost python3.9[200794]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764149174.4289246-2307-168202796752965/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:26:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4482 DF PROTO=TCP SPT=56900 DPT=9105 SEQ=4056193305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B93E6F0000000001030307) Nov 26 04:26:16 localhost python3.9[200904]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:26:16 localhost python3.9[200992]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764149175.5881906-2307-114622220973232/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:26:17 localhost python3.9[201102]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:26:17 localhost python3.9[201190]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764149176.714557-2307-254256007084788/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:26:18 localhost python3.9[201300]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:26:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4484 DF PROTO=TCP SPT=56900 DPT=9105 SEQ=4056193305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B94A7C0000000001030307) Nov 26 04:26:18 localhost python3.9[201388]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764149177.8518083-2307-3115513719606/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:26:19 localhost python3.9[201498]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:26:20 localhost python3.9[201586]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764149179.001608-2307-133098534282928/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:26:20 localhost python3.9[201696]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:26:21 localhost python3.9[201784]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764149180.1438506-2307-170018616570588/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:26:21 localhost python3.9[201894]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:26:22 localhost python3.9[201982]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764149181.2954-2307-44176328611584/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:26:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4485 DF PROTO=TCP SPT=56900 DPT=9105 SEQ=4056193305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B95A3C0000000001030307) Nov 26 04:26:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1737 DF PROTO=TCP SPT=43748 DPT=9102 SEQ=3549600696 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B961BC0000000001030307) Nov 26 04:26:25 localhost python3.9[202146]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:26:26 localhost python3.9[202289]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False Nov 26 04:26:27 localhost python3.9[202399]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 26 04:26:27 localhost systemd[1]: Reloading. Nov 26 04:26:27 localhost systemd-sysv-generator[202424]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:26:27 localhost systemd-rc-local-generator[202417]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:26:27 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:27 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:27 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:27 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:27 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:26:27 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:27 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:27 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:27 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:27 localhost systemd[1]: Starting libvirt logging daemon socket... Nov 26 04:26:27 localhost systemd[1]: Listening on libvirt logging daemon socket. Nov 26 04:26:27 localhost systemd[1]: Starting libvirt logging daemon admin socket... Nov 26 04:26:27 localhost systemd[1]: Listening on libvirt logging daemon admin socket. Nov 26 04:26:27 localhost systemd[1]: Starting libvirt logging daemon... Nov 26 04:26:27 localhost systemd[1]: Started libvirt logging daemon. Nov 26 04:26:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48454 DF PROTO=TCP SPT=44942 DPT=9100 SEQ=4078106305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B96DFC0000000001030307) Nov 26 04:26:28 localhost python3.9[202551]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 26 04:26:28 localhost systemd[1]: Reloading. Nov 26 04:26:28 localhost systemd-rc-local-generator[202576]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:26:28 localhost systemd-sysv-generator[202580]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:26:28 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:28 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:28 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:28 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:28 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:26:28 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:28 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:28 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:28 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:28 localhost systemd[1]: Starting libvirt nodedev daemon socket... Nov 26 04:26:28 localhost systemd[1]: Listening on libvirt nodedev daemon socket. Nov 26 04:26:28 localhost systemd[1]: Starting libvirt nodedev daemon admin socket... Nov 26 04:26:28 localhost systemd[1]: Starting libvirt nodedev daemon read-only socket... Nov 26 04:26:28 localhost systemd[1]: Listening on libvirt nodedev daemon admin socket. Nov 26 04:26:28 localhost systemd[1]: Listening on libvirt nodedev daemon read-only socket. Nov 26 04:26:28 localhost systemd[1]: Started libvirt nodedev daemon. Nov 26 04:26:29 localhost systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs... Nov 26 04:26:29 localhost systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs. Nov 26 04:26:29 localhost setroubleshoot[202661]: Deleting alert 41e13fe7-4246-4c3e-9d2a-3104b27ca041, it is allowed in current policy Nov 26 04:26:29 localhost systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service. Nov 26 04:26:29 localhost python3.9[202727]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 26 04:26:29 localhost systemd[1]: Reloading. Nov 26 04:26:29 localhost systemd-sysv-generator[202764]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:26:29 localhost systemd-rc-local-generator[202761]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:26:29 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:29 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:29 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:29 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:29 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:26:29 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:29 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:29 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:29 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:29 localhost systemd[1]: Starting libvirt proxy daemon socket... Nov 26 04:26:29 localhost systemd[1]: Listening on libvirt proxy daemon socket. Nov 26 04:26:29 localhost systemd[1]: Starting libvirt proxy daemon admin socket... Nov 26 04:26:29 localhost systemd[1]: Starting libvirt proxy daemon read-only socket... Nov 26 04:26:29 localhost systemd[1]: Listening on libvirt proxy daemon admin socket. Nov 26 04:26:29 localhost systemd[1]: Listening on libvirt proxy daemon read-only socket. Nov 26 04:26:29 localhost systemd[1]: Started libvirt proxy daemon. Nov 26 04:26:30 localhost setroubleshoot[202661]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 14638941-6b56-4dbb-8709-70c27d37160d Nov 26 04:26:30 localhost setroubleshoot[202661]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012***** Plugin dac_override (91.4 confidence) suggests **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012***** Plugin catchall (9.59 confidence) suggests **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012 Nov 26 04:26:30 localhost setroubleshoot[202661]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 14638941-6b56-4dbb-8709-70c27d37160d Nov 26 04:26:30 localhost setroubleshoot[202661]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012***** Plugin dac_override (91.4 confidence) suggests **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012***** Plugin catchall (9.59 confidence) suggests **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012 Nov 26 04:26:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:26:30 localhost python3.9[202907]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 26 04:26:30 localhost systemd[1]: Reloading. Nov 26 04:26:30 localhost podman[202908]: 2025-11-26 09:26:30.850110453 +0000 UTC m=+0.109584193 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251118) Nov 26 04:26:30 localhost systemd-sysv-generator[202956]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:26:30 localhost systemd-rc-local-generator[202951]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:26:30 localhost podman[202908]: 2025-11-26 09:26:30.889374159 +0000 UTC m=+0.148847919 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 26 04:26:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1739 DF PROTO=TCP SPT=43748 DPT=9102 SEQ=3549600696 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B9797D0000000001030307) Nov 26 04:26:30 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:30 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:30 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:30 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:30 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:26:30 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:30 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:30 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:30 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:31 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:26:31 localhost systemd[1]: Listening on libvirt locking daemon socket. Nov 26 04:26:31 localhost systemd[1]: Starting libvirt QEMU daemon socket... Nov 26 04:26:31 localhost systemd[1]: Listening on libvirt QEMU daemon socket. Nov 26 04:26:31 localhost systemd[1]: Starting libvirt QEMU daemon admin socket... Nov 26 04:26:31 localhost systemd[1]: Starting libvirt QEMU daemon read-only socket... Nov 26 04:26:31 localhost systemd[1]: Listening on libvirt QEMU daemon admin socket. Nov 26 04:26:31 localhost systemd[1]: Listening on libvirt QEMU daemon read-only socket. Nov 26 04:26:31 localhost systemd[1]: Started libvirt QEMU daemon. Nov 26 04:26:31 localhost python3.9[203115]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 26 04:26:31 localhost systemd[1]: Reloading. Nov 26 04:26:32 localhost systemd-rc-local-generator[203146]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:26:32 localhost systemd-sysv-generator[203149]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:26:32 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:32 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:32 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:32 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:32 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:26:32 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:32 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:32 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:32 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:26:32 localhost systemd[1]: Starting libvirt secret daemon socket... Nov 26 04:26:32 localhost systemd[1]: Listening on libvirt secret daemon socket. Nov 26 04:26:32 localhost systemd[1]: Starting libvirt secret daemon admin socket... Nov 26 04:26:32 localhost systemd[1]: Starting libvirt secret daemon read-only socket... Nov 26 04:26:32 localhost systemd[1]: Listening on libvirt secret daemon admin socket. Nov 26 04:26:32 localhost systemd[1]: Listening on libvirt secret daemon read-only socket. Nov 26 04:26:32 localhost systemd[1]: Started libvirt secret daemon. Nov 26 04:26:33 localhost python3.9[203297]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:26:33 localhost python3.9[203407]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Nov 26 04:26:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30479 DF PROTO=TCP SPT=49244 DPT=9101 SEQ=3164464334 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B9863C0000000001030307) Nov 26 04:26:34 localhost python3.9[203517]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;#012echo ceph#012awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:26:35 localhost python3.9[203629]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Nov 26 04:26:37 localhost python3.9[203737]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:26:37 localhost python3.9[203823]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764149196.5228279-3171-232328115250313/.source.xml follow=False _original_basename=secret.xml.j2 checksum=4b9603a06accac57f8dcfc3653d3792e6d34ccc6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:26:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:26:37 localhost podman[203841]: 2025-11-26 09:26:37.793617555 +0000 UTC m=+0.062726416 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Nov 26 04:26:37 localhost podman[203841]: 2025-11-26 09:26:37.802171275 +0000 UTC m=+0.071280186 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Nov 26 04:26:37 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:26:38 localhost python3.9[203953]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine 0d5e5e6d-3c4b-5efe-8c65-346ae6715606#012virsh secret-define --file /tmp/secret.xml#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:26:38 localhost python3.9[204073]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:26:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1740 DF PROTO=TCP SPT=43748 DPT=9102 SEQ=3549600696 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B999FC0000000001030307) Nov 26 04:26:40 localhost systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service: Deactivated successfully. Nov 26 04:26:40 localhost systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service: Consumed 1.033s CPU time. Nov 26 04:26:40 localhost systemd[1]: setroubleshootd.service: Deactivated successfully. Nov 26 04:26:41 localhost python3.9[204411]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:26:41 localhost python3.9[204521]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:26:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30480 DF PROTO=TCP SPT=49244 DPT=9101 SEQ=3164464334 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B9A5FC0000000001030307) Nov 26 04:26:42 localhost python3.9[204609]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764149201.363629-3336-157797275576985/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=dc5ee7162311c27a6084cbee4052b901d56cb1ba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:26:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49805 DF PROTO=TCP SPT=33854 DPT=9882 SEQ=235923863 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B9A7FD0000000001030307) Nov 26 04:26:43 localhost python3.9[204719]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:26:43 localhost python3.9[204829]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:26:44 localhost python3.9[204886]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:26:45 localhost python3.9[204996]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:26:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48316 DF PROTO=TCP SPT=52344 DPT=9105 SEQ=3765147307 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B9B3A00000000001030307) Nov 26 04:26:45 localhost python3.9[205053]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.j60u_jnf recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:26:47 localhost python3.9[205163]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:26:47 localhost python3.9[205220]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:26:48 localhost python3.9[205330]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:26:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48318 DF PROTO=TCP SPT=52344 DPT=9105 SEQ=3765147307 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B9BFBC0000000001030307) Nov 26 04:26:49 localhost python3[205441]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall Nov 26 04:26:49 localhost python3.9[205551]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:26:50 localhost sshd[205609]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:26:50 localhost python3.9[205608]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:26:52 localhost python3.9[205720]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:26:52 localhost python3.9[205777]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:26:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48319 DF PROTO=TCP SPT=52344 DPT=9105 SEQ=3765147307 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B9CF7C0000000001030307) Nov 26 04:26:53 localhost python3.9[205887]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:26:53 localhost python3.9[205944]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:26:54 localhost python3.9[206054]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:26:54 localhost python3.9[206111]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:26:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29884 DF PROTO=TCP SPT=42744 DPT=9102 SEQ=885741331 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B9D6FD0000000001030307) Nov 26 04:26:56 localhost python3.9[206221]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:26:56 localhost python3.9[206311]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764149215.0176466-3711-267232941965119/.source.nft follow=False _original_basename=ruleset.j2 checksum=e2e2635f27347d386f310e86d2b40c40289835bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:26:57 localhost python3.9[206421]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:26:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30302 DF PROTO=TCP SPT=46264 DPT=9101 SEQ=132252188 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B9E3BC0000000001030307) Nov 26 04:26:58 localhost python3.9[206531]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:26:59 localhost python3.9[206644]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:27:00 localhost python3.9[206754]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:27:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29886 DF PROTO=TCP SPT=42744 DPT=9102 SEQ=885741331 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B9EEBD0000000001030307) Nov 26 04:27:01 localhost python3.9[206865]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:27:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:27:01 localhost podman[206978]: 2025-11-26 09:27:01.74285983 +0000 UTC m=+0.075860900 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible) Nov 26 04:27:01 localhost podman[206978]: 2025-11-26 09:27:01.772247495 +0000 UTC m=+0.105248585 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller) Nov 26 04:27:01 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:27:01 localhost python3.9[206977]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:27:02 localhost python3.9[207115]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:27:03 localhost python3.9[207225]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:27:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:27:03.626 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:27:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:27:03.626 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:27:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:27:03.628 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:27:03 localhost python3.9[207313]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764149222.7127528-3928-181716576985558/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:27:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30304 DF PROTO=TCP SPT=46264 DPT=9101 SEQ=132252188 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52B9FB7D0000000001030307) Nov 26 04:27:04 localhost python3.9[207423]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:27:04 localhost python3.9[207511]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764149223.920122-3972-246567530699374/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:27:05 localhost python3.9[207621]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:27:06 localhost python3.9[207709]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764149225.2419102-4017-248064822809946/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:27:07 localhost python3.9[207819]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:27:07 localhost systemd[1]: Reloading. Nov 26 04:27:07 localhost systemd-rc-local-generator[207846]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:27:07 localhost systemd-sysv-generator[207850]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:27:07 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:27:07 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:27:07 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:27:07 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:27:07 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:27:07 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:27:07 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:27:07 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:27:07 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:27:07 localhost systemd[1]: Reached target edpm_libvirt.target. Nov 26 04:27:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:27:08 localhost systemd[1]: tmp-crun.6TzbLl.mount: Deactivated successfully. Nov 26 04:27:08 localhost podman[207915]: 2025-11-26 09:27:08.323218612 +0000 UTC m=+0.084024109 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true) Nov 26 04:27:08 localhost podman[207915]: 2025-11-26 09:27:08.336253009 +0000 UTC m=+0.097058436 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 26 04:27:08 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:27:09 localhost python3.9[207986]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None Nov 26 04:27:09 localhost systemd[1]: Reloading. Nov 26 04:27:09 localhost systemd-rc-local-generator[208008]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:27:09 localhost systemd-sysv-generator[208011]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:27:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29887 DF PROTO=TCP SPT=42744 DPT=9102 SEQ=885741331 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BA0FFD0000000001030307) Nov 26 04:27:09 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:27:09 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:27:09 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:27:09 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:27:09 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:27:09 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:27:09 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:27:09 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:27:09 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:27:09 localhost systemd[1]: Reloading. Nov 26 04:27:09 localhost systemd-rc-local-generator[208051]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:27:09 localhost systemd-sysv-generator[208055]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:27:09 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:27:09 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:27:09 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:27:09 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:27:09 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:27:09 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:27:09 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:27:09 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:27:09 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:27:11 localhost systemd[1]: session-53.scope: Deactivated successfully. Nov 26 04:27:11 localhost systemd[1]: session-53.scope: Consumed 3min 37.052s CPU time. Nov 26 04:27:11 localhost systemd-logind[761]: Session 53 logged out. Waiting for processes to exit. Nov 26 04:27:11 localhost systemd-logind[761]: Removed session 53. Nov 26 04:27:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30305 DF PROTO=TCP SPT=46264 DPT=9101 SEQ=132252188 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BA1BFC0000000001030307) Nov 26 04:27:13 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24295 DF PROTO=TCP SPT=52746 DPT=9882 SEQ=1183881049 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BA1DFC0000000001030307) Nov 26 04:27:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45608 DF PROTO=TCP SPT=59776 DPT=9105 SEQ=3184167166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BA28D00000000001030307) Nov 26 04:27:16 localhost sshd[208077]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:27:16 localhost systemd-logind[761]: New session 54 of user zuul. Nov 26 04:27:16 localhost systemd[1]: Started Session 54 of User zuul. Nov 26 04:27:18 localhost python3.9[208188]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 04:27:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45610 DF PROTO=TCP SPT=59776 DPT=9105 SEQ=3184167166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BA34BC0000000001030307) Nov 26 04:27:19 localhost python3.9[208300]: ansible-ansible.builtin.service_facts Invoked Nov 26 04:27:19 localhost network[208317]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 26 04:27:19 localhost network[208318]: 'network-scripts' will be removed from distribution in near future. Nov 26 04:27:19 localhost network[208319]: It is advised to switch to 'NetworkManager' instead for network management. Nov 26 04:27:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:27:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45611 DF PROTO=TCP SPT=59776 DPT=9105 SEQ=3184167166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BA447D0000000001030307) Nov 26 04:27:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47050 DF PROTO=TCP SPT=56484 DPT=9102 SEQ=2337746120 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BA4C3C0000000001030307) Nov 26 04:27:26 localhost python3.9[208587]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 26 04:27:27 localhost python3.9[208683]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 26 04:27:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1742 DF PROTO=TCP SPT=43748 DPT=9102 SEQ=3549600696 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BA57FC0000000001030307) Nov 26 04:27:30 localhost sshd[208704]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:27:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47052 DF PROTO=TCP SPT=56484 DPT=9102 SEQ=2337746120 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BA63FC0000000001030307) Nov 26 04:27:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:27:32 localhost podman[208706]: 2025-11-26 09:27:32.821500485 +0000 UTC m=+0.079689921 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3) Nov 26 04:27:32 localhost podman[208706]: 2025-11-26 09:27:32.882493896 +0000 UTC m=+0.140683362 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 26 04:27:32 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:27:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8529 DF PROTO=TCP SPT=51938 DPT=9101 SEQ=529187046 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BA707D0000000001030307) Nov 26 04:27:36 localhost python3.9[208840]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:27:37 localhost python3.9[208952]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi mode=preserve remote_src=True src=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi/ backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:27:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:27:38 localhost systemd[1]: tmp-crun.7oS9Rr.mount: Deactivated successfully. Nov 26 04:27:38 localhost podman[209063]: 2025-11-26 09:27:38.795397641 +0000 UTC m=+0.093173182 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:27:38 localhost podman[209063]: 2025-11-26 09:27:38.801443295 +0000 UTC m=+0.099218856 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 26 04:27:38 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:27:38 localhost python3.9[209062]: ansible-ansible.legacy.command Invoked with _raw_params=mv "/var/lib/config-data/puppet-generated/iscsid/etc/iscsi" "/var/lib/config-data/puppet-generated/iscsid/etc/iscsi.adopted"#012 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:27:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47053 DF PROTO=TCP SPT=56484 DPT=9102 SEQ=2337746120 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BA83FC0000000001030307) Nov 26 04:27:39 localhost python3.9[209191]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:27:40 localhost python3.9[209302]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -rF /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:27:40 localhost python3.9[209413]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:27:42 localhost python3.9[209525]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:27:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8530 DF PROTO=TCP SPT=51938 DPT=9101 SEQ=529187046 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BA8FFC0000000001030307) Nov 26 04:27:43 localhost python3.9[209635]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:27:43 localhost systemd[1]: Listening on Open-iSCSI iscsid Socket. Nov 26 04:27:45 localhost python3.9[209749]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:27:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17173 DF PROTO=TCP SPT=54262 DPT=9105 SEQ=3909334703 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BA9E000000000001030307) Nov 26 04:27:46 localhost systemd[1]: Reloading. Nov 26 04:27:46 localhost systemd-rc-local-generator[209773]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:27:46 localhost systemd-sysv-generator[209777]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:27:46 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:27:46 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:27:46 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:27:46 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:27:46 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:27:46 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:27:46 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:27:46 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:27:46 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:27:46 localhost systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi). Nov 26 04:27:46 localhost systemd[1]: Starting Open-iSCSI... Nov 26 04:27:46 localhost iscsid[209789]: iscsid: can't open InitiatorName configuration file /etc/iscsi/initiatorname.iscsi Nov 26 04:27:46 localhost iscsid[209789]: iscsid: Warning: InitiatorName file /etc/iscsi/initiatorname.iscsi does not exist or does not contain a properly formatted InitiatorName. If using software iscsi (iscsi_tcp or ib_iser) or partial offload (bnx2i or cxgbi iscsi), you may not be able to log into or discover targets. Please create a file /etc/iscsi/initiatorname.iscsi that contains a sting with the format: InitiatorName=iqn.yyyy-mm.[:identifier]. Nov 26 04:27:46 localhost iscsid[209789]: Example: InitiatorName=iqn.2001-04.com.redhat:fc6. Nov 26 04:27:46 localhost iscsid[209789]: If using hardware iscsi like qla4xxx this message can be ignored. Nov 26 04:27:46 localhost iscsid[209789]: iscsid: can't open InitiatorAlias configuration file /etc/iscsi/initiatorname.iscsi Nov 26 04:27:46 localhost iscsid[209789]: iscsid: can't open iscsid.safe_logout configuration file /etc/iscsi/iscsid.conf Nov 26 04:27:46 localhost iscsid[209789]: iscsid: can't open iscsid.ipc_auth_uid configuration file /etc/iscsi/iscsid.conf Nov 26 04:27:46 localhost systemd[1]: Started Open-iSCSI. Nov 26 04:27:46 localhost systemd[1]: Starting Logout off all iSCSI sessions on shutdown... Nov 26 04:27:46 localhost systemd[1]: Finished Logout off all iSCSI sessions on shutdown. Nov 26 04:27:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17174 DF PROTO=TCP SPT=54262 DPT=9105 SEQ=3909334703 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BAA1FC0000000001030307) Nov 26 04:27:47 localhost python3.9[209900]: ansible-ansible.builtin.service_facts Invoked Nov 26 04:27:47 localhost network[209917]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 26 04:27:47 localhost network[209918]: 'network-scripts' will be removed from distribution in near future. Nov 26 04:27:47 localhost network[209919]: It is advised to switch to 'NetworkManager' instead for network management. Nov 26 04:27:48 localhost systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs... Nov 26 04:27:48 localhost systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs. Nov 26 04:27:48 localhost systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@2.service. Nov 26 04:27:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17175 DF PROTO=TCP SPT=54262 DPT=9105 SEQ=3909334703 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BAA9FC0000000001030307) Nov 26 04:27:49 localhost setroubleshoot[209932]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l d03b1efa-3eb9-47b4-a832-05a47ec94cd9 Nov 26 04:27:49 localhost setroubleshoot[209932]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Nov 26 04:27:49 localhost setroubleshoot[209932]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l d03b1efa-3eb9-47b4-a832-05a47ec94cd9 Nov 26 04:27:49 localhost setroubleshoot[209932]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Nov 26 04:27:49 localhost setroubleshoot[209932]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l d03b1efa-3eb9-47b4-a832-05a47ec94cd9 Nov 26 04:27:49 localhost setroubleshoot[209932]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Nov 26 04:27:49 localhost setroubleshoot[209932]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l d03b1efa-3eb9-47b4-a832-05a47ec94cd9 Nov 26 04:27:49 localhost setroubleshoot[209932]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Nov 26 04:27:49 localhost setroubleshoot[209932]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l d03b1efa-3eb9-47b4-a832-05a47ec94cd9 Nov 26 04:27:49 localhost setroubleshoot[209932]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Nov 26 04:27:49 localhost setroubleshoot[209932]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l d03b1efa-3eb9-47b4-a832-05a47ec94cd9 Nov 26 04:27:49 localhost setroubleshoot[209932]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Nov 26 04:27:49 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:27:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17176 DF PROTO=TCP SPT=54262 DPT=9105 SEQ=3909334703 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BAB9BC0000000001030307) Nov 26 04:27:53 localhost python3.9[210168]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Nov 26 04:27:54 localhost python3.9[210278]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled Nov 26 04:27:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40006 DF PROTO=TCP SPT=43280 DPT=9102 SEQ=3243692075 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BAC13D0000000001030307) Nov 26 04:27:55 localhost python3.9[210392]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:27:55 localhost python3.9[210480]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764149274.7579699-456-25395658270529/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:27:56 localhost python3.9[210590]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:27:57 localhost python3.9[210700]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 26 04:27:57 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Nov 26 04:27:57 localhost systemd[1]: Stopped Load Kernel Modules. Nov 26 04:27:57 localhost systemd[1]: Stopping Load Kernel Modules... Nov 26 04:27:57 localhost systemd[1]: Starting Load Kernel Modules... Nov 26 04:27:57 localhost systemd-modules-load[210704]: Module 'msr' is built in Nov 26 04:27:57 localhost systemd[1]: Finished Load Kernel Modules. Nov 26 04:27:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43083 DF PROTO=TCP SPT=40330 DPT=9100 SEQ=626328730 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BACDFC0000000001030307) Nov 26 04:27:58 localhost python3.9[210815]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:27:59 localhost python3.9[210925]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:27:59 localhost systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@2.service: Deactivated successfully. Nov 26 04:27:59 localhost systemd[1]: setroubleshootd.service: Deactivated successfully. Nov 26 04:28:00 localhost python3.9[211035]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:28:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40008 DF PROTO=TCP SPT=43280 DPT=9102 SEQ=3243692075 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BAD8FC0000000001030307) Nov 26 04:28:01 localhost python3.9[211145]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:28:02 localhost python3.9[211233]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764149280.4819086-630-160742969398037/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:28:02 localhost python3.9[211343]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:28:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:28:03.627 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:28:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:28:03.629 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:28:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:28:03.630 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:28:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:28:03 localhost podman[211416]: 2025-11-26 09:28:03.823659041 +0000 UTC m=+0.073655998 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller) Nov 26 04:28:03 localhost podman[211416]: 2025-11-26 09:28:03.858269938 +0000 UTC m=+0.108266925 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true) Nov 26 04:28:03 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:28:04 localhost python3.9[211479]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:28:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25135 DF PROTO=TCP SPT=60272 DPT=9101 SEQ=4053669065 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BAE5BC0000000001030307) Nov 26 04:28:04 localhost python3.9[211590]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:28:05 localhost python3.9[211700]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:28:06 localhost python3.9[211810]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:28:06 localhost python3.9[211920]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:28:07 localhost python3.9[212030]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:28:08 localhost python3.9[212140]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:28:08 localhost python3.9[212250]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:28:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46127 DF PROTO=TCP SPT=40496 DPT=9100 SEQ=69687199 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BAF9FC0000000001030307) Nov 26 04:28:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:28:09 localhost podman[212363]: 2025-11-26 09:28:09.418970386 +0000 UTC m=+0.079940323 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent) Nov 26 04:28:09 localhost podman[212363]: 2025-11-26 09:28:09.423796472 +0000 UTC m=+0.084766369 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118) Nov 26 04:28:09 localhost sshd[212379]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:28:09 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:28:09 localhost python3.9[212362]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:28:10 localhost python3.9[212493]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:28:11 localhost python3.9[212603]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:28:11 localhost python3.9[212660]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:28:12 localhost python3.9[212770]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:28:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25136 DF PROTO=TCP SPT=60272 DPT=9101 SEQ=4053669065 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BB05FC0000000001030307) Nov 26 04:28:12 localhost python3.9[212827]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:28:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20688 DF PROTO=TCP SPT=47526 DPT=9882 SEQ=1093279844 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BB07FC0000000001030307) Nov 26 04:28:13 localhost python3.9[212937]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:28:14 localhost python3.9[213047]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:28:15 localhost python3.9[213104]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:28:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61387 DF PROTO=TCP SPT=49114 DPT=9105 SEQ=2042419453 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BB13300000000001030307) Nov 26 04:28:16 localhost python3.9[213214]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:28:16 localhost python3.9[213271]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:28:17 localhost python3.9[213381]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:28:17 localhost systemd[1]: Reloading. Nov 26 04:28:17 localhost systemd-sysv-generator[213412]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:28:17 localhost systemd-rc-local-generator[213409]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:28:17 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:28:17 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:28:17 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:28:17 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:28:17 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:28:17 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:28:17 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:28:17 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:28:17 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:28:18 localhost python3.9[213529]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:28:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61389 DF PROTO=TCP SPT=49114 DPT=9105 SEQ=2042419453 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BB1F3C0000000001030307) Nov 26 04:28:19 localhost python3.9[213586]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:28:19 localhost python3.9[213696]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:28:20 localhost python3.9[213753]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:28:21 localhost python3.9[213863]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:28:21 localhost systemd[1]: Reloading. Nov 26 04:28:21 localhost systemd-sysv-generator[213891]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:28:21 localhost systemd-rc-local-generator[213888]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:28:21 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:28:21 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:28:21 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:28:21 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:28:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:28:21 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:28:21 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:28:21 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:28:21 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:28:21 localhost systemd[1]: Starting Create netns directory... Nov 26 04:28:21 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Nov 26 04:28:21 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Nov 26 04:28:21 localhost systemd[1]: Finished Create netns directory. Nov 26 04:28:22 localhost python3.9[214015]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:28:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61390 DF PROTO=TCP SPT=49114 DPT=9105 SEQ=2042419453 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BB2EFD0000000001030307) Nov 26 04:28:22 localhost python3.9[214125]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:28:23 localhost python3.9[214213]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764149302.5559783-1251-195171803419269/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 26 04:28:24 localhost python3.9[214323]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:28:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35577 DF PROTO=TCP SPT=35942 DPT=9102 SEQ=1653472396 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BB367C0000000001030307) Nov 26 04:28:25 localhost python3.9[214433]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:28:25 localhost python3.9[214521]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764149304.8604906-1326-221514126244178/.source.json _original_basename=.i_nqkg27 follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:28:26 localhost python3.9[214631]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:28:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48657 DF PROTO=TCP SPT=55576 DPT=9101 SEQ=1604459316 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BB433D0000000001030307) Nov 26 04:28:28 localhost systemd[1]: virtnodedevd.service: Deactivated successfully. Nov 26 04:28:29 localhost python3.9[214983]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False Nov 26 04:28:29 localhost systemd[1]: virtproxyd.service: Deactivated successfully. Nov 26 04:28:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8532 DF PROTO=TCP SPT=51938 DPT=9101 SEQ=529187046 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BB4DFC0000000001030307) Nov 26 04:28:31 localhost python3.9[215136]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Nov 26 04:28:32 localhost python3.9[215246]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Nov 26 04:28:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48659 DF PROTO=TCP SPT=55576 DPT=9101 SEQ=1604459316 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BB5AFC0000000001030307) Nov 26 04:28:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:28:34 localhost systemd[1]: tmp-crun.sOYMmM.mount: Deactivated successfully. Nov 26 04:28:34 localhost podman[215291]: 2025-11-26 09:28:34.836064506 +0000 UTC m=+0.094491552 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller) Nov 26 04:28:34 localhost podman[215291]: 2025-11-26 09:28:34.903336148 +0000 UTC m=+0.161763194 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:28:34 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:28:36 localhost python3[215407]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False Nov 26 04:28:38 localhost podman[215421]: 2025-11-26 09:28:36.367132529 +0000 UTC m=+0.023451699 image pull quay.io/podified-antelope-centos9/openstack-multipathd:current-podified Nov 26 04:28:38 localhost podman[215470]: Nov 26 04:28:38 localhost podman[215470]: 2025-11-26 09:28:38.177433618 +0000 UTC m=+0.069775784 container create 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.build-date=20251118, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Nov 26 04:28:38 localhost podman[215470]: 2025-11-26 09:28:38.139788952 +0000 UTC m=+0.032131118 image pull quay.io/podified-antelope-centos9/openstack-multipathd:current-podified Nov 26 04:28:38 localhost python3[215407]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified Nov 26 04:28:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35580 DF PROTO=TCP SPT=35942 DPT=9102 SEQ=1653472396 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BB6DFD0000000001030307) Nov 26 04:28:39 localhost python3.9[215616]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:28:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:28:39 localhost podman[215636]: 2025-11-26 09:28:39.789092282 +0000 UTC m=+0.056833605 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent) Nov 26 04:28:39 localhost podman[215636]: 2025-11-26 09:28:39.822217402 +0000 UTC m=+0.089958745 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:28:39 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:28:40 localhost python3.9[215746]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:28:40 localhost systemd[1]: virtsecretd.service: Deactivated successfully. Nov 26 04:28:41 localhost python3.9[215802]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:28:42 localhost python3.9[215911]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764149321.541309-1590-25562662135420/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:28:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48660 DF PROTO=TCP SPT=55576 DPT=9101 SEQ=1604459316 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BB7BFD0000000001030307) Nov 26 04:28:43 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17317 DF PROTO=TCP SPT=56554 DPT=9882 SEQ=3152306586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BB7DFC0000000001030307) Nov 26 04:28:43 localhost python3.9[215966]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 26 04:28:43 localhost systemd[1]: Reloading. Nov 26 04:28:43 localhost systemd-rc-local-generator[215988]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:28:43 localhost systemd-sysv-generator[215993]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:28:43 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:28:43 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:28:43 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:28:43 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:28:43 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:28:43 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:28:43 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:28:43 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:28:43 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:28:44 localhost python3.9[216057]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:28:44 localhost systemd[1]: Reloading. Nov 26 04:28:44 localhost systemd-sysv-generator[216087]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:28:44 localhost systemd-rc-local-generator[216084]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:28:44 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:28:44 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:28:44 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:28:44 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:28:44 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:28:45 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:28:45 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:28:45 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:28:45 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:28:45 localhost systemd[1]: Starting multipathd container... Nov 26 04:28:45 localhost systemd[1]: Started libcrun container. Nov 26 04:28:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6cc4f1a72bf577ce0d900c994712030e650f746bd366e0b0c0efce486854c1c3/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Nov 26 04:28:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6cc4f1a72bf577ce0d900c994712030e650f746bd366e0b0c0efce486854c1c3/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Nov 26 04:28:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:28:45 localhost podman[216098]: 2025-11-26 09:28:45.298474902 +0000 UTC m=+0.143321768 container init 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, config_id=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251118) Nov 26 04:28:45 localhost multipathd[216112]: + sudo -E kolla_set_configs Nov 26 04:28:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:28:45 localhost podman[216098]: 2025-11-26 09:28:45.336640524 +0000 UTC m=+0.181487340 container start 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible) Nov 26 04:28:45 localhost podman[216098]: multipathd Nov 26 04:28:45 localhost systemd[1]: Started multipathd container. Nov 26 04:28:45 localhost multipathd[216112]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Nov 26 04:28:45 localhost multipathd[216112]: INFO:__main__:Validating config file Nov 26 04:28:45 localhost multipathd[216112]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Nov 26 04:28:45 localhost multipathd[216112]: INFO:__main__:Writing out command to execute Nov 26 04:28:45 localhost multipathd[216112]: ++ cat /run_command Nov 26 04:28:45 localhost multipathd[216112]: + CMD='/usr/sbin/multipathd -d' Nov 26 04:28:45 localhost multipathd[216112]: + ARGS= Nov 26 04:28:45 localhost multipathd[216112]: + sudo kolla_copy_cacerts Nov 26 04:28:45 localhost multipathd[216112]: + [[ ! -n '' ]] Nov 26 04:28:45 localhost multipathd[216112]: + . kolla_extend_start Nov 26 04:28:45 localhost multipathd[216112]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\''' Nov 26 04:28:45 localhost multipathd[216112]: Running command: '/usr/sbin/multipathd -d' Nov 26 04:28:45 localhost multipathd[216112]: + umask 0022 Nov 26 04:28:45 localhost multipathd[216112]: + exec /usr/sbin/multipathd -d Nov 26 04:28:45 localhost multipathd[216112]: 9800.666335 | --------start up-------- Nov 26 04:28:45 localhost multipathd[216112]: 9800.666363 | read /etc/multipath.conf Nov 26 04:28:45 localhost multipathd[216112]: 9800.670984 | path checkers start up Nov 26 04:28:45 localhost podman[216121]: 2025-11-26 09:28:45.437319494 +0000 UTC m=+0.097066894 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_managed=true, config_id=multipathd) Nov 26 04:28:45 localhost podman[216121]: 2025-11-26 09:28:45.477350277 +0000 UTC m=+0.137097637 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd) Nov 26 04:28:45 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:28:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14418 DF PROTO=TCP SPT=39642 DPT=9105 SEQ=4018081988 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BB88600000000001030307) Nov 26 04:28:46 localhost python3.9[216256]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:28:47 localhost python3.9[216368]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:28:48 localhost python3.9[216491]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 26 04:28:48 localhost systemd[1]: Stopping multipathd container... Nov 26 04:28:48 localhost multipathd[216112]: 9803.684916 | exit (signal) Nov 26 04:28:48 localhost multipathd[216112]: 9803.685574 | --------shut down------- Nov 26 04:28:48 localhost systemd[1]: libpod-8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.scope: Deactivated successfully. Nov 26 04:28:48 localhost podman[216495]: 2025-11-26 09:28:48.475194178 +0000 UTC m=+0.100495816 container died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:28:48 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.timer: Deactivated successfully. Nov 26 04:28:48 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:28:48 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc-userdata-shm.mount: Deactivated successfully. Nov 26 04:28:48 localhost systemd[1]: var-lib-containers-storage-overlay-6cc4f1a72bf577ce0d900c994712030e650f746bd366e0b0c0efce486854c1c3-merged.mount: Deactivated successfully. Nov 26 04:28:48 localhost podman[216495]: 2025-11-26 09:28:48.562833207 +0000 UTC m=+0.188134805 container cleanup 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd) Nov 26 04:28:48 localhost podman[216495]: multipathd Nov 26 04:28:48 localhost podman[216522]: 2025-11-26 09:28:48.675857636 +0000 UTC m=+0.074568058 container cleanup 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=multipathd) Nov 26 04:28:48 localhost podman[216522]: multipathd Nov 26 04:28:48 localhost systemd[1]: edpm_multipathd.service: Deactivated successfully. Nov 26 04:28:48 localhost systemd[1]: Stopped multipathd container. Nov 26 04:28:48 localhost systemd[1]: Starting multipathd container... Nov 26 04:28:48 localhost systemd[1]: Started libcrun container. Nov 26 04:28:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6cc4f1a72bf577ce0d900c994712030e650f746bd366e0b0c0efce486854c1c3/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Nov 26 04:28:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6cc4f1a72bf577ce0d900c994712030e650f746bd366e0b0c0efce486854c1c3/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Nov 26 04:28:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:28:48 localhost podman[216535]: 2025-11-26 09:28:48.837581068 +0000 UTC m=+0.131194887 container init 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd) Nov 26 04:28:48 localhost multipathd[216548]: + sudo -E kolla_set_configs Nov 26 04:28:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:28:48 localhost podman[216535]: 2025-11-26 09:28:48.864207328 +0000 UTC m=+0.157821127 container start 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd) Nov 26 04:28:48 localhost podman[216535]: multipathd Nov 26 04:28:48 localhost systemd[1]: Started multipathd container. Nov 26 04:28:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14420 DF PROTO=TCP SPT=39642 DPT=9105 SEQ=4018081988 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BB947C0000000001030307) Nov 26 04:28:48 localhost multipathd[216548]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Nov 26 04:28:48 localhost multipathd[216548]: INFO:__main__:Validating config file Nov 26 04:28:48 localhost multipathd[216548]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Nov 26 04:28:48 localhost multipathd[216548]: INFO:__main__:Writing out command to execute Nov 26 04:28:48 localhost multipathd[216548]: ++ cat /run_command Nov 26 04:28:48 localhost multipathd[216548]: + CMD='/usr/sbin/multipathd -d' Nov 26 04:28:48 localhost multipathd[216548]: + ARGS= Nov 26 04:28:48 localhost multipathd[216548]: + sudo kolla_copy_cacerts Nov 26 04:28:48 localhost multipathd[216548]: Running command: '/usr/sbin/multipathd -d' Nov 26 04:28:48 localhost multipathd[216548]: + [[ ! -n '' ]] Nov 26 04:28:48 localhost multipathd[216548]: + . kolla_extend_start Nov 26 04:28:48 localhost multipathd[216548]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\''' Nov 26 04:28:48 localhost multipathd[216548]: + umask 0022 Nov 26 04:28:48 localhost multipathd[216548]: + exec /usr/sbin/multipathd -d Nov 26 04:28:48 localhost multipathd[216548]: 9804.179865 | --------start up-------- Nov 26 04:28:48 localhost multipathd[216548]: 9804.179883 | read /etc/multipath.conf Nov 26 04:28:48 localhost multipathd[216548]: 9804.183301 | path checkers start up Nov 26 04:28:48 localhost podman[216555]: 2025-11-26 09:28:48.951236618 +0000 UTC m=+0.078081872 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Nov 26 04:28:48 localhost podman[216555]: 2025-11-26 09:28:48.957670245 +0000 UTC m=+0.084515519 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:28:48 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:28:49 localhost sshd[216602]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:28:49 localhost python3.9[216696]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:28:50 localhost python3.9[216806]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Nov 26 04:28:51 localhost python3.9[216916]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled Nov 26 04:28:52 localhost python3.9[217034]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:28:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14421 DF PROTO=TCP SPT=39642 DPT=9105 SEQ=4018081988 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BBA43C0000000001030307) Nov 26 04:28:52 localhost python3.9[217122]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764149331.8770807-1830-172397233617834/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:28:54 localhost python3.9[217232]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:28:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48472 DF PROTO=TCP SPT=54684 DPT=9102 SEQ=2745082675 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BBABBC0000000001030307) Nov 26 04:28:55 localhost python3.9[217342]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 26 04:28:55 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Nov 26 04:28:55 localhost systemd[1]: Stopped Load Kernel Modules. Nov 26 04:28:55 localhost systemd[1]: Stopping Load Kernel Modules... Nov 26 04:28:55 localhost systemd[1]: Starting Load Kernel Modules... Nov 26 04:28:55 localhost systemd-modules-load[217346]: Module 'msr' is built in Nov 26 04:28:55 localhost systemd[1]: Finished Load Kernel Modules. Nov 26 04:28:56 localhost python3.9[217456]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 26 04:28:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46129 DF PROTO=TCP SPT=40496 DPT=9100 SEQ=69687199 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BBB7FC0000000001030307) Nov 26 04:29:00 localhost systemd[1]: Reloading. Nov 26 04:29:00 localhost systemd-sysv-generator[217494]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:29:00 localhost systemd-rc-local-generator[217491]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:29:00 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:00 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:00 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:00 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:00 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:29:00 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:00 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:00 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:00 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48474 DF PROTO=TCP SPT=54684 DPT=9102 SEQ=2745082675 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BBC37C0000000001030307) Nov 26 04:29:00 localhost systemd[1]: Reloading. Nov 26 04:29:01 localhost systemd-rc-local-generator[217529]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:29:01 localhost systemd-sysv-generator[217532]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:29:01 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:01 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:01 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:01 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:01 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:29:01 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:01 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:01 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:01 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:01 localhost systemd-logind[761]: Watching system buttons on /dev/input/event0 (Power Button) Nov 26 04:29:01 localhost systemd-logind[761]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard) Nov 26 04:29:01 localhost lvm[217577]: PV /dev/loop4 online, VG ceph_vg1 is complete. Nov 26 04:29:01 localhost lvm[217577]: VG ceph_vg1 finished Nov 26 04:29:01 localhost lvm[217578]: PV /dev/loop3 online, VG ceph_vg0 is complete. Nov 26 04:29:01 localhost lvm[217578]: VG ceph_vg0 finished Nov 26 04:29:01 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 26 04:29:01 localhost systemd[1]: Starting man-db-cache-update.service... Nov 26 04:29:01 localhost systemd[1]: Reloading. Nov 26 04:29:01 localhost systemd-sysv-generator[217628]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:29:01 localhost systemd-rc-local-generator[217625]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:29:01 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:01 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:01 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:01 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:01 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:29:01 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:01 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:01 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:01 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:01 localhost systemd[1]: Queuing reload/restart jobs for marked units… Nov 26 04:29:02 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Nov 26 04:29:02 localhost systemd[1]: Finished man-db-cache-update.service. Nov 26 04:29:02 localhost systemd[1]: man-db-cache-update.service: Consumed 1.106s CPU time. Nov 26 04:29:02 localhost systemd[1]: run-r8467a5b91706468ca5bf78932a8fecb8.service: Deactivated successfully. Nov 26 04:29:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:29:03.628 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:29:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:29:03.629 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:29:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:29:03.631 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:29:03 localhost python3.9[218871]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 04:29:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17347 DF PROTO=TCP SPT=36438 DPT=9101 SEQ=2802427085 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BBD03C0000000001030307) Nov 26 04:29:04 localhost python3.9[218985]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:29:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:29:05 localhost systemd[1]: tmp-crun.Z8ONiC.mount: Deactivated successfully. Nov 26 04:29:05 localhost podman[219096]: 2025-11-26 09:29:05.665368279 +0000 UTC m=+0.102925514 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:29:05 localhost podman[219096]: 2025-11-26 09:29:05.718240957 +0000 UTC m=+0.155798252 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:29:05 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:29:05 localhost python3.9[219095]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 26 04:29:05 localhost systemd[1]: Reloading. Nov 26 04:29:05 localhost systemd-rc-local-generator[219149]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:29:05 localhost systemd-sysv-generator[219152]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:29:06 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:06 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:06 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:06 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:06 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:29:06 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:06 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:06 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:06 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:06 localhost python3.9[219265]: ansible-ansible.builtin.service_facts Invoked Nov 26 04:29:06 localhost network[219282]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 26 04:29:06 localhost network[219283]: 'network-scripts' will be removed from distribution in near future. Nov 26 04:29:06 localhost network[219284]: It is advised to switch to 'NetworkManager' instead for network management. Nov 26 04:29:08 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:29:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48475 DF PROTO=TCP SPT=54684 DPT=9102 SEQ=2745082675 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BBE3FD0000000001030307) Nov 26 04:29:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:29:10 localhost podman[219409]: 2025-11-26 09:29:10.513584422 +0000 UTC m=+0.089089968 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 26 04:29:10 localhost podman[219409]: 2025-11-26 09:29:10.548428837 +0000 UTC m=+0.123934393 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Nov 26 04:29:10 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:29:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17348 DF PROTO=TCP SPT=36438 DPT=9101 SEQ=2802427085 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BBEFFD0000000001030307) Nov 26 04:29:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56362 DF PROTO=TCP SPT=52410 DPT=9882 SEQ=3289484462 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BBF1FC0000000001030307) Nov 26 04:29:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46292 DF PROTO=TCP SPT=42532 DPT=9105 SEQ=3869127957 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BBFD900000000001030307) Nov 26 04:29:15 localhost python3.9[219537]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:29:16 localhost python3.9[219648]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:29:17 localhost python3.9[219759]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:29:18 localhost python3.9[219870]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:29:18 localhost python3.9[219981]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:29:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46294 DF PROTO=TCP SPT=42532 DPT=9105 SEQ=3869127957 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BC097D0000000001030307) Nov 26 04:29:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:29:19 localhost systemd[1]: tmp-crun.iPV2M5.mount: Deactivated successfully. Nov 26 04:29:19 localhost podman[220093]: 2025-11-26 09:29:19.293808654 +0000 UTC m=+0.087579948 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.schema-version=1.0, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:29:19 localhost podman[220093]: 2025-11-26 09:29:19.304835086 +0000 UTC m=+0.098606390 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible) Nov 26 04:29:19 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:29:19 localhost python3.9[220092]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:29:21 localhost python3.9[220223]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:29:22 localhost python3.9[220334]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:29:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46295 DF PROTO=TCP SPT=42532 DPT=9105 SEQ=3869127957 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BC193D0000000001030307) Nov 26 04:29:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9331 DF PROTO=TCP SPT=34808 DPT=9102 SEQ=346671239 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BC20FC0000000001030307) Nov 26 04:29:25 localhost python3.9[220445]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:29:25 localhost python3.9[220555]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:29:26 localhost python3.9[220665]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:29:26 localhost python3.9[220775]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:29:26 localhost sshd[220793]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:29:27 localhost python3.9[220887]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:29:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31100 DF PROTO=TCP SPT=39948 DPT=9101 SEQ=4186245486 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BC2D7C0000000001030307) Nov 26 04:29:28 localhost python3.9[220997]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:29:28 localhost python3.9[221107]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:29:29 localhost python3.9[221217]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:29:30 localhost python3.9[221327]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:29:30 localhost python3.9[221437]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:29:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9333 DF PROTO=TCP SPT=34808 DPT=9102 SEQ=346671239 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BC38BC0000000001030307) Nov 26 04:29:31 localhost python3.9[221583]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:29:32 localhost python3.9[221778]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:29:33 localhost python3.9[221888]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:29:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31102 DF PROTO=TCP SPT=39948 DPT=9101 SEQ=4186245486 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BC453D0000000001030307) Nov 26 04:29:34 localhost python3.9[222016]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:29:35 localhost python3.9[222126]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:29:35 localhost python3.9[222236]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:29:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:29:36 localhost podman[222347]: 2025-11-26 09:29:36.451544188 +0000 UTC m=+0.070083780 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:29:36 localhost podman[222347]: 2025-11-26 09:29:36.492219207 +0000 UTC m=+0.110758799 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 26 04:29:36 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:29:36 localhost python3.9[222346]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:29:37 localhost python3.9[222480]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Nov 26 04:29:38 localhost python3.9[222590]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 26 04:29:38 localhost systemd[1]: Reloading. Nov 26 04:29:38 localhost systemd-sysv-generator[222621]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:29:38 localhost systemd-rc-local-generator[222618]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:29:38 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:38 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:38 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:38 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:38 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:29:38 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:38 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:38 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:38 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:29:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9334 DF PROTO=TCP SPT=34808 DPT=9102 SEQ=346671239 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BC59FC0000000001030307) Nov 26 04:29:39 localhost python3.9[222736]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:29:40 localhost python3.9[222847]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:29:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:29:40 localhost podman[222959]: 2025-11-26 09:29:40.688492247 +0000 UTC m=+0.088150386 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:29:40 localhost podman[222959]: 2025-11-26 09:29:40.721018427 +0000 UTC m=+0.120676566 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118) Nov 26 04:29:40 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:29:40 localhost python3.9[222958]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:29:41 localhost python3.9[223087]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:29:42 localhost python3.9[223198]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:29:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31103 DF PROTO=TCP SPT=39948 DPT=9101 SEQ=4186245486 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BC65FD0000000001030307) Nov 26 04:29:43 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8484 DF PROTO=TCP SPT=47872 DPT=9882 SEQ=2493675436 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BC67FC0000000001030307) Nov 26 04:29:43 localhost python3.9[223309]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:29:43 localhost python3.9[223420]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:29:45 localhost python3.9[223531]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:29:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62185 DF PROTO=TCP SPT=58738 DPT=9105 SEQ=3904258185 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BC72C00000000001030307) Nov 26 04:29:47 localhost python3.9[223642]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:29:48 localhost python3.9[223752]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:29:48 localhost python3.9[223862]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:29:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62187 DF PROTO=TCP SPT=58738 DPT=9105 SEQ=3904258185 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BC7EBC0000000001030307) Nov 26 04:29:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:29:49 localhost podman[223972]: 2025-11-26 09:29:49.654668218 +0000 UTC m=+0.081351549 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Nov 26 04:29:49 localhost podman[223972]: 2025-11-26 09:29:49.667146826 +0000 UTC m=+0.093830187 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:29:49 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:29:49 localhost python3.9[223973]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:29:50 localhost python3.9[224101]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:29:50 localhost python3.9[224211]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:29:51 localhost python3.9[224321]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:29:52 localhost python3.9[224431]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Nov 26 04:29:52 localhost python3.9[224541]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Nov 26 04:29:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62188 DF PROTO=TCP SPT=58738 DPT=9105 SEQ=3904258185 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BC8E7C0000000001030307) Nov 26 04:29:53 localhost python3.9[224651]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Nov 26 04:29:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7594 DF PROTO=TCP SPT=51752 DPT=9102 SEQ=2614096758 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BC95FD0000000001030307) Nov 26 04:29:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48477 DF PROTO=TCP SPT=54684 DPT=9102 SEQ=2745082675 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BCA1FC0000000001030307) Nov 26 04:30:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7596 DF PROTO=TCP SPT=51752 DPT=9102 SEQ=2614096758 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BCADBD0000000001030307) Nov 26 04:30:00 localhost python3.9[224761]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None Nov 26 04:30:01 localhost python3.9[224872]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None Nov 26 04:30:02 localhost python3.9[224988]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005536118.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None Nov 26 04:30:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:30:03.630 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:30:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:30:03.630 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:30:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:30:03.632 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:30:03 localhost sshd[225014]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:30:04 localhost systemd-logind[761]: New session 55 of user zuul. Nov 26 04:30:04 localhost systemd[1]: Started Session 55 of User zuul. Nov 26 04:30:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=978 DF PROTO=TCP SPT=54418 DPT=9101 SEQ=2875771101 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BCBA7C0000000001030307) Nov 26 04:30:04 localhost systemd[1]: session-55.scope: Deactivated successfully. Nov 26 04:30:04 localhost systemd-logind[761]: Session 55 logged out. Waiting for processes to exit. Nov 26 04:30:04 localhost systemd-logind[761]: Removed session 55. Nov 26 04:30:05 localhost python3.9[225125]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:30:06 localhost python3.9[225211]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764149405.2377381-3389-3181897379879/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:30:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:30:06 localhost python3.9[225319]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:30:06 localhost podman[225320]: 2025-11-26 09:30:06.816135041 +0000 UTC m=+0.075131980 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 26 04:30:06 localhost podman[225320]: 2025-11-26 09:30:06.882406257 +0000 UTC m=+0.141403176 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 26 04:30:06 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:30:07 localhost sshd[225348]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:30:07 localhost python3.9[225402]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:30:08 localhost python3.9[225510]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:30:08 localhost python3.9[225596]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764149407.9167063-3389-216992955129101/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:30:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7597 DF PROTO=TCP SPT=51752 DPT=9102 SEQ=2614096758 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BCCDFC0000000001030307) Nov 26 04:30:09 localhost python3.9[225704]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:30:09 localhost python3.9[225790]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764149408.9805446-3389-277096616722897/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=bf2176996cbca305070d0fff5e0027db1ed8fcef backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:30:10 localhost python3.9[225898]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:30:10 localhost python3.9[225984]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764149410.0054603-3389-239571931628904/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:30:11 localhost python3.9[226092]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:30:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:30:11 localhost podman[226135]: 2025-11-26 09:30:11.825828669 +0000 UTC m=+0.082350282 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent) Nov 26 04:30:11 localhost podman[226135]: 2025-11-26 09:30:11.861304972 +0000 UTC m=+0.117826585 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 26 04:30:11 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:30:12 localhost python3.9[226196]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764149411.105701-3389-83227740034167/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:30:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=979 DF PROTO=TCP SPT=54418 DPT=9101 SEQ=2875771101 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BCD9FC0000000001030307) Nov 26 04:30:12 localhost python3.9[226306]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:30:13 localhost python3.9[226416]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:30:14 localhost python3.9[226526]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:30:15 localhost python3.9[226638]: ansible-ansible.builtin.file Invoked with group=nova mode=0400 owner=nova path=/var/lib/nova/compute_id state=file recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:30:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44504 DF PROTO=TCP SPT=49952 DPT=9105 SEQ=3695123191 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BCE7F00000000001030307) Nov 26 04:30:16 localhost python3.9[226746]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:30:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44505 DF PROTO=TCP SPT=49952 DPT=9105 SEQ=3695123191 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BCEBFC0000000001030307) Nov 26 04:30:17 localhost python3.9[226856]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:30:17 localhost python3.9[226942]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764149416.6318328-3765-97099245551515/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:30:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44506 DF PROTO=TCP SPT=49952 DPT=9105 SEQ=3695123191 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BCF3FD0000000001030307) Nov 26 04:30:19 localhost python3.9[227050]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:30:19 localhost python3.9[227136]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764149417.8062482-3810-235702761184832/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:30:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:30:19 localhost systemd[1]: tmp-crun.4wGz3R.mount: Deactivated successfully. Nov 26 04:30:19 localhost podman[227154]: 2025-11-26 09:30:19.856396877 +0000 UTC m=+0.120775857 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=multipathd) Nov 26 04:30:19 localhost podman[227154]: 2025-11-26 09:30:19.873332211 +0000 UTC m=+0.137711181 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd) Nov 26 04:30:19 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:30:20 localhost sshd[227216]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:30:21 localhost python3.9[227266]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False Nov 26 04:30:22 localhost python3.9[227376]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Nov 26 04:30:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44507 DF PROTO=TCP SPT=49952 DPT=9105 SEQ=3695123191 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BD03BC0000000001030307) Nov 26 04:30:22 localhost python3[227486]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False Nov 26 04:30:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10927 DF PROTO=TCP SPT=39814 DPT=9102 SEQ=3685693437 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BD0B3D0000000001030307) Nov 26 04:30:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29066 DF PROTO=TCP SPT=37998 DPT=9100 SEQ=136222429 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BD17FC0000000001030307) Nov 26 04:30:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10929 DF PROTO=TCP SPT=39814 DPT=9102 SEQ=3685693437 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BD22FC0000000001030307) Nov 26 04:30:33 localhost podman[227500]: 2025-11-26 09:30:23.100347845 +0000 UTC m=+0.046281776 image pull quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified Nov 26 04:30:33 localhost podman[227560]: Nov 26 04:30:33 localhost podman[227560]: 2025-11-26 09:30:33.174350412 +0000 UTC m=+0.054577773 container create 3f032c1307c1383e9e9f0aa4849db71c945330a8cb483b4401270616e20b08e9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, container_name=nova_compute_init, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:30:33 localhost podman[227560]: 2025-11-26 09:30:33.145178885 +0000 UTC m=+0.025406226 image pull quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified Nov 26 04:30:33 localhost python3[227486]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init Nov 26 04:30:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29013 DF PROTO=TCP SPT=50798 DPT=9101 SEQ=3931593846 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BD2FBC0000000001030307) Nov 26 04:30:34 localhost python3.9[227776]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:30:35 localhost python3.9[227906]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False Nov 26 04:30:36 localhost python3.9[228016]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Nov 26 04:30:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:30:37 localhost systemd[1]: tmp-crun.m3eUDq.mount: Deactivated successfully. Nov 26 04:30:37 localhost podman[228127]: 2025-11-26 09:30:37.372456811 +0000 UTC m=+0.117058798 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 26 04:30:37 localhost podman[228127]: 2025-11-26 09:30:37.439588886 +0000 UTC m=+0.184190913 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251118) Nov 26 04:30:37 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:30:37 localhost python3[228126]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False Nov 26 04:30:37 localhost python3[228126]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "8e31b7b83c8d26bacd9598fdae1b287d27f8fa7d1d3cf4270dd8e435ff2f6a66",#012 "Digest": "sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-11-21T06:33:31.011385583Z",#012 "Config": {#012 "User": "nova",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251118",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 1211770748,#012 "VirtualSize": 1211770748,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/bba04dbd2e113925874f779bcae6bca7316b7a66f7b274a8553d6a317f393238/diff:/var/lib/containers/storage/overlay/0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c/diff:/var/lib/containers/storage/overlay/6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068/diff:/var/lib/containers/storage/overlay/ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/5d77d5f2ee8b05529dee3e4c9abd16672fdc0c09b8af133b2b88b75f7f780254/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/5d77d5f2ee8b05529dee3e4c9abd16672fdc0c09b8af133b2b88b75f7f780254/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6",#012 "sha256:573e98f577c8b1610c1485067040ff856a142394fcd22ad4cb9c66b7d1de6bef",#012 "sha256:5a71e5d7d31f15255619cb8b9384b708744757c93993652418b0f45b0c0931d5",#012 "sha256:b9b598b1eb0c08906fe1bc9a64fc0e72719a6197d83669d2eb4309e69a00aa62",#012 "sha256:33e3811ab7487b27336fdf94252d5a875b17efb438cbc4ffc943f851ad3eceb6"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251118",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "nova",#012 "History": [#012 {#012 "created": "2025-11-18T01:56:49.795434035Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:6d427dd138d2b0977a7ef7feaa8bd82d04e99cc5f4a16d555d6cff0cb52d43c6 in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-18T01:56:49.795512415Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251118\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-18T01:56:52.547242013Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-11-21T06:10:01.947310748Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947327778Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947358359Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947372589Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.94738527Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.94739397Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:02.324930938Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:36.349393468Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 Nov 26 04:30:37 localhost podman[228202]: 2025-11-26 09:30:37.87374829 +0000 UTC m=+0.088966596 container remove f9509cc0d54b50bd49a9453fd05c1d2713a585d4c3c44137d49c81f0dd67441d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, release=1761123044, tcib_managed=true, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '62782eb5f982aaac812488dee300321e-c7803ed1795969cb7cf47e6d4d57c4b9'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 04:30:37 localhost python3[228126]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force nova_compute Nov 26 04:30:37 localhost podman[228216]: Nov 26 04:30:37 localhost podman[228216]: 2025-11-26 09:30:37.978398979 +0000 UTC m=+0.082187999 container create d28800f3f46d9c1297abef0c0a14c0459ac4b900bed0df41b879062363102b98 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 26 04:30:37 localhost podman[228216]: 2025-11-26 09:30:37.94321671 +0000 UTC m=+0.047005730 image pull quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified Nov 26 04:30:37 localhost python3[228126]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start Nov 26 04:30:38 localhost python3.9[228360]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:30:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10930 DF PROTO=TCP SPT=39814 DPT=9102 SEQ=3685693437 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BD43FC0000000001030307) Nov 26 04:30:39 localhost python3.9[228472]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:30:40 localhost python3.9[228581]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764149439.763826-4085-149463596684393/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:30:40 localhost python3.9[228636]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 26 04:30:40 localhost systemd[1]: Reloading. Nov 26 04:30:41 localhost systemd-rc-local-generator[228658]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:30:41 localhost systemd-sysv-generator[228662]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:30:41 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:30:41 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:30:41 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:30:41 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:30:41 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:30:41 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:30:41 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:30:41 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:30:41 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:30:41 localhost python3.9[228727]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:30:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:30:41 localhost systemd[1]: Reloading. Nov 26 04:30:42 localhost systemd-rc-local-generator[228765]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:30:42 localhost systemd-sysv-generator[228768]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:30:42 localhost podman[228729]: 2025-11-26 09:30:42.0496861 +0000 UTC m=+0.111844780 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:30:42 localhost podman[228729]: 2025-11-26 09:30:42.057537962 +0000 UTC m=+0.119696692 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent) Nov 26 04:30:42 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:30:42 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:30:42 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:30:42 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:30:42 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:30:42 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:30:42 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:30:42 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:30:42 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:30:42 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:30:42 localhost systemd[1]: Starting nova_compute container... Nov 26 04:30:42 localhost systemd[1]: Started libcrun container. Nov 26 04:30:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be414c140a3c518600d6946a87c4cbe7ac9519e95399f3f816da17ed7bc9185b/merged/etc/nvme supports timestamps until 2038 (0x7fffffff) Nov 26 04:30:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be414c140a3c518600d6946a87c4cbe7ac9519e95399f3f816da17ed7bc9185b/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Nov 26 04:30:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be414c140a3c518600d6946a87c4cbe7ac9519e95399f3f816da17ed7bc9185b/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Nov 26 04:30:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be414c140a3c518600d6946a87c4cbe7ac9519e95399f3f816da17ed7bc9185b/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 26 04:30:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be414c140a3c518600d6946a87c4cbe7ac9519e95399f3f816da17ed7bc9185b/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Nov 26 04:30:42 localhost podman[228785]: 2025-11-26 09:30:42.383460502 +0000 UTC m=+0.108389010 container init d28800f3f46d9c1297abef0c0a14c0459ac4b900bed0df41b879062363102b98 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, container_name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:30:42 localhost podman[228785]: 2025-11-26 09:30:42.393718531 +0000 UTC m=+0.118647029 container start d28800f3f46d9c1297abef0c0a14c0459ac4b900bed0df41b879062363102b98 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, container_name=nova_compute) Nov 26 04:30:42 localhost podman[228785]: nova_compute Nov 26 04:30:42 localhost nova_compute[228799]: + sudo -E kolla_set_configs Nov 26 04:30:42 localhost systemd[1]: Started nova_compute container. Nov 26 04:30:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29014 DF PROTO=TCP SPT=50798 DPT=9101 SEQ=3931593846 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BD4FFD0000000001030307) Nov 26 04:30:42 localhost nova_compute[228799]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Nov 26 04:30:42 localhost nova_compute[228799]: INFO:__main__:Validating config file Nov 26 04:30:42 localhost nova_compute[228799]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Nov 26 04:30:42 localhost nova_compute[228799]: INFO:__main__:Copying service configuration files Nov 26 04:30:42 localhost nova_compute[228799]: INFO:__main__:Deleting /etc/nova/nova.conf Nov 26 04:30:42 localhost nova_compute[228799]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf Nov 26 04:30:42 localhost nova_compute[228799]: INFO:__main__:Setting permission for /etc/nova/nova.conf Nov 26 04:30:42 localhost nova_compute[228799]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf Nov 26 04:30:42 localhost nova_compute[228799]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf Nov 26 04:30:42 localhost nova_compute[228799]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf Nov 26 04:30:42 localhost nova_compute[228799]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf Nov 26 04:30:42 localhost nova_compute[228799]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Nov 26 04:30:42 localhost nova_compute[228799]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Nov 26 04:30:42 localhost nova_compute[228799]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf Nov 26 04:30:42 localhost nova_compute[228799]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf Nov 26 04:30:42 localhost nova_compute[228799]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf Nov 26 04:30:42 localhost nova_compute[228799]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf Nov 26 04:30:42 localhost nova_compute[228799]: INFO:__main__:Deleting /etc/ceph Nov 26 04:30:42 localhost nova_compute[228799]: INFO:__main__:Creating directory /etc/ceph Nov 26 04:30:42 localhost nova_compute[228799]: INFO:__main__:Setting permission for /etc/ceph Nov 26 04:30:42 localhost nova_compute[228799]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf Nov 26 04:30:42 localhost nova_compute[228799]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Nov 26 04:30:42 localhost nova_compute[228799]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring Nov 26 04:30:42 localhost nova_compute[228799]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Nov 26 04:30:42 localhost nova_compute[228799]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey Nov 26 04:30:42 localhost nova_compute[228799]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Nov 26 04:30:42 localhost nova_compute[228799]: INFO:__main__:Deleting /var/lib/nova/.ssh/config Nov 26 04:30:42 localhost nova_compute[228799]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config Nov 26 04:30:42 localhost nova_compute[228799]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Nov 26 04:30:42 localhost nova_compute[228799]: INFO:__main__:Deleting /usr/sbin/iscsiadm Nov 26 04:30:42 localhost nova_compute[228799]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm Nov 26 04:30:42 localhost nova_compute[228799]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm Nov 26 04:30:42 localhost nova_compute[228799]: INFO:__main__:Writing out command to execute Nov 26 04:30:42 localhost nova_compute[228799]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Nov 26 04:30:42 localhost nova_compute[228799]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Nov 26 04:30:42 localhost nova_compute[228799]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ Nov 26 04:30:42 localhost nova_compute[228799]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Nov 26 04:30:42 localhost nova_compute[228799]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Nov 26 04:30:42 localhost nova_compute[228799]: ++ cat /run_command Nov 26 04:30:42 localhost nova_compute[228799]: + CMD=nova-compute Nov 26 04:30:42 localhost nova_compute[228799]: + ARGS= Nov 26 04:30:42 localhost nova_compute[228799]: + sudo kolla_copy_cacerts Nov 26 04:30:42 localhost nova_compute[228799]: + [[ ! -n '' ]] Nov 26 04:30:42 localhost nova_compute[228799]: + . kolla_extend_start Nov 26 04:30:42 localhost nova_compute[228799]: Running command: 'nova-compute' Nov 26 04:30:42 localhost nova_compute[228799]: + echo 'Running command: '\''nova-compute'\''' Nov 26 04:30:42 localhost nova_compute[228799]: + umask 0022 Nov 26 04:30:42 localhost nova_compute[228799]: + exec nova-compute Nov 26 04:30:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61923 DF PROTO=TCP SPT=45342 DPT=9882 SEQ=96408364 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BD51FC0000000001030307) Nov 26 04:30:43 localhost python3.9[228919]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.190 228803 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.190 228803 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.190 228803 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.190 228803 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.307 228803 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.326 228803 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.019s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.327 228803 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m Nov 26 04:30:44 localhost python3.9[229029]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.770 228803 INFO nova.virt.driver [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.859 228803 INFO nova.compute.provider_config [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.874 228803 WARNING nova.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.: nova.exception.TooOldComputeService: Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.874 228803 DEBUG oslo_concurrency.lockutils [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.874 228803 DEBUG oslo_concurrency.lockutils [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.875 228803 DEBUG oslo_concurrency.lockutils [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.875 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.875 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.875 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.875 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.876 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.876 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.876 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] allow_resize_to_same_host = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.876 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] arq_binding_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.876 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] backdoor_port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.876 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] backdoor_socket = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.876 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] block_device_allocate_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.877 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.877 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cert = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.877 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] compute_driver = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.877 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] compute_monitors = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.877 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] config_dir = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.877 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] config_drive_format = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.878 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.878 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.878 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] console_host = np0005536118.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.878 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] control_exchange = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.878 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cpu_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.878 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.879 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.879 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.879 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] default_availability_zone = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.879 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] default_ephemeral_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.879 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.879 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] default_schedule_zone = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.880 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] disk_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.880 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] enable_new_services = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.880 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] enabled_apis = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.880 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] enabled_ssl_apis = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.880 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] flat_injected = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.880 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] force_config_drive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.880 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] force_raw_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.881 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.881 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.881 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] host = np0005536118.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.881 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] initial_cpu_allocation_ratio = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.881 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] initial_disk_allocation_ratio = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.882 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] initial_ram_allocation_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.882 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] injected_network_template = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.882 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] instance_build_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.882 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] instance_delete_interval = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.882 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.882 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] instance_name_template = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.882 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] instance_usage_audit = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.883 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] instance_usage_audit_period = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.883 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.883 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] instances_path = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.883 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.883 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.883 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] live_migration_retry_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.884 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.884 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.884 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.884 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.884 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.884 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.885 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.885 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] log_rotation_type = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.885 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.885 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.885 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.885 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.885 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.886 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] long_rpc_timeout = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.886 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] max_concurrent_builds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.886 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.886 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] max_concurrent_snapshots = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.886 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] max_local_block_devices = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.886 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] max_logfile_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.886 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] max_logfile_size_mb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.887 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.887 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] metadata_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.887 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] metadata_listen_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.887 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] metadata_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.887 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] migrate_max_retries = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.887 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] mkisofs_cmd = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.888 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] my_block_storage_ip = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.888 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] my_ip = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.888 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] network_allocate_retries = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.888 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.888 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] osapi_compute_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.888 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] osapi_compute_listen_port = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.888 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] osapi_compute_unique_server_name_scope = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.889 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] osapi_compute_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.889 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] password_length = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.889 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] periodic_enable = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.889 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] periodic_fuzzy_delay = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.889 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] pointer_model = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.889 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] preallocate_images = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.890 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.890 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] pybasedir = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.890 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] ram_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.890 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.890 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.890 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.890 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] reboot_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.891 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] reclaim_instance_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.891 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] record = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.891 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] reimage_timeout_per_gb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.891 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] report_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.891 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] rescue_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.891 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] reserved_host_cpus = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.892 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] reserved_host_disk_mb = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.892 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] reserved_host_memory_mb = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.892 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] reserved_huge_pages = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.892 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] resize_confirm_window = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.892 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] resize_fs_using_block_device = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.892 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.892 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] rootwrap_config = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.893 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] rpc_response_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.893 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] run_external_periodic_tasks = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.893 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.893 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.893 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.893 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.893 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] service_down_time = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.894 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] servicegroup_driver = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.894 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] shelved_offload_time = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.894 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] shelved_poll_interval = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.894 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.894 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] source_is_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.894 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] ssl_only = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.895 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] state_path = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.895 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] sync_power_state_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.895 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] sync_power_state_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.895 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.895 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] tempdir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.895 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] timeout_nbd = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.896 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.896 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] update_resources_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.896 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] use_cow_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.896 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.896 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.896 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.897 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] use_rootwrap_daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.897 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.897 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.897 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vcpu_pin_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.897 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vif_plugging_is_fatal = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.897 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vif_plugging_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.897 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] virt_mkfs = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.898 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] volume_usage_poll_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.898 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.898 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] web = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.898 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.898 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_concurrency.lock_path = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.898 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.899 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.899 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.899 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.899 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.899 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api.auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.899 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api.compute_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.899 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.900 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api.dhcp_domain = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.900 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api.enable_instance_password = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.900 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api.glance_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.900 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.900 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.900 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.900 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.901 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api.local_metadata_per_cell = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.901 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api.max_limit = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.901 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api.metadata_cache_expiration = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.901 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api.neutron_default_tenant_id = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.901 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api.use_forwarded_for = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.901 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api.use_neutron_default_nets = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.902 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.902 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.902 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.902 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api.vendordata_dynamic_ssl_certfile = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.902 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.902 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api.vendordata_jsonfile_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.902 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api.vendordata_providers = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.903 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cache.backend = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.903 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cache.backend_argument = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.903 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cache.config_prefix = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.903 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cache.dead_timeout = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.903 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cache.debug_cache_backend = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.903 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cache.enable_retry_client = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.903 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cache.enable_socket_keepalive = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.904 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cache.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.904 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cache.expiration_time = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.904 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.904 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cache.hashclient_retry_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.904 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cache.memcache_dead_retry = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.904 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cache.memcache_password = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.905 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.905 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.905 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cache.memcache_pool_maxsize = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.905 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.905 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cache.memcache_sasl_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.905 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cache.memcache_servers = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.906 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cache.memcache_socket_timeout = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.906 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cache.memcache_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.906 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cache.proxies = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.906 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cache.retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.906 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cache.retry_delay = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.906 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cache.socket_keepalive_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.907 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cache.socket_keepalive_idle = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.907 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.907 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cache.tls_allowed_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.907 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cache.tls_cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.907 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cache.tls_certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.907 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cache.tls_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.907 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cache.tls_keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.908 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cinder.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.908 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cinder.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.908 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cinder.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.908 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cinder.catalog_info = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.908 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cinder.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.908 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cinder.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.909 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cinder.cross_az_attach = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.909 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cinder.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.909 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cinder.endpoint_template = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.909 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cinder.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.909 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cinder.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.909 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cinder.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.909 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cinder.os_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.910 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cinder.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.910 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cinder.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.910 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.910 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] compute.cpu_dedicated_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.910 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] compute.cpu_shared_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.910 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.911 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.911 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.911 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.911 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.911 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.911 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.912 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.912 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.912 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] conductor.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.912 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] console.allowed_origins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.912 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] console.ssl_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.912 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] console.ssl_minimum_version = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.913 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] consoleauth.token_ttl = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.913 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cyborg.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.913 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cyborg.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.913 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cyborg.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.913 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cyborg.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.913 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cyborg.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.913 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cyborg.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.914 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cyborg.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.914 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cyborg.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.914 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cyborg.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.914 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cyborg.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.914 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cyborg.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.914 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cyborg.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.914 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cyborg.service_type = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.915 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cyborg.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.915 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cyborg.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.915 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.915 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cyborg.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.915 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cyborg.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.915 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] cyborg.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.916 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.916 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.916 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.916 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.916 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.916 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.916 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.917 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.917 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.917 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.917 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.917 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.917 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.917 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.918 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.918 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.918 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.918 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.918 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.918 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.918 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api_database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.919 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api_database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.919 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api_database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.919 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api_database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.919 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.919 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api_database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.919 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.920 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api_database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.920 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.920 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.920 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api_database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.920 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api_database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.920 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api_database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.921 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api_database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.921 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api_database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.921 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.921 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api_database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.921 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api_database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.921 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api_database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.921 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.922 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] devices.enabled_mdev_types = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.922 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.922 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.922 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.922 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] glance.api_servers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.922 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] glance.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.922 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] glance.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.923 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] glance.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.923 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] glance.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.923 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] glance.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.923 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] glance.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.923 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.923 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.923 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] glance.enable_rbd_download = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.924 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] glance.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.924 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] glance.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.924 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] glance.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.924 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] glance.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.924 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] glance.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.924 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] glance.num_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.924 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] glance.rbd_ceph_conf = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.925 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] glance.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.925 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] glance.rbd_pool = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.925 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] glance.rbd_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.925 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] glance.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.925 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] glance.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.925 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] glance.service_type = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.925 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] glance.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.926 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] glance.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.926 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.926 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] glance.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.926 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] glance.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.926 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.926 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] glance.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.926 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] guestfs.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.927 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] hyperv.config_drive_cdrom = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.927 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.927 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] hyperv.dynamic_memory_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.927 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.927 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] hyperv.enable_remotefx = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.927 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] hyperv.instances_path_share = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.928 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] hyperv.iscsi_initiator_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.928 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] hyperv.limit_cpu_features = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.928 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.928 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.928 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.928 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.928 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] hyperv.qemu_img_cmd = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.929 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] hyperv.use_multipath_io = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.929 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.929 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.929 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] hyperv.vswitch_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.929 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.929 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] mks.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.930 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] mks.mksproxy_base_url = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.930 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] image_cache.manager_interval = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.930 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.930 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.930 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.930 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.930 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] image_cache.subdirectory_name = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.931 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] ironic.api_max_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.931 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] ironic.api_retry_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.931 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.931 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.931 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.931 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.931 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.932 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.932 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.932 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.932 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.932 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.932 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.933 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.933 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] ironic.partition_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.933 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] ironic.peer_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.933 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.933 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.933 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.933 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] ironic.service_type = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.934 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.934 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.934 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.934 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.934 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] ironic.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.934 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.934 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] key_manager.backend = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.935 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] key_manager.fixed_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.935 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] barbican.auth_endpoint = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.935 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] barbican.barbican_api_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.935 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] barbican.barbican_endpoint = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.935 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.935 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] barbican.barbican_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.935 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] barbican.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.936 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] barbican.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.936 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] barbican.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.936 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] barbican.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.936 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] barbican.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.936 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] barbican.number_of_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.936 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] barbican.retry_delay = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.936 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.937 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] barbican.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.937 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] barbican.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.937 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] barbican.verify_ssl = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.937 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] barbican.verify_ssl_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.937 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.937 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.937 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] barbican_service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.938 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.938 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.938 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.938 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] barbican_service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.938 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.938 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] barbican_service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.938 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vault.approle_role_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.939 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vault.approle_secret_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.939 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vault.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.939 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vault.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.939 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vault.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.939 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vault.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.939 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vault.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.940 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vault.kv_mountpoint = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.940 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vault.kv_version = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.940 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vault.namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.940 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vault.root_token_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.940 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vault.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.940 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vault.ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.940 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vault.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.941 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vault.use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.941 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vault.vault_url = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.941 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] keystone.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.941 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] keystone.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.941 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] keystone.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.941 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] keystone.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.941 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] keystone.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.942 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] keystone.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.942 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] keystone.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.942 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] keystone.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.942 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] keystone.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.942 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] keystone.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.942 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] keystone.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.942 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] keystone.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.943 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] keystone.service_type = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.943 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] keystone.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.943 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] keystone.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.943 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.943 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] keystone.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.943 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] keystone.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.943 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] keystone.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.943 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.connection_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.944 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.cpu_mode = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.944 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.cpu_model_extra_flags = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.944 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.cpu_models = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.944 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.944 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.944 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.cpu_power_management = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.945 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.945 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.945 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.device_detach_timeout = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.945 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.disk_cachemodes = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.945 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.disk_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.945 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.enabled_perf_events = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.945 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.file_backed_memory = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.946 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.gid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.946 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.hw_disk_discard = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.946 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.hw_machine_type = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.946 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.images_rbd_ceph_conf = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.946 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.946 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.947 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.947 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.images_rbd_pool = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.947 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.images_type = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.947 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.images_volume_group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.947 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.inject_key = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.947 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.inject_partition = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.947 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.948 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.iscsi_iface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.948 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.iser_use_multipath = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.948 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.948 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.948 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.948 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.948 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.949 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.949 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.949 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.949 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.live_migration_scheme = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.949 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.949 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.950 228803 WARNING oslo_config.cfg [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Nov 26 04:30:44 localhost nova_compute[228799]: live_migration_uri is deprecated for removal in favor of two other options that Nov 26 04:30:44 localhost nova_compute[228799]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Nov 26 04:30:44 localhost nova_compute[228799]: and ``live_migration_inbound_addr`` respectively. Nov 26 04:30:44 localhost nova_compute[228799]: ). Its value may be silently ignored in the future.#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.950 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.live_migration_uri = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.950 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.950 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.max_queues = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.950 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.950 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.nfs_mount_options = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.951 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.nfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.951 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.951 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.num_iser_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.951 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.951 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.951 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.num_pcie_ports = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.952 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.num_volume_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.952 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.pmem_namespaces = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.952 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.quobyte_client_cfg = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.952 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.952 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.952 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.953 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.953 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.rbd_secret_uuid = 0d5e5e6d-3c4b-5efe-8c65-346ae6715606 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.953 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.rbd_user = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.953 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.953 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.953 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.rescue_image_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.954 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.rescue_kernel_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.954 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.rescue_ramdisk_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.954 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.rng_dev_path = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.954 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.rx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.954 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.smbfs_mount_options = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.954 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.954 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.snapshot_compression = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.955 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.snapshot_image_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.955 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.snapshots_directory = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.955 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.955 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.swtpm_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.955 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.swtpm_group = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.955 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.swtpm_user = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.956 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.sysinfo_serial = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.956 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.tx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.956 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.uid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.956 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.956 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.virt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.956 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.volume_clear = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.956 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.volume_clear_size = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.957 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.volume_use_multipath = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.957 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.vzstorage_cache_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.957 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.957 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.vzstorage_mount_group = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.957 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.vzstorage_mount_opts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.957 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.vzstorage_mount_perms = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.958 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.958 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.vzstorage_mount_user = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.958 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.958 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] neutron.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.958 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] neutron.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.958 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] neutron.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.958 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] neutron.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.959 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] neutron.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.959 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] neutron.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.959 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] neutron.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.959 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] neutron.default_floating_pool = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.959 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] neutron.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.959 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.959 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] neutron.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.960 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] neutron.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.960 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] neutron.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.960 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] neutron.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.960 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.960 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] neutron.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.960 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] neutron.ovs_bridge = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.960 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] neutron.physnets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.961 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] neutron.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.961 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.961 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] neutron.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.961 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] neutron.service_type = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.961 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] neutron.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.961 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] neutron.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.961 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.962 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] neutron.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.962 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] neutron.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.962 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] neutron.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.962 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.962 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] notifications.default_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.962 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.962 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.963 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.963 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] pci.alias = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.963 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] pci.device_spec = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.963 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] pci.report_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.963 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.963 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] placement.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.964 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] placement.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.964 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.964 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.964 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.964 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] placement.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.964 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] placement.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.964 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] placement.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.965 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] placement.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.965 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] placement.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.965 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] placement.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.965 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] placement.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.965 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.965 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.965 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] placement.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.965 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] placement.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.966 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] placement.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.966 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] placement.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.966 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] placement.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.966 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] placement.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.966 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] placement.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.966 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] placement.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.966 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] placement.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.967 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] placement.service_type = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.967 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.967 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] placement.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.967 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.967 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] placement.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.967 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.967 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] placement.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.968 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] placement.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.968 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] placement.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.968 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] placement.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.968 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] placement.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.968 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] placement.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.969 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] placement.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.969 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] quota.cores = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.969 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.969 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] quota.driver = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.969 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.969 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.969 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] quota.injected_files = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.970 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] quota.instances = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.970 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] quota.key_pairs = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.970 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] quota.metadata_items = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.970 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] quota.ram = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.970 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] quota.recheck_quota = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.970 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] quota.server_group_members = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.970 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] quota.server_groups = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.971 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] rdp.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.971 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.971 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.971 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.971 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.971 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.972 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] scheduler.max_attempts = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.972 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.972 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.972 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.972 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.973 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.973 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] scheduler.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.973 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.973 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.973 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.973 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.973 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.974 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.974 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.974 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.974 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.974 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.974 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.975 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.975 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.975 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.975 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.975 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.975 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.975 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.976 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.976 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.976 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.976 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.976 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.976 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.977 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] metrics.required = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.977 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] metrics.weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.977 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] metrics.weight_of_unavailable = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.977 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] metrics.weight_setting = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.977 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] serial_console.base_url = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.977 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] serial_console.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.978 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] serial_console.port_range = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.978 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.978 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.978 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.978 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.978 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] service_user.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.978 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.979 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.979 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.979 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.979 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.979 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.979 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.980 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.980 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] spice.agent_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.980 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] spice.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.980 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.980 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] spice.html5proxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.981 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] spice.html5proxy_port = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.981 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] spice.image_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.981 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] spice.jpeg_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.981 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] spice.playback_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.981 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] spice.server_listen = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.982 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.982 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] spice.streaming_mode = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.982 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] spice.zlib_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.982 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] upgrade_levels.baseapi = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.982 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] upgrade_levels.cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.982 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] upgrade_levels.compute = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.983 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] upgrade_levels.conductor = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.983 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] upgrade_levels.scheduler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.983 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.983 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.983 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.983 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.984 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.984 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.984 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.984 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.984 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.984 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.985 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.985 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vmware.cache_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.985 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vmware.cluster_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.985 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vmware.connection_pool_size = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.985 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vmware.console_delay_seconds = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.985 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vmware.datastore_regex = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.986 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vmware.host_ip = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.986 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.986 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.986 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vmware.host_username = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.986 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.986 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vmware.integration_bridge = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.987 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vmware.maximum_objects = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.987 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vmware.pbm_default_policy = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.987 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vmware.pbm_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.989 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vmware.pbm_wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.989 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vmware.serial_log_dir = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.989 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vmware.serial_port_proxy_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.990 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.990 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.990 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vmware.use_linked_clone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.990 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vmware.vnc_keymap = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.990 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vmware.vnc_port = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.990 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vmware.vnc_port_total = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.990 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vnc.auth_schemes = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.991 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vnc.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.991 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vnc.novncproxy_base_url = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.991 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vnc.novncproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.991 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vnc.novncproxy_port = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.991 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vnc.server_listen = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.992 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vnc.server_proxyclient_address = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.992 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vnc.vencrypt_ca_certs = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.992 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vnc.vencrypt_client_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.992 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vnc.vencrypt_client_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.992 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] workarounds.disable_compute_service_check_for_ffu = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.992 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.992 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.993 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.993 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.993 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] workarounds.disable_rootwrap = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.993 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.993 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.993 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.993 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.994 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.994 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.994 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.994 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.994 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.994 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.994 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.995 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.995 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.995 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.995 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.995 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] wsgi.api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.995 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] wsgi.client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.995 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] wsgi.default_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.995 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] wsgi.keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.996 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] wsgi.max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.996 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] wsgi.secure_proxy_ssl_header = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.996 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] wsgi.ssl_ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.996 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] wsgi.ssl_cert_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.996 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] wsgi.ssl_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.996 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] wsgi.tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.996 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.997 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] zvm.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.997 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] zvm.cloud_connector_url = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.997 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] zvm.image_tmp_path = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.997 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] zvm.reachable_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.997 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.997 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_policy.enforce_scope = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.997 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.998 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.998 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.998 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.998 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.998 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.998 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.998 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.999 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.999 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.999 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] remote_debug.host = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.999 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] remote_debug.port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:44 localhost nova_compute[228799]: 2025-11-26 09:30:44.999 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:44.999 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.000 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.000 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.000 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.000 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.000 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.000 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.000 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.001 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.001 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.001 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.001 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.001 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.001 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.001 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.002 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.002 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.002 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.002 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.002 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.002 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.002 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.003 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.003 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.003 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.003 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.003 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.003 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.003 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.004 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.004 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.004 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.004 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.004 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.004 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_limit.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.005 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_limit.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.005 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_limit.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.005 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_limit.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.005 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_limit.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.005 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_limit.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.005 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_limit.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.005 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.006 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_limit.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.006 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.006 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_limit.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.006 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_limit.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.006 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_limit.endpoint_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.006 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_limit.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.006 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_limit.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.006 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_limit.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.007 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_limit.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.007 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_limit.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.007 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_limit.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.007 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_limit.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.007 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.007 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_limit.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.007 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_limit.project_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.008 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_limit.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.008 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_limit.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.008 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_limit.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.008 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_limit.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.008 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.008 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.008 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_limit.system_scope = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.008 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_limit.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.009 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_limit.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.009 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_limit.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.009 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_limit.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.009 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_limit.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.009 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_limit.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.009 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_limit.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.009 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_limit.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.010 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.010 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.010 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] oslo_reports.log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.010 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.010 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.010 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.010 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.011 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.011 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.011 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.011 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vif_plug_ovs_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.011 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.011 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.011 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.011 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] vif_plug_ovs_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.012 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.012 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.012 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.012 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.012 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] os_vif_linux_bridge.iptables_top_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.012 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.013 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] os_vif_linux_bridge.use_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.013 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.013 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] os_vif_ovs.isolate_vif = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.013 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] os_vif_ovs.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.013 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] os_vif_ovs.ovs_vsctl_timeout = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.013 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.013 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] os_vif_ovs.ovsdb_interface = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.013 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] os_vif_ovs.per_port_bridge = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.014 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] os_brick.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.014 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.014 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.014 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] privsep_osbrick.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.014 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] privsep_osbrick.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.014 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.014 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] privsep_osbrick.logger_name = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.015 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.015 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] privsep_osbrick.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.015 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.015 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] nova_sys_admin.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.015 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] nova_sys_admin.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.015 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] nova_sys_admin.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.015 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.016 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] nova_sys_admin.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.016 228803 DEBUG oslo_service.service [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.017 228803 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.030 228803 INFO nova.virt.node [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] Determined node identity 05276789-7461-410b-9529-16f5185a8bff from /var/lib/nova/compute_id#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.030 228803 DEBUG nova.virt.libvirt.host [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.031 228803 DEBUG nova.virt.libvirt.host [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.031 228803 DEBUG nova.virt.libvirt.host [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.031 228803 DEBUG nova.virt.libvirt.host [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.040 228803 DEBUG nova.virt.libvirt.host [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] Registering for lifecycle events _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.042 228803 DEBUG nova.virt.libvirt.host [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] Registering for connection events: _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.043 228803 INFO nova.virt.libvirt.driver [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] Connection event '1' reason 'None'#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.056 228803 DEBUG nova.virt.libvirt.volume.mount [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.059 228803 INFO nova.virt.libvirt.host [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] Libvirt host capabilities Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: 54d67e25-3d53-4e7f-ba95-c2d307a21761 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: x86_64 Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Rome-v4 Nov 26 04:30:45 localhost nova_compute[228799]: AMD Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: tcp Nov 26 04:30:45 localhost nova_compute[228799]: rdma Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: 16116612 Nov 26 04:30:45 localhost nova_compute[228799]: 4029153 Nov 26 04:30:45 localhost nova_compute[228799]: 0 Nov 26 04:30:45 localhost nova_compute[228799]: 0 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: selinux Nov 26 04:30:45 localhost nova_compute[228799]: 0 Nov 26 04:30:45 localhost nova_compute[228799]: system_u:system_r:svirt_t:s0 Nov 26 04:30:45 localhost nova_compute[228799]: system_u:system_r:svirt_tcg_t:s0 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: dac Nov 26 04:30:45 localhost nova_compute[228799]: 0 Nov 26 04:30:45 localhost nova_compute[228799]: +107:+107 Nov 26 04:30:45 localhost nova_compute[228799]: +107:+107 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: hvm Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: 32 Nov 26 04:30:45 localhost nova_compute[228799]: /usr/libexec/qemu-kvm Nov 26 04:30:45 localhost nova_compute[228799]: pc-i440fx-rhel7.6.0 Nov 26 04:30:45 localhost nova_compute[228799]: pc Nov 26 04:30:45 localhost nova_compute[228799]: pc-q35-rhel9.8.0 Nov 26 04:30:45 localhost nova_compute[228799]: q35 Nov 26 04:30:45 localhost nova_compute[228799]: pc-q35-rhel9.6.0 Nov 26 04:30:45 localhost nova_compute[228799]: pc-q35-rhel8.6.0 Nov 26 04:30:45 localhost nova_compute[228799]: pc-q35-rhel9.4.0 Nov 26 04:30:45 localhost nova_compute[228799]: pc-q35-rhel8.5.0 Nov 26 04:30:45 localhost nova_compute[228799]: pc-q35-rhel8.3.0 Nov 26 04:30:45 localhost nova_compute[228799]: pc-q35-rhel7.6.0 Nov 26 04:30:45 localhost nova_compute[228799]: pc-q35-rhel8.4.0 Nov 26 04:30:45 localhost nova_compute[228799]: pc-q35-rhel9.2.0 Nov 26 04:30:45 localhost nova_compute[228799]: pc-q35-rhel8.2.0 Nov 26 04:30:45 localhost nova_compute[228799]: pc-q35-rhel9.0.0 Nov 26 04:30:45 localhost nova_compute[228799]: pc-q35-rhel8.0.0 Nov 26 04:30:45 localhost nova_compute[228799]: pc-q35-rhel8.1.0 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: hvm Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: 64 Nov 26 04:30:45 localhost nova_compute[228799]: /usr/libexec/qemu-kvm Nov 26 04:30:45 localhost nova_compute[228799]: pc-i440fx-rhel7.6.0 Nov 26 04:30:45 localhost nova_compute[228799]: pc Nov 26 04:30:45 localhost nova_compute[228799]: pc-q35-rhel9.8.0 Nov 26 04:30:45 localhost nova_compute[228799]: q35 Nov 26 04:30:45 localhost nova_compute[228799]: pc-q35-rhel9.6.0 Nov 26 04:30:45 localhost nova_compute[228799]: pc-q35-rhel8.6.0 Nov 26 04:30:45 localhost nova_compute[228799]: pc-q35-rhel9.4.0 Nov 26 04:30:45 localhost nova_compute[228799]: pc-q35-rhel8.5.0 Nov 26 04:30:45 localhost nova_compute[228799]: pc-q35-rhel8.3.0 Nov 26 04:30:45 localhost nova_compute[228799]: pc-q35-rhel7.6.0 Nov 26 04:30:45 localhost nova_compute[228799]: pc-q35-rhel8.4.0 Nov 26 04:30:45 localhost nova_compute[228799]: pc-q35-rhel9.2.0 Nov 26 04:30:45 localhost nova_compute[228799]: pc-q35-rhel8.2.0 Nov 26 04:30:45 localhost nova_compute[228799]: pc-q35-rhel9.0.0 Nov 26 04:30:45 localhost nova_compute[228799]: pc-q35-rhel8.0.0 Nov 26 04:30:45 localhost nova_compute[228799]: pc-q35-rhel8.1.0 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: #033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.066 228803 DEBUG nova.virt.libvirt.host [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.080 228803 DEBUG nova.virt.libvirt.host [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: /usr/libexec/qemu-kvm Nov 26 04:30:45 localhost nova_compute[228799]: kvm Nov 26 04:30:45 localhost nova_compute[228799]: pc-q35-rhel9.8.0 Nov 26 04:30:45 localhost nova_compute[228799]: i686 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: /usr/share/OVMF/OVMF_CODE.secboot.fd Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: rom Nov 26 04:30:45 localhost nova_compute[228799]: pflash Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: yes Nov 26 04:30:45 localhost nova_compute[228799]: no Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: no Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: on Nov 26 04:30:45 localhost nova_compute[228799]: off Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: on Nov 26 04:30:45 localhost nova_compute[228799]: off Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Rome Nov 26 04:30:45 localhost nova_compute[228799]: AMD Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: 486 Nov 26 04:30:45 localhost nova_compute[228799]: 486-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Broadwell Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Broadwell-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Broadwell-noTSX Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Broadwell-noTSX-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Broadwell-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Broadwell-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Broadwell-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Broadwell-v4 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Cascadelake-Server Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Cascadelake-Server-noTSX Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Cascadelake-Server-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Cascadelake-Server-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Cascadelake-Server-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Cascadelake-Server-v4 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Cascadelake-Server-v5 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Conroe Nov 26 04:30:45 localhost nova_compute[228799]: Conroe-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Cooperlake Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Cooperlake-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Cooperlake-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Denverton Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Denverton-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Denverton-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Denverton-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Dhyana Nov 26 04:30:45 localhost nova_compute[228799]: Dhyana-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Dhyana-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Genoa Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Genoa-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-IBPB Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Milan Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Milan-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Milan-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Rome Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Rome-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Rome-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Rome-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Rome-v4 Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-v1 Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-v2 Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-v4 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: GraniteRapids Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: GraniteRapids-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: GraniteRapids-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Haswell Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Haswell-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Haswell-noTSX Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Haswell-noTSX-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Haswell-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Haswell-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Haswell-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Haswell-v4 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Icelake-Server Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Icelake-Server-noTSX Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Icelake-Server-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Icelake-Server-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Icelake-Server-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Icelake-Server-v4 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Icelake-Server-v5 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Icelake-Server-v6 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Icelake-Server-v7 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: IvyBridge Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: IvyBridge-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: IvyBridge-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: IvyBridge-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: KnightsMill Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: KnightsMill-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nehalem Nov 26 04:30:45 localhost nova_compute[228799]: Nehalem-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nehalem-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nehalem-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G1 Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G1-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G2 Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G2-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G3 Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G3-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G4 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G4-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G5 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G5-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Penryn Nov 26 04:30:45 localhost nova_compute[228799]: Penryn-v1 Nov 26 04:30:45 localhost nova_compute[228799]: SandyBridge Nov 26 04:30:45 localhost nova_compute[228799]: SandyBridge-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: SandyBridge-v1 Nov 26 04:30:45 localhost nova_compute[228799]: SandyBridge-v2 Nov 26 04:30:45 localhost nova_compute[228799]: SapphireRapids Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: SapphireRapids-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: SapphireRapids-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: SapphireRapids-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: SierraForest Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: SierraForest-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Client Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Client-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Client-noTSX-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Client-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Client-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Client-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Client-v4 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Server Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Server-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Server-noTSX-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Server-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Server-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Server-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Server-v4 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Server-v5 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Snowridge Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Snowridge-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Snowridge-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Snowridge-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Snowridge-v4 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Westmere Nov 26 04:30:45 localhost nova_compute[228799]: Westmere-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Westmere-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Westmere-v2 Nov 26 04:30:45 localhost nova_compute[228799]: athlon Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: athlon-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: core2duo Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: core2duo-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: coreduo Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: coreduo-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: kvm32 Nov 26 04:30:45 localhost nova_compute[228799]: kvm32-v1 Nov 26 04:30:45 localhost nova_compute[228799]: kvm64 Nov 26 04:30:45 localhost nova_compute[228799]: kvm64-v1 Nov 26 04:30:45 localhost nova_compute[228799]: n270 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: n270-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: pentium Nov 26 04:30:45 localhost nova_compute[228799]: pentium-v1 Nov 26 04:30:45 localhost nova_compute[228799]: pentium2 Nov 26 04:30:45 localhost nova_compute[228799]: pentium2-v1 Nov 26 04:30:45 localhost nova_compute[228799]: pentium3 Nov 26 04:30:45 localhost nova_compute[228799]: pentium3-v1 Nov 26 04:30:45 localhost nova_compute[228799]: phenom Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: phenom-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: qemu32 Nov 26 04:30:45 localhost nova_compute[228799]: qemu32-v1 Nov 26 04:30:45 localhost nova_compute[228799]: qemu64 Nov 26 04:30:45 localhost nova_compute[228799]: qemu64-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: file Nov 26 04:30:45 localhost nova_compute[228799]: anonymous Nov 26 04:30:45 localhost nova_compute[228799]: memfd Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: disk Nov 26 04:30:45 localhost nova_compute[228799]: cdrom Nov 26 04:30:45 localhost nova_compute[228799]: floppy Nov 26 04:30:45 localhost nova_compute[228799]: lun Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: fdc Nov 26 04:30:45 localhost nova_compute[228799]: scsi Nov 26 04:30:45 localhost nova_compute[228799]: virtio Nov 26 04:30:45 localhost nova_compute[228799]: usb Nov 26 04:30:45 localhost nova_compute[228799]: sata Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: virtio Nov 26 04:30:45 localhost nova_compute[228799]: virtio-transitional Nov 26 04:30:45 localhost nova_compute[228799]: virtio-non-transitional Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: vnc Nov 26 04:30:45 localhost nova_compute[228799]: egl-headless Nov 26 04:30:45 localhost nova_compute[228799]: dbus Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: subsystem Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: default Nov 26 04:30:45 localhost nova_compute[228799]: mandatory Nov 26 04:30:45 localhost nova_compute[228799]: requisite Nov 26 04:30:45 localhost nova_compute[228799]: optional Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: usb Nov 26 04:30:45 localhost nova_compute[228799]: pci Nov 26 04:30:45 localhost nova_compute[228799]: scsi Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: virtio Nov 26 04:30:45 localhost nova_compute[228799]: virtio-transitional Nov 26 04:30:45 localhost nova_compute[228799]: virtio-non-transitional Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: random Nov 26 04:30:45 localhost nova_compute[228799]: egd Nov 26 04:30:45 localhost nova_compute[228799]: builtin Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: path Nov 26 04:30:45 localhost nova_compute[228799]: handle Nov 26 04:30:45 localhost nova_compute[228799]: virtiofs Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: tpm-tis Nov 26 04:30:45 localhost nova_compute[228799]: tpm-crb Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: emulator Nov 26 04:30:45 localhost nova_compute[228799]: external Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: 2.0 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: usb Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: pty Nov 26 04:30:45 localhost nova_compute[228799]: unix Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: qemu Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: builtin Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: default Nov 26 04:30:45 localhost nova_compute[228799]: passt Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: isa Nov 26 04:30:45 localhost nova_compute[228799]: hyperv Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: null Nov 26 04:30:45 localhost nova_compute[228799]: vc Nov 26 04:30:45 localhost nova_compute[228799]: pty Nov 26 04:30:45 localhost nova_compute[228799]: dev Nov 26 04:30:45 localhost nova_compute[228799]: file Nov 26 04:30:45 localhost nova_compute[228799]: pipe Nov 26 04:30:45 localhost nova_compute[228799]: stdio Nov 26 04:30:45 localhost nova_compute[228799]: udp Nov 26 04:30:45 localhost nova_compute[228799]: tcp Nov 26 04:30:45 localhost nova_compute[228799]: unix Nov 26 04:30:45 localhost nova_compute[228799]: qemu-vdagent Nov 26 04:30:45 localhost nova_compute[228799]: dbus Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: relaxed Nov 26 04:30:45 localhost nova_compute[228799]: vapic Nov 26 04:30:45 localhost nova_compute[228799]: spinlocks Nov 26 04:30:45 localhost nova_compute[228799]: vpindex Nov 26 04:30:45 localhost nova_compute[228799]: runtime Nov 26 04:30:45 localhost nova_compute[228799]: synic Nov 26 04:30:45 localhost nova_compute[228799]: stimer Nov 26 04:30:45 localhost nova_compute[228799]: reset Nov 26 04:30:45 localhost nova_compute[228799]: vendor_id Nov 26 04:30:45 localhost nova_compute[228799]: frequencies Nov 26 04:30:45 localhost nova_compute[228799]: reenlightenment Nov 26 04:30:45 localhost nova_compute[228799]: tlbflush Nov 26 04:30:45 localhost nova_compute[228799]: ipi Nov 26 04:30:45 localhost nova_compute[228799]: avic Nov 26 04:30:45 localhost nova_compute[228799]: emsr_bitmap Nov 26 04:30:45 localhost nova_compute[228799]: xmm_input Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: 4095 Nov 26 04:30:45 localhost nova_compute[228799]: on Nov 26 04:30:45 localhost nova_compute[228799]: off Nov 26 04:30:45 localhost nova_compute[228799]: off Nov 26 04:30:45 localhost nova_compute[228799]: Linux KVM Hv Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: tdx Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.087 228803 DEBUG nova.virt.libvirt.host [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: /usr/libexec/qemu-kvm Nov 26 04:30:45 localhost nova_compute[228799]: kvm Nov 26 04:30:45 localhost nova_compute[228799]: pc-i440fx-rhel7.6.0 Nov 26 04:30:45 localhost nova_compute[228799]: i686 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: /usr/share/OVMF/OVMF_CODE.secboot.fd Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: rom Nov 26 04:30:45 localhost nova_compute[228799]: pflash Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: yes Nov 26 04:30:45 localhost nova_compute[228799]: no Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: no Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: on Nov 26 04:30:45 localhost nova_compute[228799]: off Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: on Nov 26 04:30:45 localhost nova_compute[228799]: off Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Rome Nov 26 04:30:45 localhost nova_compute[228799]: AMD Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: 486 Nov 26 04:30:45 localhost nova_compute[228799]: 486-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Broadwell Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Broadwell-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Broadwell-noTSX Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Broadwell-noTSX-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Broadwell-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Broadwell-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Broadwell-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Broadwell-v4 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Cascadelake-Server Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Cascadelake-Server-noTSX Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Cascadelake-Server-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Cascadelake-Server-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Cascadelake-Server-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Cascadelake-Server-v4 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Cascadelake-Server-v5 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Conroe Nov 26 04:30:45 localhost nova_compute[228799]: Conroe-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Cooperlake Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Cooperlake-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Cooperlake-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Denverton Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Denverton-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Denverton-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Denverton-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Dhyana Nov 26 04:30:45 localhost nova_compute[228799]: Dhyana-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Dhyana-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Genoa Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Genoa-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-IBPB Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Milan Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Milan-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Milan-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Rome Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Rome-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Rome-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Rome-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Rome-v4 Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-v1 Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-v2 Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-v4 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: GraniteRapids Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: GraniteRapids-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: GraniteRapids-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Haswell Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Haswell-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Haswell-noTSX Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Haswell-noTSX-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Haswell-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Haswell-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Haswell-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Haswell-v4 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Icelake-Server Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Icelake-Server-noTSX Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Icelake-Server-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Icelake-Server-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Icelake-Server-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Icelake-Server-v4 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Icelake-Server-v5 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Icelake-Server-v6 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Icelake-Server-v7 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: IvyBridge Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: IvyBridge-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: IvyBridge-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: IvyBridge-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: KnightsMill Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: KnightsMill-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nehalem Nov 26 04:30:45 localhost nova_compute[228799]: Nehalem-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nehalem-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nehalem-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G1 Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G1-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G2 Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G2-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G3 Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G3-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G4 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G4-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G5 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G5-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Penryn Nov 26 04:30:45 localhost nova_compute[228799]: Penryn-v1 Nov 26 04:30:45 localhost nova_compute[228799]: SandyBridge Nov 26 04:30:45 localhost nova_compute[228799]: SandyBridge-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: SandyBridge-v1 Nov 26 04:30:45 localhost nova_compute[228799]: SandyBridge-v2 Nov 26 04:30:45 localhost nova_compute[228799]: SapphireRapids Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: SapphireRapids-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: SapphireRapids-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: SapphireRapids-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: SierraForest Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: SierraForest-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Client Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Client-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Client-noTSX-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Client-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Client-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Client-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Client-v4 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Server Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Server-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Server-noTSX-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Server-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Server-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Server-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Server-v4 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Server-v5 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Snowridge Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Snowridge-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Snowridge-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Snowridge-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Snowridge-v4 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Westmere Nov 26 04:30:45 localhost nova_compute[228799]: Westmere-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Westmere-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Westmere-v2 Nov 26 04:30:45 localhost nova_compute[228799]: athlon Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: athlon-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: core2duo Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: core2duo-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: coreduo Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: coreduo-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: kvm32 Nov 26 04:30:45 localhost nova_compute[228799]: kvm32-v1 Nov 26 04:30:45 localhost nova_compute[228799]: kvm64 Nov 26 04:30:45 localhost nova_compute[228799]: kvm64-v1 Nov 26 04:30:45 localhost nova_compute[228799]: n270 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: n270-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: pentium Nov 26 04:30:45 localhost nova_compute[228799]: pentium-v1 Nov 26 04:30:45 localhost nova_compute[228799]: pentium2 Nov 26 04:30:45 localhost nova_compute[228799]: pentium2-v1 Nov 26 04:30:45 localhost nova_compute[228799]: pentium3 Nov 26 04:30:45 localhost nova_compute[228799]: pentium3-v1 Nov 26 04:30:45 localhost nova_compute[228799]: phenom Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: phenom-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: qemu32 Nov 26 04:30:45 localhost nova_compute[228799]: qemu32-v1 Nov 26 04:30:45 localhost nova_compute[228799]: qemu64 Nov 26 04:30:45 localhost nova_compute[228799]: qemu64-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: file Nov 26 04:30:45 localhost nova_compute[228799]: anonymous Nov 26 04:30:45 localhost nova_compute[228799]: memfd Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: disk Nov 26 04:30:45 localhost nova_compute[228799]: cdrom Nov 26 04:30:45 localhost nova_compute[228799]: floppy Nov 26 04:30:45 localhost nova_compute[228799]: lun Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: ide Nov 26 04:30:45 localhost nova_compute[228799]: fdc Nov 26 04:30:45 localhost nova_compute[228799]: scsi Nov 26 04:30:45 localhost nova_compute[228799]: virtio Nov 26 04:30:45 localhost nova_compute[228799]: usb Nov 26 04:30:45 localhost nova_compute[228799]: sata Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: virtio Nov 26 04:30:45 localhost nova_compute[228799]: virtio-transitional Nov 26 04:30:45 localhost nova_compute[228799]: virtio-non-transitional Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: vnc Nov 26 04:30:45 localhost nova_compute[228799]: egl-headless Nov 26 04:30:45 localhost nova_compute[228799]: dbus Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: subsystem Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: default Nov 26 04:30:45 localhost nova_compute[228799]: mandatory Nov 26 04:30:45 localhost nova_compute[228799]: requisite Nov 26 04:30:45 localhost nova_compute[228799]: optional Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: usb Nov 26 04:30:45 localhost nova_compute[228799]: pci Nov 26 04:30:45 localhost nova_compute[228799]: scsi Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: virtio Nov 26 04:30:45 localhost nova_compute[228799]: virtio-transitional Nov 26 04:30:45 localhost nova_compute[228799]: virtio-non-transitional Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: random Nov 26 04:30:45 localhost nova_compute[228799]: egd Nov 26 04:30:45 localhost nova_compute[228799]: builtin Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: path Nov 26 04:30:45 localhost nova_compute[228799]: handle Nov 26 04:30:45 localhost nova_compute[228799]: virtiofs Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: tpm-tis Nov 26 04:30:45 localhost nova_compute[228799]: tpm-crb Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: emulator Nov 26 04:30:45 localhost nova_compute[228799]: external Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: 2.0 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: usb Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: pty Nov 26 04:30:45 localhost nova_compute[228799]: unix Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: qemu Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: builtin Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: default Nov 26 04:30:45 localhost nova_compute[228799]: passt Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: isa Nov 26 04:30:45 localhost nova_compute[228799]: hyperv Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: null Nov 26 04:30:45 localhost nova_compute[228799]: vc Nov 26 04:30:45 localhost nova_compute[228799]: pty Nov 26 04:30:45 localhost nova_compute[228799]: dev Nov 26 04:30:45 localhost nova_compute[228799]: file Nov 26 04:30:45 localhost nova_compute[228799]: pipe Nov 26 04:30:45 localhost nova_compute[228799]: stdio Nov 26 04:30:45 localhost nova_compute[228799]: udp Nov 26 04:30:45 localhost nova_compute[228799]: tcp Nov 26 04:30:45 localhost nova_compute[228799]: unix Nov 26 04:30:45 localhost nova_compute[228799]: qemu-vdagent Nov 26 04:30:45 localhost nova_compute[228799]: dbus Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: relaxed Nov 26 04:30:45 localhost nova_compute[228799]: vapic Nov 26 04:30:45 localhost nova_compute[228799]: spinlocks Nov 26 04:30:45 localhost nova_compute[228799]: vpindex Nov 26 04:30:45 localhost nova_compute[228799]: runtime Nov 26 04:30:45 localhost nova_compute[228799]: synic Nov 26 04:30:45 localhost nova_compute[228799]: stimer Nov 26 04:30:45 localhost nova_compute[228799]: reset Nov 26 04:30:45 localhost nova_compute[228799]: vendor_id Nov 26 04:30:45 localhost nova_compute[228799]: frequencies Nov 26 04:30:45 localhost nova_compute[228799]: reenlightenment Nov 26 04:30:45 localhost nova_compute[228799]: tlbflush Nov 26 04:30:45 localhost nova_compute[228799]: ipi Nov 26 04:30:45 localhost nova_compute[228799]: avic Nov 26 04:30:45 localhost nova_compute[228799]: emsr_bitmap Nov 26 04:30:45 localhost nova_compute[228799]: xmm_input Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: 4095 Nov 26 04:30:45 localhost nova_compute[228799]: on Nov 26 04:30:45 localhost nova_compute[228799]: off Nov 26 04:30:45 localhost nova_compute[228799]: off Nov 26 04:30:45 localhost nova_compute[228799]: Linux KVM Hv Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: tdx Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.111 228803 DEBUG nova.virt.libvirt.host [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.115 228803 DEBUG nova.virt.libvirt.host [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: /usr/libexec/qemu-kvm Nov 26 04:30:45 localhost nova_compute[228799]: kvm Nov 26 04:30:45 localhost nova_compute[228799]: pc-q35-rhel9.8.0 Nov 26 04:30:45 localhost nova_compute[228799]: x86_64 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: efi Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: /usr/share/edk2/ovmf/OVMF_CODE.secboot.fd Nov 26 04:30:45 localhost nova_compute[228799]: /usr/share/edk2/ovmf/OVMF_CODE.fd Nov 26 04:30:45 localhost nova_compute[228799]: /usr/share/edk2/ovmf/OVMF.amdsev.fd Nov 26 04:30:45 localhost nova_compute[228799]: /usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: rom Nov 26 04:30:45 localhost nova_compute[228799]: pflash Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: yes Nov 26 04:30:45 localhost nova_compute[228799]: no Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: yes Nov 26 04:30:45 localhost nova_compute[228799]: no Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: on Nov 26 04:30:45 localhost nova_compute[228799]: off Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: on Nov 26 04:30:45 localhost nova_compute[228799]: off Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Rome Nov 26 04:30:45 localhost nova_compute[228799]: AMD Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: 486 Nov 26 04:30:45 localhost nova_compute[228799]: 486-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Broadwell Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Broadwell-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Broadwell-noTSX Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Broadwell-noTSX-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Broadwell-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Broadwell-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Broadwell-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Broadwell-v4 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Cascadelake-Server Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Cascadelake-Server-noTSX Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Cascadelake-Server-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Cascadelake-Server-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Cascadelake-Server-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Cascadelake-Server-v4 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Cascadelake-Server-v5 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Conroe Nov 26 04:30:45 localhost nova_compute[228799]: Conroe-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Cooperlake Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Cooperlake-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Cooperlake-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Denverton Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Denverton-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Denverton-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Denverton-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Dhyana Nov 26 04:30:45 localhost nova_compute[228799]: Dhyana-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Dhyana-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Genoa Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Genoa-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-IBPB Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Milan Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Milan-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Milan-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Rome Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Rome-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Rome-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Rome-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Rome-v4 Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-v1 Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-v2 Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-v4 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: GraniteRapids Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: GraniteRapids-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: GraniteRapids-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost python3.9[229139]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Haswell Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Haswell-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Haswell-noTSX Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Haswell-noTSX-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Haswell-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Haswell-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Haswell-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Haswell-v4 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Icelake-Server Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Icelake-Server-noTSX Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Icelake-Server-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Icelake-Server-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Icelake-Server-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Icelake-Server-v4 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Icelake-Server-v5 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Icelake-Server-v6 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Icelake-Server-v7 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: IvyBridge Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: IvyBridge-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: IvyBridge-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: IvyBridge-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: KnightsMill Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: KnightsMill-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nehalem Nov 26 04:30:45 localhost nova_compute[228799]: Nehalem-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nehalem-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nehalem-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G1 Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G1-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G2 Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G2-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G3 Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G3-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G4 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G4-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G5 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G5-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Penryn Nov 26 04:30:45 localhost nova_compute[228799]: Penryn-v1 Nov 26 04:30:45 localhost nova_compute[228799]: SandyBridge Nov 26 04:30:45 localhost nova_compute[228799]: SandyBridge-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: SandyBridge-v1 Nov 26 04:30:45 localhost nova_compute[228799]: SandyBridge-v2 Nov 26 04:30:45 localhost nova_compute[228799]: SapphireRapids Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: SapphireRapids-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: SapphireRapids-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: SapphireRapids-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: SierraForest Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: SierraForest-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Client Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Client-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Client-noTSX-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Client-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Client-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Client-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Client-v4 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Server Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Server-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Server-noTSX-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Server-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Server-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Server-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Server-v4 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Server-v5 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Snowridge Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Snowridge-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Snowridge-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Snowridge-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Snowridge-v4 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Westmere Nov 26 04:30:45 localhost nova_compute[228799]: Westmere-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Westmere-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Westmere-v2 Nov 26 04:30:45 localhost nova_compute[228799]: athlon Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: athlon-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: core2duo Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: core2duo-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: coreduo Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: coreduo-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: kvm32 Nov 26 04:30:45 localhost nova_compute[228799]: kvm32-v1 Nov 26 04:30:45 localhost nova_compute[228799]: kvm64 Nov 26 04:30:45 localhost nova_compute[228799]: kvm64-v1 Nov 26 04:30:45 localhost nova_compute[228799]: n270 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: n270-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: pentium Nov 26 04:30:45 localhost nova_compute[228799]: pentium-v1 Nov 26 04:30:45 localhost nova_compute[228799]: pentium2 Nov 26 04:30:45 localhost nova_compute[228799]: pentium2-v1 Nov 26 04:30:45 localhost nova_compute[228799]: pentium3 Nov 26 04:30:45 localhost nova_compute[228799]: pentium3-v1 Nov 26 04:30:45 localhost nova_compute[228799]: phenom Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: phenom-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: qemu32 Nov 26 04:30:45 localhost nova_compute[228799]: qemu32-v1 Nov 26 04:30:45 localhost nova_compute[228799]: qemu64 Nov 26 04:30:45 localhost nova_compute[228799]: qemu64-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: file Nov 26 04:30:45 localhost nova_compute[228799]: anonymous Nov 26 04:30:45 localhost nova_compute[228799]: memfd Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: disk Nov 26 04:30:45 localhost nova_compute[228799]: cdrom Nov 26 04:30:45 localhost nova_compute[228799]: floppy Nov 26 04:30:45 localhost nova_compute[228799]: lun Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: fdc Nov 26 04:30:45 localhost nova_compute[228799]: scsi Nov 26 04:30:45 localhost nova_compute[228799]: virtio Nov 26 04:30:45 localhost nova_compute[228799]: usb Nov 26 04:30:45 localhost nova_compute[228799]: sata Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: virtio Nov 26 04:30:45 localhost nova_compute[228799]: virtio-transitional Nov 26 04:30:45 localhost nova_compute[228799]: virtio-non-transitional Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: vnc Nov 26 04:30:45 localhost nova_compute[228799]: egl-headless Nov 26 04:30:45 localhost nova_compute[228799]: dbus Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: subsystem Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: default Nov 26 04:30:45 localhost nova_compute[228799]: mandatory Nov 26 04:30:45 localhost nova_compute[228799]: requisite Nov 26 04:30:45 localhost nova_compute[228799]: optional Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: usb Nov 26 04:30:45 localhost nova_compute[228799]: pci Nov 26 04:30:45 localhost nova_compute[228799]: scsi Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: virtio Nov 26 04:30:45 localhost nova_compute[228799]: virtio-transitional Nov 26 04:30:45 localhost nova_compute[228799]: virtio-non-transitional Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: random Nov 26 04:30:45 localhost nova_compute[228799]: egd Nov 26 04:30:45 localhost nova_compute[228799]: builtin Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: path Nov 26 04:30:45 localhost nova_compute[228799]: handle Nov 26 04:30:45 localhost nova_compute[228799]: virtiofs Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: tpm-tis Nov 26 04:30:45 localhost nova_compute[228799]: tpm-crb Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: emulator Nov 26 04:30:45 localhost nova_compute[228799]: external Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: 2.0 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: usb Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: pty Nov 26 04:30:45 localhost nova_compute[228799]: unix Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: qemu Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: builtin Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: default Nov 26 04:30:45 localhost nova_compute[228799]: passt Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: isa Nov 26 04:30:45 localhost nova_compute[228799]: hyperv Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: null Nov 26 04:30:45 localhost nova_compute[228799]: vc Nov 26 04:30:45 localhost nova_compute[228799]: pty Nov 26 04:30:45 localhost nova_compute[228799]: dev Nov 26 04:30:45 localhost nova_compute[228799]: file Nov 26 04:30:45 localhost nova_compute[228799]: pipe Nov 26 04:30:45 localhost nova_compute[228799]: stdio Nov 26 04:30:45 localhost nova_compute[228799]: udp Nov 26 04:30:45 localhost nova_compute[228799]: tcp Nov 26 04:30:45 localhost nova_compute[228799]: unix Nov 26 04:30:45 localhost nova_compute[228799]: qemu-vdagent Nov 26 04:30:45 localhost nova_compute[228799]: dbus Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: relaxed Nov 26 04:30:45 localhost nova_compute[228799]: vapic Nov 26 04:30:45 localhost nova_compute[228799]: spinlocks Nov 26 04:30:45 localhost nova_compute[228799]: vpindex Nov 26 04:30:45 localhost nova_compute[228799]: runtime Nov 26 04:30:45 localhost nova_compute[228799]: synic Nov 26 04:30:45 localhost nova_compute[228799]: stimer Nov 26 04:30:45 localhost nova_compute[228799]: reset Nov 26 04:30:45 localhost nova_compute[228799]: vendor_id Nov 26 04:30:45 localhost nova_compute[228799]: frequencies Nov 26 04:30:45 localhost nova_compute[228799]: reenlightenment Nov 26 04:30:45 localhost nova_compute[228799]: tlbflush Nov 26 04:30:45 localhost nova_compute[228799]: ipi Nov 26 04:30:45 localhost nova_compute[228799]: avic Nov 26 04:30:45 localhost nova_compute[228799]: emsr_bitmap Nov 26 04:30:45 localhost nova_compute[228799]: xmm_input Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: 4095 Nov 26 04:30:45 localhost nova_compute[228799]: on Nov 26 04:30:45 localhost nova_compute[228799]: off Nov 26 04:30:45 localhost nova_compute[228799]: off Nov 26 04:30:45 localhost nova_compute[228799]: Linux KVM Hv Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: tdx Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.175 228803 DEBUG nova.virt.libvirt.host [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: /usr/libexec/qemu-kvm Nov 26 04:30:45 localhost nova_compute[228799]: kvm Nov 26 04:30:45 localhost nova_compute[228799]: pc-i440fx-rhel7.6.0 Nov 26 04:30:45 localhost nova_compute[228799]: x86_64 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: /usr/share/OVMF/OVMF_CODE.secboot.fd Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: rom Nov 26 04:30:45 localhost nova_compute[228799]: pflash Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: yes Nov 26 04:30:45 localhost nova_compute[228799]: no Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: no Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: on Nov 26 04:30:45 localhost nova_compute[228799]: off Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: on Nov 26 04:30:45 localhost nova_compute[228799]: off Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Rome Nov 26 04:30:45 localhost nova_compute[228799]: AMD Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: 486 Nov 26 04:30:45 localhost nova_compute[228799]: 486-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Broadwell Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Broadwell-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Broadwell-noTSX Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Broadwell-noTSX-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Broadwell-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Broadwell-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Broadwell-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Broadwell-v4 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Cascadelake-Server Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Cascadelake-Server-noTSX Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Cascadelake-Server-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Cascadelake-Server-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Cascadelake-Server-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Cascadelake-Server-v4 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Cascadelake-Server-v5 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Conroe Nov 26 04:30:45 localhost nova_compute[228799]: Conroe-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Cooperlake Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Cooperlake-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Cooperlake-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Denverton Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Denverton-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Denverton-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Denverton-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Dhyana Nov 26 04:30:45 localhost nova_compute[228799]: Dhyana-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Dhyana-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Genoa Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Genoa-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-IBPB Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Milan Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Milan-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Milan-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Rome Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Rome-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Rome-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Rome-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-Rome-v4 Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-v1 Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-v2 Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: EPYC-v4 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: GraniteRapids Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: GraniteRapids-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: GraniteRapids-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Haswell Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Haswell-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Haswell-noTSX Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Haswell-noTSX-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Haswell-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Haswell-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Haswell-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Haswell-v4 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Icelake-Server Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Icelake-Server-noTSX Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Icelake-Server-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Icelake-Server-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Icelake-Server-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Icelake-Server-v4 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Icelake-Server-v5 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Icelake-Server-v6 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Icelake-Server-v7 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: IvyBridge Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: IvyBridge-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: IvyBridge-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: IvyBridge-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: KnightsMill Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: KnightsMill-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nehalem Nov 26 04:30:45 localhost nova_compute[228799]: Nehalem-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nehalem-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nehalem-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G1 Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G1-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G2 Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G2-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G3 Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G3-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G4 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G4-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G5 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Opteron_G5-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Penryn Nov 26 04:30:45 localhost nova_compute[228799]: Penryn-v1 Nov 26 04:30:45 localhost nova_compute[228799]: SandyBridge Nov 26 04:30:45 localhost nova_compute[228799]: SandyBridge-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: SandyBridge-v1 Nov 26 04:30:45 localhost nova_compute[228799]: SandyBridge-v2 Nov 26 04:30:45 localhost nova_compute[228799]: SapphireRapids Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: SapphireRapids-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: SapphireRapids-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: SapphireRapids-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: SierraForest Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: SierraForest-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Client Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Client-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Client-noTSX-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Client-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Client-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Client-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Client-v4 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Server Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Server-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Server-noTSX-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Server-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Server-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Server-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Server-v4 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Skylake-Server-v5 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Snowridge Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Snowridge-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Snowridge-v2 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Snowridge-v3 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Snowridge-v4 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Westmere Nov 26 04:30:45 localhost nova_compute[228799]: Westmere-IBRS Nov 26 04:30:45 localhost nova_compute[228799]: Westmere-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Westmere-v2 Nov 26 04:30:45 localhost nova_compute[228799]: athlon Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: athlon-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: core2duo Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: core2duo-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: coreduo Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: coreduo-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: kvm32 Nov 26 04:30:45 localhost nova_compute[228799]: kvm32-v1 Nov 26 04:30:45 localhost nova_compute[228799]: kvm64 Nov 26 04:30:45 localhost nova_compute[228799]: kvm64-v1 Nov 26 04:30:45 localhost nova_compute[228799]: n270 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: n270-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: pentium Nov 26 04:30:45 localhost nova_compute[228799]: pentium-v1 Nov 26 04:30:45 localhost nova_compute[228799]: pentium2 Nov 26 04:30:45 localhost nova_compute[228799]: pentium2-v1 Nov 26 04:30:45 localhost nova_compute[228799]: pentium3 Nov 26 04:30:45 localhost nova_compute[228799]: pentium3-v1 Nov 26 04:30:45 localhost nova_compute[228799]: phenom Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: phenom-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: qemu32 Nov 26 04:30:45 localhost nova_compute[228799]: qemu32-v1 Nov 26 04:30:45 localhost nova_compute[228799]: qemu64 Nov 26 04:30:45 localhost nova_compute[228799]: qemu64-v1 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: file Nov 26 04:30:45 localhost nova_compute[228799]: anonymous Nov 26 04:30:45 localhost nova_compute[228799]: memfd Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: disk Nov 26 04:30:45 localhost nova_compute[228799]: cdrom Nov 26 04:30:45 localhost nova_compute[228799]: floppy Nov 26 04:30:45 localhost nova_compute[228799]: lun Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: ide Nov 26 04:30:45 localhost nova_compute[228799]: fdc Nov 26 04:30:45 localhost nova_compute[228799]: scsi Nov 26 04:30:45 localhost nova_compute[228799]: virtio Nov 26 04:30:45 localhost nova_compute[228799]: usb Nov 26 04:30:45 localhost nova_compute[228799]: sata Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: virtio Nov 26 04:30:45 localhost nova_compute[228799]: virtio-transitional Nov 26 04:30:45 localhost nova_compute[228799]: virtio-non-transitional Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: vnc Nov 26 04:30:45 localhost nova_compute[228799]: egl-headless Nov 26 04:30:45 localhost nova_compute[228799]: dbus Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: subsystem Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: default Nov 26 04:30:45 localhost nova_compute[228799]: mandatory Nov 26 04:30:45 localhost nova_compute[228799]: requisite Nov 26 04:30:45 localhost nova_compute[228799]: optional Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: usb Nov 26 04:30:45 localhost nova_compute[228799]: pci Nov 26 04:30:45 localhost nova_compute[228799]: scsi Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: virtio Nov 26 04:30:45 localhost nova_compute[228799]: virtio-transitional Nov 26 04:30:45 localhost nova_compute[228799]: virtio-non-transitional Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: random Nov 26 04:30:45 localhost nova_compute[228799]: egd Nov 26 04:30:45 localhost nova_compute[228799]: builtin Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: path Nov 26 04:30:45 localhost nova_compute[228799]: handle Nov 26 04:30:45 localhost nova_compute[228799]: virtiofs Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: tpm-tis Nov 26 04:30:45 localhost nova_compute[228799]: tpm-crb Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: emulator Nov 26 04:30:45 localhost nova_compute[228799]: external Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: 2.0 Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: usb Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: pty Nov 26 04:30:45 localhost nova_compute[228799]: unix Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: qemu Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: builtin Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: default Nov 26 04:30:45 localhost nova_compute[228799]: passt Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: isa Nov 26 04:30:45 localhost nova_compute[228799]: hyperv Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: null Nov 26 04:30:45 localhost nova_compute[228799]: vc Nov 26 04:30:45 localhost nova_compute[228799]: pty Nov 26 04:30:45 localhost nova_compute[228799]: dev Nov 26 04:30:45 localhost nova_compute[228799]: file Nov 26 04:30:45 localhost nova_compute[228799]: pipe Nov 26 04:30:45 localhost nova_compute[228799]: stdio Nov 26 04:30:45 localhost nova_compute[228799]: udp Nov 26 04:30:45 localhost nova_compute[228799]: tcp Nov 26 04:30:45 localhost nova_compute[228799]: unix Nov 26 04:30:45 localhost nova_compute[228799]: qemu-vdagent Nov 26 04:30:45 localhost nova_compute[228799]: dbus Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: relaxed Nov 26 04:30:45 localhost nova_compute[228799]: vapic Nov 26 04:30:45 localhost nova_compute[228799]: spinlocks Nov 26 04:30:45 localhost nova_compute[228799]: vpindex Nov 26 04:30:45 localhost nova_compute[228799]: runtime Nov 26 04:30:45 localhost nova_compute[228799]: synic Nov 26 04:30:45 localhost nova_compute[228799]: stimer Nov 26 04:30:45 localhost nova_compute[228799]: reset Nov 26 04:30:45 localhost nova_compute[228799]: vendor_id Nov 26 04:30:45 localhost nova_compute[228799]: frequencies Nov 26 04:30:45 localhost nova_compute[228799]: reenlightenment Nov 26 04:30:45 localhost nova_compute[228799]: tlbflush Nov 26 04:30:45 localhost nova_compute[228799]: ipi Nov 26 04:30:45 localhost nova_compute[228799]: avic Nov 26 04:30:45 localhost nova_compute[228799]: emsr_bitmap Nov 26 04:30:45 localhost nova_compute[228799]: xmm_input Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: 4095 Nov 26 04:30:45 localhost nova_compute[228799]: on Nov 26 04:30:45 localhost nova_compute[228799]: off Nov 26 04:30:45 localhost nova_compute[228799]: off Nov 26 04:30:45 localhost nova_compute[228799]: Linux KVM Hv Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: tdx Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: Nov 26 04:30:45 localhost nova_compute[228799]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.242 228803 DEBUG nova.virt.libvirt.host [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.242 228803 INFO nova.virt.libvirt.host [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] Secure Boot support detected#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.245 228803 INFO nova.virt.libvirt.driver [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.245 228803 INFO nova.virt.libvirt.driver [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.258 228803 DEBUG nova.virt.libvirt.driver [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.300 228803 INFO nova.virt.node [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] Determined node identity 05276789-7461-410b-9529-16f5185a8bff from /var/lib/nova/compute_id#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.328 228803 DEBUG nova.compute.manager [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] Verified node 05276789-7461-410b-9529-16f5185a8bff matches my host np0005536118.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.382 228803 DEBUG nova.compute.manager [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.386 228803 DEBUG nova.virt.libvirt.vif [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-26T08:29:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=,hidden=False,host='np0005536118.localdomain',hostname='test',id=2,image_ref='7ebee4f6-b3ad-441d-abd0-239ae838ae37',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-11-26T08:29:20Z,launched_on='np0005536118.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=,node='np0005536118.localdomain',numa_topology=None,old_flavor=,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='b2fe3cd6f6ea49b8a2de01b236dd92e3',ramdisk_id='',reservation_id='r-hokjvvqr',resources=,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata=,tags=,task_state=None,terminated_at=None,trusted_certs=,updated_at=2025-11-26T08:29:20Z,user_data=None,user_id='9f8fafc3f43241c3a71039595891ea0e',uuid=9d78bef9-6977-4fb5-b50b-ae75124e73af,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.386 228803 DEBUG nova.network.os_vif_util [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] Converting VIF {"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.387 228803 DEBUG nova.network.os_vif_util [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8c:0f:d8,bridge_name='br-int',has_traffic_filtering=True,id=5afdc9d0-9595-4904-b83b-3d24f739ffec,network=Network(3633976c-3aa0-4c4a-aa49-e8224cd25e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5afdc9d0-95') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.387 228803 DEBUG os_vif [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8c:0f:d8,bridge_name='br-int',has_traffic_filtering=True,id=5afdc9d0-9595-4904-b83b-3d24f739ffec,network=Network(3633976c-3aa0-4c4a-aa49-e8224cd25e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5afdc9d0-95') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.419 228803 DEBUG ovsdbapp.backend.ovs_idl [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.419 228803 DEBUG ovsdbapp.backend.ovs_idl [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.420 228803 DEBUG ovsdbapp.backend.ovs_idl [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.420 228803 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.420 228803 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.421 228803 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.421 228803 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.422 228803 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.424 228803 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.436 228803 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.436 228803 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.437 228803 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.437 228803 INFO oslo.privsep.daemon [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp4mdjhhaw/privsep.sock']#033[00m Nov 26 04:30:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38671 DF PROTO=TCP SPT=57044 DPT=9105 SEQ=3451231984 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BD5D200000000001030307) Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.957 228803 INFO oslo.privsep.daemon [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.873 229224 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.878 229224 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.882 229224 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m Nov 26 04:30:45 localhost nova_compute[228799]: 2025-11-26 09:30:45.882 229224 INFO oslo.privsep.daemon [-] privsep daemon running as pid 229224#033[00m Nov 26 04:30:46 localhost nova_compute[228799]: 2025-11-26 09:30:46.270 228803 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:30:46 localhost nova_compute[228799]: 2025-11-26 09:30:46.271 228803 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5afdc9d0-95, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:30:46 localhost nova_compute[228799]: 2025-11-26 09:30:46.272 228803 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5afdc9d0-95, col_values=(('external_ids', {'iface-id': '5afdc9d0-9595-4904-b83b-3d24f739ffec', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8c:0f:d8', 'vm-uuid': '9d78bef9-6977-4fb5-b50b-ae75124e73af'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:30:46 localhost nova_compute[228799]: 2025-11-26 09:30:46.273 228803 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 26 04:30:46 localhost nova_compute[228799]: 2025-11-26 09:30:46.273 228803 INFO os_vif [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8c:0f:d8,bridge_name='br-int',has_traffic_filtering=True,id=5afdc9d0-9595-4904-b83b-3d24f739ffec,network=Network(3633976c-3aa0-4c4a-aa49-e8224cd25e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5afdc9d0-95')#033[00m Nov 26 04:30:46 localhost nova_compute[228799]: 2025-11-26 09:30:46.274 228803 DEBUG nova.compute.manager [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 26 04:30:46 localhost nova_compute[228799]: 2025-11-26 09:30:46.277 228803 DEBUG nova.compute.manager [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304#033[00m Nov 26 04:30:46 localhost nova_compute[228799]: 2025-11-26 09:30:46.278 228803 INFO nova.compute.manager [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m Nov 26 04:30:46 localhost python3.9[229282]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None Nov 26 04:30:46 localhost systemd-journald[47778]: Field hash table of /run/log/journal/ea6370aa35b896eb1e7cdbd81aa316d7/system.journal has a fill level at 121.3 (404 of 333 items), suggesting rotation. Nov 26 04:30:46 localhost systemd-journald[47778]: /run/log/journal/ea6370aa35b896eb1e7cdbd81aa316d7/system.journal: Journal header limits reached or header out-of-date, rotating. Nov 26 04:30:46 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 26 04:30:46 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 26 04:30:46 localhost sshd[229325]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:30:46 localhost nova_compute[228799]: 2025-11-26 09:30:46.735 228803 INFO nova.service [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] Updating service version for nova-compute on np0005536118.localdomain from 57 to 66#033[00m Nov 26 04:30:46 localhost nova_compute[228799]: 2025-11-26 09:30:46.798 228803 DEBUG oslo_concurrency.lockutils [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:30:46 localhost nova_compute[228799]: 2025-11-26 09:30:46.799 228803 DEBUG oslo_concurrency.lockutils [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:30:46 localhost nova_compute[228799]: 2025-11-26 09:30:46.799 228803 DEBUG oslo_concurrency.lockutils [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:30:46 localhost nova_compute[228799]: 2025-11-26 09:30:46.800 228803 DEBUG nova.compute.resource_tracker [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 04:30:46 localhost nova_compute[228799]: 2025-11-26 09:30:46.801 228803 DEBUG oslo_concurrency.processutils [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:30:47 localhost nova_compute[228799]: 2025-11-26 09:30:47.288 228803 DEBUG oslo_concurrency.processutils [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:30:47 localhost python3.9[229439]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 26 04:30:47 localhost nova_compute[228799]: 2025-11-26 09:30:47.369 228803 DEBUG nova.virt.libvirt.driver [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:30:47 localhost nova_compute[228799]: 2025-11-26 09:30:47.369 228803 DEBUG nova.virt.libvirt.driver [None req-66d9efaa-f272-4755-aa29-e0e78cd498bd - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:30:47 localhost systemd[1]: Started libvirt nodedev daemon. Nov 26 04:30:47 localhost systemd[1]: Stopping nova_compute container... Nov 26 04:30:47 localhost nova_compute[228799]: 2025-11-26 09:30:47.469 228803 DEBUG oslo_concurrency.lockutils [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:30:47 localhost nova_compute[228799]: 2025-11-26 09:30:47.469 228803 DEBUG oslo_concurrency.lockutils [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:30:47 localhost nova_compute[228799]: 2025-11-26 09:30:47.470 228803 DEBUG oslo_concurrency.lockutils [None req-6de32ac9-14af-4cb7-ba4e-40716261cf50 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:30:47 localhost nova_compute[228799]: 2025-11-26 09:30:47.497 228803 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170#033[00m Nov 26 04:30:47 localhost journal[202976]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, ) Nov 26 04:30:47 localhost journal[202976]: hostname: np0005536118.localdomain Nov 26 04:30:47 localhost journal[202976]: End of file while reading data: Input/output error Nov 26 04:30:47 localhost systemd[1]: libpod-d28800f3f46d9c1297abef0c0a14c0459ac4b900bed0df41b879062363102b98.scope: Deactivated successfully. Nov 26 04:30:47 localhost systemd[1]: libpod-d28800f3f46d9c1297abef0c0a14c0459ac4b900bed0df41b879062363102b98.scope: Consumed 4.039s CPU time. Nov 26 04:30:47 localhost podman[229446]: 2025-11-26 09:30:47.85966563 +0000 UTC m=+0.456627226 container died d28800f3f46d9c1297abef0c0a14c0459ac4b900bed0df41b879062363102b98 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute) Nov 26 04:30:47 localhost systemd[1]: tmp-crun.yBeeVg.mount: Deactivated successfully. Nov 26 04:30:47 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d28800f3f46d9c1297abef0c0a14c0459ac4b900bed0df41b879062363102b98-userdata-shm.mount: Deactivated successfully. Nov 26 04:30:48 localhost systemd[1]: var-lib-containers-storage-overlay-be414c140a3c518600d6946a87c4cbe7ac9519e95399f3f816da17ed7bc9185b-merged.mount: Deactivated successfully. Nov 26 04:30:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38673 DF PROTO=TCP SPT=57044 DPT=9105 SEQ=3451231984 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BD693C0000000001030307) Nov 26 04:30:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:30:50 localhost systemd[1]: tmp-crun.jsRZuV.mount: Deactivated successfully. Nov 26 04:30:50 localhost podman[229753]: 2025-11-26 09:30:50.643112238 +0000 UTC m=+0.655749016 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 26 04:30:50 localhost podman[229753]: 2025-11-26 09:30:50.909242259 +0000 UTC m=+0.921878987 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.build-date=20251118) Nov 26 04:30:52 localhost podman[229446]: 2025-11-26 09:30:52.383369918 +0000 UTC m=+4.980331434 container cleanup d28800f3f46d9c1297abef0c0a14c0459ac4b900bed0df41b879062363102b98 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=nova_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_managed=true, org.label-schema.build-date=20251118, config_id=edpm) Nov 26 04:30:52 localhost podman[229446]: nova_compute Nov 26 04:30:52 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:30:52 localhost podman[229786]: error opening file `/run/crun/d28800f3f46d9c1297abef0c0a14c0459ac4b900bed0df41b879062363102b98/status`: No such file or directory Nov 26 04:30:52 localhost podman[229773]: 2025-11-26 09:30:52.496372305 +0000 UTC m=+0.088156330 container cleanup d28800f3f46d9c1297abef0c0a14c0459ac4b900bed0df41b879062363102b98 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:30:52 localhost podman[229773]: nova_compute Nov 26 04:30:52 localhost systemd[1]: edpm_nova_compute.service: Deactivated successfully. Nov 26 04:30:52 localhost systemd[1]: Stopped nova_compute container. Nov 26 04:30:52 localhost systemd[1]: Starting nova_compute container... Nov 26 04:30:52 localhost systemd[1]: Started libcrun container. Nov 26 04:30:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be414c140a3c518600d6946a87c4cbe7ac9519e95399f3f816da17ed7bc9185b/merged/etc/nvme supports timestamps until 2038 (0x7fffffff) Nov 26 04:30:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be414c140a3c518600d6946a87c4cbe7ac9519e95399f3f816da17ed7bc9185b/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Nov 26 04:30:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be414c140a3c518600d6946a87c4cbe7ac9519e95399f3f816da17ed7bc9185b/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Nov 26 04:30:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be414c140a3c518600d6946a87c4cbe7ac9519e95399f3f816da17ed7bc9185b/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 26 04:30:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be414c140a3c518600d6946a87c4cbe7ac9519e95399f3f816da17ed7bc9185b/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Nov 26 04:30:52 localhost podman[229788]: 2025-11-26 09:30:52.646328777 +0000 UTC m=+0.124285459 container init d28800f3f46d9c1297abef0c0a14c0459ac4b900bed0df41b879062363102b98 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=nova_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm) Nov 26 04:30:52 localhost nova_compute[229802]: + sudo -E kolla_set_configs Nov 26 04:30:52 localhost podman[229788]: 2025-11-26 09:30:52.659310444 +0000 UTC m=+0.137267086 container start d28800f3f46d9c1297abef0c0a14c0459ac4b900bed0df41b879062363102b98 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=nova_compute, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:30:52 localhost podman[229788]: nova_compute Nov 26 04:30:52 localhost systemd[1]: Started nova_compute container. Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Validating config file Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Copying service configuration files Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Deleting /etc/nova/nova.conf Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Setting permission for /etc/nova/nova.conf Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Deleting /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Deleting /etc/ceph Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Creating directory /etc/ceph Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Setting permission for /etc/ceph Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Deleting /var/lib/nova/.ssh/config Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Deleting /usr/sbin/iscsiadm Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Writing out command to execute Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Nov 26 04:30:52 localhost nova_compute[229802]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Nov 26 04:30:52 localhost nova_compute[229802]: ++ cat /run_command Nov 26 04:30:52 localhost nova_compute[229802]: + CMD=nova-compute Nov 26 04:30:52 localhost nova_compute[229802]: + ARGS= Nov 26 04:30:52 localhost nova_compute[229802]: + sudo kolla_copy_cacerts Nov 26 04:30:52 localhost nova_compute[229802]: + [[ ! -n '' ]] Nov 26 04:30:52 localhost nova_compute[229802]: + . kolla_extend_start Nov 26 04:30:52 localhost nova_compute[229802]: Running command: 'nova-compute' Nov 26 04:30:52 localhost nova_compute[229802]: + echo 'Running command: '\''nova-compute'\''' Nov 26 04:30:52 localhost nova_compute[229802]: + umask 0022 Nov 26 04:30:52 localhost nova_compute[229802]: + exec nova-compute Nov 26 04:30:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38674 DF PROTO=TCP SPT=57044 DPT=9105 SEQ=3451231984 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BD78FC0000000001030307) Nov 26 04:30:53 localhost python3.9[229923]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None Nov 26 04:30:53 localhost systemd[1]: Started libpod-conmon-3f032c1307c1383e9e9f0aa4849db71c945330a8cb483b4401270616e20b08e9.scope. Nov 26 04:30:53 localhost systemd[1]: Started libcrun container. Nov 26 04:30:53 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1e67a08f7f89bb249239464cd2488dcb0276d30630b75fe760d996b7617d582f/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff) Nov 26 04:30:53 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1e67a08f7f89bb249239464cd2488dcb0276d30630b75fe760d996b7617d582f/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff) Nov 26 04:30:53 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1e67a08f7f89bb249239464cd2488dcb0276d30630b75fe760d996b7617d582f/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 26 04:30:53 localhost podman[229949]: 2025-11-26 09:30:53.781205119 +0000 UTC m=+0.120077074 container init 3f032c1307c1383e9e9f0aa4849db71c945330a8cb483b4401270616e20b08e9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=nova_compute_init, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Nov 26 04:30:53 localhost podman[229949]: 2025-11-26 09:30:53.792363077 +0000 UTC m=+0.131235022 container start 3f032c1307c1383e9e9f0aa4849db71c945330a8cb483b4401270616e20b08e9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_id=edpm, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 26 04:30:53 localhost python3.9[229923]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init Nov 26 04:30:53 localhost nova_compute_init[229969]: INFO:nova_statedir:Applying nova statedir ownership Nov 26 04:30:53 localhost nova_compute_init[229969]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436 Nov 26 04:30:53 localhost nova_compute_init[229969]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/ Nov 26 04:30:53 localhost nova_compute_init[229969]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436 Nov 26 04:30:53 localhost nova_compute_init[229969]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0 Nov 26 04:30:53 localhost nova_compute_init[229969]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/ Nov 26 04:30:53 localhost nova_compute_init[229969]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436 Nov 26 04:30:53 localhost nova_compute_init[229969]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0 Nov 26 04:30:53 localhost nova_compute_init[229969]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/9d78bef9-6977-4fb5-b50b-ae75124e73af/ Nov 26 04:30:53 localhost nova_compute_init[229969]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/9d78bef9-6977-4fb5-b50b-ae75124e73af already 42436:42436 Nov 26 04:30:53 localhost nova_compute_init[229969]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/9d78bef9-6977-4fb5-b50b-ae75124e73af to system_u:object_r:container_file_t:s0 Nov 26 04:30:53 localhost nova_compute_init[229969]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/instances/9d78bef9-6977-4fb5-b50b-ae75124e73af/console.log Nov 26 04:30:53 localhost nova_compute_init[229969]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ Nov 26 04:30:53 localhost nova_compute_init[229969]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/_base already 42436:42436 Nov 26 04:30:53 localhost nova_compute_init[229969]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/_base to system_u:object_r:container_file_t:s0 Nov 26 04:30:53 localhost nova_compute_init[229969]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ed49784906b83c1a7713dc04a5e33f72ee029af6 Nov 26 04:30:53 localhost nova_compute_init[229969]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ephemeral_1_0706d66 Nov 26 04:30:53 localhost nova_compute_init[229969]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/ Nov 26 04:30:53 localhost nova_compute_init[229969]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/locks already 42436:42436 Nov 26 04:30:53 localhost nova_compute_init[229969]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/locks to system_u:object_r:container_file_t:s0 Nov 26 04:30:53 localhost nova_compute_init[229969]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-ed49784906b83c1a7713dc04a5e33f72ee029af6 Nov 26 04:30:53 localhost nova_compute_init[229969]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-ephemeral_1_0706d66 Nov 26 04:30:53 localhost nova_compute_init[229969]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/delay-nova-compute Nov 26 04:30:53 localhost nova_compute_init[229969]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ Nov 26 04:30:53 localhost nova_compute_init[229969]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436 Nov 26 04:30:53 localhost nova_compute_init[229969]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0 Nov 26 04:30:53 localhost nova_compute_init[229969]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey Nov 26 04:30:53 localhost nova_compute_init[229969]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config Nov 26 04:30:53 localhost nova_compute_init[229969]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/ Nov 26 04:30:53 localhost nova_compute_init[229969]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache already 42436:42436 Nov 26 04:30:53 localhost nova_compute_init[229969]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache to system_u:object_r:container_file_t:s0 Nov 26 04:30:53 localhost nova_compute_init[229969]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/ Nov 26 04:30:53 localhost nova_compute_init[229969]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache/python-entrypoints already 42436:42436 Nov 26 04:30:53 localhost nova_compute_init[229969]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache/python-entrypoints to system_u:object_r:container_file_t:s0 Nov 26 04:30:53 localhost nova_compute_init[229969]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/b234715fc878456b41e32c4fbc669b417044dbe6c6684bbc9059e5c93396ffea Nov 26 04:30:53 localhost nova_compute_init[229969]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/4143dbbec5b08621aa3c8eb364f8a7d3e97604e18b7ed41c4bab0da11ed561fd Nov 26 04:30:53 localhost nova_compute_init[229969]: INFO:nova_statedir:Nova statedir ownership complete Nov 26 04:30:53 localhost systemd[1]: libpod-3f032c1307c1383e9e9f0aa4849db71c945330a8cb483b4401270616e20b08e9.scope: Deactivated successfully. Nov 26 04:30:53 localhost podman[229983]: 2025-11-26 09:30:53.981062934 +0000 UTC m=+0.109561807 container died 3f032c1307c1383e9e9f0aa4849db71c945330a8cb483b4401270616e20b08e9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=nova_compute_init, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 26 04:30:54 localhost podman[229983]: 2025-11-26 09:30:54.006435318 +0000 UTC m=+0.134934151 container cleanup 3f032c1307c1383e9e9f0aa4849db71c945330a8cb483b4401270616e20b08e9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.license=GPLv2) Nov 26 04:30:54 localhost systemd[1]: libpod-conmon-3f032c1307c1383e9e9f0aa4849db71c945330a8cb483b4401270616e20b08e9.scope: Deactivated successfully. Nov 26 04:30:54 localhost systemd[1]: var-lib-containers-storage-overlay-1e67a08f7f89bb249239464cd2488dcb0276d30630b75fe760d996b7617d582f-merged.mount: Deactivated successfully. Nov 26 04:30:54 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3f032c1307c1383e9e9f0aa4849db71c945330a8cb483b4401270616e20b08e9-userdata-shm.mount: Deactivated successfully. Nov 26 04:30:54 localhost nova_compute[229802]: 2025-11-26 09:30:54.458 229806 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Nov 26 04:30:54 localhost nova_compute[229802]: 2025-11-26 09:30:54.459 229806 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Nov 26 04:30:54 localhost nova_compute[229802]: 2025-11-26 09:30:54.459 229806 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Nov 26 04:30:54 localhost nova_compute[229802]: 2025-11-26 09:30:54.459 229806 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m Nov 26 04:30:54 localhost nova_compute[229802]: 2025-11-26 09:30:54.581 229806 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:30:54 localhost nova_compute[229802]: 2025-11-26 09:30:54.599 229806 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.019s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:30:54 localhost nova_compute[229802]: 2025-11-26 09:30:54.600 229806 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m Nov 26 04:30:54 localhost systemd[1]: session-54.scope: Deactivated successfully. Nov 26 04:30:54 localhost systemd[1]: session-54.scope: Consumed 2min 9.318s CPU time. Nov 26 04:30:54 localhost systemd-logind[761]: Session 54 logged out. Waiting for processes to exit. Nov 26 04:30:54 localhost systemd-logind[761]: Removed session 54. Nov 26 04:30:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5040 DF PROTO=TCP SPT=36724 DPT=9102 SEQ=230464318 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BD807C0000000001030307) Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.001 229806 INFO nova.virt.driver [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.114 229806 INFO nova.compute.provider_config [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.121 229806 WARNING nova.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.: nova.exception.TooOldComputeService: Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.121 229806 DEBUG oslo_concurrency.lockutils [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.121 229806 DEBUG oslo_concurrency.lockutils [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.122 229806 DEBUG oslo_concurrency.lockutils [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.122 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.122 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.122 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.122 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.123 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.123 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.123 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] allow_resize_to_same_host = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.123 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] arq_binding_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.123 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] backdoor_port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.123 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] backdoor_socket = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.123 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] block_device_allocate_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.124 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.124 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cert = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.124 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] compute_driver = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.124 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] compute_monitors = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.124 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] config_dir = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.124 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] config_drive_format = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.124 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.125 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.125 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] console_host = np0005536118.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.125 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] control_exchange = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.125 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cpu_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.125 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.125 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.125 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.126 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] default_availability_zone = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.126 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] default_ephemeral_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.126 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.126 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] default_schedule_zone = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.126 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] disk_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.126 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] enable_new_services = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.126 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] enabled_apis = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.127 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] enabled_ssl_apis = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.127 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] flat_injected = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.127 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] force_config_drive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.127 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] force_raw_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.127 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.127 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.128 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] host = np0005536118.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.128 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] initial_cpu_allocation_ratio = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.128 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] initial_disk_allocation_ratio = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.128 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] initial_ram_allocation_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.128 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] injected_network_template = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.128 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] instance_build_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.128 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] instance_delete_interval = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.129 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.129 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] instance_name_template = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.129 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] instance_usage_audit = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.129 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] instance_usage_audit_period = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.129 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.129 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] instances_path = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.129 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.130 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.130 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] live_migration_retry_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.130 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.130 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.130 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.130 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.131 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.131 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.131 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.131 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] log_rotation_type = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.131 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.131 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.132 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.132 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.132 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.132 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] long_rpc_timeout = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.132 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] max_concurrent_builds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.132 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.132 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] max_concurrent_snapshots = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.133 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] max_local_block_devices = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.133 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] max_logfile_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.133 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] max_logfile_size_mb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.133 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.133 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] metadata_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.133 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] metadata_listen_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.134 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] metadata_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.134 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] migrate_max_retries = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.134 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] mkisofs_cmd = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.134 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] my_block_storage_ip = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.134 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] my_ip = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.134 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] network_allocate_retries = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.135 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.135 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] osapi_compute_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.135 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] osapi_compute_listen_port = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.135 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] osapi_compute_unique_server_name_scope = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.135 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] osapi_compute_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.135 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] password_length = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.136 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] periodic_enable = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.136 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] periodic_fuzzy_delay = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.136 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] pointer_model = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.136 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] preallocate_images = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.136 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.136 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] pybasedir = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.136 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] ram_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.137 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.137 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.137 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.137 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] reboot_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.137 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] reclaim_instance_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.138 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] record = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.138 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] reimage_timeout_per_gb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.138 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] report_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.138 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] rescue_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.138 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] reserved_host_cpus = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.139 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] reserved_host_disk_mb = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.139 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] reserved_host_memory_mb = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.139 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] reserved_huge_pages = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.139 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] resize_confirm_window = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.139 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] resize_fs_using_block_device = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.139 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.140 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] rootwrap_config = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.140 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] rpc_response_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.140 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] run_external_periodic_tasks = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.140 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.140 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.140 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.140 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.140 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] service_down_time = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.141 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] servicegroup_driver = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.141 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] shelved_offload_time = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.141 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] shelved_poll_interval = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.141 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.141 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] source_is_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.141 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] ssl_only = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.141 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] state_path = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.142 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] sync_power_state_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.142 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] sync_power_state_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.142 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.142 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] tempdir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.142 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] timeout_nbd = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.142 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.142 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] update_resources_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.143 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] use_cow_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.143 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.143 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.143 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.143 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] use_rootwrap_daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.143 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.143 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.143 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vcpu_pin_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.144 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vif_plugging_is_fatal = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.144 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vif_plugging_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.144 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] virt_mkfs = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.144 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] volume_usage_poll_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.144 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.144 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] web = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.145 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.145 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_concurrency.lock_path = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.145 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.145 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.145 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.145 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.146 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.146 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api.auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.146 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api.compute_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.146 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.146 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api.dhcp_domain = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.146 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api.enable_instance_password = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.146 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api.glance_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.147 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.147 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.147 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.147 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.147 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api.local_metadata_per_cell = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.147 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api.max_limit = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.148 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api.metadata_cache_expiration = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.148 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api.neutron_default_tenant_id = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.148 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api.use_forwarded_for = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.148 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api.use_neutron_default_nets = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.148 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.148 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.149 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.149 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api.vendordata_dynamic_ssl_certfile = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.149 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.149 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api.vendordata_jsonfile_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.149 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api.vendordata_providers = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.149 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cache.backend = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.149 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cache.backend_argument = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.150 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cache.config_prefix = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.150 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cache.dead_timeout = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.150 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cache.debug_cache_backend = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.150 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cache.enable_retry_client = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.150 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cache.enable_socket_keepalive = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.150 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cache.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.150 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cache.expiration_time = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.151 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.151 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cache.hashclient_retry_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.151 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cache.memcache_dead_retry = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.151 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cache.memcache_password = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.151 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.151 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.151 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cache.memcache_pool_maxsize = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.152 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.152 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cache.memcache_sasl_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.152 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cache.memcache_servers = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.152 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cache.memcache_socket_timeout = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.152 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cache.memcache_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.152 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cache.proxies = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.152 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cache.retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.152 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cache.retry_delay = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.153 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cache.socket_keepalive_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.153 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cache.socket_keepalive_idle = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.153 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.153 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cache.tls_allowed_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.153 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cache.tls_cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.153 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cache.tls_certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.153 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cache.tls_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.154 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cache.tls_keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.154 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cinder.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.154 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cinder.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.154 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cinder.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.154 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cinder.catalog_info = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.154 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cinder.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.154 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cinder.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.155 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cinder.cross_az_attach = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.155 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cinder.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.155 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cinder.endpoint_template = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.155 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cinder.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.155 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cinder.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.155 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cinder.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.155 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cinder.os_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.156 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cinder.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.156 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cinder.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.156 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.156 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] compute.cpu_dedicated_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.156 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] compute.cpu_shared_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.156 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.156 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.157 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.157 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.157 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.157 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.157 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.157 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.157 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.158 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] conductor.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.158 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] console.allowed_origins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.158 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] console.ssl_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.158 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] console.ssl_minimum_version = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.158 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] consoleauth.token_ttl = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.158 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cyborg.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.159 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cyborg.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.159 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cyborg.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.159 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cyborg.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.159 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cyborg.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.159 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cyborg.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.159 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cyborg.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.159 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cyborg.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.160 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cyborg.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.160 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cyborg.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.160 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cyborg.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.160 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cyborg.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.160 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cyborg.service_type = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.160 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cyborg.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.160 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cyborg.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.160 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.161 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cyborg.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.161 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cyborg.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.161 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] cyborg.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.161 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.161 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.161 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.161 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.162 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.162 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.162 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.162 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.162 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.162 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.162 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.163 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.163 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.163 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.163 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.163 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.163 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.164 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.164 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.164 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.164 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api_database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.164 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api_database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.164 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api_database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.164 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api_database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.164 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.165 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api_database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.165 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.165 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api_database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.165 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.165 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.165 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api_database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.165 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api_database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.166 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api_database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.166 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api_database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.166 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api_database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.166 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.166 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api_database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.166 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api_database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.166 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api_database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.167 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.167 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] devices.enabled_mdev_types = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.167 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.167 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.167 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.167 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] glance.api_servers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.167 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] glance.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.167 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] glance.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.168 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] glance.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.168 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] glance.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.168 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] glance.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.168 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] glance.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.168 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.168 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.168 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] glance.enable_rbd_download = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.169 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] glance.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.169 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] glance.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.169 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] glance.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.169 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] glance.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.169 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] glance.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.169 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] glance.num_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.169 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] glance.rbd_ceph_conf = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.169 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] glance.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.170 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] glance.rbd_pool = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.170 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] glance.rbd_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.170 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] glance.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.170 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] glance.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.170 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] glance.service_type = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.170 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] glance.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.170 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] glance.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.171 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.171 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] glance.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.171 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] glance.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.171 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.171 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] glance.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.171 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] guestfs.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.171 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] hyperv.config_drive_cdrom = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.172 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.172 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] hyperv.dynamic_memory_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.172 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.172 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] hyperv.enable_remotefx = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.172 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] hyperv.instances_path_share = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.172 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] hyperv.iscsi_initiator_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.173 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] hyperv.limit_cpu_features = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.173 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.173 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.173 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.173 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.173 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] hyperv.qemu_img_cmd = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.173 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] hyperv.use_multipath_io = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.174 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.174 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.174 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] hyperv.vswitch_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.174 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.174 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] mks.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.174 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] mks.mksproxy_base_url = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.175 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] image_cache.manager_interval = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.175 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.175 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.175 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.175 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.175 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] image_cache.subdirectory_name = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.175 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] ironic.api_max_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.176 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] ironic.api_retry_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.176 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.176 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.176 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.176 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.176 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.176 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.177 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.177 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.177 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.177 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.177 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.177 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.177 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] ironic.partition_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.178 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] ironic.peer_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.178 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.178 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.178 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.178 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] ironic.service_type = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.178 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.179 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.179 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.179 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.179 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] ironic.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.179 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.180 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] key_manager.backend = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.180 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] key_manager.fixed_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.180 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] barbican.auth_endpoint = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.180 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] barbican.barbican_api_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.180 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] barbican.barbican_endpoint = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.180 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.180 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] barbican.barbican_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.180 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] barbican.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.181 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] barbican.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.181 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] barbican.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.181 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] barbican.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.181 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] barbican.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.181 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] barbican.number_of_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.181 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] barbican.retry_delay = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.181 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.182 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] barbican.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.182 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] barbican.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.182 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] barbican.verify_ssl = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.182 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] barbican.verify_ssl_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.182 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.182 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.182 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] barbican_service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.182 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.183 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.183 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.183 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] barbican_service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.183 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.183 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] barbican_service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.183 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vault.approle_role_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.183 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vault.approle_secret_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.184 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vault.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.184 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vault.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.184 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vault.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.184 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vault.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.184 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vault.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.184 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vault.kv_mountpoint = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.184 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vault.kv_version = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.184 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vault.namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.185 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vault.root_token_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.185 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vault.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.185 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vault.ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.185 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vault.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.185 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vault.use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.185 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vault.vault_url = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.186 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] keystone.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.186 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] keystone.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.186 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] keystone.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.186 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] keystone.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.186 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] keystone.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.186 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] keystone.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.186 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] keystone.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.186 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] keystone.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.187 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] keystone.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.187 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] keystone.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.187 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] keystone.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.187 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] keystone.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.187 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] keystone.service_type = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.187 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] keystone.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.187 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] keystone.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.188 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.188 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] keystone.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.188 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] keystone.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.188 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] keystone.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.188 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.connection_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.188 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.cpu_mode = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.189 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.cpu_model_extra_flags = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.189 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.cpu_models = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.189 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.189 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.189 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.cpu_power_management = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.190 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.190 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.190 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.device_detach_timeout = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.190 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.disk_cachemodes = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.190 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.disk_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.190 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.enabled_perf_events = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.191 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.file_backed_memory = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.191 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.gid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.191 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.hw_disk_discard = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.191 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.hw_machine_type = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.191 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.images_rbd_ceph_conf = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.192 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.192 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.192 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.192 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.images_rbd_pool = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.192 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.images_type = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.192 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.images_volume_group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.193 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.inject_key = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.193 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.inject_partition = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.193 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.193 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.iscsi_iface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.193 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.iser_use_multipath = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.193 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.193 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.194 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.194 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.194 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.194 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.194 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.194 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.194 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.live_migration_scheme = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.195 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.195 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.195 229806 WARNING oslo_config.cfg [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Nov 26 04:30:55 localhost nova_compute[229802]: live_migration_uri is deprecated for removal in favor of two other options that Nov 26 04:30:55 localhost nova_compute[229802]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Nov 26 04:30:55 localhost nova_compute[229802]: and ``live_migration_inbound_addr`` respectively. Nov 26 04:30:55 localhost nova_compute[229802]: ). Its value may be silently ignored in the future.#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.195 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.live_migration_uri = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.195 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.196 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.max_queues = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.196 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.196 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.nfs_mount_options = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.196 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.nfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.196 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.197 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.num_iser_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.197 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.197 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.197 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.num_pcie_ports = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.197 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.num_volume_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.198 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.pmem_namespaces = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.198 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.quobyte_client_cfg = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.198 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.198 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.198 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.198 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.199 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.rbd_secret_uuid = 0d5e5e6d-3c4b-5efe-8c65-346ae6715606 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.199 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.rbd_user = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.199 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.199 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.199 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.rescue_image_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.200 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.rescue_kernel_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.200 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.rescue_ramdisk_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.200 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.rng_dev_path = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.200 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.rx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.200 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.smbfs_mount_options = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.200 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.201 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.snapshot_compression = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.201 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.snapshot_image_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.201 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.snapshots_directory = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.201 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.201 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.swtpm_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.201 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.swtpm_group = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.201 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.swtpm_user = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.202 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.sysinfo_serial = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.202 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.tx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.202 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.uid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.202 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.202 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.virt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.202 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.volume_clear = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.203 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.volume_clear_size = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.203 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.volume_use_multipath = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.203 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.vzstorage_cache_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.203 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.203 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.vzstorage_mount_group = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.203 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.vzstorage_mount_opts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.203 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.vzstorage_mount_perms = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.204 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.204 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.vzstorage_mount_user = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.204 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.204 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] neutron.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.204 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] neutron.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.204 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] neutron.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.205 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] neutron.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.205 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] neutron.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.205 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] neutron.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.205 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] neutron.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.205 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] neutron.default_floating_pool = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.205 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] neutron.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.205 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.205 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] neutron.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.206 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] neutron.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.206 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] neutron.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.206 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] neutron.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.206 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.206 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] neutron.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.206 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] neutron.ovs_bridge = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.206 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] neutron.physnets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.207 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] neutron.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.207 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.207 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] neutron.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.207 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] neutron.service_type = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.207 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] neutron.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.207 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] neutron.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.207 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.208 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] neutron.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.208 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] neutron.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.208 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] neutron.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.208 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.208 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] notifications.default_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.208 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.208 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.209 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.209 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] pci.alias = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.209 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] pci.device_spec = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.209 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] pci.report_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.209 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.209 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] placement.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.210 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] placement.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.210 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.210 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.210 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.210 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] placement.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.210 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] placement.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.211 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] placement.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.211 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] placement.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.211 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] placement.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.211 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] placement.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.211 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] placement.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.211 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.212 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.212 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] placement.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.212 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] placement.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.212 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] placement.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.212 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] placement.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.212 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] placement.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.212 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] placement.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.213 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] placement.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.213 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] placement.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.213 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] placement.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.213 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] placement.service_type = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.213 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.213 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] placement.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.213 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.213 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] placement.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.214 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.214 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] placement.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.214 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] placement.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.214 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] placement.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.214 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] placement.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.214 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] placement.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.214 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] placement.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.215 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] placement.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.215 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] quota.cores = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.215 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.215 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] quota.driver = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.215 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.215 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.215 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] quota.injected_files = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.216 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] quota.instances = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.216 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] quota.key_pairs = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.216 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] quota.metadata_items = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.216 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] quota.ram = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.216 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] quota.recheck_quota = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.216 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] quota.server_group_members = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.216 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] quota.server_groups = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.216 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] rdp.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.217 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.217 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.217 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.217 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.217 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.217 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] scheduler.max_attempts = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.218 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.218 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.218 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.218 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.218 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.218 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] scheduler.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.218 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.219 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.219 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.219 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.219 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.219 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.219 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.219 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.220 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.220 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.220 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.220 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.220 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.220 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.220 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.220 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.221 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.221 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.221 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.221 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.221 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.221 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.221 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.222 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.222 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] metrics.required = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.222 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] metrics.weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.222 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] metrics.weight_of_unavailable = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.222 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] metrics.weight_setting = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.222 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] serial_console.base_url = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.222 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] serial_console.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.223 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] serial_console.port_range = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.223 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.223 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.223 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.223 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.223 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] service_user.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.223 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.224 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.224 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.224 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.224 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.224 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.224 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.224 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.224 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] spice.agent_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.225 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] spice.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.225 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.225 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] spice.html5proxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.225 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] spice.html5proxy_port = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.225 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] spice.image_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.225 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] spice.jpeg_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.226 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] spice.playback_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.226 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] spice.server_listen = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.226 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.226 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] spice.streaming_mode = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.226 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] spice.zlib_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.226 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] upgrade_levels.baseapi = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.226 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] upgrade_levels.cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.227 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] upgrade_levels.compute = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.227 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] upgrade_levels.conductor = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.227 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] upgrade_levels.scheduler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.227 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.227 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.227 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.227 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.228 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.228 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.228 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.228 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.228 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.228 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.228 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.228 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vmware.cache_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.229 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vmware.cluster_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.229 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vmware.connection_pool_size = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.229 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vmware.console_delay_seconds = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.229 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vmware.datastore_regex = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.229 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vmware.host_ip = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.229 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.229 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.230 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vmware.host_username = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.230 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.230 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vmware.integration_bridge = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.230 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vmware.maximum_objects = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.230 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vmware.pbm_default_policy = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.230 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vmware.pbm_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.230 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vmware.pbm_wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.230 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vmware.serial_log_dir = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.231 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vmware.serial_port_proxy_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.231 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.231 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.231 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vmware.use_linked_clone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.231 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vmware.vnc_keymap = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.231 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vmware.vnc_port = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.231 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vmware.vnc_port_total = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.232 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vnc.auth_schemes = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.232 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vnc.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.232 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vnc.novncproxy_base_url = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.232 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vnc.novncproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.232 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vnc.novncproxy_port = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.232 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vnc.server_listen = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.233 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vnc.server_proxyclient_address = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.233 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vnc.vencrypt_ca_certs = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.233 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vnc.vencrypt_client_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.233 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vnc.vencrypt_client_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.233 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] workarounds.disable_compute_service_check_for_ffu = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.233 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.233 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.233 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.234 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.234 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] workarounds.disable_rootwrap = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.234 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.234 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.234 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.234 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.234 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.235 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.235 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.235 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.235 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.235 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.235 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.235 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.236 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.236 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.236 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.236 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] wsgi.api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.236 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] wsgi.client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.236 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] wsgi.default_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.236 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] wsgi.keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.237 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] wsgi.max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.237 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] wsgi.secure_proxy_ssl_header = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.237 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] wsgi.ssl_ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.237 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] wsgi.ssl_cert_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.237 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] wsgi.ssl_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.237 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] wsgi.tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.237 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.237 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] zvm.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.238 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] zvm.cloud_connector_url = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.238 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] zvm.image_tmp_path = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.238 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] zvm.reachable_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.238 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.238 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_policy.enforce_scope = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.239 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.239 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.239 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.239 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.239 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.239 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.239 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.239 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.240 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.240 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.240 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] remote_debug.host = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.240 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] remote_debug.port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.240 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.240 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.240 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.241 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.241 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.241 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.241 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.241 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.241 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.242 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.242 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.242 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.242 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.242 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.242 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.242 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.242 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.243 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.243 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.243 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.243 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.243 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.243 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.243 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.244 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.244 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.244 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.244 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.244 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.244 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.244 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.244 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.245 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.245 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.245 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.245 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_limit.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.245 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_limit.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.245 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_limit.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.245 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_limit.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.246 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_limit.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.246 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_limit.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.246 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_limit.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.246 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.246 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_limit.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.246 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.246 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_limit.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.246 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_limit.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.247 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_limit.endpoint_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.247 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_limit.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.247 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_limit.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.247 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_limit.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.247 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_limit.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.247 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_limit.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.247 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_limit.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.248 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_limit.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.248 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.248 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_limit.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.248 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_limit.project_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.248 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_limit.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.248 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_limit.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.248 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_limit.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.248 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_limit.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.249 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.249 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.249 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_limit.system_scope = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.249 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_limit.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.249 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_limit.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.249 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_limit.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.249 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_limit.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.250 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_limit.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.250 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_limit.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.250 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_limit.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.250 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_limit.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.250 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.250 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.250 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] oslo_reports.log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.251 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.251 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.251 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.251 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.251 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.251 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.251 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.252 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vif_plug_ovs_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.252 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.252 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.252 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.252 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] vif_plug_ovs_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.252 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.252 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.253 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.253 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.253 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] os_vif_linux_bridge.iptables_top_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.253 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.253 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] os_vif_linux_bridge.use_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.253 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.253 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] os_vif_ovs.isolate_vif = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.254 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] os_vif_ovs.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.254 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] os_vif_ovs.ovs_vsctl_timeout = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.254 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.254 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] os_vif_ovs.ovsdb_interface = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.254 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] os_vif_ovs.per_port_bridge = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.254 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] os_brick.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.254 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.255 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.255 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] privsep_osbrick.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.255 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] privsep_osbrick.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.255 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.255 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] privsep_osbrick.logger_name = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.255 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.255 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] privsep_osbrick.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.255 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.256 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] nova_sys_admin.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.256 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] nova_sys_admin.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.256 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] nova_sys_admin.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.256 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.256 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] nova_sys_admin.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.256 229806 DEBUG oslo_service.service [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.258 229806 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.346 229806 INFO nova.virt.node [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Determined node identity 05276789-7461-410b-9529-16f5185a8bff from /var/lib/nova/compute_id#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.348 229806 DEBUG nova.virt.libvirt.host [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.349 229806 DEBUG nova.virt.libvirt.host [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.350 229806 DEBUG nova.virt.libvirt.host [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.350 229806 DEBUG nova.virt.libvirt.host [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.362 229806 DEBUG nova.virt.libvirt.host [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Registering for lifecycle events _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.365 229806 DEBUG nova.virt.libvirt.host [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Registering for connection events: _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.368 229806 INFO nova.virt.libvirt.driver [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Connection event '1' reason 'None'#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.386 229806 INFO nova.virt.libvirt.host [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Libvirt host capabilities Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: 54d67e25-3d53-4e7f-ba95-c2d307a21761 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: x86_64 Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Rome-v4 Nov 26 04:30:55 localhost nova_compute[229802]: AMD Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: tcp Nov 26 04:30:55 localhost nova_compute[229802]: rdma Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: 16116612 Nov 26 04:30:55 localhost nova_compute[229802]: 4029153 Nov 26 04:30:55 localhost nova_compute[229802]: 0 Nov 26 04:30:55 localhost nova_compute[229802]: 0 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: selinux Nov 26 04:30:55 localhost nova_compute[229802]: 0 Nov 26 04:30:55 localhost nova_compute[229802]: system_u:system_r:svirt_t:s0 Nov 26 04:30:55 localhost nova_compute[229802]: system_u:system_r:svirt_tcg_t:s0 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: dac Nov 26 04:30:55 localhost nova_compute[229802]: 0 Nov 26 04:30:55 localhost nova_compute[229802]: +107:+107 Nov 26 04:30:55 localhost nova_compute[229802]: +107:+107 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: hvm Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: 32 Nov 26 04:30:55 localhost nova_compute[229802]: /usr/libexec/qemu-kvm Nov 26 04:30:55 localhost nova_compute[229802]: pc-i440fx-rhel7.6.0 Nov 26 04:30:55 localhost nova_compute[229802]: pc Nov 26 04:30:55 localhost nova_compute[229802]: pc-q35-rhel9.8.0 Nov 26 04:30:55 localhost nova_compute[229802]: q35 Nov 26 04:30:55 localhost nova_compute[229802]: pc-q35-rhel9.6.0 Nov 26 04:30:55 localhost nova_compute[229802]: pc-q35-rhel8.6.0 Nov 26 04:30:55 localhost nova_compute[229802]: pc-q35-rhel9.4.0 Nov 26 04:30:55 localhost nova_compute[229802]: pc-q35-rhel8.5.0 Nov 26 04:30:55 localhost nova_compute[229802]: pc-q35-rhel8.3.0 Nov 26 04:30:55 localhost nova_compute[229802]: pc-q35-rhel7.6.0 Nov 26 04:30:55 localhost nova_compute[229802]: pc-q35-rhel8.4.0 Nov 26 04:30:55 localhost nova_compute[229802]: pc-q35-rhel9.2.0 Nov 26 04:30:55 localhost nova_compute[229802]: pc-q35-rhel8.2.0 Nov 26 04:30:55 localhost nova_compute[229802]: pc-q35-rhel9.0.0 Nov 26 04:30:55 localhost nova_compute[229802]: pc-q35-rhel8.0.0 Nov 26 04:30:55 localhost nova_compute[229802]: pc-q35-rhel8.1.0 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: hvm Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: 64 Nov 26 04:30:55 localhost nova_compute[229802]: /usr/libexec/qemu-kvm Nov 26 04:30:55 localhost nova_compute[229802]: pc-i440fx-rhel7.6.0 Nov 26 04:30:55 localhost nova_compute[229802]: pc Nov 26 04:30:55 localhost nova_compute[229802]: pc-q35-rhel9.8.0 Nov 26 04:30:55 localhost nova_compute[229802]: q35 Nov 26 04:30:55 localhost nova_compute[229802]: pc-q35-rhel9.6.0 Nov 26 04:30:55 localhost nova_compute[229802]: pc-q35-rhel8.6.0 Nov 26 04:30:55 localhost nova_compute[229802]: pc-q35-rhel9.4.0 Nov 26 04:30:55 localhost nova_compute[229802]: pc-q35-rhel8.5.0 Nov 26 04:30:55 localhost nova_compute[229802]: pc-q35-rhel8.3.0 Nov 26 04:30:55 localhost nova_compute[229802]: pc-q35-rhel7.6.0 Nov 26 04:30:55 localhost nova_compute[229802]: pc-q35-rhel8.4.0 Nov 26 04:30:55 localhost nova_compute[229802]: pc-q35-rhel9.2.0 Nov 26 04:30:55 localhost nova_compute[229802]: pc-q35-rhel8.2.0 Nov 26 04:30:55 localhost nova_compute[229802]: pc-q35-rhel9.0.0 Nov 26 04:30:55 localhost nova_compute[229802]: pc-q35-rhel8.0.0 Nov 26 04:30:55 localhost nova_compute[229802]: pc-q35-rhel8.1.0 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: #033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.395 229806 DEBUG nova.virt.libvirt.host [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.399 229806 DEBUG nova.virt.libvirt.host [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: /usr/libexec/qemu-kvm Nov 26 04:30:55 localhost nova_compute[229802]: kvm Nov 26 04:30:55 localhost nova_compute[229802]: pc-q35-rhel9.8.0 Nov 26 04:30:55 localhost nova_compute[229802]: i686 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: /usr/share/OVMF/OVMF_CODE.secboot.fd Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: rom Nov 26 04:30:55 localhost nova_compute[229802]: pflash Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: yes Nov 26 04:30:55 localhost nova_compute[229802]: no Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: no Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: on Nov 26 04:30:55 localhost nova_compute[229802]: off Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: on Nov 26 04:30:55 localhost nova_compute[229802]: off Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Rome Nov 26 04:30:55 localhost nova_compute[229802]: AMD Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: 486 Nov 26 04:30:55 localhost nova_compute[229802]: 486-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Broadwell Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Broadwell-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Broadwell-noTSX Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Broadwell-noTSX-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Broadwell-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Broadwell-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Broadwell-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Broadwell-v4 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Cascadelake-Server Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Cascadelake-Server-noTSX Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Cascadelake-Server-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Cascadelake-Server-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Cascadelake-Server-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Cascadelake-Server-v4 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Cascadelake-Server-v5 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Conroe Nov 26 04:30:55 localhost nova_compute[229802]: Conroe-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Cooperlake Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Cooperlake-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Cooperlake-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Denverton Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Denverton-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Denverton-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Denverton-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Dhyana Nov 26 04:30:55 localhost nova_compute[229802]: Dhyana-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Dhyana-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Genoa Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Genoa-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-IBPB Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Milan Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Milan-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Milan-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Rome Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Rome-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Rome-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Rome-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Rome-v4 Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-v1 Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-v2 Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-v4 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: GraniteRapids Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: GraniteRapids-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: GraniteRapids-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Haswell Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Haswell-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Haswell-noTSX Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Haswell-noTSX-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Haswell-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Haswell-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Haswell-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Haswell-v4 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Icelake-Server Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Icelake-Server-noTSX Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Icelake-Server-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Icelake-Server-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Icelake-Server-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Icelake-Server-v4 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Icelake-Server-v5 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Icelake-Server-v6 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Icelake-Server-v7 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: IvyBridge Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: IvyBridge-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: IvyBridge-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: IvyBridge-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: KnightsMill Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: KnightsMill-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nehalem Nov 26 04:30:55 localhost nova_compute[229802]: Nehalem-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nehalem-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nehalem-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G1 Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G1-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G2 Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G2-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G3 Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G3-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G4 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G4-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G5 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G5-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Penryn Nov 26 04:30:55 localhost nova_compute[229802]: Penryn-v1 Nov 26 04:30:55 localhost nova_compute[229802]: SandyBridge Nov 26 04:30:55 localhost nova_compute[229802]: SandyBridge-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: SandyBridge-v1 Nov 26 04:30:55 localhost nova_compute[229802]: SandyBridge-v2 Nov 26 04:30:55 localhost nova_compute[229802]: SapphireRapids Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: SapphireRapids-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: SapphireRapids-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: SapphireRapids-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: SierraForest Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: SierraForest-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Client Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Client-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Client-noTSX-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Client-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Client-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Client-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Client-v4 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Server Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Server-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Server-noTSX-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Server-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Server-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Server-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Server-v4 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Server-v5 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Snowridge Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Snowridge-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Snowridge-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Snowridge-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Snowridge-v4 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Westmere Nov 26 04:30:55 localhost nova_compute[229802]: Westmere-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Westmere-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Westmere-v2 Nov 26 04:30:55 localhost nova_compute[229802]: athlon Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: athlon-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: core2duo Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: core2duo-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: coreduo Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: coreduo-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: kvm32 Nov 26 04:30:55 localhost nova_compute[229802]: kvm32-v1 Nov 26 04:30:55 localhost nova_compute[229802]: kvm64 Nov 26 04:30:55 localhost nova_compute[229802]: kvm64-v1 Nov 26 04:30:55 localhost nova_compute[229802]: n270 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: n270-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: pentium Nov 26 04:30:55 localhost nova_compute[229802]: pentium-v1 Nov 26 04:30:55 localhost nova_compute[229802]: pentium2 Nov 26 04:30:55 localhost nova_compute[229802]: pentium2-v1 Nov 26 04:30:55 localhost nova_compute[229802]: pentium3 Nov 26 04:30:55 localhost nova_compute[229802]: pentium3-v1 Nov 26 04:30:55 localhost nova_compute[229802]: phenom Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: phenom-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: qemu32 Nov 26 04:30:55 localhost nova_compute[229802]: qemu32-v1 Nov 26 04:30:55 localhost nova_compute[229802]: qemu64 Nov 26 04:30:55 localhost nova_compute[229802]: qemu64-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: file Nov 26 04:30:55 localhost nova_compute[229802]: anonymous Nov 26 04:30:55 localhost nova_compute[229802]: memfd Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: disk Nov 26 04:30:55 localhost nova_compute[229802]: cdrom Nov 26 04:30:55 localhost nova_compute[229802]: floppy Nov 26 04:30:55 localhost nova_compute[229802]: lun Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: fdc Nov 26 04:30:55 localhost nova_compute[229802]: scsi Nov 26 04:30:55 localhost nova_compute[229802]: virtio Nov 26 04:30:55 localhost nova_compute[229802]: usb Nov 26 04:30:55 localhost nova_compute[229802]: sata Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: virtio Nov 26 04:30:55 localhost nova_compute[229802]: virtio-transitional Nov 26 04:30:55 localhost nova_compute[229802]: virtio-non-transitional Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: vnc Nov 26 04:30:55 localhost nova_compute[229802]: egl-headless Nov 26 04:30:55 localhost nova_compute[229802]: dbus Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: subsystem Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: default Nov 26 04:30:55 localhost nova_compute[229802]: mandatory Nov 26 04:30:55 localhost nova_compute[229802]: requisite Nov 26 04:30:55 localhost nova_compute[229802]: optional Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: usb Nov 26 04:30:55 localhost nova_compute[229802]: pci Nov 26 04:30:55 localhost nova_compute[229802]: scsi Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: virtio Nov 26 04:30:55 localhost nova_compute[229802]: virtio-transitional Nov 26 04:30:55 localhost nova_compute[229802]: virtio-non-transitional Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: random Nov 26 04:30:55 localhost nova_compute[229802]: egd Nov 26 04:30:55 localhost nova_compute[229802]: builtin Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: path Nov 26 04:30:55 localhost nova_compute[229802]: handle Nov 26 04:30:55 localhost nova_compute[229802]: virtiofs Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: tpm-tis Nov 26 04:30:55 localhost nova_compute[229802]: tpm-crb Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: emulator Nov 26 04:30:55 localhost nova_compute[229802]: external Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: 2.0 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: usb Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: pty Nov 26 04:30:55 localhost nova_compute[229802]: unix Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: qemu Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: builtin Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: default Nov 26 04:30:55 localhost nova_compute[229802]: passt Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: isa Nov 26 04:30:55 localhost nova_compute[229802]: hyperv Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: null Nov 26 04:30:55 localhost nova_compute[229802]: vc Nov 26 04:30:55 localhost nova_compute[229802]: pty Nov 26 04:30:55 localhost nova_compute[229802]: dev Nov 26 04:30:55 localhost nova_compute[229802]: file Nov 26 04:30:55 localhost nova_compute[229802]: pipe Nov 26 04:30:55 localhost nova_compute[229802]: stdio Nov 26 04:30:55 localhost nova_compute[229802]: udp Nov 26 04:30:55 localhost nova_compute[229802]: tcp Nov 26 04:30:55 localhost nova_compute[229802]: unix Nov 26 04:30:55 localhost nova_compute[229802]: qemu-vdagent Nov 26 04:30:55 localhost nova_compute[229802]: dbus Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: relaxed Nov 26 04:30:55 localhost nova_compute[229802]: vapic Nov 26 04:30:55 localhost nova_compute[229802]: spinlocks Nov 26 04:30:55 localhost nova_compute[229802]: vpindex Nov 26 04:30:55 localhost nova_compute[229802]: runtime Nov 26 04:30:55 localhost nova_compute[229802]: synic Nov 26 04:30:55 localhost nova_compute[229802]: stimer Nov 26 04:30:55 localhost nova_compute[229802]: reset Nov 26 04:30:55 localhost nova_compute[229802]: vendor_id Nov 26 04:30:55 localhost nova_compute[229802]: frequencies Nov 26 04:30:55 localhost nova_compute[229802]: reenlightenment Nov 26 04:30:55 localhost nova_compute[229802]: tlbflush Nov 26 04:30:55 localhost nova_compute[229802]: ipi Nov 26 04:30:55 localhost nova_compute[229802]: avic Nov 26 04:30:55 localhost nova_compute[229802]: emsr_bitmap Nov 26 04:30:55 localhost nova_compute[229802]: xmm_input Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: 4095 Nov 26 04:30:55 localhost nova_compute[229802]: on Nov 26 04:30:55 localhost nova_compute[229802]: off Nov 26 04:30:55 localhost nova_compute[229802]: off Nov 26 04:30:55 localhost nova_compute[229802]: Linux KVM Hv Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: tdx Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.406 229806 DEBUG nova.virt.libvirt.host [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: /usr/libexec/qemu-kvm Nov 26 04:30:55 localhost nova_compute[229802]: kvm Nov 26 04:30:55 localhost nova_compute[229802]: pc-i440fx-rhel7.6.0 Nov 26 04:30:55 localhost nova_compute[229802]: i686 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: /usr/share/OVMF/OVMF_CODE.secboot.fd Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: rom Nov 26 04:30:55 localhost nova_compute[229802]: pflash Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: yes Nov 26 04:30:55 localhost nova_compute[229802]: no Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: no Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: on Nov 26 04:30:55 localhost nova_compute[229802]: off Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: on Nov 26 04:30:55 localhost nova_compute[229802]: off Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Rome Nov 26 04:30:55 localhost nova_compute[229802]: AMD Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: 486 Nov 26 04:30:55 localhost nova_compute[229802]: 486-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Broadwell Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Broadwell-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Broadwell-noTSX Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Broadwell-noTSX-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Broadwell-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Broadwell-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Broadwell-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Broadwell-v4 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Cascadelake-Server Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Cascadelake-Server-noTSX Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Cascadelake-Server-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Cascadelake-Server-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Cascadelake-Server-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Cascadelake-Server-v4 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Cascadelake-Server-v5 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Conroe Nov 26 04:30:55 localhost nova_compute[229802]: Conroe-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Cooperlake Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Cooperlake-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Cooperlake-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Denverton Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Denverton-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Denverton-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Denverton-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Dhyana Nov 26 04:30:55 localhost nova_compute[229802]: Dhyana-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Dhyana-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Genoa Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Genoa-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-IBPB Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Milan Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Milan-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Milan-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Rome Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Rome-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Rome-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Rome-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Rome-v4 Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-v1 Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-v2 Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-v4 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: GraniteRapids Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: GraniteRapids-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: GraniteRapids-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Haswell Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Haswell-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Haswell-noTSX Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Haswell-noTSX-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Haswell-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Haswell-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Haswell-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Haswell-v4 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Icelake-Server Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Icelake-Server-noTSX Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Icelake-Server-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Icelake-Server-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Icelake-Server-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Icelake-Server-v4 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Icelake-Server-v5 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Icelake-Server-v6 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Icelake-Server-v7 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: IvyBridge Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: IvyBridge-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: IvyBridge-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: IvyBridge-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: KnightsMill Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: KnightsMill-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nehalem Nov 26 04:30:55 localhost nova_compute[229802]: Nehalem-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nehalem-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nehalem-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G1 Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G1-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G2 Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G2-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G3 Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G3-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G4 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G4-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G5 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G5-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Penryn Nov 26 04:30:55 localhost nova_compute[229802]: Penryn-v1 Nov 26 04:30:55 localhost nova_compute[229802]: SandyBridge Nov 26 04:30:55 localhost nova_compute[229802]: SandyBridge-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: SandyBridge-v1 Nov 26 04:30:55 localhost nova_compute[229802]: SandyBridge-v2 Nov 26 04:30:55 localhost nova_compute[229802]: SapphireRapids Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: SapphireRapids-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: SapphireRapids-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: SapphireRapids-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: SierraForest Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: SierraForest-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Client Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Client-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Client-noTSX-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Client-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Client-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Client-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Client-v4 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Server Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Server-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Server-noTSX-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Server-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Server-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Server-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Server-v4 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Server-v5 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Snowridge Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Snowridge-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Snowridge-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Snowridge-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Snowridge-v4 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Westmere Nov 26 04:30:55 localhost nova_compute[229802]: Westmere-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Westmere-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Westmere-v2 Nov 26 04:30:55 localhost nova_compute[229802]: athlon Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: athlon-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: core2duo Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: core2duo-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: coreduo Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: coreduo-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: kvm32 Nov 26 04:30:55 localhost nova_compute[229802]: kvm32-v1 Nov 26 04:30:55 localhost nova_compute[229802]: kvm64 Nov 26 04:30:55 localhost nova_compute[229802]: kvm64-v1 Nov 26 04:30:55 localhost nova_compute[229802]: n270 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: n270-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: pentium Nov 26 04:30:55 localhost nova_compute[229802]: pentium-v1 Nov 26 04:30:55 localhost nova_compute[229802]: pentium2 Nov 26 04:30:55 localhost nova_compute[229802]: pentium2-v1 Nov 26 04:30:55 localhost nova_compute[229802]: pentium3 Nov 26 04:30:55 localhost nova_compute[229802]: pentium3-v1 Nov 26 04:30:55 localhost nova_compute[229802]: phenom Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: phenom-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: qemu32 Nov 26 04:30:55 localhost nova_compute[229802]: qemu32-v1 Nov 26 04:30:55 localhost nova_compute[229802]: qemu64 Nov 26 04:30:55 localhost nova_compute[229802]: qemu64-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: file Nov 26 04:30:55 localhost nova_compute[229802]: anonymous Nov 26 04:30:55 localhost nova_compute[229802]: memfd Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: disk Nov 26 04:30:55 localhost nova_compute[229802]: cdrom Nov 26 04:30:55 localhost nova_compute[229802]: floppy Nov 26 04:30:55 localhost nova_compute[229802]: lun Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: ide Nov 26 04:30:55 localhost nova_compute[229802]: fdc Nov 26 04:30:55 localhost nova_compute[229802]: scsi Nov 26 04:30:55 localhost nova_compute[229802]: virtio Nov 26 04:30:55 localhost nova_compute[229802]: usb Nov 26 04:30:55 localhost nova_compute[229802]: sata Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: virtio Nov 26 04:30:55 localhost nova_compute[229802]: virtio-transitional Nov 26 04:30:55 localhost nova_compute[229802]: virtio-non-transitional Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: vnc Nov 26 04:30:55 localhost nova_compute[229802]: egl-headless Nov 26 04:30:55 localhost nova_compute[229802]: dbus Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: subsystem Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: default Nov 26 04:30:55 localhost nova_compute[229802]: mandatory Nov 26 04:30:55 localhost nova_compute[229802]: requisite Nov 26 04:30:55 localhost nova_compute[229802]: optional Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: usb Nov 26 04:30:55 localhost nova_compute[229802]: pci Nov 26 04:30:55 localhost nova_compute[229802]: scsi Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: virtio Nov 26 04:30:55 localhost nova_compute[229802]: virtio-transitional Nov 26 04:30:55 localhost nova_compute[229802]: virtio-non-transitional Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: random Nov 26 04:30:55 localhost nova_compute[229802]: egd Nov 26 04:30:55 localhost nova_compute[229802]: builtin Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: path Nov 26 04:30:55 localhost nova_compute[229802]: handle Nov 26 04:30:55 localhost nova_compute[229802]: virtiofs Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: tpm-tis Nov 26 04:30:55 localhost nova_compute[229802]: tpm-crb Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: emulator Nov 26 04:30:55 localhost nova_compute[229802]: external Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: 2.0 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: usb Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: pty Nov 26 04:30:55 localhost nova_compute[229802]: unix Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: qemu Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: builtin Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: default Nov 26 04:30:55 localhost nova_compute[229802]: passt Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: isa Nov 26 04:30:55 localhost nova_compute[229802]: hyperv Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: null Nov 26 04:30:55 localhost nova_compute[229802]: vc Nov 26 04:30:55 localhost nova_compute[229802]: pty Nov 26 04:30:55 localhost nova_compute[229802]: dev Nov 26 04:30:55 localhost nova_compute[229802]: file Nov 26 04:30:55 localhost nova_compute[229802]: pipe Nov 26 04:30:55 localhost nova_compute[229802]: stdio Nov 26 04:30:55 localhost nova_compute[229802]: udp Nov 26 04:30:55 localhost nova_compute[229802]: tcp Nov 26 04:30:55 localhost nova_compute[229802]: unix Nov 26 04:30:55 localhost nova_compute[229802]: qemu-vdagent Nov 26 04:30:55 localhost nova_compute[229802]: dbus Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: relaxed Nov 26 04:30:55 localhost nova_compute[229802]: vapic Nov 26 04:30:55 localhost nova_compute[229802]: spinlocks Nov 26 04:30:55 localhost nova_compute[229802]: vpindex Nov 26 04:30:55 localhost nova_compute[229802]: runtime Nov 26 04:30:55 localhost nova_compute[229802]: synic Nov 26 04:30:55 localhost nova_compute[229802]: stimer Nov 26 04:30:55 localhost nova_compute[229802]: reset Nov 26 04:30:55 localhost nova_compute[229802]: vendor_id Nov 26 04:30:55 localhost nova_compute[229802]: frequencies Nov 26 04:30:55 localhost nova_compute[229802]: reenlightenment Nov 26 04:30:55 localhost nova_compute[229802]: tlbflush Nov 26 04:30:55 localhost nova_compute[229802]: ipi Nov 26 04:30:55 localhost nova_compute[229802]: avic Nov 26 04:30:55 localhost nova_compute[229802]: emsr_bitmap Nov 26 04:30:55 localhost nova_compute[229802]: xmm_input Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: 4095 Nov 26 04:30:55 localhost nova_compute[229802]: on Nov 26 04:30:55 localhost nova_compute[229802]: off Nov 26 04:30:55 localhost nova_compute[229802]: off Nov 26 04:30:55 localhost nova_compute[229802]: Linux KVM Hv Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: tdx Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.430 229806 DEBUG nova.virt.libvirt.host [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.437 229806 DEBUG nova.virt.libvirt.host [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: /usr/libexec/qemu-kvm Nov 26 04:30:55 localhost nova_compute[229802]: kvm Nov 26 04:30:55 localhost nova_compute[229802]: pc-i440fx-rhel7.6.0 Nov 26 04:30:55 localhost nova_compute[229802]: x86_64 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: /usr/share/OVMF/OVMF_CODE.secboot.fd Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: rom Nov 26 04:30:55 localhost nova_compute[229802]: pflash Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: yes Nov 26 04:30:55 localhost nova_compute[229802]: no Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: no Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: on Nov 26 04:30:55 localhost nova_compute[229802]: off Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: on Nov 26 04:30:55 localhost nova_compute[229802]: off Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Rome Nov 26 04:30:55 localhost nova_compute[229802]: AMD Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: 486 Nov 26 04:30:55 localhost nova_compute[229802]: 486-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Broadwell Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Broadwell-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Broadwell-noTSX Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Broadwell-noTSX-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Broadwell-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Broadwell-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Broadwell-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Broadwell-v4 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Cascadelake-Server Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Cascadelake-Server-noTSX Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Cascadelake-Server-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Cascadelake-Server-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Cascadelake-Server-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Cascadelake-Server-v4 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Cascadelake-Server-v5 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Conroe Nov 26 04:30:55 localhost nova_compute[229802]: Conroe-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Cooperlake Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Cooperlake-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Cooperlake-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Denverton Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Denverton-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Denverton-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Denverton-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Dhyana Nov 26 04:30:55 localhost nova_compute[229802]: Dhyana-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Dhyana-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Genoa Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Genoa-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-IBPB Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Milan Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Milan-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Milan-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Rome Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Rome-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Rome-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Rome-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Rome-v4 Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-v1 Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-v2 Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-v4 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: GraniteRapids Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: GraniteRapids-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: GraniteRapids-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Haswell Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Haswell-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Haswell-noTSX Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Haswell-noTSX-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Haswell-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Haswell-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Haswell-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Haswell-v4 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Icelake-Server Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Icelake-Server-noTSX Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Icelake-Server-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Icelake-Server-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Icelake-Server-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Icelake-Server-v4 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Icelake-Server-v5 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Icelake-Server-v6 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Icelake-Server-v7 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: IvyBridge Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: IvyBridge-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: IvyBridge-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: IvyBridge-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: KnightsMill Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: KnightsMill-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nehalem Nov 26 04:30:55 localhost nova_compute[229802]: Nehalem-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nehalem-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nehalem-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G1 Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G1-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G2 Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G2-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G3 Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G3-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G4 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G4-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G5 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G5-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Penryn Nov 26 04:30:55 localhost nova_compute[229802]: Penryn-v1 Nov 26 04:30:55 localhost nova_compute[229802]: SandyBridge Nov 26 04:30:55 localhost nova_compute[229802]: SandyBridge-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: SandyBridge-v1 Nov 26 04:30:55 localhost nova_compute[229802]: SandyBridge-v2 Nov 26 04:30:55 localhost nova_compute[229802]: SapphireRapids Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: SapphireRapids-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: SapphireRapids-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: SapphireRapids-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: SierraForest Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: SierraForest-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Client Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Client-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Client-noTSX-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Client-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Client-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Client-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Client-v4 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Server Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Server-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Server-noTSX-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Server-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Server-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Server-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Server-v4 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Server-v5 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Snowridge Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Snowridge-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Snowridge-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Snowridge-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Snowridge-v4 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Westmere Nov 26 04:30:55 localhost nova_compute[229802]: Westmere-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Westmere-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Westmere-v2 Nov 26 04:30:55 localhost nova_compute[229802]: athlon Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: athlon-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: core2duo Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: core2duo-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: coreduo Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: coreduo-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: kvm32 Nov 26 04:30:55 localhost nova_compute[229802]: kvm32-v1 Nov 26 04:30:55 localhost nova_compute[229802]: kvm64 Nov 26 04:30:55 localhost nova_compute[229802]: kvm64-v1 Nov 26 04:30:55 localhost nova_compute[229802]: n270 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: n270-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: pentium Nov 26 04:30:55 localhost nova_compute[229802]: pentium-v1 Nov 26 04:30:55 localhost nova_compute[229802]: pentium2 Nov 26 04:30:55 localhost nova_compute[229802]: pentium2-v1 Nov 26 04:30:55 localhost nova_compute[229802]: pentium3 Nov 26 04:30:55 localhost nova_compute[229802]: pentium3-v1 Nov 26 04:30:55 localhost nova_compute[229802]: phenom Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: phenom-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: qemu32 Nov 26 04:30:55 localhost nova_compute[229802]: qemu32-v1 Nov 26 04:30:55 localhost nova_compute[229802]: qemu64 Nov 26 04:30:55 localhost nova_compute[229802]: qemu64-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: file Nov 26 04:30:55 localhost nova_compute[229802]: anonymous Nov 26 04:30:55 localhost nova_compute[229802]: memfd Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: disk Nov 26 04:30:55 localhost nova_compute[229802]: cdrom Nov 26 04:30:55 localhost nova_compute[229802]: floppy Nov 26 04:30:55 localhost nova_compute[229802]: lun Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: ide Nov 26 04:30:55 localhost nova_compute[229802]: fdc Nov 26 04:30:55 localhost nova_compute[229802]: scsi Nov 26 04:30:55 localhost nova_compute[229802]: virtio Nov 26 04:30:55 localhost nova_compute[229802]: usb Nov 26 04:30:55 localhost nova_compute[229802]: sata Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: virtio Nov 26 04:30:55 localhost nova_compute[229802]: virtio-transitional Nov 26 04:30:55 localhost nova_compute[229802]: virtio-non-transitional Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: vnc Nov 26 04:30:55 localhost nova_compute[229802]: egl-headless Nov 26 04:30:55 localhost nova_compute[229802]: dbus Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: subsystem Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: default Nov 26 04:30:55 localhost nova_compute[229802]: mandatory Nov 26 04:30:55 localhost nova_compute[229802]: requisite Nov 26 04:30:55 localhost nova_compute[229802]: optional Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: usb Nov 26 04:30:55 localhost nova_compute[229802]: pci Nov 26 04:30:55 localhost nova_compute[229802]: scsi Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: virtio Nov 26 04:30:55 localhost nova_compute[229802]: virtio-transitional Nov 26 04:30:55 localhost nova_compute[229802]: virtio-non-transitional Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: random Nov 26 04:30:55 localhost nova_compute[229802]: egd Nov 26 04:30:55 localhost nova_compute[229802]: builtin Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: path Nov 26 04:30:55 localhost nova_compute[229802]: handle Nov 26 04:30:55 localhost nova_compute[229802]: virtiofs Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: tpm-tis Nov 26 04:30:55 localhost nova_compute[229802]: tpm-crb Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: emulator Nov 26 04:30:55 localhost nova_compute[229802]: external Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: 2.0 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: usb Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: pty Nov 26 04:30:55 localhost nova_compute[229802]: unix Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: qemu Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: builtin Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: default Nov 26 04:30:55 localhost nova_compute[229802]: passt Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: isa Nov 26 04:30:55 localhost nova_compute[229802]: hyperv Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: null Nov 26 04:30:55 localhost nova_compute[229802]: vc Nov 26 04:30:55 localhost nova_compute[229802]: pty Nov 26 04:30:55 localhost nova_compute[229802]: dev Nov 26 04:30:55 localhost nova_compute[229802]: file Nov 26 04:30:55 localhost nova_compute[229802]: pipe Nov 26 04:30:55 localhost nova_compute[229802]: stdio Nov 26 04:30:55 localhost nova_compute[229802]: udp Nov 26 04:30:55 localhost nova_compute[229802]: tcp Nov 26 04:30:55 localhost nova_compute[229802]: unix Nov 26 04:30:55 localhost nova_compute[229802]: qemu-vdagent Nov 26 04:30:55 localhost nova_compute[229802]: dbus Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: relaxed Nov 26 04:30:55 localhost nova_compute[229802]: vapic Nov 26 04:30:55 localhost nova_compute[229802]: spinlocks Nov 26 04:30:55 localhost nova_compute[229802]: vpindex Nov 26 04:30:55 localhost nova_compute[229802]: runtime Nov 26 04:30:55 localhost nova_compute[229802]: synic Nov 26 04:30:55 localhost nova_compute[229802]: stimer Nov 26 04:30:55 localhost nova_compute[229802]: reset Nov 26 04:30:55 localhost nova_compute[229802]: vendor_id Nov 26 04:30:55 localhost nova_compute[229802]: frequencies Nov 26 04:30:55 localhost nova_compute[229802]: reenlightenment Nov 26 04:30:55 localhost nova_compute[229802]: tlbflush Nov 26 04:30:55 localhost nova_compute[229802]: ipi Nov 26 04:30:55 localhost nova_compute[229802]: avic Nov 26 04:30:55 localhost nova_compute[229802]: emsr_bitmap Nov 26 04:30:55 localhost nova_compute[229802]: xmm_input Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: 4095 Nov 26 04:30:55 localhost nova_compute[229802]: on Nov 26 04:30:55 localhost nova_compute[229802]: off Nov 26 04:30:55 localhost nova_compute[229802]: off Nov 26 04:30:55 localhost nova_compute[229802]: Linux KVM Hv Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: tdx Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.487 229806 DEBUG nova.virt.libvirt.volume.mount [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.496 229806 DEBUG nova.virt.libvirt.host [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: /usr/libexec/qemu-kvm Nov 26 04:30:55 localhost nova_compute[229802]: kvm Nov 26 04:30:55 localhost nova_compute[229802]: pc-q35-rhel9.8.0 Nov 26 04:30:55 localhost nova_compute[229802]: x86_64 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: efi Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: /usr/share/edk2/ovmf/OVMF_CODE.secboot.fd Nov 26 04:30:55 localhost nova_compute[229802]: /usr/share/edk2/ovmf/OVMF_CODE.fd Nov 26 04:30:55 localhost nova_compute[229802]: /usr/share/edk2/ovmf/OVMF.amdsev.fd Nov 26 04:30:55 localhost nova_compute[229802]: /usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: rom Nov 26 04:30:55 localhost nova_compute[229802]: pflash Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: yes Nov 26 04:30:55 localhost nova_compute[229802]: no Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: yes Nov 26 04:30:55 localhost nova_compute[229802]: no Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: on Nov 26 04:30:55 localhost nova_compute[229802]: off Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: on Nov 26 04:30:55 localhost nova_compute[229802]: off Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Rome Nov 26 04:30:55 localhost nova_compute[229802]: AMD Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: 486 Nov 26 04:30:55 localhost nova_compute[229802]: 486-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Broadwell Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Broadwell-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Broadwell-noTSX Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Broadwell-noTSX-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Broadwell-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Broadwell-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Broadwell-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Broadwell-v4 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Cascadelake-Server Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Cascadelake-Server-noTSX Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Cascadelake-Server-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Cascadelake-Server-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Cascadelake-Server-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Cascadelake-Server-v4 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Cascadelake-Server-v5 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Conroe Nov 26 04:30:55 localhost nova_compute[229802]: Conroe-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Cooperlake Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Cooperlake-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Cooperlake-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Denverton Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Denverton-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Denverton-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Denverton-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Dhyana Nov 26 04:30:55 localhost nova_compute[229802]: Dhyana-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Dhyana-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Genoa Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Genoa-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-IBPB Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Milan Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Milan-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Milan-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Rome Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Rome-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Rome-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Rome-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-Rome-v4 Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-v1 Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-v2 Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: EPYC-v4 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: GraniteRapids Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: GraniteRapids-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: GraniteRapids-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Haswell Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Haswell-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Haswell-noTSX Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Haswell-noTSX-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Haswell-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Haswell-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Haswell-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Haswell-v4 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Icelake-Server Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Icelake-Server-noTSX Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Icelake-Server-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Icelake-Server-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Icelake-Server-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Icelake-Server-v4 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Icelake-Server-v5 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Icelake-Server-v6 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Icelake-Server-v7 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: IvyBridge Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: IvyBridge-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: IvyBridge-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: IvyBridge-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: KnightsMill Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: KnightsMill-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nehalem Nov 26 04:30:55 localhost nova_compute[229802]: Nehalem-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nehalem-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nehalem-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G1 Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G1-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G2 Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G2-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G3 Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G3-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G4 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G4-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G5 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Opteron_G5-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Penryn Nov 26 04:30:55 localhost nova_compute[229802]: Penryn-v1 Nov 26 04:30:55 localhost nova_compute[229802]: SandyBridge Nov 26 04:30:55 localhost nova_compute[229802]: SandyBridge-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: SandyBridge-v1 Nov 26 04:30:55 localhost nova_compute[229802]: SandyBridge-v2 Nov 26 04:30:55 localhost nova_compute[229802]: SapphireRapids Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: SapphireRapids-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: SapphireRapids-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: SapphireRapids-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: SierraForest Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: SierraForest-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Client Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Client-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Client-noTSX-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Client-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Client-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Client-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Client-v4 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Server Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Server-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Server-noTSX-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Server-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Server-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Server-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Server-v4 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Skylake-Server-v5 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Snowridge Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Snowridge-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Snowridge-v2 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Snowridge-v3 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Snowridge-v4 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Westmere Nov 26 04:30:55 localhost nova_compute[229802]: Westmere-IBRS Nov 26 04:30:55 localhost nova_compute[229802]: Westmere-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Westmere-v2 Nov 26 04:30:55 localhost nova_compute[229802]: athlon Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: athlon-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: core2duo Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: core2duo-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: coreduo Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: coreduo-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: kvm32 Nov 26 04:30:55 localhost nova_compute[229802]: kvm32-v1 Nov 26 04:30:55 localhost nova_compute[229802]: kvm64 Nov 26 04:30:55 localhost nova_compute[229802]: kvm64-v1 Nov 26 04:30:55 localhost nova_compute[229802]: n270 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: n270-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: pentium Nov 26 04:30:55 localhost nova_compute[229802]: pentium-v1 Nov 26 04:30:55 localhost nova_compute[229802]: pentium2 Nov 26 04:30:55 localhost nova_compute[229802]: pentium2-v1 Nov 26 04:30:55 localhost nova_compute[229802]: pentium3 Nov 26 04:30:55 localhost nova_compute[229802]: pentium3-v1 Nov 26 04:30:55 localhost nova_compute[229802]: phenom Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: phenom-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: qemu32 Nov 26 04:30:55 localhost nova_compute[229802]: qemu32-v1 Nov 26 04:30:55 localhost nova_compute[229802]: qemu64 Nov 26 04:30:55 localhost nova_compute[229802]: qemu64-v1 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: file Nov 26 04:30:55 localhost nova_compute[229802]: anonymous Nov 26 04:30:55 localhost nova_compute[229802]: memfd Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: disk Nov 26 04:30:55 localhost nova_compute[229802]: cdrom Nov 26 04:30:55 localhost nova_compute[229802]: floppy Nov 26 04:30:55 localhost nova_compute[229802]: lun Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: fdc Nov 26 04:30:55 localhost nova_compute[229802]: scsi Nov 26 04:30:55 localhost nova_compute[229802]: virtio Nov 26 04:30:55 localhost nova_compute[229802]: usb Nov 26 04:30:55 localhost nova_compute[229802]: sata Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: virtio Nov 26 04:30:55 localhost nova_compute[229802]: virtio-transitional Nov 26 04:30:55 localhost nova_compute[229802]: virtio-non-transitional Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: vnc Nov 26 04:30:55 localhost nova_compute[229802]: egl-headless Nov 26 04:30:55 localhost nova_compute[229802]: dbus Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: subsystem Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: default Nov 26 04:30:55 localhost nova_compute[229802]: mandatory Nov 26 04:30:55 localhost nova_compute[229802]: requisite Nov 26 04:30:55 localhost nova_compute[229802]: optional Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: usb Nov 26 04:30:55 localhost nova_compute[229802]: pci Nov 26 04:30:55 localhost nova_compute[229802]: scsi Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: virtio Nov 26 04:30:55 localhost nova_compute[229802]: virtio-transitional Nov 26 04:30:55 localhost nova_compute[229802]: virtio-non-transitional Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: random Nov 26 04:30:55 localhost nova_compute[229802]: egd Nov 26 04:30:55 localhost nova_compute[229802]: builtin Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: path Nov 26 04:30:55 localhost nova_compute[229802]: handle Nov 26 04:30:55 localhost nova_compute[229802]: virtiofs Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: tpm-tis Nov 26 04:30:55 localhost nova_compute[229802]: tpm-crb Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: emulator Nov 26 04:30:55 localhost nova_compute[229802]: external Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: 2.0 Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: usb Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: pty Nov 26 04:30:55 localhost nova_compute[229802]: unix Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: qemu Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: builtin Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: default Nov 26 04:30:55 localhost nova_compute[229802]: passt Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: isa Nov 26 04:30:55 localhost nova_compute[229802]: hyperv Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: null Nov 26 04:30:55 localhost nova_compute[229802]: vc Nov 26 04:30:55 localhost nova_compute[229802]: pty Nov 26 04:30:55 localhost nova_compute[229802]: dev Nov 26 04:30:55 localhost nova_compute[229802]: file Nov 26 04:30:55 localhost nova_compute[229802]: pipe Nov 26 04:30:55 localhost nova_compute[229802]: stdio Nov 26 04:30:55 localhost nova_compute[229802]: udp Nov 26 04:30:55 localhost nova_compute[229802]: tcp Nov 26 04:30:55 localhost nova_compute[229802]: unix Nov 26 04:30:55 localhost nova_compute[229802]: qemu-vdagent Nov 26 04:30:55 localhost nova_compute[229802]: dbus Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: relaxed Nov 26 04:30:55 localhost nova_compute[229802]: vapic Nov 26 04:30:55 localhost nova_compute[229802]: spinlocks Nov 26 04:30:55 localhost nova_compute[229802]: vpindex Nov 26 04:30:55 localhost nova_compute[229802]: runtime Nov 26 04:30:55 localhost nova_compute[229802]: synic Nov 26 04:30:55 localhost nova_compute[229802]: stimer Nov 26 04:30:55 localhost nova_compute[229802]: reset Nov 26 04:30:55 localhost nova_compute[229802]: vendor_id Nov 26 04:30:55 localhost nova_compute[229802]: frequencies Nov 26 04:30:55 localhost nova_compute[229802]: reenlightenment Nov 26 04:30:55 localhost nova_compute[229802]: tlbflush Nov 26 04:30:55 localhost nova_compute[229802]: ipi Nov 26 04:30:55 localhost nova_compute[229802]: avic Nov 26 04:30:55 localhost nova_compute[229802]: emsr_bitmap Nov 26 04:30:55 localhost nova_compute[229802]: xmm_input Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: 4095 Nov 26 04:30:55 localhost nova_compute[229802]: on Nov 26 04:30:55 localhost nova_compute[229802]: off Nov 26 04:30:55 localhost nova_compute[229802]: off Nov 26 04:30:55 localhost nova_compute[229802]: Linux KVM Hv Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: tdx Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: Nov 26 04:30:55 localhost nova_compute[229802]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.549 229806 DEBUG nova.virt.libvirt.host [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.549 229806 DEBUG nova.virt.libvirt.host [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.549 229806 DEBUG nova.virt.libvirt.host [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.550 229806 INFO nova.virt.libvirt.host [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Secure Boot support detected#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.552 229806 INFO nova.virt.libvirt.driver [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.552 229806 INFO nova.virt.libvirt.driver [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.567 229806 DEBUG nova.virt.libvirt.driver [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.592 229806 INFO nova.virt.node [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Determined node identity 05276789-7461-410b-9529-16f5185a8bff from /var/lib/nova/compute_id#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.612 229806 DEBUG nova.compute.manager [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Verified node 05276789-7461-410b-9529-16f5185a8bff matches my host np0005536118.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.647 229806 DEBUG nova.compute.manager [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.652 229806 DEBUG nova.virt.libvirt.vif [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-26T08:29:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=,hidden=False,host='np0005536118.localdomain',hostname='test',id=2,image_ref='7ebee4f6-b3ad-441d-abd0-239ae838ae37',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-11-26T08:29:20Z,launched_on='np0005536118.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=,node='np0005536118.localdomain',numa_topology=None,old_flavor=,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='b2fe3cd6f6ea49b8a2de01b236dd92e3',ramdisk_id='',reservation_id='r-hokjvvqr',resources=,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata=,tags=,task_state=None,terminated_at=None,trusted_certs=,updated_at=2025-11-26T08:29:20Z,user_data=None,user_id='9f8fafc3f43241c3a71039595891ea0e',uuid=9d78bef9-6977-4fb5-b50b-ae75124e73af,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.652 229806 DEBUG nova.network.os_vif_util [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Converting VIF {"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.653 229806 DEBUG nova.network.os_vif_util [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8c:0f:d8,bridge_name='br-int',has_traffic_filtering=True,id=5afdc9d0-9595-4904-b83b-3d24f739ffec,network=Network(3633976c-3aa0-4c4a-aa49-e8224cd25e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5afdc9d0-95') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.654 229806 DEBUG os_vif [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8c:0f:d8,bridge_name='br-int',has_traffic_filtering=True,id=5afdc9d0-9595-4904-b83b-3d24f739ffec,network=Network(3633976c-3aa0-4c4a-aa49-e8224cd25e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5afdc9d0-95') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.698 229806 DEBUG ovsdbapp.backend.ovs_idl [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.698 229806 DEBUG ovsdbapp.backend.ovs_idl [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.699 229806 DEBUG ovsdbapp.backend.ovs_idl [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.699 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.699 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.700 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.700 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.702 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.705 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.722 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.722 229806 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.722 229806 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 26 04:30:55 localhost nova_compute[229802]: 2025-11-26 09:30:55.723 229806 INFO oslo.privsep.daemon [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpq1ejaa39/privsep.sock']#033[00m Nov 26 04:30:56 localhost nova_compute[229802]: 2025-11-26 09:30:56.341 229806 INFO oslo.privsep.daemon [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Nov 26 04:30:56 localhost nova_compute[229802]: 2025-11-26 09:30:56.238 230054 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Nov 26 04:30:56 localhost nova_compute[229802]: 2025-11-26 09:30:56.243 230054 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Nov 26 04:30:56 localhost nova_compute[229802]: 2025-11-26 09:30:56.246 230054 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m Nov 26 04:30:56 localhost nova_compute[229802]: 2025-11-26 09:30:56.246 230054 INFO oslo.privsep.daemon [-] privsep daemon running as pid 230054#033[00m Nov 26 04:30:56 localhost nova_compute[229802]: 2025-11-26 09:30:56.616 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:30:56 localhost nova_compute[229802]: 2025-11-26 09:30:56.617 229806 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5afdc9d0-95, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:30:56 localhost nova_compute[229802]: 2025-11-26 09:30:56.617 229806 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5afdc9d0-95, col_values=(('external_ids', {'iface-id': '5afdc9d0-9595-4904-b83b-3d24f739ffec', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8c:0f:d8', 'vm-uuid': '9d78bef9-6977-4fb5-b50b-ae75124e73af'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:30:56 localhost nova_compute[229802]: 2025-11-26 09:30:56.618 229806 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 26 04:30:56 localhost nova_compute[229802]: 2025-11-26 09:30:56.618 229806 INFO os_vif [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8c:0f:d8,bridge_name='br-int',has_traffic_filtering=True,id=5afdc9d0-9595-4904-b83b-3d24f739ffec,network=Network(3633976c-3aa0-4c4a-aa49-e8224cd25e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5afdc9d0-95')#033[00m Nov 26 04:30:56 localhost nova_compute[229802]: 2025-11-26 09:30:56.618 229806 DEBUG nova.compute.manager [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 26 04:30:56 localhost nova_compute[229802]: 2025-11-26 09:30:56.622 229806 DEBUG nova.compute.manager [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304#033[00m Nov 26 04:30:56 localhost nova_compute[229802]: 2025-11-26 09:30:56.622 229806 INFO nova.compute.manager [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m Nov 26 04:30:56 localhost nova_compute[229802]: 2025-11-26 09:30:56.686 229806 DEBUG oslo_concurrency.lockutils [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:30:56 localhost nova_compute[229802]: 2025-11-26 09:30:56.686 229806 DEBUG oslo_concurrency.lockutils [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:30:56 localhost nova_compute[229802]: 2025-11-26 09:30:56.687 229806 DEBUG oslo_concurrency.lockutils [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:30:56 localhost nova_compute[229802]: 2025-11-26 09:30:56.687 229806 DEBUG nova.compute.resource_tracker [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 04:30:56 localhost nova_compute[229802]: 2025-11-26 09:30:56.687 229806 DEBUG oslo_concurrency.processutils [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:30:57 localhost nova_compute[229802]: 2025-11-26 09:30:57.164 229806 DEBUG oslo_concurrency.processutils [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:30:57 localhost nova_compute[229802]: 2025-11-26 09:30:57.337 229806 DEBUG nova.virt.libvirt.driver [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:30:57 localhost nova_compute[229802]: 2025-11-26 09:30:57.338 229806 DEBUG nova.virt.libvirt.driver [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:30:57 localhost nova_compute[229802]: 2025-11-26 09:30:57.503 229806 WARNING nova.virt.libvirt.driver [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 04:30:57 localhost nova_compute[229802]: 2025-11-26 09:30:57.504 229806 DEBUG nova.compute.resource_tracker [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=12913MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 04:30:57 localhost nova_compute[229802]: 2025-11-26 09:30:57.504 229806 DEBUG oslo_concurrency.lockutils [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:30:57 localhost nova_compute[229802]: 2025-11-26 09:30:57.504 229806 DEBUG oslo_concurrency.lockutils [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:30:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7599 DF PROTO=TCP SPT=51752 DPT=9102 SEQ=2614096758 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BD8BFC0000000001030307) Nov 26 04:30:57 localhost nova_compute[229802]: 2025-11-26 09:30:57.838 229806 DEBUG nova.compute.resource_tracker [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 04:30:57 localhost nova_compute[229802]: 2025-11-26 09:30:57.839 229806 DEBUG nova.compute.resource_tracker [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 04:30:57 localhost nova_compute[229802]: 2025-11-26 09:30:57.839 229806 DEBUG nova.compute.resource_tracker [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 04:30:57 localhost nova_compute[229802]: 2025-11-26 09:30:57.890 229806 DEBUG nova.scheduler.client.report [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Refreshing inventories for resource provider 05276789-7461-410b-9529-16f5185a8bff _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Nov 26 04:30:57 localhost nova_compute[229802]: 2025-11-26 09:30:57.945 229806 DEBUG nova.scheduler.client.report [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Updating ProviderTree inventory for provider 05276789-7461-410b-9529-16f5185a8bff from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Nov 26 04:30:57 localhost nova_compute[229802]: 2025-11-26 09:30:57.946 229806 DEBUG nova.compute.provider_tree [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Updating inventory in ProviderTree for provider 05276789-7461-410b-9529-16f5185a8bff with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Nov 26 04:30:57 localhost nova_compute[229802]: 2025-11-26 09:30:57.963 229806 DEBUG nova.scheduler.client.report [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Refreshing aggregate associations for resource provider 05276789-7461-410b-9529-16f5185a8bff, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Nov 26 04:30:57 localhost nova_compute[229802]: 2025-11-26 09:30:57.981 229806 DEBUG nova.scheduler.client.report [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Refreshing trait associations for resource provider 05276789-7461-410b-9529-16f5185a8bff, traits: COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_FMA3,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_F16C,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_LAN9118,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_CLMUL,COMPUTE_NODE,HW_CPU_X86_SSE4A,COMPUTE_DEVICE_TAGGING,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_AESNI,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_TRUSTED_CERTS,COMPUTE_ACCELERATORS,HW_CPU_X86_AVX _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Nov 26 04:30:58 localhost nova_compute[229802]: 2025-11-26 09:30:58.019 229806 DEBUG oslo_concurrency.processutils [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:30:58 localhost nova_compute[229802]: 2025-11-26 09:30:58.442 229806 DEBUG oslo_concurrency.processutils [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:30:58 localhost nova_compute[229802]: 2025-11-26 09:30:58.449 229806 DEBUG nova.virt.libvirt.host [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N Nov 26 04:30:58 localhost nova_compute[229802]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m Nov 26 04:30:58 localhost nova_compute[229802]: 2025-11-26 09:30:58.449 229806 INFO nova.virt.libvirt.host [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] kernel doesn't support AMD SEV#033[00m Nov 26 04:30:58 localhost nova_compute[229802]: 2025-11-26 09:30:58.451 229806 DEBUG nova.compute.provider_tree [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Updating inventory in ProviderTree for provider 05276789-7461-410b-9529-16f5185a8bff with inventory: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Nov 26 04:30:58 localhost nova_compute[229802]: 2025-11-26 09:30:58.452 229806 DEBUG nova.virt.libvirt.driver [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Nov 26 04:30:58 localhost nova_compute[229802]: 2025-11-26 09:30:58.510 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:30:58 localhost nova_compute[229802]: 2025-11-26 09:30:58.528 229806 DEBUG nova.scheduler.client.report [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Updated inventory for provider 05276789-7461-410b-9529-16f5185a8bff with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m Nov 26 04:30:58 localhost nova_compute[229802]: 2025-11-26 09:30:58.529 229806 DEBUG nova.compute.provider_tree [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Updating resource provider 05276789-7461-410b-9529-16f5185a8bff generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m Nov 26 04:30:58 localhost nova_compute[229802]: 2025-11-26 09:30:58.529 229806 DEBUG nova.compute.provider_tree [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Updating inventory in ProviderTree for provider 05276789-7461-410b-9529-16f5185a8bff with inventory: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Nov 26 04:30:58 localhost nova_compute[229802]: 2025-11-26 09:30:58.620 229806 DEBUG nova.compute.provider_tree [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Updating resource provider 05276789-7461-410b-9529-16f5185a8bff generation from 4 to 5 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m Nov 26 04:30:58 localhost nova_compute[229802]: 2025-11-26 09:30:58.643 229806 DEBUG nova.compute.resource_tracker [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 04:30:58 localhost nova_compute[229802]: 2025-11-26 09:30:58.644 229806 DEBUG oslo_concurrency.lockutils [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:30:58 localhost nova_compute[229802]: 2025-11-26 09:30:58.644 229806 DEBUG nova.service [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m Nov 26 04:30:58 localhost nova_compute[229802]: 2025-11-26 09:30:58.689 229806 DEBUG nova.service [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m Nov 26 04:30:58 localhost nova_compute[229802]: 2025-11-26 09:30:58.690 229806 DEBUG nova.servicegroup.drivers.db [None req-16f9d649-5882-46ac-bd09-673d0610bd72 - - - - - -] DB_Driver: join new ServiceGroup member np0005536118.localdomain to the compute group, service = join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m Nov 26 04:30:59 localhost sshd[230102]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:31:00 localhost systemd-logind[761]: New session 56 of user zuul. Nov 26 04:31:00 localhost systemd[1]: Started Session 56 of User zuul. Nov 26 04:31:00 localhost nova_compute[229802]: 2025-11-26 09:31:00.748 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:31:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=981 DF PROTO=TCP SPT=54418 DPT=9101 SEQ=2875771101 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BD97FC0000000001030307) Nov 26 04:31:01 localhost python3.9[230213]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 04:31:02 localhost python3.9[230327]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 26 04:31:02 localhost systemd[1]: Reloading. Nov 26 04:31:02 localhost systemd-rc-local-generator[230355]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:31:02 localhost systemd-sysv-generator[230359]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:31:02 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:02 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:02 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:02 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:02 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:31:02 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:02 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:02 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:02 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:03 localhost nova_compute[229802]: 2025-11-26 09:31:03.513 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:31:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:31:03.630 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:31:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:31:03.631 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:31:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:31:03.632 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:31:03 localhost python3.9[230471]: ansible-ansible.builtin.service_facts Invoked Nov 26 04:31:03 localhost network[230488]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 26 04:31:03 localhost network[230489]: 'network-scripts' will be removed from distribution in near future. Nov 26 04:31:03 localhost network[230490]: It is advised to switch to 'NetworkManager' instead for network management. Nov 26 04:31:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25625 DF PROTO=TCP SPT=40742 DPT=9101 SEQ=1758960187 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BDA4FC0000000001030307) Nov 26 04:31:05 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:31:05 localhost nova_compute[229802]: 2025-11-26 09:31:05.751 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:31:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:31:07 localhost systemd[1]: tmp-crun.GWS2cE.mount: Deactivated successfully. Nov 26 04:31:07 localhost podman[230613]: 2025-11-26 09:31:07.663201755 +0000 UTC m=+0.096882041 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118) Nov 26 04:31:07 localhost podman[230613]: 2025-11-26 09:31:07.771564853 +0000 UTC m=+0.205245089 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:31:07 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:31:08 localhost nova_compute[229802]: 2025-11-26 09:31:08.517 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:31:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5043 DF PROTO=TCP SPT=36724 DPT=9102 SEQ=230464318 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BDB7FD0000000001030307) Nov 26 04:31:09 localhost python3.9[230751]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:31:10 localhost nova_compute[229802]: 2025-11-26 09:31:10.753 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:31:11 localhost python3.9[230862]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:31:11 localhost systemd-journald[47778]: Field hash table of /run/log/journal/ea6370aa35b896eb1e7cdbd81aa316d7/system.journal has a fill level at 76.3 (254 of 333 items), suggesting rotation. Nov 26 04:31:11 localhost systemd-journald[47778]: /run/log/journal/ea6370aa35b896eb1e7cdbd81aa316d7/system.journal: Journal header limits reached or header out-of-date, rotating. Nov 26 04:31:11 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 26 04:31:11 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 26 04:31:11 localhost python3.9[230973]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:31:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65372 DF PROTO=TCP SPT=41606 DPT=9882 SEQ=2662291759 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BDC5FC0000000001030307) Nov 26 04:31:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:31:12 localhost podman[231083]: 2025-11-26 09:31:12.771532817 +0000 UTC m=+0.087793108 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 26 04:31:12 localhost podman[231083]: 2025-11-26 09:31:12.805389004 +0000 UTC m=+0.121649315 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3) Nov 26 04:31:12 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:31:12 localhost python3.9[231084]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:31:13 localhost nova_compute[229802]: 2025-11-26 09:31:13.518 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:31:13 localhost python3.9[231211]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Nov 26 04:31:14 localhost python3.9[231321]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 26 04:31:14 localhost systemd[1]: Reloading. Nov 26 04:31:14 localhost systemd-sysv-generator[231345]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:31:14 localhost systemd-rc-local-generator[231342]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:31:14 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:14 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:14 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:14 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:14 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:31:14 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:14 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:14 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:14 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:15 localhost python3.9[231466]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:31:15 localhost nova_compute[229802]: 2025-11-26 09:31:15.757 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:31:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46202 DF PROTO=TCP SPT=56474 DPT=9105 SEQ=1511131698 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BDD2500000000001030307) Nov 26 04:31:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46203 DF PROTO=TCP SPT=56474 DPT=9105 SEQ=1511131698 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BDD63C0000000001030307) Nov 26 04:31:16 localhost python3.9[231577]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:31:18 localhost python3.9[231685]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:31:18 localhost nova_compute[229802]: 2025-11-26 09:31:18.521 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:31:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46204 DF PROTO=TCP SPT=56474 DPT=9105 SEQ=1511131698 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BDDE3C0000000001030307) Nov 26 04:31:18 localhost python3.9[231795]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:31:19 localhost python3.9[231881]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764149478.5039067-360-170311524095024/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=48228bb9a278dd96d836393b631624e73deeb5e4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:31:20 localhost python3.9[231991]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None Nov 26 04:31:20 localhost nova_compute[229802]: 2025-11-26 09:31:20.760 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:31:21 localhost python3.9[232101]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None Nov 26 04:31:22 localhost python3.9[232212]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None Nov 26 04:31:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:31:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46205 DF PROTO=TCP SPT=56474 DPT=9105 SEQ=1511131698 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BDEDFC0000000001030307) Nov 26 04:31:22 localhost podman[232266]: 2025-11-26 09:31:22.904393681 +0000 UTC m=+0.165968845 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, io.buildah.version=1.41.3, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118) Nov 26 04:31:22 localhost podman[232266]: 2025-11-26 09:31:22.941351206 +0000 UTC m=+0.202926390 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd) Nov 26 04:31:22 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:31:23 localhost python3.9[232347]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005536118.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None Nov 26 04:31:23 localhost nova_compute[229802]: 2025-11-26 09:31:23.523 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:31:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35212 DF PROTO=TCP SPT=50960 DPT=9102 SEQ=1681022540 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BDF5BC0000000001030307) Nov 26 04:31:24 localhost python3.9[232463]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:31:25 localhost python3.9[232549]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764149484.4363606-564-13869307158529/.source.conf _original_basename=ceilometer.conf follow=False checksum=791a61e8c4f9e4e2b66cf6192d8082b61c1f5329 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:31:25 localhost sshd[232550]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:31:25 localhost nova_compute[229802]: 2025-11-26 09:31:25.763 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:31:26 localhost python3.9[232659]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:31:26 localhost python3.9[232745]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764149485.749434-564-10756539727268/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:31:27 localhost python3.9[232853]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:31:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14721 DF PROTO=TCP SPT=33526 DPT=9100 SEQ=2605163008 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BE01FC0000000001030307) Nov 26 04:31:28 localhost python3.9[232939]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764149487.370268-564-28167274940642/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:31:28 localhost nova_compute[229802]: 2025-11-26 09:31:28.525 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:31:28 localhost python3.9[233047]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:31:30 localhost python3.9[233155]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:31:30 localhost nova_compute[229802]: 2025-11-26 09:31:30.765 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:31:30 localhost python3.9[233263]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:31:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35214 DF PROTO=TCP SPT=50960 DPT=9102 SEQ=1681022540 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BE0D7C0000000001030307) Nov 26 04:31:31 localhost python3.9[233349]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764149490.4155366-741-179822556029293/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:31:32 localhost python3.9[233457]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:31:32 localhost python3.9[233512]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:31:33 localhost python3.9[233620]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:31:33 localhost nova_compute[229802]: 2025-11-26 09:31:33.541 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:31:33 localhost python3.9[233706]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764149492.8199332-741-273589067729668/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=d15068604cf730dd6e7b88a19d62f57d3a39f94f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:31:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53175 DF PROTO=TCP SPT=40876 DPT=9101 SEQ=2734320842 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BE19FD0000000001030307) Nov 26 04:31:34 localhost python3.9[233814]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:31:34 localhost python3.9[233917]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764149493.9065354-741-214941590208320/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:31:35 localhost python3.9[234083]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:31:35 localhost podman[234126]: 2025-11-26 09:31:35.646580325 +0000 UTC m=+0.066398672 container exec a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, distribution-scope=public, version=7, RELEASE=main, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553) Nov 26 04:31:35 localhost nova_compute[229802]: 2025-11-26 09:31:35.692 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:31:35 localhost nova_compute[229802]: 2025-11-26 09:31:35.709 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Triggering sync for uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Nov 26 04:31:35 localhost nova_compute[229802]: 2025-11-26 09:31:35.710 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquiring lock "9d78bef9-6977-4fb5-b50b-ae75124e73af" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:31:35 localhost nova_compute[229802]: 2025-11-26 09:31:35.710 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "9d78bef9-6977-4fb5-b50b-ae75124e73af" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:31:35 localhost nova_compute[229802]: 2025-11-26 09:31:35.710 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:31:35 localhost podman[234126]: 2025-11-26 09:31:35.752884443 +0000 UTC m=+0.172702790 container exec_died a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, distribution-scope=public, version=7, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, name=rhceph, GIT_BRANCH=main, vcs-type=git, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, architecture=x86_64, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 26 04:31:35 localhost nova_compute[229802]: 2025-11-26 09:31:35.754 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "9d78bef9-6977-4fb5-b50b-ae75124e73af" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.044s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:31:35 localhost nova_compute[229802]: 2025-11-26 09:31:35.767 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:31:35 localhost python3.9[234235]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764149495.0458317-741-48829320818308/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:31:36 localhost python3.9[234410]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:31:37 localhost python3.9[234528]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764149496.0656967-741-106462729915581/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=7e5ab36b7368c1d4a00810e02af11a7f7d7c84e8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:31:37 localhost python3.9[234654]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:31:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:31:38 localhost podman[234741]: 2025-11-26 09:31:38.343828141 +0000 UTC m=+0.093342648 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible) Nov 26 04:31:38 localhost python3.9[234740]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764149497.333562-741-234919599846202/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:31:38 localhost podman[234741]: 2025-11-26 09:31:38.391409493 +0000 UTC m=+0.140924000 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 26 04:31:38 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:31:38 localhost nova_compute[229802]: 2025-11-26 09:31:38.545 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:31:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35215 DF PROTO=TCP SPT=50960 DPT=9102 SEQ=1681022540 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BE2DFC0000000001030307) Nov 26 04:31:39 localhost python3.9[234872]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:31:40 localhost python3.9[234958]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764149499.098366-741-251825387837038/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=0e4ea521b0035bea70b7a804346a5c89364dcbc3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:31:40 localhost nova_compute[229802]: 2025-11-26 09:31:40.769 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:31:41 localhost python3.9[235066]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:31:41 localhost python3.9[235152]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764149500.8583972-741-66272898427769/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=b056dcaaba7624b93826bb95ee9e82f81bde6c72 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:31:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53176 DF PROTO=TCP SPT=40876 DPT=9101 SEQ=2734320842 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BE39FC0000000001030307) Nov 26 04:31:42 localhost python3.9[235260]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:31:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13206 DF PROTO=TCP SPT=59610 DPT=9882 SEQ=1636583651 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BE3BFC0000000001030307) Nov 26 04:31:43 localhost python3.9[235346]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764149502.07035-741-209727830449553/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=885ccc6f5edd8803cb385bdda5648d0b3017b4e4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:31:43 localhost nova_compute[229802]: 2025-11-26 09:31:43.555 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:31:43 localhost python3.9[235454]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:31:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:31:43 localhost podman[235471]: 2025-11-26 09:31:43.8282864 +0000 UTC m=+0.086965405 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:31:43 localhost podman[235471]: 2025-11-26 09:31:43.860556406 +0000 UTC m=+0.119235471 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:31:43 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:31:44 localhost python3.9[235558]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764149503.2027726-741-39895094567623/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:31:45 localhost python3.9[235668]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:31:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10125 DF PROTO=TCP SPT=53348 DPT=9105 SEQ=4278758938 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BE47800000000001030307) Nov 26 04:31:45 localhost nova_compute[229802]: 2025-11-26 09:31:45.821 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:31:45 localhost python3.9[235778]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:31:45 localhost systemd[1]: Reloading. Nov 26 04:31:46 localhost systemd-rc-local-generator[235801]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:31:46 localhost systemd-sysv-generator[235805]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:31:46 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:46 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:46 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:46 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:46 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:31:46 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:46 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:46 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:46 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:46 localhost systemd[1]: Listening on Podman API Socket. Nov 26 04:31:47 localhost python3.9[235927]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:31:47 localhost python3.9[236015]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764149506.8867533-1257-275460361396188/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 26 04:31:48 localhost python3.9[236070]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:31:48 localhost nova_compute[229802]: 2025-11-26 09:31:48.594 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:31:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10127 DF PROTO=TCP SPT=53348 DPT=9105 SEQ=4278758938 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BE537D0000000001030307) Nov 26 04:31:49 localhost python3.9[236158]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764149506.8867533-1257-275460361396188/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 26 04:31:50 localhost nova_compute[229802]: 2025-11-26 09:31:50.823 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:31:51 localhost python3.9[236268]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False Nov 26 04:31:52 localhost python3.9[236378]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Nov 26 04:31:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10128 DF PROTO=TCP SPT=53348 DPT=9105 SEQ=4278758938 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BE633D0000000001030307) Nov 26 04:31:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:31:53 localhost podman[236489]: 2025-11-26 09:31:53.564085711 +0000 UTC m=+0.072798404 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Nov 26 04:31:53 localhost podman[236489]: 2025-11-26 09:31:53.583712705 +0000 UTC m=+0.092425408 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 26 04:31:53 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:31:53 localhost nova_compute[229802]: 2025-11-26 09:31:53.599 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:31:53 localhost python3[236488]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False Nov 26 04:31:53 localhost python3[236488]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "5b3bac081df6146e06acefa72320d250dc7d5f82abc7fbe0b9e83aec1e1587f5",#012 "Digest": "sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-11-21T06:23:50.144134741Z",#012 "Config": {#012 "User": "root",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251118",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 505196287,#012 "VirtualSize": 505196287,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/0ef5cc2b89c8a41c643bbf27e239c40ba42d2785cdc67bc5e5d4e7b894568a96/diff:/var/lib/containers/storage/overlay/0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c/diff:/var/lib/containers/storage/overlay/6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068/diff:/var/lib/containers/storage/overlay/ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/3e0acb8482e457c209ba092ee475b7db4ab9004896d495d3f7437d3c12bacbd3/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/3e0acb8482e457c209ba092ee475b7db4ab9004896d495d3f7437d3c12bacbd3/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6",#012 "sha256:573e98f577c8b1610c1485067040ff856a142394fcd22ad4cb9c66b7d1de6bef",#012 "sha256:5a71e5d7d31f15255619cb8b9384b708744757c93993652418b0f45b0c0931d5",#012 "sha256:4ff7b15b3989ce3486d1ee120e82ba5b4acb5e4ad1d931e92c8d8e0851a32a6a",#012 "sha256:847ae301d478780c04ade872e138a0bd4b67a423f03bd51e3a177105d1684cb3"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251118",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "root",#012 "History": [#012 {#012 "created": "2025-11-18T01:56:49.795434035Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:6d427dd138d2b0977a7ef7feaa8bd82d04e99cc5f4a16d555d6cff0cb52d43c6 in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-18T01:56:49.795512415Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251118\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-18T01:56:52.547242013Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-11-21T06:10:01.947310748Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947327778Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947358359Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947372589Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.94738527Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.94739397Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:02.324930938Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:36.349393468Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 Nov 26 04:31:54 localhost podman[236557]: 2025-11-26 09:31:54.069515683 +0000 UTC m=+0.111304767 container remove 3dceb395dbc9b12b9126f5ee3562970469cd404667299589ff90844874874efe (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, tcib_managed=true, container_name=ceilometer_agent_compute, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f94fd18b42545cee37022470afd201a1'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Nov 26 04:31:54 localhost python3[236488]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ceilometer_agent_compute Nov 26 04:31:54 localhost podman[236571]: Nov 26 04:31:54 localhost podman[236571]: 2025-11-26 09:31:54.193336668 +0000 UTC m=+0.103289303 container create f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=edpm, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute) Nov 26 04:31:54 localhost podman[236571]: 2025-11-26 09:31:54.140977974 +0000 UTC m=+0.050930669 image pull quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified Nov 26 04:31:54 localhost python3[236488]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start Nov 26 04:31:54 localhost nova_compute[229802]: 2025-11-26 09:31:54.661 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:31:54 localhost nova_compute[229802]: 2025-11-26 09:31:54.662 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:31:54 localhost nova_compute[229802]: 2025-11-26 09:31:54.662 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 04:31:54 localhost nova_compute[229802]: 2025-11-26 09:31:54.663 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 04:31:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30133 DF PROTO=TCP SPT=36600 DPT=9102 SEQ=3132709225 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BE6ABC0000000001030307) Nov 26 04:31:55 localhost python3.9[236717]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:31:55 localhost nova_compute[229802]: 2025-11-26 09:31:55.825 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:31:56 localhost nova_compute[229802]: 2025-11-26 09:31:56.804 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:31:56 localhost nova_compute[229802]: 2025-11-26 09:31:56.805 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:31:56 localhost nova_compute[229802]: 2025-11-26 09:31:56.805 229806 DEBUG nova.network.neutron [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 04:31:56 localhost nova_compute[229802]: 2025-11-26 09:31:56.805 229806 DEBUG nova.objects.instance [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:31:56 localhost python3.9[236829]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:31:57 localhost nova_compute[229802]: 2025-11-26 09:31:57.585 229806 DEBUG nova.network.neutron [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:31:57 localhost nova_compute[229802]: 2025-11-26 09:31:57.611 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:31:57 localhost nova_compute[229802]: 2025-11-26 09:31:57.611 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 04:31:57 localhost nova_compute[229802]: 2025-11-26 09:31:57.612 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:31:57 localhost nova_compute[229802]: 2025-11-26 09:31:57.613 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:31:57 localhost nova_compute[229802]: 2025-11-26 09:31:57.613 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:31:57 localhost nova_compute[229802]: 2025-11-26 09:31:57.614 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:31:57 localhost nova_compute[229802]: 2025-11-26 09:31:57.614 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:31:57 localhost nova_compute[229802]: 2025-11-26 09:31:57.615 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:31:57 localhost nova_compute[229802]: 2025-11-26 09:31:57.616 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 04:31:57 localhost nova_compute[229802]: 2025-11-26 09:31:57.616 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:31:57 localhost nova_compute[229802]: 2025-11-26 09:31:57.640 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:31:57 localhost nova_compute[229802]: 2025-11-26 09:31:57.641 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:31:57 localhost nova_compute[229802]: 2025-11-26 09:31:57.642 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:31:57 localhost nova_compute[229802]: 2025-11-26 09:31:57.642 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 04:31:57 localhost nova_compute[229802]: 2025-11-26 09:31:57.643 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:31:57 localhost python3.9[236939]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764149517.0243094-1449-91740143464485/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:31:58 localhost nova_compute[229802]: 2025-11-26 09:31:58.042 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.398s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:31:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7289 DF PROTO=TCP SPT=54214 DPT=9101 SEQ=3941829266 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BE777C0000000001030307) Nov 26 04:31:58 localhost nova_compute[229802]: 2025-11-26 09:31:58.115 229806 DEBUG nova.virt.libvirt.driver [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:31:58 localhost nova_compute[229802]: 2025-11-26 09:31:58.116 229806 DEBUG nova.virt.libvirt.driver [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:31:58 localhost nova_compute[229802]: 2025-11-26 09:31:58.307 229806 WARNING nova.virt.libvirt.driver [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 04:31:58 localhost nova_compute[229802]: 2025-11-26 09:31:58.308 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=12908MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 04:31:58 localhost nova_compute[229802]: 2025-11-26 09:31:58.308 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:31:58 localhost nova_compute[229802]: 2025-11-26 09:31:58.309 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:31:58 localhost python3.9[237015]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 26 04:31:58 localhost systemd[1]: Reloading. Nov 26 04:31:58 localhost nova_compute[229802]: 2025-11-26 09:31:58.430 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 04:31:58 localhost nova_compute[229802]: 2025-11-26 09:31:58.431 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 04:31:58 localhost nova_compute[229802]: 2025-11-26 09:31:58.432 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 04:31:58 localhost nova_compute[229802]: 2025-11-26 09:31:58.474 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:31:58 localhost systemd-rc-local-generator[237040]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:31:58 localhost systemd-sysv-generator[237047]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:31:58 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:58 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:58 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:58 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:58 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:31:58 localhost nova_compute[229802]: 2025-11-26 09:31:58.598 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:31:58 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:58 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:58 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:58 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:58 localhost nova_compute[229802]: 2025-11-26 09:31:58.981 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.507s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:31:58 localhost nova_compute[229802]: 2025-11-26 09:31:58.990 229806 DEBUG nova.compute.provider_tree [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 04:31:59 localhost nova_compute[229802]: 2025-11-26 09:31:59.014 229806 DEBUG nova.scheduler.client.report [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 04:31:59 localhost nova_compute[229802]: 2025-11-26 09:31:59.017 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 04:31:59 localhost nova_compute[229802]: 2025-11-26 09:31:59.018 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.709s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:31:59 localhost python3.9[237128]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:31:59 localhost systemd[1]: Reloading. Nov 26 04:31:59 localhost systemd-sysv-generator[237159]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:31:59 localhost systemd-rc-local-generator[237154]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:31:59 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:59 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:59 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:59 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:59 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:31:59 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:59 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:59 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:59 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:31:59 localhost systemd[1]: Starting ceilometer_agent_compute container... Nov 26 04:31:59 localhost systemd[1]: Started libcrun container. Nov 26 04:31:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6d4157dea8097c538f70005e1906ffff4f639c7671f60eab43cfb9497ac695be/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff) Nov 26 04:31:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6d4157dea8097c538f70005e1906ffff4f639c7671f60eab43cfb9497ac695be/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff) Nov 26 04:31:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:31:59 localhost podman[237168]: 2025-11-26 09:31:59.951521817 +0000 UTC m=+0.158015842 container init f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 26 04:31:59 localhost ceilometer_agent_compute[237183]: + sudo -E kolla_set_configs Nov 26 04:31:59 localhost ceilometer_agent_compute[237183]: sudo: unable to send audit message: Operation not permitted Nov 26 04:31:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:31:59 localhost podman[237168]: 2025-11-26 09:31:59.987074237 +0000 UTC m=+0.193568252 container start f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:31:59 localhost podman[237168]: ceilometer_agent_compute Nov 26 04:31:59 localhost systemd[1]: Started ceilometer_agent_compute container. Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: INFO:__main__:Validating config file Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: INFO:__main__:Copying service configuration files Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: INFO:__main__:Writing out command to execute Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: ++ cat /run_command Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout' Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: + ARGS= Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: + sudo kolla_copy_cacerts Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: sudo: unable to send audit message: Operation not permitted Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: + [[ ! -n '' ]] Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: + . kolla_extend_start Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout' Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\''' Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: + umask 0022 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout Nov 26 04:32:00 localhost podman[237191]: 2025-11-26 09:32:00.069982801 +0000 UTC m=+0.079960121 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true) Nov 26 04:32:00 localhost podman[237191]: 2025-11-26 09:32:00.108392422 +0000 UTC m=+0.118369772 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:32:00 localhost podman[237191]: unhealthy Nov 26 04:32:00 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:32:00 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Failed with result 'exit-code'. Nov 26 04:32:00 localhost nova_compute[229802]: 2025-11-26 09:32:00.867 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:32:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30135 DF PROTO=TCP SPT=36600 DPT=9102 SEQ=3132709225 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BE827C0000000001030307) Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.879 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.879 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.879 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590 Nov 26 04:32:00 localhost python3.9[237323]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.879 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.880 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.880 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.880 2 DEBUG cotyledon.oslo_config_glue [-] batch_size = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.880 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.880 2 DEBUG cotyledon.oslo_config_glue [-] config_dir = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.880 2 DEBUG cotyledon.oslo_config_glue [-] config_file = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.880 2 DEBUG cotyledon.oslo_config_glue [-] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.880 2 DEBUG cotyledon.oslo_config_glue [-] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.881 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.881 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.881 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.881 2 DEBUG cotyledon.oslo_config_glue [-] host = np0005536118.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.881 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.881 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.881 2 DEBUG cotyledon.oslo_config_glue [-] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.881 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.881 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.881 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.881 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.881 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.882 2 DEBUG cotyledon.oslo_config_glue [-] log_dir = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.882 2 DEBUG cotyledon.oslo_config_glue [-] log_file = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.882 2 DEBUG cotyledon.oslo_config_glue [-] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.882 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.882 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.882 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.882 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.882 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.882 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.882 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.882 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.883 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.883 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.883 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.883 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.883 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.883 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.883 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.883 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.883 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.883 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.883 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.883 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.883 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.884 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.884 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.884 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.884 2 DEBUG cotyledon.oslo_config_glue [-] sample_source = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.884 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.884 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.884 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.884 2 DEBUG cotyledon.oslo_config_glue [-] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.884 2 DEBUG cotyledon.oslo_config_glue [-] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.884 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.884 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.884 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.884 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.885 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.885 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.885 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.885 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.885 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.885 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.885 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.885 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.885 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.885 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.885 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.885 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.885 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.886 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.886 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.886 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.886 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.886 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.886 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.886 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.886 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.886 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.886 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.886 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.886 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.886 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.887 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.887 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.887 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.887 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.887 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.887 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.887 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.887 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.887 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.887 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.887 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.887 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.887 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.888 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.888 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.888 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.888 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.888 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.888 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.888 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.888 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.888 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.888 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.888 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.888 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.889 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.889 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.889 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.889 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.889 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.889 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.889 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.889 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.889 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.889 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.889 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.889 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.890 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.890 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.890 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.890 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.890 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.890 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.890 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.890 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.890 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.890 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.890 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.890 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.890 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.891 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.891 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.891 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.891 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.891 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.891 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.891 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.891 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.891 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.891 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.891 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.891 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.891 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.891 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.892 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.892 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.892 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.892 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.892 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.892 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.892 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.892 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.892 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.892 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.892 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.892 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.892 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.892 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.913 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']]. Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.915 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d]. Nov 26 04:32:00 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:00.916 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']]. Nov 26 04:32:00 localhost systemd[1]: Stopping ceilometer_agent_compute container... Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.012 2 INFO cotyledon._service_manager [-] Caught SIGTERM signal, graceful exiting of master process Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.013 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.074 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.074 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.074 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.074 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.075 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.075 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.075 12 DEBUG cotyledon.oslo_config_glue [-] batch_size = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.075 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.075 12 DEBUG cotyledon.oslo_config_glue [-] config_dir = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.075 12 DEBUG cotyledon.oslo_config_glue [-] config_file = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.075 12 DEBUG cotyledon.oslo_config_glue [-] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.075 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.075 12 DEBUG cotyledon.oslo_config_glue [-] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.076 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.076 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.076 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.076 12 DEBUG cotyledon.oslo_config_glue [-] host = np0005536118.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.076 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.076 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.076 12 DEBUG cotyledon.oslo_config_glue [-] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.076 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.076 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.076 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.076 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.076 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.077 12 DEBUG cotyledon.oslo_config_glue [-] log_dir = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.077 12 DEBUG cotyledon.oslo_config_glue [-] log_file = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.077 12 DEBUG cotyledon.oslo_config_glue [-] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.077 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.077 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.077 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.077 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.077 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.077 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.077 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.077 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.077 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.077 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.078 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.078 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.078 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.078 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.078 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.078 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.078 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.078 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.078 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.078 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.078 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.078 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.079 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.079 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.079 12 DEBUG cotyledon.oslo_config_glue [-] sample_source = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.079 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.079 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.079 12 DEBUG cotyledon.oslo_config_glue [-] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.079 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.079 12 DEBUG cotyledon.oslo_config_glue [-] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.079 12 DEBUG cotyledon.oslo_config_glue [-] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.079 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.079 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.079 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.079 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.080 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.080 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.080 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.080 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.080 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.080 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.080 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.080 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.080 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.080 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.080 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.080 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.081 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.081 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.081 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.081 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.081 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.081 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.081 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.081 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.081 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.081 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.081 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.081 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.081 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.082 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.082 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.082 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.082 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.082 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.082 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.082 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.082 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.082 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.082 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.082 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.082 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.083 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.083 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.083 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.083 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.083 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.083 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.083 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.083 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.083 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.083 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.083 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.084 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.084 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.084 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.084 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.084 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.084 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.084 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.084 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.084 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.084 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.084 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.084 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.085 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.085 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.085 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.085 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.085 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.085 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.085 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.085 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.085 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.085 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.085 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.085 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.086 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.086 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.086 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.086 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.086 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.086 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.086 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.086 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.086 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.086 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.086 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.086 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.087 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.087 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.087 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.087 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.087 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.087 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.087 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.087 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.087 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.087 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.087 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.087 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.088 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.088 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.088 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.088 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.088 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.088 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.088 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.088 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.088 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.088 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.088 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.088 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.088 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.089 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.089 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.089 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.089 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.089 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.089 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.089 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.089 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.089 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.089 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.089 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.089 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.089 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.089 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.090 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.090 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.090 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.090 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.090 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.090 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.090 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.090 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.090 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.090 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.090 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.090 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.091 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.091 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.091 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.091 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.091 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.091 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.091 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.091 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.091 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.091 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.091 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.091 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.092 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.092 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.092 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.092 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.092 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.092 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.092 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.092 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.092 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.095 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.102 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.114 2 DEBUG cotyledon._service_manager [-] Killing services with signal SIGTERM _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:304 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.114 2 DEBUG cotyledon._service_manager [-] Waiting services to terminate _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:308 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.114 12 INFO cotyledon._service [-] Caught SIGTERM signal, graceful exiting of service AgentManager(0) [12] Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.464 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}df28e28174d3e9fa8d91b14083507cba4a194a711f6d0a4b48d39b207a7e1b31" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.628 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 327 Content-Type: application/json Date: Wed, 26 Nov 2025 09:32:01 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-367f9496-3c74-49ff-97fa-10cac631559d x-openstack-request-id: req-367f9496-3c74-49ff-97fa-10cac631559d _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.630 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "a8cafabf-98f1-4bbc-a3ca-a9382f40900b", "name": "m1.small", "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/a8cafabf-98f1-4bbc-a3ca-a9382f40900b"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/a8cafabf-98f1-4bbc-a3ca-a9382f40900b"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.630 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-367f9496-3c74-49ff-97fa-10cac631559d request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.633 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors/a8cafabf-98f1-4bbc-a3ca-a9382f40900b -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}df28e28174d3e9fa8d91b14083507cba4a194a711f6d0a4b48d39b207a7e1b31" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.672 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 494 Content-Type: application/json Date: Wed, 26 Nov 2025 09:32:01 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-5c0a5b5c-eaa5-479e-b567-c08fc9b19175 x-openstack-request-id: req-5c0a5b5c-eaa5-479e-b567-c08fc9b19175 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.672 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "a8cafabf-98f1-4bbc-a3ca-a9382f40900b", "name": "m1.small", "ram": 512, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 1, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/a8cafabf-98f1-4bbc-a3ca-a9382f40900b"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/a8cafabf-98f1-4bbc-a3ca-a9382f40900b"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.672 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors/a8cafabf-98f1-4bbc-a3ca-a9382f40900b used request id req-5c0a5b5c-eaa5-479e-b567-c08fc9b19175 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.674 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'name': 'test', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005536118.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'hostId': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.675 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.675 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.675 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [] Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.676 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.682 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 9d78bef9-6977-4fb5-b50b-ae75124e73af / tap5afdc9d0-95 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.682 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '30fda3a5-90bc-466e-95a3-58ab60ecdd82', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:32:01.676788', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'c3360da0-caaa-11f0-b2b9-fa163e73ba36', 'monotonic_time': 9996.919209172, 'message_signature': 'da9fe16aaa0fbc77d8cfc5fa97ec04dca58d4ae2d6c72ac3864252be10239733'}]}, 'timestamp': '2025-11-26 09:32:01.684179', '_unique_id': '59af603eed034c55860d7b734dad764c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.692 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.697 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.716 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/memory.usage volume: 52.296875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e26c4cee-1438-480f-aaa7-b230bc362ec7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.296875, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T09:32:01.697355', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'c33b185e-caaa-11f0-b2b9-fa163e73ba36', 'monotonic_time': 9996.957962193, 'message_signature': '511c56e25f1faa750800e7c9a8fae33ef50d3d156525793d91b60930f6283ad5'}]}, 'timestamp': '2025-11-26 09:32:01.716856', '_unique_id': 'a405198e8bf34b7fb1f8ee3d5c8f8ca6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.718 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.727 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.727 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f4d83c0-3721-405b-ba93-aba5a4fc23ff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:32:01.718885', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c33cccb2-caaa-11f0-b2b9-fa163e73ba36', 'monotonic_time': 9996.961190786, 'message_signature': 'aead781b6ca5cd4f90f78239843c3e4a50f8a39a5e2ac059526a78d117023e95'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:32:01.718885', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c33cd5ae-caaa-11f0-b2b9-fa163e73ba36', 'monotonic_time': 9996.961190786, 'message_signature': '3f0d3b41b50dd502c0de5eb1af5b0c8cd762b7215cfbe122c2d26106e8cf6ee9'}]}, 'timestamp': '2025-11-26 09:32:01.728142', '_unique_id': '5978d81aa6bb4a3f9f336269c481c5af'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.728 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd9310995-d1a7-4601-a2cc-ab847682c80e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:32:01.729241', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'c33d0920-caaa-11f0-b2b9-fa163e73ba36', 'monotonic_time': 9996.919209172, 'message_signature': '1b3759322010371e915d3daa8b404e63ecdddb4cfe72648cb8513f5a146bdec6'}]}, 'timestamp': '2025-11-26 09:32:01.729473', '_unique_id': 'fddbc43efd91408d942433ae95e543e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.729 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.730 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.730 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/cpu volume: 55220000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3326ecf6-42d9-45c8-867d-d8a0c65d52ec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 55220000000, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T09:32:01.730467', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'c33d38aa-caaa-11f0-b2b9-fa163e73ba36', 'monotonic_time': 9996.957962193, 'message_signature': '8b016efe03272e5ea42093c1b0aed16e4ba18c3eb9c8a6a2a5bf945f1d8af4dc'}]}, 'timestamp': '2025-11-26 09:32:01.730683', '_unique_id': '3d2ad2b8888340eaa478c2fbbd9385bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.731 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '99774054-92bd-4dab-a190-69b389ba851d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:32:01.731671', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'c33d6820-caaa-11f0-b2b9-fa163e73ba36', 'monotonic_time': 9996.919209172, 'message_signature': '10f429762258020a2f5b5c53bb0f59abe80a0863158408281e883eedf5e7ea4c'}]}, 'timestamp': '2025-11-26 09:32:01.731904', '_unique_id': '039bb3e2571140c8af1a25d2b971411f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.732 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.733 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.757 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 627516836 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.757 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 21052656 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'abaf3b6e-2f0a-4b45-8fdf-b96e6c32d2a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 627516836, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:32:01.733166', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c3414d82-caaa-11f0-b2b9-fa163e73ba36', 'monotonic_time': 9996.975433918, 'message_signature': 'e2002ca59db23119d3d8c910f3bcf2b28ea03393c6c3abb5bd6e53ac7e5e5bba'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21052656, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:32:01.733166', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c341585e-caaa-11f0-b2b9-fa163e73ba36', 'monotonic_time': 9996.975433918, 'message_signature': '78fcd0af516c7fe2ac37252f99eeb3387c7b5a15e967b933313763006da0d565'}]}, 'timestamp': '2025-11-26 09:32:01.757738', '_unique_id': '54ab16b35e5e4568af6e988de6e57aff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.758 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.759 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.759 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.759 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '393c3079-95c8-470a-a60c-1f33590bf008', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:32:01.759230', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c3419daa-caaa-11f0-b2b9-fa163e73ba36', 'monotonic_time': 9996.961190786, 'message_signature': '3218aee4bce1d7ac0d58e641d1353e592e45d805c2253c81302d60a761ddbd44'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:32:01.759230', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c341a7e6-caaa-11f0-b2b9-fa163e73ba36', 'monotonic_time': 9996.961190786, 'message_signature': '68d3d3ac2b686ee0dc761e071bdea32f5e89e9086e8c352dfaba8088e67fc57c'}]}, 'timestamp': '2025-11-26 09:32:01.759774', '_unique_id': '58fb96f9fe4d41f5b68cebd64e37e4b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.760 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.761 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.761 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.761 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e3ce3a9a-5cb4-44c4-81e3-3991ff7b48f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:32:01.761225', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c341eba2-caaa-11f0-b2b9-fa163e73ba36', 'monotonic_time': 9996.975433918, 'message_signature': '675c3ca9223e33dc03b0139ac1fbbf684e324e0979b77e4be8b92553c001d85d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:32:01.761225', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c341f610-caaa-11f0-b2b9-fa163e73ba36', 'monotonic_time': 9996.975433918, 'message_signature': '982bdd23eb1aa461913b675b7c5496d08a838f9ba4f2e12ef72642d00f1cc3be'}]}, 'timestamp': '2025-11-26 09:32:01.761776', '_unique_id': '187b6d6462884bc5bcb1ca865ea41e48'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.762 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.763 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.763 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b4a9c93-741e-4a27-8a48-c44c7ee02f6f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:32:01.763181', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'c342385a-caaa-11f0-b2b9-fa163e73ba36', 'monotonic_time': 9996.919209172, 'message_signature': 'cea86ffee00bf16e4ab7bf5ff108645c4add62e5b21c9ada5565a80f00f4371d'}]}, 'timestamp': '2025-11-26 09:32:01.763492', '_unique_id': 'a899c15232e540ba93f29489d51a58a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.764 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.765 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.765 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '640dbc09-691f-473e-848e-586f78acc369', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:32:01.764970', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c342803a-caaa-11f0-b2b9-fa163e73ba36', 'monotonic_time': 9996.961190786, 'message_signature': 'b4644f75020f5a47a1ef43dbb752bff8697bf989178e40de0e2d473d667cacc3'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:32:01.764970', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c3428ecc-caaa-11f0-b2b9-fa163e73ba36', 'monotonic_time': 9996.961190786, 'message_signature': '29aadfbeef514983c630b21e8690799045ef9f05e281b390ea9c91c1bd65d507'}]}, 'timestamp': '2025-11-26 09:32:01.765720', '_unique_id': 'd84b9333c9fc4952b3d99d44c5118e10'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.766 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.767 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.767 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes volume: 9035 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e195aa61-a891-409f-925f-b0de5a291e83', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9035, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:32:01.767173', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'c342d486-caaa-11f0-b2b9-fa163e73ba36', 'monotonic_time': 9996.919209172, 'message_signature': 'a100eeb5a90d82642d8366bbf4f64ffb940dde078aba9980fa1866685f0a326d'}]}, 'timestamp': '2025-11-26 09:32:01.767492', '_unique_id': 'a876cbc538024613923234f6e8f884c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.768 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.769 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.769 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [] Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.769 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.769 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 1141678425 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.769 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 173265014 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b80fc44f-9f56-466d-ad21-0a56cc8fd4e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1141678425, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:32:01.769423', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c3432c10-caaa-11f0-b2b9-fa163e73ba36', 'monotonic_time': 9996.975433918, 'message_signature': '5fa0d6aa2571b3297ef57675965945068efd9183a459ece94ad1e0557ccb2937'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 173265014, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:32:01.769423', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c34336a6-caaa-11f0-b2b9-fa163e73ba36', 'monotonic_time': 9996.975433918, 'message_signature': '274e89349a9808d9f8c1c51b8c211407b3c9e7c616c09c1e3a4e610ed341dc5d'}]}, 'timestamp': '2025-11-26 09:32:01.770010', '_unique_id': 'd692f7e7dcf844f4bdc8d510f5ff9f0b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.770 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.771 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.771 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.771 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0a9da1a8-ab7d-4307-b957-3cb2fa067413', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:32:01.771563', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c3437fc6-caaa-11f0-b2b9-fa163e73ba36', 'monotonic_time': 9996.975433918, 'message_signature': 'e0be6dd94269578533df5c0b481fdd30b77a928121edc555d7ee3520b655fc22'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:32:01.771563', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c3438b60-caaa-11f0-b2b9-fa163e73ba36', 'monotonic_time': 9996.975433918, 'message_signature': '6d73036687cbbf21aa5e1106f880932eb64c5f46c04302bfbdc1b0f849c510a1'}]}, 'timestamp': '2025-11-26 09:32:01.772168', '_unique_id': '86175ccc8a3b4fe88d7c31fecaa7ef4d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.772 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.773 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.773 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.774 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [] Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.774 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.774 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4e5f9005-cb51-4c02-93f1-1e9e403ecf8d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:32:01.774255', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'c343e8c6-caaa-11f0-b2b9-fa163e73ba36', 'monotonic_time': 9996.919209172, 'message_signature': '5314bbe00acc5d6e8c89de2e15daa0bb2990d1eb042595c17ee3343d37c39b4b'}]}, 'timestamp': '2025-11-26 09:32:01.774559', '_unique_id': 'ac19b2b172764a7fb3234a6be750d1ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.775 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.776 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.776 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [] Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.776 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.776 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '35c8fade-f205-4684-a411-3fe2e81e9157', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:32:01.776383', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'c3443bf0-caaa-11f0-b2b9-fa163e73ba36', 'monotonic_time': 9996.919209172, 'message_signature': '75f66b22d41385e42537d97aa56a8c12b9c16567a5ecbd26ed9e3068b47e9f61'}]}, 'timestamp': '2025-11-26 09:32:01.776689', '_unique_id': 'efbaf4550d5848adad9e47f527a5fadb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.777 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.778 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.778 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 73912320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.778 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0c632d3f-245c-46f8-b7d2-72b8e907ddba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73912320, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:32:01.778095', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c3447ed0-caaa-11f0-b2b9-fa163e73ba36', 'monotonic_time': 9996.975433918, 'message_signature': '4a562221b5670402a89e700bcb151d3da351adb6c36ebbc0d1aaa42669487bc8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:32:01.778095', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c3448948-caaa-11f0-b2b9-fa163e73ba36', 'monotonic_time': 9996.975433918, 'message_signature': '7746b4bf574a2bcdde41285240343dfbeb8050d588e5d86ea30f5f6966234be1'}]}, 'timestamp': '2025-11-26 09:32:01.778647', '_unique_id': 'ac4743d9753944ef8824dc1d0813bb38'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.779 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.780 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.780 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd4ee2dea-8b02-4906-9f74-90ff2142f04f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:32:01.780135', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'c344cfca-caaa-11f0-b2b9-fa163e73ba36', 'monotonic_time': 9996.919209172, 'message_signature': 'd36d804f9556345ab8331f7186c9c34d499f5e96e9cd9b3dd2f6731cf283c25e'}]}, 'timestamp': '2025-11-26 09:32:01.780493', '_unique_id': 'a2f97c8ad7cf4b8d952ec18008d15a91'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.781 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 524 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.782 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '63652400-63f9-4d6c-b1b9-db75ba367248', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 524, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:32:01.781847', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c345125a-caaa-11f0-b2b9-fa163e73ba36', 'monotonic_time': 9996.975433918, 'message_signature': '1a17dff44229a952b365b6991090f5f31a6b9abb7916c9327c6b79f369a6dc36'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:32:01.781847', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c3451cf0-caaa-11f0-b2b9-fa163e73ba36', 'monotonic_time': 9996.975433918, 'message_signature': '88f94e29e4d535e7305eba4370b334cf547fadee69a79529a29ca92488d9a2a4'}]}, 'timestamp': '2025-11-26 09:32:01.782449', '_unique_id': 'f4c2ce04c0d142dda5a247c962724f7f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 ERROR oslo_messaging.notify.messaging Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.783 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets volume: 88 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.784 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4e659a1f-1552-439e-99c5-3ce5683384dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 88, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:32:01.783880', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'c34561ec-caaa-11f0-b2b9-fa163e73ba36', 'monotonic_time': 9996.919209172, 'message_signature': '3efd78e4dd56dce53b5b86c527a5b4c9fbc2774ca208927560fa7a6082d78d86'}]}, 'timestamp': '2025-11-26 09:32:01.784214', '_unique_id': '987eefcc43d441579705feb488671e99'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:32:01 localhost ceilometer_agent_compute[237183]: 2025-11-26 09:32:01.784 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:32:27 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:32:27 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:32:27 localhost rsyslogd[760]: imjournal: 2129 messages lost due to rate-limiting (20000 allowed within 600 seconds) Nov 26 04:32:27 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:32:27 localhost python3.9[240195]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 26 04:32:27 localhost systemd[1]: Stopping podman_exporter container... Nov 26 04:32:27 localhost podman[240049]: @ - - [26/Nov/2025:09:32:25 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 2790 "" "Go-http-client/1.1" Nov 26 04:32:27 localhost systemd[1]: libpod-b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.scope: Deactivated successfully. Nov 26 04:32:27 localhost podman[240199]: 2025-11-26 09:32:27.785365436 +0000 UTC m=+0.076270153 container died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 26 04:32:27 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.timer: Deactivated successfully. Nov 26 04:32:27 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:32:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35217 DF PROTO=TCP SPT=50960 DPT=9102 SEQ=1681022540 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BEEBFC0000000001030307) Nov 26 04:32:28 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef-userdata-shm.mount: Deactivated successfully. Nov 26 04:32:28 localhost systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully. Nov 26 04:32:28 localhost systemd[1]: var-lib-containers-storage-overlay-1dbb13f7b8362d56ad42775e94222a336ba84da2e29af0e3ec833723d2b6e61c-merged.mount: Deactivated successfully. Nov 26 04:32:28 localhost systemd[1]: var-lib-containers-storage-overlay-1dbb13f7b8362d56ad42775e94222a336ba84da2e29af0e3ec833723d2b6e61c-merged.mount: Deactivated successfully. Nov 26 04:32:28 localhost podman[240199]: 2025-11-26 09:32:28.094819667 +0000 UTC m=+0.385724364 container cleanup b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 04:32:28 localhost podman[240199]: podman_exporter Nov 26 04:32:28 localhost podman[240210]: 2025-11-26 09:32:28.158356014 +0000 UTC m=+0.369509569 container cleanup b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 26 04:32:28 localhost nova_compute[229802]: 2025-11-26 09:32:28.614 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:32:29 localhost systemd[1]: var-lib-containers-storage-overlay-63fe6648b7b00c2aa594e33a7a4376216ef15cb176861bc4340d5c1601f157c0-merged.mount: Deactivated successfully. Nov 26 04:32:29 localhost systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully. Nov 26 04:32:29 localhost systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully. Nov 26 04:32:29 localhost systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully. Nov 26 04:32:29 localhost systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT Nov 26 04:32:29 localhost podman[240223]: 2025-11-26 09:32:29.949449921 +0000 UTC m=+0.057142835 container cleanup b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 26 04:32:29 localhost podman[240223]: podman_exporter Nov 26 04:32:30 localhost nova_compute[229802]: 2025-11-26 09:32:30.883 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:32:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60915 DF PROTO=TCP SPT=48284 DPT=9102 SEQ=2794022035 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BEF7BC0000000001030307) Nov 26 04:32:31 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:32:31 localhost systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully. Nov 26 04:32:31 localhost systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully. Nov 26 04:32:31 localhost systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'. Nov 26 04:32:31 localhost systemd[1]: Stopped podman_exporter container. Nov 26 04:32:31 localhost systemd[1]: Starting podman_exporter container... Nov 26 04:32:32 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:32:32 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:32:32 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:32:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:32:32 localhost systemd[1]: Started libcrun container. Nov 26 04:32:32 localhost podman[240248]: 2025-11-26 09:32:32.743629455 +0000 UTC m=+0.059462579 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, config_id=edpm, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 26 04:32:32 localhost podman[240248]: 2025-11-26 09:32:32.7803605 +0000 UTC m=+0.096193644 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm) Nov 26 04:32:32 localhost podman[240248]: unhealthy Nov 26 04:32:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:32:32 localhost podman[240236]: 2025-11-26 09:32:32.800972424 +0000 UTC m=+1.269867994 container init b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:32:32 localhost podman_exporter[240255]: ts=2025-11-26T09:32:32.819Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)" Nov 26 04:32:32 localhost podman_exporter[240255]: ts=2025-11-26T09:32:32.819Z caller=exporter.go:69 level=info msg=metrics enhanced=false Nov 26 04:32:32 localhost podman[240049]: @ - - [26/Nov/2025:09:32:32 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1" Nov 26 04:32:32 localhost podman[240049]: time="2025-11-26T09:32:32Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:32:32 localhost podman_exporter[240255]: ts=2025-11-26T09:32:32.820Z caller=handler.go:94 level=info msg="enabled collectors" Nov 26 04:32:32 localhost podman_exporter[240255]: ts=2025-11-26T09:32:32.820Z caller=handler.go:105 level=info collector=container Nov 26 04:32:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:32:32 localhost podman[240236]: 2025-11-26 09:32:32.844956431 +0000 UTC m=+1.313852031 container start b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 04:32:32 localhost podman[240236]: podman_exporter Nov 26 04:32:33 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:32:33 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:32:33 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Failed with result 'exit-code'. Nov 26 04:32:33 localhost nova_compute[229802]: 2025-11-26 09:32:33.654 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:32:34 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:32:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8837 DF PROTO=TCP SPT=49276 DPT=9101 SEQ=3554263631 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF047C0000000001030307) Nov 26 04:32:34 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:32:34 localhost systemd[1]: Started podman_exporter container. Nov 26 04:32:34 localhost podman[240278]: 2025-11-26 09:32:34.231951522 +0000 UTC m=+1.382579253 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 26 04:32:34 localhost podman[240278]: 2025-11-26 09:32:34.241252127 +0000 UTC m=+1.391879908 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 26 04:32:34 localhost podman[240278]: unhealthy Nov 26 04:32:34 localhost systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully. Nov 26 04:32:34 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:32:34 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:32:34 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:32:34 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:32:34 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Failed with result 'exit-code'. Nov 26 04:32:34 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:32:34 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:32:35 localhost python3.9[240409]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:32:35 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:32:35 localhost python3.9[240497]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764149554.4421334-2055-111085751445130/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 26 04:32:35 localhost nova_compute[229802]: 2025-11-26 09:32:35.923 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:32:36 localhost systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully. Nov 26 04:32:36 localhost systemd[1]: var-lib-containers-storage-overlay-1dbb13f7b8362d56ad42775e94222a336ba84da2e29af0e3ec833723d2b6e61c-merged.mount: Deactivated successfully. Nov 26 04:32:36 localhost python3.9[240607]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False Nov 26 04:32:36 localhost systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully. Nov 26 04:32:36 localhost systemd[1]: var-lib-containers-storage-overlay-dd66387fc89c132424d887c45935af2d1eacff944555d01cc251c3d6d1b83282-merged.mount: Deactivated successfully. Nov 26 04:32:37 localhost python3.9[240717]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Nov 26 04:32:37 localhost systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully. Nov 26 04:32:37 localhost systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully. Nov 26 04:32:37 localhost systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully. Nov 26 04:32:38 localhost python3[240863]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False Nov 26 04:32:38 localhost systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully. Nov 26 04:32:38 localhost systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully. Nov 26 04:32:38 localhost systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully. Nov 26 04:32:38 localhost nova_compute[229802]: 2025-11-26 09:32:38.657 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:32:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:32:38 localhost podman[240920]: 2025-11-26 09:32:38.806681248 +0000 UTC m=+0.062863007 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:32:38 localhost podman[240920]: 2025-11-26 09:32:38.86442956 +0000 UTC m=+0.120611329 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:32:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60916 DF PROTO=TCP SPT=48284 DPT=9102 SEQ=2794022035 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF17FC0000000001030307) Nov 26 04:32:39 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:32:39 localhost systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully. Nov 26 04:32:39 localhost systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully. Nov 26 04:32:39 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:32:39 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:32:40 localhost systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully. Nov 26 04:32:40 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:32:40 localhost nova_compute[229802]: 2025-11-26 09:32:40.969 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:32:41 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:32:41 localhost systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully. Nov 26 04:32:41 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:32:41 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:32:42 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:32:42 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:32:42 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:32:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8838 DF PROTO=TCP SPT=49276 DPT=9101 SEQ=3554263631 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF23FC0000000001030307) Nov 26 04:32:42 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:32:42 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:32:42 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:32:42 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:32:42 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:32:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:32:43 localhost podman[240963]: 2025-11-26 09:32:43.305912778 +0000 UTC m=+0.071230323 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 26 04:32:43 localhost podman[240963]: 2025-11-26 09:32:43.314885132 +0000 UTC m=+0.080202717 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 26 04:32:43 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:32:43 localhost systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully. Nov 26 04:32:43 localhost systemd[1]: var-lib-containers-storage-overlay-dd66387fc89c132424d887c45935af2d1eacff944555d01cc251c3d6d1b83282-merged.mount: Deactivated successfully. Nov 26 04:32:43 localhost nova_compute[229802]: 2025-11-26 09:32:43.662 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:32:44 localhost sshd[240999]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:32:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:32:44 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:32:44 localhost systemd[1]: tmp-crun.YY3K2F.mount: Deactivated successfully. Nov 26 04:32:44 localhost podman[241001]: 2025-11-26 09:32:44.620681587 +0000 UTC m=+0.267792690 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent) Nov 26 04:32:44 localhost podman[241001]: 2025-11-26 09:32:44.631212841 +0000 UTC m=+0.278323964 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:32:45 localhost systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully. Nov 26 04:32:45 localhost systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully. Nov 26 04:32:45 localhost systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully. Nov 26 04:32:45 localhost systemd[1]: var-lib-containers-storage-overlay-14988db00eaf3274b740fc90a2db62d16af3a82b44457432a1a6aa29dc90bda4-merged.mount: Deactivated successfully. Nov 26 04:32:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27254 DF PROTO=TCP SPT=50370 DPT=9105 SEQ=3582486075 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF31E00000000001030307) Nov 26 04:32:46 localhost nova_compute[229802]: 2025-11-26 09:32:46.001 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:32:46 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:32:46 localhost systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully. Nov 26 04:32:46 localhost systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully. Nov 26 04:32:46 localhost systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully. Nov 26 04:32:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27255 DF PROTO=TCP SPT=50370 DPT=9105 SEQ=3582486075 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF35FC0000000001030307) Nov 26 04:32:47 localhost systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully. Nov 26 04:32:47 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:32:47 localhost systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully. Nov 26 04:32:47 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:32:47 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:32:47 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:32:48 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:32:48 localhost nova_compute[229802]: 2025-11-26 09:32:48.697 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:32:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27256 DF PROTO=TCP SPT=50370 DPT=9105 SEQ=3582486075 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF3DFD0000000001030307) Nov 26 04:32:48 localhost systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully. Nov 26 04:32:48 localhost systemd[1]: var-lib-containers-storage-overlay-29f95cfe95f535f9d59641ec99dde393a4714016cd95f7dbb20217cba1000992-merged.mount: Deactivated successfully. Nov 26 04:32:49 localhost systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully. Nov 26 04:32:49 localhost systemd[1]: var-lib-containers-storage-overlay-0ff11ed3154c8bbd91096301c9cfc5b95bbe726d99c5650ba8d355053fb0bbad-merged.mount: Deactivated successfully. Nov 26 04:32:49 localhost systemd[1]: var-lib-containers-storage-overlay-d16160b7dcc2f7ec400dce38b825ab93d5279c0ca0a9a7ff351e435b4aeeea92-merged.mount: Deactivated successfully. Nov 26 04:32:49 localhost systemd[1]: var-lib-containers-storage-overlay-d16160b7dcc2f7ec400dce38b825ab93d5279c0ca0a9a7ff351e435b4aeeea92-merged.mount: Deactivated successfully. Nov 26 04:32:50 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:32:50 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:32:51 localhost nova_compute[229802]: 2025-11-26 09:32:51.003 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:32:51 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:32:51 localhost systemd[1]: var-lib-containers-storage-overlay-0ff11ed3154c8bbd91096301c9cfc5b95bbe726d99c5650ba8d355053fb0bbad-merged.mount: Deactivated successfully. Nov 26 04:32:51 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:32:51 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:32:52 localhost systemd[1]: var-lib-containers-storage-overlay-d16160b7dcc2f7ec400dce38b825ab93d5279c0ca0a9a7ff351e435b4aeeea92-merged.mount: Deactivated successfully. Nov 26 04:32:52 localhost systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully. Nov 26 04:32:52 localhost systemd[1]: var-lib-containers-storage-overlay-14988db00eaf3274b740fc90a2db62d16af3a82b44457432a1a6aa29dc90bda4-merged.mount: Deactivated successfully. Nov 26 04:32:52 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:32:52 localhost podman[240049]: time="2025-11-26T09:32:52Z" level=error msg="Getting root fs size for \"18c91518aeaf5e3312d90da56c9cbfadc13e34e8587ff5ec371069aeb8b23d64\": getting diffsize of layer \"cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e\" and its parent \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\": unmounting layer efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf: replacing mount point \"/var/lib/containers/storage/overlay/efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf/merged\": device or resource busy" Nov 26 04:32:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27257 DF PROTO=TCP SPT=50370 DPT=9105 SEQ=3582486075 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF4DBC0000000001030307) Nov 26 04:32:53 localhost systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully. Nov 26 04:32:53 localhost systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully. Nov 26 04:32:53 localhost systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully. Nov 26 04:32:53 localhost nova_compute[229802]: 2025-11-26 09:32:53.701 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:32:54 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:32:54 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:32:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:32:54 localhost podman[241031]: 2025-11-26 09:32:54.320694833 +0000 UTC m=+0.083358467 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118) Nov 26 04:32:54 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:32:54 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:32:54 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:32:54 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:32:54 localhost podman[241031]: 2025-11-26 09:32:54.339113077 +0000 UTC m=+0.101776751 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 26 04:32:54 localhost systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully. Nov 26 04:32:54 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:32:54 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:32:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53463 DF PROTO=TCP SPT=38306 DPT=9102 SEQ=927469381 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF553D0000000001030307) Nov 26 04:32:54 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:32:55 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:32:55 localhost systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully. Nov 26 04:32:55 localhost systemd[1]: var-lib-containers-storage-overlay-29f95cfe95f535f9d59641ec99dde393a4714016cd95f7dbb20217cba1000992-merged.mount: Deactivated successfully. Nov 26 04:32:55 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:32:55 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:32:56 localhost nova_compute[229802]: 2025-11-26 09:32:56.039 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:32:56 localhost systemd[1]: var-lib-containers-storage-overlay-d16160b7dcc2f7ec400dce38b825ab93d5279c0ca0a9a7ff351e435b4aeeea92-merged.mount: Deactivated successfully. Nov 26 04:32:56 localhost systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully. Nov 26 04:32:56 localhost systemd[1]: var-lib-containers-storage-overlay-ab1eeb830657f9ab8bbf0a1c1e595d808a09550d63278050b820041c6a307d5f-merged.mount: Deactivated successfully. Nov 26 04:32:57 localhost systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully. Nov 26 04:32:57 localhost systemd[1]: var-lib-containers-storage-overlay-0ff11ed3154c8bbd91096301c9cfc5b95bbe726d99c5650ba8d355053fb0bbad-merged.mount: Deactivated successfully. Nov 26 04:32:57 localhost systemd[1]: var-lib-containers-storage-overlay-d16160b7dcc2f7ec400dce38b825ab93d5279c0ca0a9a7ff351e435b4aeeea92-merged.mount: Deactivated successfully. Nov 26 04:32:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51760 DF PROTO=TCP SPT=51586 DPT=9101 SEQ=433842839 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF61FC0000000001030307) Nov 26 04:32:58 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:32:58 localhost systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully. Nov 26 04:32:58 localhost systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully. Nov 26 04:32:58 localhost nova_compute[229802]: 2025-11-26 09:32:58.702 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:32:58 localhost nova_compute[229802]: 2025-11-26 09:32:58.961 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:32:58 localhost nova_compute[229802]: 2025-11-26 09:32:58.962 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:32:59 localhost nova_compute[229802]: 2025-11-26 09:32:59.024 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:32:59 localhost nova_compute[229802]: 2025-11-26 09:32:59.024 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 04:32:59 localhost nova_compute[229802]: 2025-11-26 09:32:59.024 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 04:32:59 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:32:59 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:32:59 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:32:59 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:32:59 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Nov 26 04:32:59 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:32:59 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:33:00 localhost systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully. Nov 26 04:33:00 localhost systemd[1]: var-lib-containers-storage-overlay-97e8961208df3b0c873a479fd1758a4d1fe73c2607c9ea38c2b5c398da67b5e0-merged.mount: Deactivated successfully. Nov 26 04:33:00 localhost nova_compute[229802]: 2025-11-26 09:33:00.834 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:33:00 localhost nova_compute[229802]: 2025-11-26 09:33:00.834 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:33:00 localhost nova_compute[229802]: 2025-11-26 09:33:00.834 229806 DEBUG nova.network.neutron [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 04:33:00 localhost nova_compute[229802]: 2025-11-26 09:33:00.834 229806 DEBUG nova.objects.instance [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:33:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53465 DF PROTO=TCP SPT=38306 DPT=9102 SEQ=927469381 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF6CFD0000000001030307) Nov 26 04:33:00 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:33:00 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:33:01 localhost nova_compute[229802]: 2025-11-26 09:33:01.090 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:33:01 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:33:01 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:33:01 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:33:01 localhost systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully. Nov 26 04:33:01 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:33:01 localhost nova_compute[229802]: 2025-11-26 09:33:01.986 229806 DEBUG nova.network.neutron [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:33:02 localhost nova_compute[229802]: 2025-11-26 09:33:02.155 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:33:02 localhost nova_compute[229802]: 2025-11-26 09:33:02.155 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 04:33:02 localhost nova_compute[229802]: 2025-11-26 09:33:02.155 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:33:02 localhost nova_compute[229802]: 2025-11-26 09:33:02.155 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:33:02 localhost nova_compute[229802]: 2025-11-26 09:33:02.155 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:33:02 localhost nova_compute[229802]: 2025-11-26 09:33:02.156 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:33:02 localhost nova_compute[229802]: 2025-11-26 09:33:02.156 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:33:02 localhost nova_compute[229802]: 2025-11-26 09:33:02.156 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:33:02 localhost nova_compute[229802]: 2025-11-26 09:33:02.156 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 04:33:02 localhost nova_compute[229802]: 2025-11-26 09:33:02.156 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:33:02 localhost nova_compute[229802]: 2025-11-26 09:33:02.179 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:33:02 localhost nova_compute[229802]: 2025-11-26 09:33:02.179 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:33:02 localhost nova_compute[229802]: 2025-11-26 09:33:02.180 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:33:02 localhost nova_compute[229802]: 2025-11-26 09:33:02.180 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 04:33:02 localhost nova_compute[229802]: 2025-11-26 09:33:02.180 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:33:02 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:33:02 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:33:02 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:33:02 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:33:02 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:33:02 localhost nova_compute[229802]: 2025-11-26 09:33:02.577 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.397s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:33:02 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:33:02 localhost nova_compute[229802]: 2025-11-26 09:33:02.643 229806 DEBUG nova.virt.libvirt.driver [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:33:02 localhost nova_compute[229802]: 2025-11-26 09:33:02.644 229806 DEBUG nova.virt.libvirt.driver [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:33:02 localhost nova_compute[229802]: 2025-11-26 09:33:02.817 229806 WARNING nova.virt.libvirt.driver [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 04:33:02 localhost nova_compute[229802]: 2025-11-26 09:33:02.818 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=12520MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 04:33:02 localhost nova_compute[229802]: 2025-11-26 09:33:02.819 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:33:02 localhost nova_compute[229802]: 2025-11-26 09:33:02.819 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:33:02 localhost nova_compute[229802]: 2025-11-26 09:33:02.927 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 04:33:02 localhost nova_compute[229802]: 2025-11-26 09:33:02.928 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 04:33:02 localhost nova_compute[229802]: 2025-11-26 09:33:02.928 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 04:33:02 localhost nova_compute[229802]: 2025-11-26 09:33:02.976 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:33:03 localhost systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully. Nov 26 04:33:03 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:33:03 localhost podman[240890]: 2025-11-26 09:32:38.447125586 +0000 UTC m=+0.051782955 image pull quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7 Nov 26 04:33:03 localhost nova_compute[229802]: 2025-11-26 09:33:03.412 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:33:03 localhost nova_compute[229802]: 2025-11-26 09:33:03.420 229806 DEBUG nova.compute.provider_tree [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 04:33:03 localhost nova_compute[229802]: 2025-11-26 09:33:03.439 229806 DEBUG nova.scheduler.client.report [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 04:33:03 localhost nova_compute[229802]: 2025-11-26 09:33:03.441 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 04:33:03 localhost nova_compute[229802]: 2025-11-26 09:33:03.442 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.623s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:33:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:33:03 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:33:03 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:33:03 localhost systemd[1]: var-lib-containers-storage-overlay-ab1eeb830657f9ab8bbf0a1c1e595d808a09550d63278050b820041c6a307d5f-merged.mount: Deactivated successfully. Nov 26 04:33:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:33:03.631 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:33:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:33:03.632 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:33:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:33:03.633 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:33:03 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:33:03 localhost nova_compute[229802]: 2025-11-26 09:33:03.724 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:33:03 localhost podman[241105]: 2025-11-26 09:33:03.73025668 +0000 UTC m=+0.244135569 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Nov 26 04:33:03 localhost podman[241105]: 2025-11-26 09:33:03.764464046 +0000 UTC m=+0.278342965 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Nov 26 04:33:03 localhost podman[241105]: unhealthy Nov 26 04:33:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51762 DF PROTO=TCP SPT=51586 DPT=9101 SEQ=433842839 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF79BC0000000001030307) Nov 26 04:33:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:33:05 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Nov 26 04:33:06 localhost systemd[1]: var-lib-containers-storage-overlay-91ede8908568d13e7e992d722f3ffe3beb21394f78b9650d7ef9bdba7da629a5-merged.mount: Deactivated successfully. Nov 26 04:33:06 localhost nova_compute[229802]: 2025-11-26 09:33:06.092 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:33:06 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:33:06 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Failed with result 'exit-code'. Nov 26 04:33:06 localhost podman[241134]: 2025-11-26 09:33:06.173215857 +0000 UTC m=+0.431625661 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 26 04:33:06 localhost podman[241134]: 2025-11-26 09:33:06.210267002 +0000 UTC m=+0.468676766 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 26 04:33:06 localhost podman[241134]: unhealthy Nov 26 04:33:07 localhost systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully. Nov 26 04:33:07 localhost systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully. Nov 26 04:33:07 localhost systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully. Nov 26 04:33:08 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:33:08 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Nov 26 04:33:08 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Nov 26 04:33:08 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:33:08 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Failed with result 'exit-code'. Nov 26 04:33:08 localhost podman[241168]: 2025-11-26 09:33:08.603729697 +0000 UTC m=+2.402605886 image pull quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7 Nov 26 04:33:08 localhost nova_compute[229802]: 2025-11-26 09:33:08.730 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:33:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45307 DF PROTO=TCP SPT=34028 DPT=9100 SEQ=1121373155 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF8DFC0000000001030307) Nov 26 04:33:09 localhost systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully. Nov 26 04:33:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:33:10 localhost podman[241182]: 2025-11-26 09:33:10.026741292 +0000 UTC m=+0.077605064 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 26 04:33:10 localhost systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully. Nov 26 04:33:10 localhost podman[241182]: 2025-11-26 09:33:10.058986076 +0000 UTC m=+0.109849818 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2) Nov 26 04:33:10 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:33:10 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:33:10 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:33:11 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:33:11 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:33:11 localhost nova_compute[229802]: 2025-11-26 09:33:11.095 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:33:11 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:33:11 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:33:11 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:33:12 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:33:12 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:33:12 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:33:12 localhost podman[240049]: time="2025-11-26T09:33:12Z" level=error msg="Getting root fs size for \"373d5c869df5f4b6c549fc0bffc963e5064f15198d84a2ec31e2346176efccd1\": unmounting layer c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6: replacing mount point \"/var/lib/containers/storage/overlay/c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6/merged\": device or resource busy" Nov 26 04:33:12 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:33:12 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:33:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51763 DF PROTO=TCP SPT=51586 DPT=9101 SEQ=433842839 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF99FD0000000001030307) Nov 26 04:33:12 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:33:12 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:33:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40146 DF PROTO=TCP SPT=40422 DPT=9882 SEQ=56969187 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF9BFC0000000001030307) Nov 26 04:33:12 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:33:13 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:33:13 localhost systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully. Nov 26 04:33:13 localhost nova_compute[229802]: 2025-11-26 09:33:13.732 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:33:14 localhost systemd[1]: var-lib-containers-storage-overlay-8429e6696a92b25e8a426aacc63cb14bbf8015d1d1cfea8ab0510d96125cec37-merged.mount: Deactivated successfully. Nov 26 04:33:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:33:14 localhost systemd[1]: tmp-crun.ZBnf76.mount: Deactivated successfully. Nov 26 04:33:14 localhost podman[241206]: 2025-11-26 09:33:14.817916788 +0000 UTC m=+0.083544282 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 26 04:33:14 localhost podman[241206]: 2025-11-26 09:33:14.85326393 +0000 UTC m=+0.118891474 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 26 04:33:15 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Nov 26 04:33:15 localhost systemd[1]: var-lib-containers-storage-overlay-91ede8908568d13e7e992d722f3ffe3beb21394f78b9650d7ef9bdba7da629a5-merged.mount: Deactivated successfully. Nov 26 04:33:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48263 DF PROTO=TCP SPT=33682 DPT=9105 SEQ=3127707711 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BFA7100000000001030307) Nov 26 04:33:15 localhost systemd[1]: var-lib-containers-storage-overlay-91ede8908568d13e7e992d722f3ffe3beb21394f78b9650d7ef9bdba7da629a5-merged.mount: Deactivated successfully. Nov 26 04:33:16 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:33:16 localhost nova_compute[229802]: 2025-11-26 09:33:16.098 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:33:16 localhost podman[241168]: Nov 26 04:33:16 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:33:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:33:16 localhost podman[241168]: 2025-11-26 09:33:16.399998961 +0000 UTC m=+10.198875140 container create a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_id=edpm, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, name=ubi9-minimal, io.openshift.tags=minimal rhel9, vcs-type=git, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc.) Nov 26 04:33:16 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:33:17 localhost systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully. Nov 26 04:33:17 localhost systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully. Nov 26 04:33:17 localhost systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully. Nov 26 04:33:18 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:33:18 localhost python3[240863]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7 Nov 26 04:33:18 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:33:18 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:33:18 localhost podman[241230]: 2025-11-26 09:33:18.621966633 +0000 UTC m=+2.217723438 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible) Nov 26 04:33:18 localhost podman[241230]: 2025-11-26 09:33:18.656238232 +0000 UTC m=+2.251995027 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 26 04:33:18 localhost nova_compute[229802]: 2025-11-26 09:33:18.758 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:33:18 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:33:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48265 DF PROTO=TCP SPT=33682 DPT=9105 SEQ=3127707711 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BFB2FC0000000001030307) Nov 26 04:33:19 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:33:19 localhost podman[240049]: time="2025-11-26T09:33:19Z" level=error msg="Getting root fs size for \"3a5b352373b1cb4201e9a0c41bc14392b8ec31b7ee990afd81cebce34fe31490\": getting diffsize of layer \"3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae\" and its parent \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\": unmounting layer 3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae: replacing mount point \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged\": device or resource busy" Nov 26 04:33:20 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:33:20 localhost systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully. Nov 26 04:33:20 localhost systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully. Nov 26 04:33:20 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:33:20 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:33:21 localhost nova_compute[229802]: 2025-11-26 09:33:21.100 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:33:21 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:33:21 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:33:21 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:33:22 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:33:22 localhost systemd[1]: var-lib-containers-storage-overlay-3df8000495686f63312283822b59896f01c9669473559757e06844f715334a89-merged.mount: Deactivated successfully. Nov 26 04:33:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48266 DF PROTO=TCP SPT=33682 DPT=9105 SEQ=3127707711 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BFC2BD0000000001030307) Nov 26 04:33:22 localhost systemd[1]: var-lib-containers-storage-overlay-3df8000495686f63312283822b59896f01c9669473559757e06844f715334a89-merged.mount: Deactivated successfully. Nov 26 04:33:23 localhost sshd[241288]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:33:23 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:33:23 localhost nova_compute[229802]: 2025-11-26 09:33:23.761 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:33:23 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:33:24 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:33:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30245 DF PROTO=TCP SPT=56802 DPT=9102 SEQ=2974909597 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BFCA7C0000000001030307) Nov 26 04:33:25 localhost systemd[1]: var-lib-containers-storage-overlay-bba04dbd2e113925874f779bcae6bca7316b7a66f7b274a8553d6a317f393238-merged.mount: Deactivated successfully. Nov 26 04:33:25 localhost systemd[1]: var-lib-containers-storage-overlay-5d77d5f2ee8b05529dee3e4c9abd16672fdc0c09b8af133b2b88b75f7f780254-merged.mount: Deactivated successfully. Nov 26 04:33:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:33:25 localhost podman[241290]: 2025-11-26 09:33:25.412783256 +0000 UTC m=+0.079122103 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 26 04:33:25 localhost podman[241290]: 2025-11-26 09:33:25.449308355 +0000 UTC m=+0.115647252 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible) Nov 26 04:33:25 localhost systemd[1]: var-lib-containers-storage-overlay-5d77d5f2ee8b05529dee3e4c9abd16672fdc0c09b8af133b2b88b75f7f780254-merged.mount: Deactivated successfully. Nov 26 04:33:25 localhost systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully. Nov 26 04:33:25 localhost systemd[1]: var-lib-containers-storage-overlay-8429e6696a92b25e8a426aacc63cb14bbf8015d1d1cfea8ab0510d96125cec37-merged.mount: Deactivated successfully. Nov 26 04:33:25 localhost systemd[1]: var-lib-containers-storage-overlay-8429e6696a92b25e8a426aacc63cb14bbf8015d1d1cfea8ab0510d96125cec37-merged.mount: Deactivated successfully. Nov 26 04:33:25 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:33:26 localhost nova_compute[229802]: 2025-11-26 09:33:26.103 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:33:26 localhost python3.9[241402]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:33:27 localhost systemd[1]: var-lib-containers-storage-overlay-0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c-merged.mount: Deactivated successfully. Nov 26 04:33:27 localhost python3.9[241514]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:33:27 localhost systemd[1]: var-lib-containers-storage-overlay-bba04dbd2e113925874f779bcae6bca7316b7a66f7b274a8553d6a317f393238-merged.mount: Deactivated successfully. Nov 26 04:33:27 localhost systemd[1]: var-lib-containers-storage-overlay-bba04dbd2e113925874f779bcae6bca7316b7a66f7b274a8553d6a317f393238-merged.mount: Deactivated successfully. Nov 26 04:33:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60918 DF PROTO=TCP SPT=48284 DPT=9102 SEQ=2794022035 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BFD5FD0000000001030307) Nov 26 04:33:28 localhost python3.9[241623]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764149607.4362125-2214-139713718926579/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:33:28 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:33:28 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:33:28 localhost systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully. Nov 26 04:33:28 localhost systemd[1]: var-lib-containers-storage-overlay-0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c-merged.mount: Deactivated successfully. Nov 26 04:33:28 localhost python3.9[241678]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 26 04:33:28 localhost systemd[1]: Reloading. Nov 26 04:33:28 localhost systemd-rc-local-generator[241700]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:33:28 localhost systemd-sysv-generator[241706]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:33:28 localhost nova_compute[229802]: 2025-11-26 09:33:28.814 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:33:28 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:33:28 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:33:28 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:33:28 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:33:28 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:33:28 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:33:28 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:33:28 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:33:28 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:33:28 localhost systemd[1]: var-lib-containers-storage-overlay-0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c-merged.mount: Deactivated successfully. Nov 26 04:33:28 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:33:29 localhost systemd[1]: var-lib-containers-storage-overlay-ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6-merged.mount: Deactivated successfully. Nov 26 04:33:29 localhost systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully. Nov 26 04:33:29 localhost systemd[1]: var-lib-containers-storage-overlay-ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6-merged.mount: Deactivated successfully. Nov 26 04:33:29 localhost python3.9[241768]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:33:29 localhost systemd[1]: Reloading. Nov 26 04:33:29 localhost systemd-sysv-generator[241793]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:33:29 localhost systemd-rc-local-generator[241788]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:33:29 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:33:29 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:33:29 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:33:29 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:33:29 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:33:29 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:33:29 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:33:29 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:33:29 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:33:29 localhost systemd[1]: var-lib-containers-storage-overlay-ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6-merged.mount: Deactivated successfully. Nov 26 04:33:30 localhost systemd[1]: Starting openstack_network_exporter container... Nov 26 04:33:30 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:33:30 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:33:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8840 DF PROTO=TCP SPT=49276 DPT=9101 SEQ=3554263631 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BFE1FC0000000001030307) Nov 26 04:33:31 localhost nova_compute[229802]: 2025-11-26 09:33:31.107 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:33:31 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:33:31 localhost systemd[1]: var-lib-containers-storage-overlay-5d77d5f2ee8b05529dee3e4c9abd16672fdc0c09b8af133b2b88b75f7f780254-merged.mount: Deactivated successfully. Nov 26 04:33:32 localhost systemd[1]: var-lib-containers-storage-overlay-1e67a08f7f89bb249239464cd2488dcb0276d30630b75fe760d996b7617d582f-merged.mount: Deactivated successfully. Nov 26 04:33:32 localhost systemd[1]: var-lib-containers-storage-overlay-1e67a08f7f89bb249239464cd2488dcb0276d30630b75fe760d996b7617d582f-merged.mount: Deactivated successfully. Nov 26 04:33:32 localhost systemd[1]: Started libcrun container. Nov 26 04:33:32 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52b601c5c7ea760591d62e0a7b681ed0d9c14a3884b5ceaf5260b64deaa8b4d6/merged/run/ovn supports timestamps until 2038 (0x7fffffff) Nov 26 04:33:32 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52b601c5c7ea760591d62e0a7b681ed0d9c14a3884b5ceaf5260b64deaa8b4d6/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff) Nov 26 04:33:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:33:32 localhost podman[241809]: 2025-11-26 09:33:32.382635782 +0000 UTC m=+2.314729270 container init a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, config_id=edpm, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, version=9.6, architecture=x86_64, name=ubi9-minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9) Nov 26 04:33:32 localhost openstack_network_exporter[241824]: INFO 09:33:32 main.go:48: registering *bridge.Collector Nov 26 04:33:32 localhost openstack_network_exporter[241824]: INFO 09:33:32 main.go:48: registering *coverage.Collector Nov 26 04:33:32 localhost openstack_network_exporter[241824]: INFO 09:33:32 main.go:48: registering *datapath.Collector Nov 26 04:33:32 localhost openstack_network_exporter[241824]: INFO 09:33:32 main.go:48: registering *iface.Collector Nov 26 04:33:32 localhost openstack_network_exporter[241824]: INFO 09:33:32 main.go:48: registering *memory.Collector Nov 26 04:33:32 localhost openstack_network_exporter[241824]: INFO 09:33:32 main.go:48: registering *ovnnorthd.Collector Nov 26 04:33:32 localhost openstack_network_exporter[241824]: INFO 09:33:32 main.go:48: registering *ovn.Collector Nov 26 04:33:32 localhost openstack_network_exporter[241824]: INFO 09:33:32 main.go:48: registering *ovsdbserver.Collector Nov 26 04:33:32 localhost openstack_network_exporter[241824]: INFO 09:33:32 main.go:48: registering *pmd_perf.Collector Nov 26 04:33:32 localhost openstack_network_exporter[241824]: INFO 09:33:32 main.go:48: registering *pmd_rxq.Collector Nov 26 04:33:32 localhost openstack_network_exporter[241824]: INFO 09:33:32 main.go:48: registering *vswitch.Collector Nov 26 04:33:32 localhost openstack_network_exporter[241824]: NOTICE 09:33:32 main.go:82: listening on http://:9105/metrics Nov 26 04:33:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:33:32 localhost podman[241809]: 2025-11-26 09:33:32.416666279 +0000 UTC m=+2.348759767 container start a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_id=edpm, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, distribution-scope=public, version=9.6, architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=) Nov 26 04:33:32 localhost podman[241809]: openstack_network_exporter Nov 26 04:33:32 localhost systemd[1]: var-lib-containers-storage-overlay-a1185e7325783fe8cba63270bc6e59299386d7c73e4bc34c560a1fbc9e6d7e2c-merged.mount: Deactivated successfully. Nov 26 04:33:32 localhost systemd[1]: var-lib-containers-storage-overlay-2cd9444c84550fbd551e3826a8110fcc009757858b99e84f1119041f2325189b-merged.mount: Deactivated successfully. Nov 26 04:33:33 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:33:33 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:33:33 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:33:33 localhost systemd[1]: Started openstack_network_exporter container. Nov 26 04:33:33 localhost podman[241835]: 2025-11-26 09:33:33.621602831 +0000 UTC m=+1.198831321 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=starting, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, config_id=edpm, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7) Nov 26 04:33:33 localhost podman[241835]: 2025-11-26 09:33:33.656078049 +0000 UTC m=+1.233306529 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vendor=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, release=1755695350, com.redhat.component=ubi9-minimal-container, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, maintainer=Red Hat, Inc.) Nov 26 04:33:33 localhost nova_compute[229802]: 2025-11-26 09:33:33.844 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:33:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39631 DF PROTO=TCP SPT=39682 DPT=9101 SEQ=4279197687 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BFEEBC0000000001030307) Nov 26 04:33:34 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:33:34 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:33:34 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:33:34 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:33:34 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:33:35 localhost python3.9[241967]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 26 04:33:35 localhost systemd[1]: Stopping openstack_network_exporter container... Nov 26 04:33:35 localhost systemd[1]: libpod-a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.scope: Deactivated successfully. Nov 26 04:33:35 localhost podman[241971]: 2025-11-26 09:33:35.298186622 +0000 UTC m=+0.089638700 container died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., container_name=openstack_network_exporter, release=1755695350, version=9.6) Nov 26 04:33:35 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.timer: Deactivated successfully. Nov 26 04:33:35 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:33:35 localhost systemd[1]: tmp-crun.wQgGd7.mount: Deactivated successfully. Nov 26 04:33:35 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba-userdata-shm.mount: Deactivated successfully. Nov 26 04:33:36 localhost nova_compute[229802]: 2025-11-26 09:33:36.111 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:33:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:33:36 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:33:37 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:33:37 localhost systemd[1]: var-lib-containers-storage-overlay-3df8000495686f63312283822b59896f01c9669473559757e06844f715334a89-merged.mount: Deactivated successfully. Nov 26 04:33:37 localhost systemd[1]: var-lib-containers-storage-overlay-52b601c5c7ea760591d62e0a7b681ed0d9c14a3884b5ceaf5260b64deaa8b4d6-merged.mount: Deactivated successfully. Nov 26 04:33:37 localhost podman[241971]: 2025-11-26 09:33:37.509628208 +0000 UTC m=+2.301080276 container cleanup a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal) Nov 26 04:33:37 localhost podman[241971]: openstack_network_exporter Nov 26 04:33:37 localhost podman[241997]: 2025-11-26 09:33:37.59527927 +0000 UTC m=+0.861031207 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:33:37 localhost podman[241984]: 2025-11-26 09:33:37.628484025 +0000 UTC m=+2.325071245 container cleanup a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., release=1755695350, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Nov 26 04:33:37 localhost podman[241997]: 2025-11-26 09:33:37.630332745 +0000 UTC m=+0.896084692 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:33:37 localhost podman[241997]: unhealthy Nov 26 04:33:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:33:38 localhost nova_compute[229802]: 2025-11-26 09:33:38.896 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:33:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30248 DF PROTO=TCP SPT=56802 DPT=9102 SEQ=2974909597 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C001FC0000000001030307) Nov 26 04:33:39 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:33:39 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:33:39 localhost systemd[1]: var-lib-containers-storage-overlay-bba04dbd2e113925874f779bcae6bca7316b7a66f7b274a8553d6a317f393238-merged.mount: Deactivated successfully. Nov 26 04:33:39 localhost systemd[1]: var-lib-containers-storage-overlay-5d77d5f2ee8b05529dee3e4c9abd16672fdc0c09b8af133b2b88b75f7f780254-merged.mount: Deactivated successfully. Nov 26 04:33:39 localhost systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT Nov 26 04:33:39 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:33:39 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Failed with result 'exit-code'. Nov 26 04:33:39 localhost podman[242014]: 2025-11-26 09:33:39.88002947 +0000 UTC m=+1.138844692 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 26 04:33:39 localhost podman[242014]: 2025-11-26 09:33:39.918603198 +0000 UTC m=+1.177418420 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:33:39 localhost podman[242014]: unhealthy Nov 26 04:33:39 localhost podman[242026]: 2025-11-26 09:33:39.978121084 +0000 UTC m=+0.125525486 container cleanup a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, config_id=edpm, io.openshift.tags=minimal rhel9, distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter) Nov 26 04:33:39 localhost podman[242026]: openstack_network_exporter Nov 26 04:33:40 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:33:40 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:33:40 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:33:41 localhost nova_compute[229802]: 2025-11-26 09:33:41.114 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:33:41 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:33:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:33:41 localhost systemd[1]: var-lib-containers-storage-overlay-0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c-merged.mount: Deactivated successfully. Nov 26 04:33:41 localhost systemd[1]: var-lib-containers-storage-overlay-bba04dbd2e113925874f779bcae6bca7316b7a66f7b274a8553d6a317f393238-merged.mount: Deactivated successfully. Nov 26 04:33:41 localhost systemd[1]: var-lib-containers-storage-overlay-bba04dbd2e113925874f779bcae6bca7316b7a66f7b274a8553d6a317f393238-merged.mount: Deactivated successfully. Nov 26 04:33:41 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:33:41 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Failed with result 'exit-code'. Nov 26 04:33:41 localhost systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'. Nov 26 04:33:41 localhost systemd[1]: Stopped openstack_network_exporter container. Nov 26 04:33:41 localhost systemd[1]: Starting openstack_network_exporter container... Nov 26 04:33:41 localhost podman[242099]: 2025-11-26 09:33:41.675810268 +0000 UTC m=+0.520556824 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:33:41 localhost podman[242099]: 2025-11-26 09:33:41.710843281 +0000 UTC m=+0.555589797 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Nov 26 04:33:42 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:33:42 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:33:42 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:33:42 localhost systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully. Nov 26 04:33:42 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:33:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8643 DF PROTO=TCP SPT=53440 DPT=9882 SEQ=3871305389 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C00FFC0000000001030307) Nov 26 04:33:43 localhost systemd[1]: Started libcrun container. Nov 26 04:33:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52b601c5c7ea760591d62e0a7b681ed0d9c14a3884b5ceaf5260b64deaa8b4d6/merged/run/ovn supports timestamps until 2038 (0x7fffffff) Nov 26 04:33:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52b601c5c7ea760591d62e0a7b681ed0d9c14a3884b5ceaf5260b64deaa8b4d6/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff) Nov 26 04:33:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:33:43 localhost podman[242113]: 2025-11-26 09:33:43.247662879 +0000 UTC m=+1.565034190 container init a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=ubi9-minimal-container, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, version=9.6, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, name=ubi9-minimal, release=1755695350) Nov 26 04:33:43 localhost openstack_network_exporter[242153]: INFO 09:33:43 main.go:48: registering *bridge.Collector Nov 26 04:33:43 localhost openstack_network_exporter[242153]: INFO 09:33:43 main.go:48: registering *coverage.Collector Nov 26 04:33:43 localhost openstack_network_exporter[242153]: INFO 09:33:43 main.go:48: registering *datapath.Collector Nov 26 04:33:43 localhost openstack_network_exporter[242153]: INFO 09:33:43 main.go:48: registering *iface.Collector Nov 26 04:33:43 localhost openstack_network_exporter[242153]: INFO 09:33:43 main.go:48: registering *memory.Collector Nov 26 04:33:43 localhost openstack_network_exporter[242153]: INFO 09:33:43 main.go:48: registering *ovnnorthd.Collector Nov 26 04:33:43 localhost openstack_network_exporter[242153]: INFO 09:33:43 main.go:48: registering *ovn.Collector Nov 26 04:33:43 localhost openstack_network_exporter[242153]: INFO 09:33:43 main.go:48: registering *ovsdbserver.Collector Nov 26 04:33:43 localhost openstack_network_exporter[242153]: INFO 09:33:43 main.go:48: registering *pmd_perf.Collector Nov 26 04:33:43 localhost openstack_network_exporter[242153]: INFO 09:33:43 main.go:48: registering *pmd_rxq.Collector Nov 26 04:33:43 localhost openstack_network_exporter[242153]: INFO 09:33:43 main.go:48: registering *vswitch.Collector Nov 26 04:33:43 localhost openstack_network_exporter[242153]: NOTICE 09:33:43 main.go:82: listening on http://:9105/metrics Nov 26 04:33:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:33:43 localhost podman[242113]: 2025-11-26 09:33:43.280550115 +0000 UTC m=+1.597921406 container start a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, container_name=openstack_network_exporter, version=9.6, io.openshift.tags=minimal rhel9, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, vcs-type=git, managed_by=edpm_ansible, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Nov 26 04:33:43 localhost podman[242113]: openstack_network_exporter Nov 26 04:33:43 localhost systemd[1]: var-lib-containers-storage-overlay-0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c-merged.mount: Deactivated successfully. Nov 26 04:33:43 localhost systemd[1]: var-lib-containers-storage-overlay-ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6-merged.mount: Deactivated successfully. Nov 26 04:33:43 localhost systemd[1]: var-lib-containers-storage-overlay-ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6-merged.mount: Deactivated successfully. Nov 26 04:33:43 localhost nova_compute[229802]: 2025-11-26 09:33:43.924 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:33:45 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:33:45 localhost systemd[1]: var-lib-containers-storage-overlay-0341f1887aae20a301e856089dc461ce52079f292afb39f1be5bab8c0d01f7a2-merged.mount: Deactivated successfully. Nov 26 04:33:45 localhost systemd[1]: Started openstack_network_exporter container. Nov 26 04:33:45 localhost podman[242163]: 2025-11-26 09:33:45.69569209 +0000 UTC m=+2.409776743 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=starting, architecture=x86_64, release=1755695350, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, container_name=openstack_network_exporter, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc.) Nov 26 04:33:45 localhost podman[242163]: 2025-11-26 09:33:45.712791165 +0000 UTC m=+2.426875808 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Nov 26 04:33:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38460 DF PROTO=TCP SPT=57066 DPT=9105 SEQ=3451510457 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C01C400000000001030307) Nov 26 04:33:46 localhost nova_compute[229802]: 2025-11-26 09:33:46.115 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:33:46 localhost python3.9[242309]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Nov 26 04:33:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:33:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38461 DF PROTO=TCP SPT=57066 DPT=9105 SEQ=3451510457 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C0203C0000000001030307) Nov 26 04:33:47 localhost systemd[1]: var-lib-containers-storage-overlay-5d77d5f2ee8b05529dee3e4c9abd16672fdc0c09b8af133b2b88b75f7f780254-merged.mount: Deactivated successfully. Nov 26 04:33:47 localhost systemd[1]: var-lib-containers-storage-overlay-1e67a08f7f89bb249239464cd2488dcb0276d30630b75fe760d996b7617d582f-merged.mount: Deactivated successfully. Nov 26 04:33:48 localhost systemd[1]: var-lib-containers-storage-overlay-1e67a08f7f89bb249239464cd2488dcb0276d30630b75fe760d996b7617d582f-merged.mount: Deactivated successfully. Nov 26 04:33:48 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:33:48 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:33:48 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Nov 26 04:33:48 localhost systemd[1]: var-lib-containers-storage-overlay-0438ade5aeea533b00cd75095bec75fbc2b307bace4c89bb39b75d428637bcd8-merged.mount: Deactivated successfully. Nov 26 04:33:48 localhost systemd[1]: var-lib-containers-storage-overlay-a1185e7325783fe8cba63270bc6e59299386d7c73e4bc34c560a1fbc9e6d7e2c-merged.mount: Deactivated successfully. Nov 26 04:33:48 localhost systemd[1]: var-lib-containers-storage-overlay-a1185e7325783fe8cba63270bc6e59299386d7c73e4bc34c560a1fbc9e6d7e2c-merged.mount: Deactivated successfully. Nov 26 04:33:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38462 DF PROTO=TCP SPT=57066 DPT=9105 SEQ=3451510457 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C0283D0000000001030307) Nov 26 04:33:48 localhost nova_compute[229802]: 2025-11-26 09:33:48.974 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:33:48 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:33:49 localhost podman[242327]: 2025-11-26 09:33:49.01087968 +0000 UTC m=+2.267789528 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 26 04:33:49 localhost podman[242327]: 2025-11-26 09:33:49.021398401 +0000 UTC m=+2.278308259 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 26 04:33:50 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:33:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:33:51 localhost nova_compute[229802]: 2025-11-26 09:33:51.117 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:33:51 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:33:51 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:33:51 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:33:51 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:33:51 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:33:51 localhost podman[242350]: 2025-11-26 09:33:51.764258059 +0000 UTC m=+1.234140671 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent) Nov 26 04:33:51 localhost podman[242350]: 2025-11-26 09:33:51.795107991 +0000 UTC m=+1.264990593 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible) Nov 26 04:33:52 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:33:52 localhost podman[240049]: time="2025-11-26T09:33:52Z" level=error msg="Getting root fs size for \"5b3681e874015ba7279a1323490b43d4c2358ed6d099870ec7a1493286a1e6a3\": getting diffsize of layer \"3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae\" and its parent \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\": unmounting layer 3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae: replacing mount point \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged\": device or resource busy" Nov 26 04:33:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38463 DF PROTO=TCP SPT=57066 DPT=9105 SEQ=3451510457 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C037FC0000000001030307) Nov 26 04:33:53 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:33:53 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:33:53 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:33:53 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:33:53 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:33:54 localhost nova_compute[229802]: 2025-11-26 09:33:54.020 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:33:54 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:33:54 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:33:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31025 DF PROTO=TCP SPT=53458 DPT=9102 SEQ=2729808065 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C03F7D0000000001030307) Nov 26 04:33:54 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:33:55 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:33:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:33:55 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:33:56 localhost podman[242369]: 2025-11-26 09:33:56.066405764 +0000 UTC m=+0.086923498 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd) Nov 26 04:33:56 localhost podman[242369]: 2025-11-26 09:33:56.082343268 +0000 UTC m=+0.102860952 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd) Nov 26 04:33:56 localhost nova_compute[229802]: 2025-11-26 09:33:56.119 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:33:56 localhost systemd[1]: tmp-crun.oD6yOA.mount: Deactivated successfully. Nov 26 04:33:56 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:33:56 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Nov 26 04:33:56 localhost systemd[1]: var-lib-containers-storage-overlay-6b40f2e3ce469630e9b0ac8513bc38db78e513919de8034db1894708fd48cdba-merged.mount: Deactivated successfully. Nov 26 04:33:56 localhost systemd[1]: var-lib-containers-storage-overlay-6b40f2e3ce469630e9b0ac8513bc38db78e513919de8034db1894708fd48cdba-merged.mount: Deactivated successfully. Nov 26 04:33:56 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:33:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53468 DF PROTO=TCP SPT=38306 DPT=9102 SEQ=927469381 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C04BFC0000000001030307) Nov 26 04:33:58 localhost systemd[1]: var-lib-containers-storage-overlay-7f2203bb12e8263bda654cf994f0f457cee81cbd85ac1474f70f0dcdab850bfc-merged.mount: Deactivated successfully. Nov 26 04:33:58 localhost systemd[1]: var-lib-containers-storage-overlay-94bc28862446c9d52bea5cb761ece58f4b7ce0b6f4d30585ec973efe7d5007f7-merged.mount: Deactivated successfully. Nov 26 04:33:58 localhost systemd[1]: var-lib-containers-storage-overlay-94bc28862446c9d52bea5cb761ece58f4b7ce0b6f4d30585ec973efe7d5007f7-merged.mount: Deactivated successfully. Nov 26 04:33:59 localhost nova_compute[229802]: 2025-11-26 09:33:59.079 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:33:59 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:33:59 localhost systemd[1]: var-lib-containers-storage-overlay-0341f1887aae20a301e856089dc461ce52079f292afb39f1be5bab8c0d01f7a2-merged.mount: Deactivated successfully. Nov 26 04:34:00 localhost systemd[1]: var-lib-containers-storage-overlay-cb3e7a7b413bc69102c7d8435b32176125a43d794f5f87a0ef3f45710221e344-merged.mount: Deactivated successfully. Nov 26 04:34:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31027 DF PROTO=TCP SPT=53458 DPT=9102 SEQ=2729808065 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C0573C0000000001030307) Nov 26 04:34:00 localhost systemd[1]: var-lib-containers-storage-overlay-7f2203bb12e8263bda654cf994f0f457cee81cbd85ac1474f70f0dcdab850bfc-merged.mount: Deactivated successfully. Nov 26 04:34:01 localhost systemd[1]: var-lib-containers-storage-overlay-7f2203bb12e8263bda654cf994f0f457cee81cbd85ac1474f70f0dcdab850bfc-merged.mount: Deactivated successfully. Nov 26 04:34:01 localhost nova_compute[229802]: 2025-11-26 09:34:01.120 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:34:01 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:34:01 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Nov 26 04:34:02 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Nov 26 04:34:02 localhost sshd[242388]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:34:02 localhost systemd[1]: var-lib-containers-storage-overlay-0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c-merged.mount: Deactivated successfully. Nov 26 04:34:03 localhost systemd[1]: var-lib-containers-storage-overlay-cb3e7a7b413bc69102c7d8435b32176125a43d794f5f87a0ef3f45710221e344-merged.mount: Deactivated successfully. Nov 26 04:34:03 localhost nova_compute[229802]: 2025-11-26 09:34:03.443 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:34:03 localhost nova_compute[229802]: 2025-11-26 09:34:03.444 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:34:03 localhost nova_compute[229802]: 2025-11-26 09:34:03.444 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 04:34:03 localhost nova_compute[229802]: 2025-11-26 09:34:03.444 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 04:34:03 localhost nova_compute[229802]: 2025-11-26 09:34:03.512 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:34:03 localhost nova_compute[229802]: 2025-11-26 09:34:03.512 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:34:03 localhost nova_compute[229802]: 2025-11-26 09:34:03.513 229806 DEBUG nova.network.neutron [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 04:34:03 localhost nova_compute[229802]: 2025-11-26 09:34:03.513 229806 DEBUG nova.objects.instance [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.557 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'name': 'test', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005536118.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'hostId': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.558 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.561 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '40520c4b-e8e9-415d-acc0-7d1e3d74155e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:34:03.558762', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '0bdb48c2-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10118.801021328, 'message_signature': 'f9a54398e497c586095ee8453fc41a9bba8a26f4eda74e906c96f1276fd818d8'}]}, 'timestamp': '2025-11-26 09:34:03.562753', '_unique_id': '665c690b9ba444f4bd82279377749907'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.564 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.565 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.565 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.565 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets volume: 88 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2291627d-fd37-44da-a7fd-980a27dd673c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 88, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:34:03.565693', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '0bdbd35a-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10118.801021328, 'message_signature': 'c688d7ad65cb383c1fbb5d9b228d521c55cf6840f125109890ba4e139cadf979'}]}, 'timestamp': '2025-11-26 09:34:03.566378', '_unique_id': 'd4d017acfc3b4209962b043172856b94'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.567 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.568 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.597 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 627516836 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.598 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 21052656 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b8eae38a-80c0-480f-bc4b-c1150cf4cff1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 627516836, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:34:03.568522', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0be0acf4-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10118.810783689, 'message_signature': 'bfc396894022a28b904e18b00a7f91a2a5637bd68d25a4b86c1aa7124f8efbb5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21052656, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:34:03.568522', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0be0c144-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10118.810783689, 'message_signature': 'b3850edc08c5f4148cf81a4aa3aea5aa48c8b16b344d4681adebfad0495b8d04'}]}, 'timestamp': '2025-11-26 09:34:03.598481', '_unique_id': 'f7c03324478c438ba00fc2d22b652ad2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.599 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.601 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.601 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.612 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.613 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '42858112-4fc9-4f6e-bc6e-d557e41d62c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:34:03.601464', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0be2fd60-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10118.843733087, 'message_signature': 'e8d27187c19075aa023ef93f33dc14d1ab78841e24ab9b768d122f8196e97abe'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:34:03.601464', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0be31084-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10118.843733087, 'message_signature': '2eccd3838e643f4d102581a89226a240eeef3c909235427c1be4811e6eef7fb7'}]}, 'timestamp': '2025-11-26 09:34:03.613666', '_unique_id': '192e5e70b5754bcb9ba1a0415c2e6b1b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.614 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.615 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.616 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.616 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '474dda1f-099e-4065-8cd0-0a9055a99ddb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:34:03.616143', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0be384e2-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10118.810783689, 'message_signature': 'dfe37ee399448007b90482d15299df73f00861da9c30e47dd7699fa241d4b418'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:34:03.616143', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0be397f2-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10118.810783689, 'message_signature': 'f2b505b4e85118ffce52f7682d6b6c3310a3f88737dff9c3172d2946cfa06927'}]}, 'timestamp': '2025-11-26 09:34:03.617123', '_unique_id': 'deb5344c09874f26bd3cd54532e73108'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.618 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.619 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.619 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes volume: 9035 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3812b351-3618-42d0-a058-d4d7e508a17f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9035, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:34:03.619458', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '0be406a6-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10118.801021328, 'message_signature': 'ba799f04e7b8d64c7ba8c3db1e41f26c4b34f5ce9e28c4afe678439e6b582ab5'}]}, 'timestamp': '2025-11-26 09:34:03.619975', '_unique_id': 'cde7c64d6f3146948e661a8de973b3f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.620 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.627 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.627 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 524 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.627 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c7de9092-af28-41ef-97bc-1d369e7d96fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 524, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:34:03.627422', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0be53ea4-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10118.810783689, 'message_signature': 'c21c4974d976fa04bd65cc78192b62548e12abbe7e4c97002c1777d022cc8d8c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:34:03.627422', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0be55358-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10118.810783689, 'message_signature': '6b015e4a3a250161131d96696f7b4c01095695a65141b29c7a243b68f98bd7bd'}]}, 'timestamp': '2025-11-26 09:34:03.628440', '_unique_id': '71dfd73096aa4eeab6f59a19b0237f6c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.629 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.631 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 26 04:34:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:34:03.633 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:34:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:34:03.633 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:34:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:34:03.635 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.647 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/memory.usage volume: 52.296875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50af9017-66fc-4b2c-bb01-af401d39c065', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.296875, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T09:34:03.631516', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '0be84ff4-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10118.889600929, 'message_signature': '33daf6a0ecf244c550cd74e6dad0faf57e18928cdd9659c545193f7843a2e624'}]}, 'timestamp': '2025-11-26 09:34:03.648101', '_unique_id': '408eef6ccca8473798061b1143e92f5f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.649 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.650 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.650 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 1141678425 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.651 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 173265014 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ae6cf89-6e55-465e-a3b4-8a5c58dd4c0e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1141678425, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:34:03.650617', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0be8c79a-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10118.810783689, 'message_signature': '75660c96b65382a458d8799d0f6f79eb91ca39725e9c40ca3b83437a1dddc52d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 173265014, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:34:03.650617', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0be8d992-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10118.810783689, 'message_signature': '9d74405ad8d5269c6c56175bd1937d78740708bda8d31a805d48f77ae9d1d05d'}]}, 'timestamp': '2025-11-26 09:34:03.651525', '_unique_id': '6788d5b035cc4aeaa2ea6ec21bfe115e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.652 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.653 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.653 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44882ec2-22ae-4b9a-bfb6-a8ce788eda7c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:34:03.653753', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '0be9436e-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10118.801021328, 'message_signature': '0b2d8125a1817d4bb6a4a008ad79d68ad5f2f1c131c636376a455450cf909ac6'}]}, 'timestamp': '2025-11-26 09:34:03.654260', '_unique_id': '4112d0ef0d7e4db4aa29b7b7c254e774'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.655 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.656 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.656 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c424047-32ad-44dc-9fc3-1d94fcd9ddb2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:34:03.656480', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '0be9ae9e-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10118.801021328, 'message_signature': 'a70e6642809a1db1e0a7a856f4a8841b032475d7f86a1ecaa17a076d9d0608e9'}]}, 'timestamp': '2025-11-26 09:34:03.657042', '_unique_id': '5d6351bb8cd24924a55e8d411c47c54d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.657 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.659 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.659 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/cpu volume: 56290000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '02d31316-31ff-4d5a-bc60-152e8e46e8b3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 56290000000, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T09:34:03.659144', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '0bea14e2-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10118.889600929, 'message_signature': '937a8311a0a5dced1e973f63bac3f108945a4acc7fd6ee9d92108b9e8c08f614'}]}, 'timestamp': '2025-11-26 09:34:03.659609', '_unique_id': 'b5dbe5dae1c1411a8d00f49defd44d12'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.660 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.661 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.661 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.662 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4ff73c90-036f-4117-9870-df597e7549d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:34:03.661736', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0bea7aae-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10118.810783689, 'message_signature': 'd1890976eb4dae946ed4dd6d3e0b8353643f6bb15b753d50a02e8a88ce9fe5e0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:34:03.661736', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0bea8af8-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10118.810783689, 'message_signature': '3a6eb0411ad5013a31b533ef87dabffef076c69fac5e4a5724724ba21ee9da85'}]}, 'timestamp': '2025-11-26 09:34:03.662627', '_unique_id': '61a4baec72c04945917909246095d751'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.663 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.664 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.664 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ec7bef6c-c42a-414a-800b-0fed26141ee5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:34:03.664828', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '0beaf3ee-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10118.801021328, 'message_signature': '53f3a8c2bb3a8c96b94bfcd54ea02ff9de7779cf482f367deb8be9068d94b349'}]}, 'timestamp': '2025-11-26 09:34:03.665332', '_unique_id': '78f47aef558645238ef6ac2d9b2c4925'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.666 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.667 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.667 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 73912320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.668 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4ae656ab-42a9-4e7b-bba2-1d4dfbc39bc6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73912320, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:34:03.667619', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0beb5f5a-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10118.810783689, 'message_signature': 'e4cdd102d9c2a6a52e7a9b8c41743851cc2d1bad8931792fedaac7faca644589'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:34:03.667619', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0beb717a-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10118.810783689, 'message_signature': '9870d729910b1d030f3b22c3b5804b2d47f2621587786827f930c131b1c0c41c'}]}, 'timestamp': '2025-11-26 09:34:03.668518', '_unique_id': '6f41219366044a30b0aa2d862b501c3b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.669 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.670 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.670 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82366f12-8509-4c2d-bb0f-e1ee6b54e230', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:34:03.670674', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '0bebd6ec-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10118.801021328, 'message_signature': '1f52293ed31a2e6c8f9d46318e1ebc8454ba179a5905a0e011de23df9b32cea3'}]}, 'timestamp': '2025-11-26 09:34:03.671175', '_unique_id': '79cf87e828ec42e5814589c4c1b6f690'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.672 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.673 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.673 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.673 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '16e84ff5-daf5-4c0d-8bb1-8e433773b7b6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:34:03.673289', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0bec3d4e-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10118.843733087, 'message_signature': '44808c3056b93e210772b588b82c35bf4407078185601edbe09a4a2e148ca57c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:34:03.673289', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0bec4e92-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10118.843733087, 'message_signature': '493d498a917a480a7be8c5b20b7eabd966ece9436440b1ac8b23c0de216887dd'}]}, 'timestamp': '2025-11-26 09:34:03.674182', '_unique_id': '5a129cdf55fe4c4abc5962444cf87a19'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.675 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.676 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.676 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3003130c-dafa-44c3-91fd-bcd49239aded', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:34:03.676331', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '0becb3f0-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10118.801021328, 'message_signature': '565f08348fee37120aca6c976773f17525cbabc9867ff39332c28dc38f274c67'}]}, 'timestamp': '2025-11-26 09:34:03.676827', '_unique_id': '6f18ae83f9184bfcbfc2996287bbd65c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.677 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.678 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.678 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c284a512-2cf7-4a19-9206-ea1198845840', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:34:03.678469', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '0bed03a0-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10118.801021328, 'message_signature': '1d7310e1d2e3ac2be060d9b6e3b0fc743b654b2f97a78fa3e094ea009382faf9'}]}, 'timestamp': '2025-11-26 09:34:03.678752', '_unique_id': 'b65d1dacac2547d88be0f339fe15a01d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.679 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.680 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.680 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.680 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d3e2edb-ee56-4dec-9bbd-3c3db270cebf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:34:03.680124', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0bed4450-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10118.843733087, 'message_signature': 'd676d394e53fa2fa983c4fdfbf619cda8d4d36b970874f4773553a7061c49e2d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:34:03.680124', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0bed4ea0-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10118.843733087, 'message_signature': '2b2c7ebfa91eb5873d70ae28560b5afb902455202c5a14c91db8e61a75fe44ba'}]}, 'timestamp': '2025-11-26 09:34:03.680656', '_unique_id': '626de4fc5f4b47818d830f7d7afcccfc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.681 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6bb5685-514d-46cd-a9b0-3b61d23e04c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:34:03.682059', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '0bed900e-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10118.801021328, 'message_signature': 'e9aefa416688a772238e0ede0ff52dbca4d55770c2daa73380410964c39c3be4'}]}, 'timestamp': '2025-11-26 09:34:03.682350', '_unique_id': '7531670497fb4d6f9178c972c23dd1c0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:34:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:34:03.682 12 ERROR oslo_messaging.notify.messaging Nov 26 04:34:03 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:34:03 localhost systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully. Nov 26 04:34:03 localhost systemd[1]: var-lib-containers-storage-overlay-0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c-merged.mount: Deactivated successfully. Nov 26 04:34:03 localhost nova_compute[229802]: 2025-11-26 09:34:03.864 229806 DEBUG nova.network.neutron [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:34:03 localhost nova_compute[229802]: 2025-11-26 09:34:03.886 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:34:03 localhost nova_compute[229802]: 2025-11-26 09:34:03.887 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 04:34:03 localhost nova_compute[229802]: 2025-11-26 09:34:03.888 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:34:03 localhost nova_compute[229802]: 2025-11-26 09:34:03.888 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:34:03 localhost nova_compute[229802]: 2025-11-26 09:34:03.889 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:34:03 localhost nova_compute[229802]: 2025-11-26 09:34:03.889 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:34:03 localhost nova_compute[229802]: 2025-11-26 09:34:03.889 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:34:03 localhost nova_compute[229802]: 2025-11-26 09:34:03.890 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:34:03 localhost nova_compute[229802]: 2025-11-26 09:34:03.891 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 04:34:03 localhost nova_compute[229802]: 2025-11-26 09:34:03.891 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:34:03 localhost nova_compute[229802]: 2025-11-26 09:34:03.908 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:34:03 localhost nova_compute[229802]: 2025-11-26 09:34:03.908 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:34:03 localhost nova_compute[229802]: 2025-11-26 09:34:03.909 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:34:03 localhost nova_compute[229802]: 2025-11-26 09:34:03.909 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 04:34:03 localhost nova_compute[229802]: 2025-11-26 09:34:03.910 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:34:04 localhost systemd[1]: var-lib-containers-storage-overlay-0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c-merged.mount: Deactivated successfully. Nov 26 04:34:04 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:34:04 localhost nova_compute[229802]: 2025-11-26 09:34:04.104 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:34:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25080 DF PROTO=TCP SPT=44894 DPT=9101 SEQ=602906 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C063FC0000000001030307) Nov 26 04:34:04 localhost nova_compute[229802]: 2025-11-26 09:34:04.369 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:34:04 localhost nova_compute[229802]: 2025-11-26 09:34:04.449 229806 DEBUG nova.virt.libvirt.driver [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:34:04 localhost nova_compute[229802]: 2025-11-26 09:34:04.450 229806 DEBUG nova.virt.libvirt.driver [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:34:04 localhost nova_compute[229802]: 2025-11-26 09:34:04.650 229806 WARNING nova.virt.libvirt.driver [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 04:34:04 localhost nova_compute[229802]: 2025-11-26 09:34:04.651 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=12555MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 04:34:04 localhost nova_compute[229802]: 2025-11-26 09:34:04.652 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:34:04 localhost nova_compute[229802]: 2025-11-26 09:34:04.652 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:34:04 localhost nova_compute[229802]: 2025-11-26 09:34:04.725 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 04:34:04 localhost nova_compute[229802]: 2025-11-26 09:34:04.726 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 04:34:04 localhost nova_compute[229802]: 2025-11-26 09:34:04.726 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 04:34:04 localhost nova_compute[229802]: 2025-11-26 09:34:04.775 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:34:04 localhost systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully. Nov 26 04:34:05 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:34:05 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:34:05 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:34:05 localhost nova_compute[229802]: 2025-11-26 09:34:05.238 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:34:05 localhost nova_compute[229802]: 2025-11-26 09:34:05.245 229806 DEBUG nova.compute.provider_tree [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 04:34:05 localhost nova_compute[229802]: 2025-11-26 09:34:05.263 229806 DEBUG nova.scheduler.client.report [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 04:34:05 localhost nova_compute[229802]: 2025-11-26 09:34:05.268 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 04:34:05 localhost nova_compute[229802]: 2025-11-26 09:34:05.268 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.616s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:34:06 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:34:06 localhost nova_compute[229802]: 2025-11-26 09:34:06.123 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:34:06 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:34:06 localhost systemd[1]: var-lib-containers-storage-overlay-94bc28862446c9d52bea5cb761ece58f4b7ce0b6f4d30585ec973efe7d5007f7-merged.mount: Deactivated successfully. Nov 26 04:34:08 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Nov 26 04:34:08 localhost systemd[1]: var-lib-containers-storage-overlay-6b40f2e3ce469630e9b0ac8513bc38db78e513919de8034db1894708fd48cdba-merged.mount: Deactivated successfully. Nov 26 04:34:08 localhost systemd[1]: var-lib-containers-storage-overlay-6b40f2e3ce469630e9b0ac8513bc38db78e513919de8034db1894708fd48cdba-merged.mount: Deactivated successfully. Nov 26 04:34:09 localhost nova_compute[229802]: 2025-11-26 09:34:09.142 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:34:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35390 DF PROTO=TCP SPT=40778 DPT=9100 SEQ=507539702 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C077FC0000000001030307) Nov 26 04:34:09 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:34:09 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:34:09 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:34:10 localhost systemd[1]: var-lib-containers-storage-overlay-7f2203bb12e8263bda654cf994f0f457cee81cbd85ac1474f70f0dcdab850bfc-merged.mount: Deactivated successfully. Nov 26 04:34:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:34:10 localhost systemd[1]: var-lib-containers-storage-overlay-94bc28862446c9d52bea5cb761ece58f4b7ce0b6f4d30585ec973efe7d5007f7-merged.mount: Deactivated successfully. Nov 26 04:34:10 localhost systemd[1]: tmp-crun.ZUl3CP.mount: Deactivated successfully. Nov 26 04:34:10 localhost podman[242434]: 2025-11-26 09:34:10.575080904 +0000 UTC m=+0.112112489 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Nov 26 04:34:10 localhost podman[242434]: 2025-11-26 09:34:10.605044313 +0000 UTC m=+0.142075948 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 26 04:34:10 localhost podman[242434]: unhealthy Nov 26 04:34:11 localhost nova_compute[229802]: 2025-11-26 09:34:11.125 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:34:11 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:34:11 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:34:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:34:11 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:34:11 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:34:11 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Failed with result 'exit-code'. Nov 26 04:34:11 localhost podman[242452]: 2025-11-26 09:34:11.823977743 +0000 UTC m=+0.085327301 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 04:34:11 localhost podman[242452]: 2025-11-26 09:34:11.833158634 +0000 UTC m=+0.094508192 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 26 04:34:11 localhost podman[242452]: unhealthy Nov 26 04:34:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25081 DF PROTO=TCP SPT=44894 DPT=9101 SEQ=602906 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C083FC0000000001030307) Nov 26 04:34:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:34:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19239 DF PROTO=TCP SPT=34926 DPT=9882 SEQ=3409784122 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C085FC0000000001030307) Nov 26 04:34:12 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:34:12 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:34:13 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:34:13 localhost systemd[1]: var-lib-containers-storage-overlay-cb3e7a7b413bc69102c7d8435b32176125a43d794f5f87a0ef3f45710221e344-merged.mount: Deactivated successfully. Nov 26 04:34:13 localhost systemd[1]: var-lib-containers-storage-overlay-7f2203bb12e8263bda654cf994f0f457cee81cbd85ac1474f70f0dcdab850bfc-merged.mount: Deactivated successfully. Nov 26 04:34:14 localhost systemd[1]: var-lib-containers-storage-overlay-7f2203bb12e8263bda654cf994f0f457cee81cbd85ac1474f70f0dcdab850bfc-merged.mount: Deactivated successfully. Nov 26 04:34:14 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:34:14 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Failed with result 'exit-code'. Nov 26 04:34:14 localhost nova_compute[229802]: 2025-11-26 09:34:14.172 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:34:14 localhost podman[242475]: 2025-11-26 09:34:14.185863238 +0000 UTC m=+1.447965970 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible) Nov 26 04:34:14 localhost podman[242475]: 2025-11-26 09:34:14.223335757 +0000 UTC m=+1.485438509 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.build-date=20251118) Nov 26 04:34:14 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:34:14 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:34:15 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:34:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30534 DF PROTO=TCP SPT=42070 DPT=9105 SEQ=4033228926 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C091700000000001030307) Nov 26 04:34:15 localhost systemd[1]: var-lib-containers-storage-overlay-0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c-merged.mount: Deactivated successfully. Nov 26 04:34:16 localhost systemd[1]: var-lib-containers-storage-overlay-cb3e7a7b413bc69102c7d8435b32176125a43d794f5f87a0ef3f45710221e344-merged.mount: Deactivated successfully. Nov 26 04:34:16 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:34:16 localhost nova_compute[229802]: 2025-11-26 09:34:16.127 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:34:16 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:34:16 localhost systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully. Nov 26 04:34:16 localhost systemd[1]: var-lib-containers-storage-overlay-0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c-merged.mount: Deactivated successfully. Nov 26 04:34:17 localhost systemd[1]: var-lib-containers-storage-overlay-0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c-merged.mount: Deactivated successfully. Nov 26 04:34:17 localhost systemd[1]: var-lib-containers-storage-overlay-ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6-merged.mount: Deactivated successfully. Nov 26 04:34:17 localhost sshd[242501]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:34:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:34:18 localhost podman[242503]: 2025-11-26 09:34:18.83733325 +0000 UTC m=+0.090679821 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, version=9.6, distribution-scope=public, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, release=1755695350, container_name=openstack_network_exporter, io.openshift.expose-services=, config_id=edpm, architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc.) Nov 26 04:34:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30536 DF PROTO=TCP SPT=42070 DPT=9105 SEQ=4033228926 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C09D7C0000000001030307) Nov 26 04:34:18 localhost podman[242503]: 2025-11-26 09:34:18.878266871 +0000 UTC m=+0.131613432 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=edpm, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Nov 26 04:34:19 localhost nova_compute[229802]: 2025-11-26 09:34:19.176 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:34:19 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:34:19 localhost systemd[1]: var-lib-containers-storage-overlay-9a4f9c83fc765d2523f15252985ad52f1a1e3e45616ae2a11946a3981b65bdf5-merged.mount: Deactivated successfully. Nov 26 04:34:19 localhost systemd[1]: var-lib-containers-storage-overlay-9a4f9c83fc765d2523f15252985ad52f1a1e3e45616ae2a11946a3981b65bdf5-merged.mount: Deactivated successfully. Nov 26 04:34:19 localhost systemd[1]: var-lib-containers-storage-overlay-94bc28862446c9d52bea5cb761ece58f4b7ce0b6f4d30585ec973efe7d5007f7-merged.mount: Deactivated successfully. Nov 26 04:34:19 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:34:19 localhost kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:34:19 localhost kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:34:21 localhost nova_compute[229802]: 2025-11-26 09:34:21.171 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:34:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:34:22 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:34:22 localhost podman[242523]: 2025-11-26 09:34:22.711155375 +0000 UTC m=+0.066430851 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 26 04:34:22 localhost podman[242523]: 2025-11-26 09:34:22.746619812 +0000 UTC m=+0.101895358 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 26 04:34:22 localhost podman[240049]: time="2025-11-26T09:34:22Z" level=error msg="Getting root fs size for \"73f5fd05db839fb6a1d1aa71f796fc97a73af0e0d291430d998c62ae8e85d8cb\": getting diffsize of layer \"3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34\" and its parent \"f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958\": unmounting layer 3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34: replacing mount point \"/var/lib/containers/storage/overlay/3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34/merged\": device or resource busy" Nov 26 04:34:22 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:34:22 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:34:22 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:34:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30537 DF PROTO=TCP SPT=42070 DPT=9105 SEQ=4033228926 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C0AD3D0000000001030307) Nov 26 04:34:24 localhost nova_compute[229802]: 2025-11-26 09:34:24.208 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:34:24 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:34:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:34:24 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:34:24 localhost podman[242546]: 2025-11-26 09:34:24.769975014 +0000 UTC m=+0.076332685 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Nov 26 04:34:24 localhost podman[242546]: 2025-11-26 09:34:24.779354342 +0000 UTC m=+0.085712003 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:34:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46017 DF PROTO=TCP SPT=46808 DPT=9102 SEQ=3200953752 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C0B4BD0000000001030307) Nov 26 04:34:25 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:34:25 localhost systemd[1]: var-lib-containers-storage-overlay-eadc15d9188ab59a3183de8359c9702c1c3bf67b60cc946527b932af6f7de9b9-merged.mount: Deactivated successfully. Nov 26 04:34:25 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:34:25 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:34:25 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:34:25 localhost systemd[1]: var-lib-containers-storage-overlay-eadc15d9188ab59a3183de8359c9702c1c3bf67b60cc946527b932af6f7de9b9-merged.mount: Deactivated successfully. Nov 26 04:34:25 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:34:26 localhost nova_compute[229802]: 2025-11-26 09:34:26.198 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:34:26 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:34:26 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:34:26 localhost systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully. Nov 26 04:34:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:34:26 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:34:27 localhost podman[242564]: 2025-11-26 09:34:27.055274256 +0000 UTC m=+0.062584749 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:34:27 localhost podman[242564]: 2025-11-26 09:34:27.066310197 +0000 UTC m=+0.073620700 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd) Nov 26 04:34:27 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:34:27 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:34:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5751 DF PROTO=TCP SPT=35236 DPT=9101 SEQ=4262791348 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C0C17C0000000001030307) Nov 26 04:34:28 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:34:28 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:34:28 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:34:29 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:34:29 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:34:29 localhost nova_compute[229802]: 2025-11-26 09:34:29.241 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:34:29 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:34:30 localhost systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully. Nov 26 04:34:30 localhost systemd[1]: var-lib-containers-storage-overlay-d6ea731fdd03e16191558f2f7302aba6c7ea628ed7b242aa43f6ff82950e24e1-merged.mount: Deactivated successfully. Nov 26 04:34:30 localhost systemd[1]: var-lib-containers-storage-overlay-d6ea731fdd03e16191558f2f7302aba6c7ea628ed7b242aa43f6ff82950e24e1-merged.mount: Deactivated successfully. Nov 26 04:34:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46019 DF PROTO=TCP SPT=46808 DPT=9102 SEQ=3200953752 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C0CC7D0000000001030307) Nov 26 04:34:31 localhost nova_compute[229802]: 2025-11-26 09:34:31.241 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:34:31 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:34:31 localhost systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully. Nov 26 04:34:31 localhost systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully. Nov 26 04:34:31 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:34:31 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:34:31 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:34:32 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:34:32 localhost systemd[1]: var-lib-containers-storage-overlay-9a4f9c83fc765d2523f15252985ad52f1a1e3e45616ae2a11946a3981b65bdf5-merged.mount: Deactivated successfully. Nov 26 04:34:32 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:34:32 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:34:33 localhost systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully. Nov 26 04:34:33 localhost systemd[1]: var-lib-containers-storage-overlay-f99cd177b672ff33074ec35abbc6210e048ba1785e645693f779453f3bd61c4d-merged.mount: Deactivated successfully. Nov 26 04:34:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5753 DF PROTO=TCP SPT=35236 DPT=9101 SEQ=4262791348 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C0D93C0000000001030307) Nov 26 04:34:34 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:34:34 localhost nova_compute[229802]: 2025-11-26 09:34:34.279 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:34:34 localhost systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully. Nov 26 04:34:34 localhost systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully. Nov 26 04:34:34 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:34:34 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:34:34 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:34:34 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:34:34 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:34:35 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:34:35 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:34:35 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:34:35 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:34:36 localhost nova_compute[229802]: 2025-11-26 09:34:36.283 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:34:36 localhost systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully. Nov 26 04:34:36 localhost systemd[1]: var-lib-containers-storage-overlay-9d565698d62e4a7ac4d5579be96d621c994dd08165edad7b4fd7325073352493-merged.mount: Deactivated successfully. Nov 26 04:34:36 localhost systemd[1]: var-lib-containers-storage-overlay-9d565698d62e4a7ac4d5579be96d621c994dd08165edad7b4fd7325073352493-merged.mount: Deactivated successfully. Nov 26 04:34:37 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:34:38 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:34:38 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:34:38 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:34:38 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Nov 26 04:34:38 localhost systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully. Nov 26 04:34:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46020 DF PROTO=TCP SPT=46808 DPT=9102 SEQ=3200953752 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C0EBFC0000000001030307) Nov 26 04:34:39 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:34:39 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:34:39 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:34:39 localhost nova_compute[229802]: 2025-11-26 09:34:39.321 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:34:39 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:34:39 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:34:39 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:34:40 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:34:41 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Nov 26 04:34:41 localhost sshd[242582]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:34:41 localhost nova_compute[229802]: 2025-11-26 09:34:41.307 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:34:41 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Nov 26 04:34:42 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:34:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:34:42 localhost systemd[1]: var-lib-containers-storage-overlay-eadc15d9188ab59a3183de8359c9702c1c3bf67b60cc946527b932af6f7de9b9-merged.mount: Deactivated successfully. Nov 26 04:34:42 localhost podman[242584]: 2025-11-26 09:34:42.467027859 +0000 UTC m=+0.094675033 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible) Nov 26 04:34:42 localhost podman[242584]: 2025-11-26 09:34:42.49923487 +0000 UTC m=+0.126881994 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 26 04:34:42 localhost podman[242584]: unhealthy Nov 26 04:34:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5754 DF PROTO=TCP SPT=35236 DPT=9101 SEQ=4262791348 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C0F9FC0000000001030307) Nov 26 04:34:42 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:34:42 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:34:42 localhost systemd[1]: var-lib-containers-storage-overlay-eadc15d9188ab59a3183de8359c9702c1c3bf67b60cc946527b932af6f7de9b9-merged.mount: Deactivated successfully. Nov 26 04:34:42 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:34:42 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:34:42 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Failed with result 'exit-code'. Nov 26 04:34:43 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55315 DF PROTO=TCP SPT=45368 DPT=9882 SEQ=3586410554 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C0FBFD0000000001030307) Nov 26 04:34:43 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:34:43 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:34:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:34:44 localhost podman[242638]: 2025-11-26 09:34:44.257798264 +0000 UTC m=+0.079934355 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 04:34:44 localhost podman[242638]: 2025-11-26 09:34:44.267833502 +0000 UTC m=+0.089969583 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 26 04:34:44 localhost podman[242638]: unhealthy Nov 26 04:34:44 localhost nova_compute[229802]: 2025-11-26 09:34:44.374 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:34:44 localhost systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully. Nov 26 04:34:44 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:34:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25659 DF PROTO=TCP SPT=38136 DPT=9105 SEQ=930192872 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C106A00000000001030307) Nov 26 04:34:46 localhost systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully. Nov 26 04:34:46 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:34:46 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:34:46 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:34:46 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Failed with result 'exit-code'. Nov 26 04:34:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:34:46 localhost nova_compute[229802]: 2025-11-26 09:34:46.346 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:34:47 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:34:47 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:34:47 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:34:48 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:34:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25661 DF PROTO=TCP SPT=38136 DPT=9105 SEQ=930192872 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C112BD0000000001030307) Nov 26 04:34:48 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:34:49 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:34:49 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:34:49 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:34:49 localhost podman[242674]: 2025-11-26 09:34:49.255049027 +0000 UTC m=+3.068709711 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118) Nov 26 04:34:49 localhost podman[242674]: 2025-11-26 09:34:49.327495244 +0000 UTC m=+3.141155908 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 26 04:34:49 localhost nova_compute[229802]: 2025-11-26 09:34:49.376 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:34:49 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:34:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:34:49 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:34:50 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:34:50 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:34:51 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:34:51 localhost nova_compute[229802]: 2025-11-26 09:34:51.387 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:34:51 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:34:51 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:34:51 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:34:51 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:34:51 localhost podman[242723]: 2025-11-26 09:34:51.517902949 +0000 UTC m=+1.571471853 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, managed_by=edpm_ansible, distribution-scope=public, io.openshift.tags=minimal rhel9, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers) Nov 26 04:34:51 localhost podman[242723]: 2025-11-26 09:34:51.533636978 +0000 UTC m=+1.587205892 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.buildah.version=1.33.7, name=ubi9-minimal, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Nov 26 04:34:51 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:34:52 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:34:52 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:34:52 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:34:52 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:34:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25662 DF PROTO=TCP SPT=38136 DPT=9105 SEQ=930192872 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C1227C0000000001030307) Nov 26 04:34:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:34:53 localhost podman[242756]: 2025-11-26 09:34:53.043832758 +0000 UTC m=+0.058023401 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 26 04:34:53 localhost podman[242756]: 2025-11-26 09:34:53.079602302 +0000 UTC m=+0.093792985 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 26 04:34:53 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:34:53 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:34:53 localhost systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully. Nov 26 04:34:54 localhost systemd[1]: var-lib-containers-storage-overlay-d6ea731fdd03e16191558f2f7302aba6c7ea628ed7b242aa43f6ff82950e24e1-merged.mount: Deactivated successfully. Nov 26 04:34:54 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:34:54 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:34:54 localhost systemd[1]: var-lib-containers-storage-overlay-d6ea731fdd03e16191558f2f7302aba6c7ea628ed7b242aa43f6ff82950e24e1-merged.mount: Deactivated successfully. Nov 26 04:34:54 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:34:54 localhost nova_compute[229802]: 2025-11-26 09:34:54.379 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:34:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28851 DF PROTO=TCP SPT=52812 DPT=9102 SEQ=3700384689 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C129FC0000000001030307) Nov 26 04:34:54 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:34:55 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:34:55 localhost systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully. Nov 26 04:34:55 localhost systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully. Nov 26 04:34:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:34:55 localhost podman[242778]: 2025-11-26 09:34:55.907229769 +0000 UTC m=+0.052178715 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Nov 26 04:34:55 localhost podman[242778]: 2025-11-26 09:34:55.912255119 +0000 UTC m=+0.057204065 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2) Nov 26 04:34:55 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:34:55 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:34:56 localhost nova_compute[229802]: 2025-11-26 09:34:56.450 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:34:57 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:34:57 localhost systemd[1]: var-lib-containers-storage-overlay-3a6b696492174e75acfc2b8fd9aa6357f72d30dd36dce3e87a766ba6c92f819d-merged.mount: Deactivated successfully. Nov 26 04:34:57 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:34:57 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:34:57 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:34:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35392 DF PROTO=TCP SPT=40778 DPT=9100 SEQ=507539702 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C135FC0000000001030307) Nov 26 04:34:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:34:58 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:34:58 localhost systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully. Nov 26 04:34:58 localhost podman[242798]: 2025-11-26 09:34:58.343046284 +0000 UTC m=+0.099031440 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118) Nov 26 04:34:58 localhost systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully. Nov 26 04:34:58 localhost podman[242798]: 2025-11-26 09:34:58.355547761 +0000 UTC m=+0.111532947 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:34:58 localhost nova_compute[229802]: 2025-11-26 09:34:58.429 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:34:58 localhost nova_compute[229802]: 2025-11-26 09:34:58.430 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:34:58 localhost nova_compute[229802]: 2025-11-26 09:34:58.676 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:34:58 localhost nova_compute[229802]: 2025-11-26 09:34:58.676 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 04:34:58 localhost nova_compute[229802]: 2025-11-26 09:34:58.676 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 04:34:58 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:34:58 localhost nova_compute[229802]: 2025-11-26 09:34:58.862 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:34:58 localhost nova_compute[229802]: 2025-11-26 09:34:58.862 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:34:58 localhost nova_compute[229802]: 2025-11-26 09:34:58.862 229806 DEBUG nova.network.neutron [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 04:34:58 localhost nova_compute[229802]: 2025-11-26 09:34:58.863 229806 DEBUG nova.objects.instance [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:34:59 localhost systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully. Nov 26 04:34:59 localhost systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully. Nov 26 04:34:59 localhost systemd[1]: var-lib-containers-storage-overlay-f99cd177b672ff33074ec35abbc6210e048ba1785e645693f779453f3bd61c4d-merged.mount: Deactivated successfully. Nov 26 04:34:59 localhost nova_compute[229802]: 2025-11-26 09:34:59.373 229806 DEBUG nova.network.neutron [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:34:59 localhost nova_compute[229802]: 2025-11-26 09:34:59.384 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:34:59 localhost systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully. Nov 26 04:34:59 localhost nova_compute[229802]: 2025-11-26 09:34:59.442 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:34:59 localhost nova_compute[229802]: 2025-11-26 09:34:59.442 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 04:34:59 localhost nova_compute[229802]: 2025-11-26 09:34:59.443 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:34:59 localhost nova_compute[229802]: 2025-11-26 09:34:59.443 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:34:59 localhost nova_compute[229802]: 2025-11-26 09:34:59.444 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:34:59 localhost nova_compute[229802]: 2025-11-26 09:34:59.444 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:34:59 localhost nova_compute[229802]: 2025-11-26 09:34:59.444 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 04:34:59 localhost nova_compute[229802]: 2025-11-26 09:34:59.444 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:34:59 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:34:59 localhost systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully. Nov 26 04:34:59 localhost nova_compute[229802]: 2025-11-26 09:34:59.502 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:34:59 localhost nova_compute[229802]: 2025-11-26 09:34:59.503 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:34:59 localhost nova_compute[229802]: 2025-11-26 09:34:59.503 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:34:59 localhost nova_compute[229802]: 2025-11-26 09:34:59.504 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 04:34:59 localhost nova_compute[229802]: 2025-11-26 09:34:59.504 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:34:59 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:34:59 localhost nova_compute[229802]: 2025-11-26 09:34:59.963 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:35:00 localhost nova_compute[229802]: 2025-11-26 09:35:00.095 229806 DEBUG nova.virt.libvirt.driver [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:35:00 localhost nova_compute[229802]: 2025-11-26 09:35:00.095 229806 DEBUG nova.virt.libvirt.driver [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:35:00 localhost nova_compute[229802]: 2025-11-26 09:35:00.227 229806 WARNING nova.virt.libvirt.driver [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 04:35:00 localhost nova_compute[229802]: 2025-11-26 09:35:00.228 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=12461MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 04:35:00 localhost nova_compute[229802]: 2025-11-26 09:35:00.228 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:35:00 localhost nova_compute[229802]: 2025-11-26 09:35:00.228 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:35:00 localhost systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully. Nov 26 04:35:00 localhost nova_compute[229802]: 2025-11-26 09:35:00.316 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 04:35:00 localhost nova_compute[229802]: 2025-11-26 09:35:00.317 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 04:35:00 localhost nova_compute[229802]: 2025-11-26 09:35:00.317 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 04:35:00 localhost nova_compute[229802]: 2025-11-26 09:35:00.354 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:35:00 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:35:00 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:35:00 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:35:00 localhost nova_compute[229802]: 2025-11-26 09:35:00.816 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:35:00 localhost nova_compute[229802]: 2025-11-26 09:35:00.823 229806 DEBUG nova.compute.provider_tree [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 04:35:00 localhost nova_compute[229802]: 2025-11-26 09:35:00.862 229806 DEBUG nova.scheduler.client.report [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 04:35:00 localhost nova_compute[229802]: 2025-11-26 09:35:00.864 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 04:35:00 localhost nova_compute[229802]: 2025-11-26 09:35:00.864 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.636s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:35:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28853 DF PROTO=TCP SPT=52812 DPT=9102 SEQ=3700384689 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C141BC0000000001030307) Nov 26 04:35:01 localhost nova_compute[229802]: 2025-11-26 09:35:01.029 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:35:01 localhost nova_compute[229802]: 2025-11-26 09:35:01.030 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:35:01 localhost systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully. Nov 26 04:35:01 localhost systemd[1]: var-lib-containers-storage-overlay-64dc032106e0155914798cde9f31ffb5c79bf4498cfa055c6994544bc631b545-merged.mount: Deactivated successfully. Nov 26 04:35:01 localhost systemd[1]: var-lib-containers-storage-overlay-64dc032106e0155914798cde9f31ffb5c79bf4498cfa055c6994544bc631b545-merged.mount: Deactivated successfully. Nov 26 04:35:01 localhost systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully. Nov 26 04:35:01 localhost nova_compute[229802]: 2025-11-26 09:35:01.499 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:35:02 localhost systemd[1]: var-lib-containers-storage-overlay-4f2d68ffc7b0d2ad7154f194ce01f6add8f68d1c87ebccb7dfe58b78cf788c91-merged.mount: Deactivated successfully. Nov 26 04:35:02 localhost systemd[1]: var-lib-containers-storage-overlay-9d565698d62e4a7ac4d5579be96d621c994dd08165edad7b4fd7325073352493-merged.mount: Deactivated successfully. Nov 26 04:35:02 localhost systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully. Nov 26 04:35:02 localhost systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully. Nov 26 04:35:03 localhost systemd[1]: var-lib-containers-storage-overlay-ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6-merged.mount: Deactivated successfully. Nov 26 04:35:03 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Nov 26 04:35:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:35:03.635 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:35:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:35:03.635 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:35:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:35:03.636 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:35:03 localhost systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully. Nov 26 04:35:03 localhost systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully. Nov 26 04:35:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53172 DF PROTO=TCP SPT=35954 DPT=9101 SEQ=360686782 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C14E7C0000000001030307) Nov 26 04:35:04 localhost nova_compute[229802]: 2025-11-26 09:35:04.405 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:35:05 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:35:05 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:35:05 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Nov 26 04:35:05 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:35:06 localhost nova_compute[229802]: 2025-11-26 09:35:06.504 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:35:06 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Nov 26 04:35:06 localhost systemd[1]: session-56.scope: Deactivated successfully. Nov 26 04:35:06 localhost systemd[1]: session-56.scope: Consumed 58.720s CPU time. Nov 26 04:35:06 localhost systemd-logind[761]: Session 56 logged out. Waiting for processes to exit. Nov 26 04:35:06 localhost systemd-logind[761]: Removed session 56. Nov 26 04:35:07 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:35:07 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:35:07 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:35:07 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:35:07 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:35:07 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:35:08 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:35:08 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:35:08 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:35:08 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:35:08 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:35:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28854 DF PROTO=TCP SPT=52812 DPT=9102 SEQ=3700384689 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C161FC0000000001030307) Nov 26 04:35:09 localhost nova_compute[229802]: 2025-11-26 09:35:09.442 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:35:09 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:35:09 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:35:09 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:35:10 localhost systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully. Nov 26 04:35:11 localhost nova_compute[229802]: 2025-11-26 09:35:11.559 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:35:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53173 DF PROTO=TCP SPT=35954 DPT=9101 SEQ=360686782 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C16DFD0000000001030307) Nov 26 04:35:12 localhost systemd[1]: var-lib-containers-storage-overlay-936e3cf49366e6a39a6d9fcfb7eda40c941ef016ddebdad776e7ba69c7632552-merged.mount: Deactivated successfully. Nov 26 04:35:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:35:13 localhost podman[242861]: 2025-11-26 09:35:13.053544059 +0000 UTC m=+0.070238878 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118) Nov 26 04:35:13 localhost podman[242861]: 2025-11-26 09:35:13.08353363 +0000 UTC m=+0.100228479 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Nov 26 04:35:13 localhost podman[242861]: unhealthy Nov 26 04:35:13 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:35:13 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:35:14 localhost nova_compute[229802]: 2025-11-26 09:35:14.445 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:35:14 localhost systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully. Nov 26 04:35:14 localhost sshd[242879]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:35:14 localhost systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully. Nov 26 04:35:14 localhost sshd[242880]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:35:14 localhost systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully. Nov 26 04:35:14 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:35:14 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Failed with result 'exit-code'. Nov 26 04:35:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20751 DF PROTO=TCP SPT=53080 DPT=9105 SEQ=3097529693 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C17BD10000000001030307) Nov 26 04:35:16 localhost systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully. Nov 26 04:35:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:35:16 localhost podman[242881]: 2025-11-26 09:35:16.519154494 +0000 UTC m=+0.098327649 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 04:35:16 localhost systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully. Nov 26 04:35:16 localhost podman[242881]: 2025-11-26 09:35:16.531770993 +0000 UTC m=+0.110944158 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 26 04:35:16 localhost podman[242881]: unhealthy Nov 26 04:35:16 localhost nova_compute[229802]: 2025-11-26 09:35:16.563 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:35:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20752 DF PROTO=TCP SPT=53080 DPT=9105 SEQ=3097529693 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C17FBC0000000001030307) Nov 26 04:35:17 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:35:17 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:35:17 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:35:17 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:35:17 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:35:17 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Failed with result 'exit-code'. Nov 26 04:35:17 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:35:18 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:35:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20753 DF PROTO=TCP SPT=53080 DPT=9105 SEQ=3097529693 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C187BC0000000001030307) Nov 26 04:35:18 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:35:18 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:35:18 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:35:19 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:35:19 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:35:19 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:35:19 localhost nova_compute[229802]: 2025-11-26 09:35:19.481 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:35:19 localhost sshd[242904]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:35:19 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:35:19 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:35:20 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:35:20 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:35:20 localhost systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully. Nov 26 04:35:21 localhost systemd[1]: var-lib-containers-storage-overlay-17c292e04cc2973af4faecaf51a38b12d0c20f47d0b5fc279a11e99087cbc694-merged.mount: Deactivated successfully. Nov 26 04:35:21 localhost nova_compute[229802]: 2025-11-26 09:35:21.567 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:35:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:35:21 localhost podman[242906]: 2025-11-26 09:35:21.842403073 +0000 UTC m=+0.100184058 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:35:21 localhost podman[242906]: 2025-11-26 09:35:21.88680164 +0000 UTC m=+0.144582595 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 26 04:35:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:35:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20754 DF PROTO=TCP SPT=53080 DPT=9105 SEQ=3097529693 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C1977D0000000001030307) Nov 26 04:35:23 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:35:23 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:35:23 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Nov 26 04:35:23 localhost systemd[1]: var-lib-containers-storage-overlay-3a6b696492174e75acfc2b8fd9aa6357f72d30dd36dce3e87a766ba6c92f819d-merged.mount: Deactivated successfully. Nov 26 04:35:23 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:35:23 localhost podman[242931]: 2025-11-26 09:35:23.496907317 +0000 UTC m=+0.761608227 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, config_id=edpm, distribution-scope=public, version=9.6, architecture=x86_64, name=ubi9-minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., vcs-type=git, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, release=1755695350) Nov 26 04:35:23 localhost podman[242931]: 2025-11-26 09:35:23.538450564 +0000 UTC m=+0.803151514 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, architecture=x86_64, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, version=9.6, io.openshift.tags=minimal rhel9, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm) Nov 26 04:35:24 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Nov 26 04:35:24 localhost nova_compute[229802]: 2025-11-26 09:35:24.483 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:35:24 localhost systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully. Nov 26 04:35:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:35:24 localhost systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully. Nov 26 04:35:24 localhost systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully. Nov 26 04:35:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25183 DF PROTO=TCP SPT=60452 DPT=9102 SEQ=1616036468 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C19F3C0000000001030307) Nov 26 04:35:25 localhost sshd[242964]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:35:25 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:35:25 localhost systemd-logind[761]: New session 57 of user zuul. Nov 26 04:35:25 localhost systemd[1]: Started Session 57 of User zuul. Nov 26 04:35:25 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:35:25 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:35:25 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:35:25 localhost podman[242953]: 2025-11-26 09:35:25.692701533 +0000 UTC m=+1.142262555 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 04:35:25 localhost podman[242953]: 2025-11-26 09:35:25.727174366 +0000 UTC m=+1.176735398 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 26 04:35:26 localhost python3.9[243070]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman Nov 26 04:35:26 localhost nova_compute[229802]: 2025-11-26 09:35:26.573 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:35:26 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:35:26 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:35:26 localhost systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully. Nov 26 04:35:26 localhost podman[240049]: time="2025-11-26T09:35:26Z" level=error msg="Getting root fs size for \"9794eab88e1b8a77d12697aed6eb31954ed19b9b36ebd75c105c94c7886b54a7\": getting diffsize of layer \"3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae\" and its parent \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\": unmounting layer efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf: replacing mount point \"/var/lib/containers/storage/overlay/efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf/merged\": device or resource busy" Nov 26 04:35:26 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:35:27 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:35:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:35:27 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:35:27 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:35:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29885 DF PROTO=TCP SPT=53142 DPT=9101 SEQ=2853431565 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C1ABBC0000000001030307) Nov 26 04:35:29 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Nov 26 04:35:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:35:29 localhost systemd[1]: var-lib-containers-storage-overlay-1ad32a5db29098f5568060ccdb89afe68c9fb2dd318793af5aa95785da54e96e-merged.mount: Deactivated successfully. Nov 26 04:35:29 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:35:29 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:35:29 localhost podman[243085]: 2025-11-26 09:35:29.448436076 +0000 UTC m=+1.827070027 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent) Nov 26 04:35:29 localhost podman[243085]: 2025-11-26 09:35:29.453101184 +0000 UTC m=+1.831735115 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0) Nov 26 04:35:29 localhost nova_compute[229802]: 2025-11-26 09:35:29.490 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:35:29 localhost podman[243096]: 2025-11-26 09:35:29.501694824 +0000 UTC m=+0.293281329 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:35:29 localhost podman[243096]: 2025-11-26 09:35:29.514763989 +0000 UTC m=+0.306350514 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Nov 26 04:35:30 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:35:30 localhost systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully. Nov 26 04:35:30 localhost systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully. Nov 26 04:35:30 localhost systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully. Nov 26 04:35:30 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:35:30 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:35:30 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:35:30 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:35:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25185 DF PROTO=TCP SPT=60452 DPT=9102 SEQ=1616036468 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C1B6FC0000000001030307) Nov 26 04:35:31 localhost python3.9[243231]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Nov 26 04:35:31 localhost systemd[1]: Started libpod-conmon-123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.scope. Nov 26 04:35:31 localhost podman[243232]: 2025-11-26 09:35:31.306201894 +0000 UTC m=+0.109958797 container exec 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251118, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 26 04:35:31 localhost podman[243232]: 2025-11-26 09:35:31.341568946 +0000 UTC m=+0.145325859 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3) Nov 26 04:35:31 localhost systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully. Nov 26 04:35:31 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:35:31 localhost systemd[1]: var-lib-containers-storage-overlay-64dc032106e0155914798cde9f31ffb5c79bf4498cfa055c6994544bc631b545-merged.mount: Deactivated successfully. Nov 26 04:35:31 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:35:31 localhost nova_compute[229802]: 2025-11-26 09:35:31.577 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:35:32 localhost systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully. Nov 26 04:35:32 localhost systemd[1]: var-lib-containers-storage-overlay-4f2d68ffc7b0d2ad7154f194ce01f6add8f68d1c87ebccb7dfe58b78cf788c91-merged.mount: Deactivated successfully. Nov 26 04:35:32 localhost python3.9[243373]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Nov 26 04:35:32 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:35:32 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:35:32 localhost systemd[1]: libpod-conmon-123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.scope: Deactivated successfully. Nov 26 04:35:32 localhost systemd[1]: Started libpod-conmon-123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.scope. Nov 26 04:35:32 localhost podman[243374]: 2025-11-26 09:35:32.537032486 +0000 UTC m=+0.210738802 container exec 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Nov 26 04:35:32 localhost podman[243374]: 2025-11-26 09:35:32.540132585 +0000 UTC m=+0.213838891 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3) Nov 26 04:35:33 localhost systemd[1]: var-lib-containers-storage-overlay-ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6-merged.mount: Deactivated successfully. Nov 26 04:35:33 localhost systemd[1]: var-lib-containers-storage-overlay-ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6-merged.mount: Deactivated successfully. Nov 26 04:35:33 localhost python3.9[243514]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:35:33 localhost systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully. Nov 26 04:35:34 localhost systemd[1]: libpod-conmon-123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.scope: Deactivated successfully. Nov 26 04:35:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29887 DF PROTO=TCP SPT=53142 DPT=9101 SEQ=2853431565 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C1C37C0000000001030307) Nov 26 04:35:34 localhost systemd[1]: var-lib-containers-storage-overlay-435018edd8bd40c695e2155529ca14d60cdcfce0e73e35f8aa64d52f759a88b7-merged.mount: Deactivated successfully. Nov 26 04:35:34 localhost nova_compute[229802]: 2025-11-26 09:35:34.494 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:35:34 localhost python3.9[243624]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman Nov 26 04:35:34 localhost systemd[1]: var-lib-containers-storage-overlay-4f2d68ffc7b0d2ad7154f194ce01f6add8f68d1c87ebccb7dfe58b78cf788c91-merged.mount: Deactivated successfully. Nov 26 04:35:34 localhost systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully. Nov 26 04:35:35 localhost systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully. Nov 26 04:35:36 localhost nova_compute[229802]: 2025-11-26 09:35:36.585 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:35:37 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:35:37 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:35:37 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:35:38 localhost systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully. Nov 26 04:35:39 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:35:39 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:35:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33996 DF PROTO=TCP SPT=35184 DPT=9100 SEQ=3580594546 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C1D7FC0000000001030307) Nov 26 04:35:39 localhost nova_compute[229802]: 2025-11-26 09:35:39.497 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:35:39 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:35:40 localhost python3.9[243744]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Nov 26 04:35:40 localhost systemd[1]: Started libpod-conmon-659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.scope. Nov 26 04:35:40 localhost podman[243745]: 2025-11-26 09:35:40.680746628 +0000 UTC m=+0.131506964 container exec 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3) Nov 26 04:35:40 localhost podman[243745]: 2025-11-26 09:35:40.716410044 +0000 UTC m=+0.167170340 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118) Nov 26 04:35:40 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:35:40 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:35:41 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:35:41 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:35:41 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:35:41 localhost systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully. Nov 26 04:35:41 localhost nova_compute[229802]: 2025-11-26 09:35:41.591 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:35:42 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:35:42 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:35:42 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:35:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29888 DF PROTO=TCP SPT=53142 DPT=9101 SEQ=2853431565 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C1E3FC0000000001030307) Nov 26 04:35:42 localhost python3.9[243881]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Nov 26 04:35:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49500 DF PROTO=TCP SPT=48532 DPT=9882 SEQ=208036163 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C1E5FC0000000001030307) Nov 26 04:35:43 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:35:43 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:35:44 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:35:44 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:35:44 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:35:44 localhost systemd[1]: libpod-conmon-659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.scope: Deactivated successfully. Nov 26 04:35:44 localhost systemd[1]: Started libpod-conmon-659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.scope. Nov 26 04:35:44 localhost podman[243882]: 2025-11-26 09:35:44.116209741 +0000 UTC m=+1.486842920 container exec 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS) Nov 26 04:35:44 localhost podman[243882]: 2025-11-26 09:35:44.151730154 +0000 UTC m=+1.522363333 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2) Nov 26 04:35:44 localhost nova_compute[229802]: 2025-11-26 09:35:44.498 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:35:44 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:35:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:35:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=630 DF PROTO=TCP SPT=46380 DPT=9105 SEQ=2014213063 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C1F1000000000001030307) Nov 26 04:35:45 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:35:45 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:35:46 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:35:46 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:35:46 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:35:46 localhost podman[243908]: 2025-11-26 09:35:46.234351429 +0000 UTC m=+0.486610800 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm) Nov 26 04:35:46 localhost podman[243908]: 2025-11-26 09:35:46.264048218 +0000 UTC m=+0.516307569 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:35:46 localhost podman[243908]: unhealthy Nov 26 04:35:46 localhost nova_compute[229802]: 2025-11-26 09:35:46.616 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:35:46 localhost python3.9[244037]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:35:47 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:35:47 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:35:47 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:35:47 localhost python3.9[244147]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman Nov 26 04:35:48 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:35:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:35:48 localhost systemd[1]: var-lib-containers-storage-overlay-936e3cf49366e6a39a6d9fcfb7eda40c941ef016ddebdad776e7ba69c7632552-merged.mount: Deactivated successfully. Nov 26 04:35:48 localhost systemd[1]: var-lib-containers-storage-overlay-936e3cf49366e6a39a6d9fcfb7eda40c941ef016ddebdad776e7ba69c7632552-merged.mount: Deactivated successfully. Nov 26 04:35:48 localhost systemd[1]: libpod-conmon-659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.scope: Deactivated successfully. Nov 26 04:35:48 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:35:48 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:35:48 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:35:48 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Failed with result 'exit-code'. Nov 26 04:35:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=632 DF PROTO=TCP SPT=46380 DPT=9105 SEQ=2014213063 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C1FCFC0000000001030307) Nov 26 04:35:48 localhost podman[244158]: 2025-11-26 09:35:48.848269865 +0000 UTC m=+0.302796814 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 26 04:35:48 localhost podman[244158]: 2025-11-26 09:35:48.877445126 +0000 UTC m=+0.331972115 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 26 04:35:48 localhost podman[244158]: unhealthy Nov 26 04:35:49 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:35:49 localhost nova_compute[229802]: 2025-11-26 09:35:49.536 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:35:49 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:35:50 localhost systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully. Nov 26 04:35:50 localhost systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully. Nov 26 04:35:50 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:35:50 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:35:50 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:35:50 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Failed with result 'exit-code'. Nov 26 04:35:50 localhost systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully. Nov 26 04:35:50 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:35:50 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:35:51 localhost nova_compute[229802]: 2025-11-26 09:35:51.624 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:35:51 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:35:51 localhost systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully. Nov 26 04:35:52 localhost systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully. Nov 26 04:35:52 localhost python3.9[244339]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Nov 26 04:35:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=633 DF PROTO=TCP SPT=46380 DPT=9105 SEQ=2014213063 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C20CBC0000000001030307) Nov 26 04:35:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:35:54 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:35:54 localhost systemd[1]: var-lib-containers-storage-overlay-06407ef92aa62a84cce5a4c105c3570211ecf8dcded9a8a2a2909fc5cebcb893-merged.mount: Deactivated successfully. Nov 26 04:35:54 localhost nova_compute[229802]: 2025-11-26 09:35:54.569 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:35:54 localhost nova_compute[229802]: 2025-11-26 09:35:54.609 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:35:54 localhost nova_compute[229802]: 2025-11-26 09:35:54.610 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Nov 26 04:35:54 localhost nova_compute[229802]: 2025-11-26 09:35:54.628 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Nov 26 04:35:54 localhost nova_compute[229802]: 2025-11-26 09:35:54.630 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:35:54 localhost nova_compute[229802]: 2025-11-26 09:35:54.630 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Nov 26 04:35:54 localhost nova_compute[229802]: 2025-11-26 09:35:54.651 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:35:54 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:35:54 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:35:54 localhost systemd[1]: Started libpod-conmon-8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.scope. Nov 26 04:35:54 localhost podman[244340]: 2025-11-26 09:35:54.745779088 +0000 UTC m=+1.972447427 container exec 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:35:54 localhost podman[244351]: 2025-11-26 09:35:54.750819817 +0000 UTC m=+1.008719809 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller) Nov 26 04:35:54 localhost podman[244340]: 2025-11-26 09:35:54.776511939 +0000 UTC m=+2.003180238 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:35:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38846 DF PROTO=TCP SPT=48138 DPT=9102 SEQ=3911535079 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C2143D0000000001030307) Nov 26 04:35:54 localhost podman[244351]: 2025-11-26 09:35:54.818567077 +0000 UTC m=+1.076467049 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 26 04:35:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:35:55 localhost nova_compute[229802]: 2025-11-26 09:35:55.664 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:35:55 localhost nova_compute[229802]: 2025-11-26 09:35:55.665 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 04:35:55 localhost nova_compute[229802]: 2025-11-26 09:35:55.665 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 04:35:55 localhost nova_compute[229802]: 2025-11-26 09:35:55.925 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:35:55 localhost nova_compute[229802]: 2025-11-26 09:35:55.926 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:35:55 localhost nova_compute[229802]: 2025-11-26 09:35:55.926 229806 DEBUG nova.network.neutron [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 04:35:55 localhost nova_compute[229802]: 2025-11-26 09:35:55.926 229806 DEBUG nova.objects.instance [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:35:55 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:35:56 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:35:56 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:35:56 localhost nova_compute[229802]: 2025-11-26 09:35:56.404 229806 DEBUG nova.network.neutron [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:35:56 localhost nova_compute[229802]: 2025-11-26 09:35:56.424 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:35:56 localhost nova_compute[229802]: 2025-11-26 09:35:56.424 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 04:35:56 localhost nova_compute[229802]: 2025-11-26 09:35:56.425 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:35:56 localhost nova_compute[229802]: 2025-11-26 09:35:56.441 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:35:56 localhost nova_compute[229802]: 2025-11-26 09:35:56.442 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:35:56 localhost nova_compute[229802]: 2025-11-26 09:35:56.442 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:35:56 localhost nova_compute[229802]: 2025-11-26 09:35:56.442 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 04:35:56 localhost nova_compute[229802]: 2025-11-26 09:35:56.443 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:35:56 localhost nova_compute[229802]: 2025-11-26 09:35:56.655 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:35:56 localhost nova_compute[229802]: 2025-11-26 09:35:56.922 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:35:56 localhost nova_compute[229802]: 2025-11-26 09:35:56.993 229806 DEBUG nova.virt.libvirt.driver [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:35:56 localhost nova_compute[229802]: 2025-11-26 09:35:56.994 229806 DEBUG nova.virt.libvirt.driver [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:35:57 localhost nova_compute[229802]: 2025-11-26 09:35:57.202 229806 WARNING nova.virt.libvirt.driver [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 04:35:57 localhost nova_compute[229802]: 2025-11-26 09:35:57.204 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=12245MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 04:35:57 localhost nova_compute[229802]: 2025-11-26 09:35:57.205 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:35:57 localhost nova_compute[229802]: 2025-11-26 09:35:57.205 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:35:57 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:35:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:35:57 localhost sshd[244463]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:35:57 localhost nova_compute[229802]: 2025-11-26 09:35:57.406 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 04:35:57 localhost nova_compute[229802]: 2025-11-26 09:35:57.407 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 04:35:57 localhost nova_compute[229802]: 2025-11-26 09:35:57.407 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 04:35:57 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:35:57 localhost nova_compute[229802]: 2025-11-26 09:35:57.509 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:35:57 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:35:57 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:35:57 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:35:57 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:35:57 localhost podman[244430]: 2025-11-26 09:35:57.78161954 +0000 UTC m=+2.110614751 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, version=9.6, container_name=openstack_network_exporter, vcs-type=git, io.openshift.tags=minimal rhel9) Nov 26 04:35:57 localhost podman[244430]: 2025-11-26 09:35:57.796575072 +0000 UTC m=+2.125570273 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, vcs-type=git, io.buildah.version=1.33.7, version=9.6) Nov 26 04:35:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28856 DF PROTO=TCP SPT=52812 DPT=9102 SEQ=3700384689 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C21FFC0000000001030307) Nov 26 04:35:57 localhost podman[244464]: 2025-11-26 09:35:57.861652708 +0000 UTC m=+0.496449201 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 04:35:57 localhost podman[244464]: 2025-11-26 09:35:57.869574518 +0000 UTC m=+0.504371031 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 26 04:35:58 localhost nova_compute[229802]: 2025-11-26 09:35:58.053 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.544s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:35:58 localhost nova_compute[229802]: 2025-11-26 09:35:58.060 229806 DEBUG nova.compute.provider_tree [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 04:35:58 localhost nova_compute[229802]: 2025-11-26 09:35:58.075 229806 DEBUG nova.scheduler.client.report [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 04:35:58 localhost nova_compute[229802]: 2025-11-26 09:35:58.076 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 04:35:58 localhost nova_compute[229802]: 2025-11-26 09:35:58.077 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.871s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:35:58 localhost nova_compute[229802]: 2025-11-26 09:35:58.260 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:35:58 localhost nova_compute[229802]: 2025-11-26 09:35:58.260 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:35:58 localhost nova_compute[229802]: 2025-11-26 09:35:58.261 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 04:35:58 localhost python3.9[244628]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Nov 26 04:35:58 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:35:58 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:35:58 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:35:58 localhost nova_compute[229802]: 2025-11-26 09:35:58.605 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:35:58 localhost nova_compute[229802]: 2025-11-26 09:35:58.608 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:35:59 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:35:59 localhost nova_compute[229802]: 2025-11-26 09:35:59.569 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:35:59 localhost nova_compute[229802]: 2025-11-26 09:35:59.608 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:35:59 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:36:00 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:36:00 localhost systemd[1]: libpod-conmon-8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.scope: Deactivated successfully. Nov 26 04:36:00 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:36:00 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:36:00 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:36:00 localhost systemd[1]: Started libpod-conmon-8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.scope. Nov 26 04:36:00 localhost podman[244629]: 2025-11-26 09:36:00.123740712 +0000 UTC m=+1.689324705 container exec 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 26 04:36:00 localhost podman[244629]: 2025-11-26 09:36:00.153374409 +0000 UTC m=+1.718958322 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 26 04:36:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:36:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:36:00 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:36:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53175 DF PROTO=TCP SPT=35954 DPT=9101 SEQ=360686782 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C22BFD0000000001030307) Nov 26 04:36:01 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:36:01 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:36:01 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:36:01 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:36:01 localhost podman[244659]: 2025-11-26 09:36:01.344844549 +0000 UTC m=+0.854038075 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:36:01 localhost podman[244658]: 2025-11-26 09:36:01.404685959 +0000 UTC m=+0.916819987 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 26 04:36:01 localhost podman[244658]: 2025-11-26 09:36:01.440405027 +0000 UTC m=+0.952539025 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:36:01 localhost podman[244659]: 2025-11-26 09:36:01.464211408 +0000 UTC m=+0.973404934 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:36:01 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:36:01 localhost nova_compute[229802]: 2025-11-26 09:36:01.610 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:36:01 localhost nova_compute[229802]: 2025-11-26 09:36:01.688 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:36:02 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:36:02 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:36:02 localhost python3.9[244802]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:36:02 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:36:02 localhost nova_compute[229802]: 2025-11-26 09:36:02.609 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:36:02 localhost python3.9[244912]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman Nov 26 04:36:03 localhost systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully. Nov 26 04:36:03 localhost systemd[1]: var-lib-containers-storage-overlay-17c292e04cc2973af4faecaf51a38b12d0c20f47d0b5fc279a11e99087cbc694-merged.mount: Deactivated successfully. Nov 26 04:36:03 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:36:03 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:36:03 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:36:03 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:36:03 localhost systemd[1]: libpod-conmon-8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.scope: Deactivated successfully. Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.560 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'name': 'test', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005536118.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'hostId': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.561 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.566 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '41556bfd-614b-4dd9-8533-6a492e482bb7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:36:03.561410', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '5362882c-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10238.803662667, 'message_signature': 'bca5ca09ff1e17f03e909babf9fad52cbc37b1b585fa872443cd12795e8fff70'}]}, 'timestamp': '2025-11-26 09:36:03.567326', '_unique_id': '6db3b05b349848bbb8b6c61cc7d872a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.568 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.570 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.570 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a269736-2abe-4d35-82e9-5c2fe3b69d37', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:36:03.570462', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '53631918-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10238.803662667, 'message_signature': '8177c610bda34fa1f3a400196d2b8a8190db6a5ead64d9ad362848f212993822'}]}, 'timestamp': '2025-11-26 09:36:03.570984', '_unique_id': 'f1bb6bff5e354d5e95900180a9f39751'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.571 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.573 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.599 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 627516836 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.600 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 21052656 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23702fef-2870-419d-a57e-7bad0c6bc609', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 627516836, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:36:03.573169', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '53679650-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10238.815421328, 'message_signature': '19da3ceb17e9585bcb40c3ff201c147ffc79ef9f9c458ae0db12c78693790f4a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21052656, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:36:03.573169', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5367aabe-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10238.815421328, 'message_signature': '9715ecda9efd9492abec6bf5d8168cac10b144e9c0c93de68a2b15e2fc6c87c5'}]}, 'timestamp': '2025-11-26 09:36:03.600882', '_unique_id': '9f45706ba9214980a6408f8a340c434f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.602 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.603 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.614 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.614 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '326a2ec2-a106-4632-9242-71efb6673d70', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:36:03.603845', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5369ca6a-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10238.846167129, 'message_signature': 'af30b1acc9ae4c35c6fe68b1d92607311f0df14885c4e3c2b78754e7288a39de'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:36:03.603845', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5369dd66-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10238.846167129, 'message_signature': 'e11853efc32e18536705449cbb35e4789dbf325208942e9339af75f8eb92684c'}]}, 'timestamp': '2025-11-26 09:36:03.615274', '_unique_id': 'ccf2a05553694b2d82c948a4074c3252'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.616 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.617 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.617 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes volume: 9035 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8c2ba606-a891-40b2-9f6c-f047ffa94aca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9035, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:36:03.617767', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '536a5322-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10238.803662667, 'message_signature': '20624b68ffd14035973966e04d8d06050072de9e4c07ec6c9f44b537c2ebfc0b'}]}, 'timestamp': '2025-11-26 09:36:03.618315', '_unique_id': '934908ed6c2947198b595e9fd9deb7ff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.619 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.620 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.620 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.620 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '94aeaff8-ecee-4b71-9ac4-98689e4c277b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:36:03.620457', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '536ab952-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10238.846167129, 'message_signature': 'd4c47ee000cf434a9fc1703a9658454068fd91eaa12d5baa413e80d42d7aa723'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:36:03.620457', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '536acae6-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10238.846167129, 'message_signature': '3bc6d9b9e6d61c539b56b2df91473a4e5bbb7a5eecded9df40ad0e0c671f3ec9'}]}, 'timestamp': '2025-11-26 09:36:03.621347', '_unique_id': '46a0e39eedbc488ba26e178b199fc150'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.622 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.623 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.623 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets volume: 88 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e7fa31fa-b028-4cdd-a308-da6d1372aafa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 88, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:36:03.623554', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '536b326a-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10238.803662667, 'message_signature': '01686aa9bab1a4c9cf05d24e764ab3399491b718396531471981f23bddd3bfcb'}]}, 'timestamp': '2025-11-26 09:36:03.624057', '_unique_id': '31c7967de1c146e49f5537495b233431'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.624 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.626 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.626 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1691d9bb-6a53-4f1d-8112-badf0e05446a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:36:03.626193', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '536b99a8-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10238.803662667, 'message_signature': '1217133ab9f671aab80960b4709c0f2d176b4cb8ea6c08dcd93569c062120506'}]}, 'timestamp': '2025-11-26 09:36:03.626670', '_unique_id': 'f79c25988a5b4ebd8b4dff8510c43240'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.627 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.628 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 26 04:36:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:36:03.636 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:36:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:36:03.637 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:36:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:36:03.638 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.645 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/memory.usage volume: 52.296875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '453828e5-ce92-4f2e-a115-cff0d47fea04', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.296875, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T09:36:03.628776', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '536e8c12-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10238.887540626, 'message_signature': '96583f20802206fe6b0c9b0305d700b423a763f66c9b1d7a2490e53b948f72e1'}]}, 'timestamp': '2025-11-26 09:36:03.646057', '_unique_id': '4305ce4e244c41bdbbd1b260b1ce5576'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.647 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.648 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.648 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f27bb8cb-bda0-4806-81bf-62298a2a80a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:36:03.648431', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '536efe54-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10238.803662667, 'message_signature': '10df898bb108dfe67ae0950de24ced5da568ca51827be56593d5e2e361077a2f'}]}, 'timestamp': '2025-11-26 09:36:03.648912', '_unique_id': '084d1de602734b979487aff182eb56e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.649 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.650 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.651 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.651 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2e499f8-75cf-47d5-9168-9ba0c9b829f1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:36:03.651243', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '536f6be6-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10238.803662667, 'message_signature': 'ba9f9cd0d7f10c12cc8632ac4fac9c96ab6749e2bbc70e2fbd54c2f59659fd93'}]}, 'timestamp': '2025-11-26 09:36:03.651713', '_unique_id': 'c95033354ea847bab2be27d3b7f83762'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.652 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.653 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.653 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 1141678425 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.654 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 173265014 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b60577bf-4223-4048-abc2-df64867a8b5b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1141678425, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:36:03.653811', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '536fd1b2-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10238.815421328, 'message_signature': '35a497883e0b67e6990dc6fdcbd3e5393602d74e0e683e4ef64f1e164d0d2e96'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 173265014, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:36:03.653811', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '536fe1fc-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10238.815421328, 'message_signature': 'a40a91c9b694eaff74d12b0d641813f3d2a9b6d64cf5bfe3752d34ffcbd9c431'}]}, 'timestamp': '2025-11-26 09:36:03.654704', '_unique_id': '49358e0aa8e64c8e89aa7cb3c48c8682'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.655 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.656 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.657 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/cpu volume: 57430000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '51da8c77-0020-49a1-90dc-3390e299c644', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 57430000000, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T09:36:03.657025', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '53704dcc-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10238.887540626, 'message_signature': '992e4c1ee2088a5f418c2209177fa3d8aca539792729bd9a6f0adcf5595b9c82'}]}, 'timestamp': '2025-11-26 09:36:03.657480', '_unique_id': '46b066e7a76e47e49b799f05a07da0ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.658 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.659 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.659 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 73912320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.660 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0480a06c-1648-49ab-bd48-9eab3f51c432', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73912320, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:36:03.659582', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5370b10e-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10238.815421328, 'message_signature': 'ec465fe142522164274bdfff1dae0a1887730fda03fff582bb9312335692f42d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:36:03.659582', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5370c2a2-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10238.815421328, 'message_signature': '4c0def98573f8f760572940a0217d89c4586f1527a02526d7f15c13eba461e1c'}]}, 'timestamp': '2025-11-26 09:36:03.660457', '_unique_id': 'c146346b531347d9918ea9b41df82ec5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.661 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.662 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.662 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.662 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '10ddc7e2-7394-449f-af91-96673a701097', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:36:03.662750', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '53712e7c-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10238.803662667, 'message_signature': '1f8ed4309df85316be71574e0a49879d5b8a52b1875b16d2a4156bebdefebb98'}]}, 'timestamp': '2025-11-26 09:36:03.663247', '_unique_id': 'c27e9135a11544eea15830405b77a696'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.664 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.665 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.665 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '91cc1459-2812-445c-836e-570ae957287e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:36:03.665372', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '537193c6-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10238.803662667, 'message_signature': 'dc9327ae54fdcfdc4c96ac0a1133be0f3893bc9afb1487208473bc5ff30e449d'}]}, 'timestamp': '2025-11-26 09:36:03.665840', '_unique_id': 'f0bbab659d464b7e817de83e05f9bef0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.666 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.667 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.668 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.668 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.668 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96fc719d-45fd-4753-a4c7-1323d4e96181', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:36:03.668224', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '537202de-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10238.815421328, 'message_signature': '8fcf769b077d032bda595f069d182171d64f97099aac1b21254e91e88cdc773f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:36:03.668224', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5372130a-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10238.815421328, 'message_signature': '11e7738cb7db45def7975ccd960ec64b56a65572db3ccfc9bf1427c92800df05'}]}, 'timestamp': '2025-11-26 09:36:03.669095', '_unique_id': 'd47840d8df5d4db2ae74b071d0d920ff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.670 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.671 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.671 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.671 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 524 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.671 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '247959d7-bbd7-4a00-bf46-fae9d22c611f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 524, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:36:03.671493', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '537282f4-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10238.815421328, 'message_signature': 'f139f5b8eabcd88afd63b6ac4cb58dd341b87cd385d0a1563a006e087bd5d25d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:36:03.671493', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '53729546-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10238.815421328, 'message_signature': 'c64a47bb22406b3674f4928701e593fe023df1b3bcad896a3af0f6ff75fec73a'}]}, 'timestamp': '2025-11-26 09:36:03.672417', '_unique_id': '2df49a374ac0437bb5eeca192b360762'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.673 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.674 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.674 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3ef38221-be73-4d1c-92a1-dfd8480dd3a0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:36:03.674672', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '5372fb76-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10238.803662667, 'message_signature': 'eb6d9576eb2ad3a0daec5296ca3fa86bebba9b55a78fd76c649940ca15fb7fa5'}]}, 'timestamp': '2025-11-26 09:36:03.674984', '_unique_id': '61a1da122bec4784b3db0ce0f9d5c026'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.675 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.676 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.676 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.676 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae5247d5-c40b-42fb-b17a-c0323c2ccf54', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:36:03.676279', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '53733a0a-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10238.846167129, 'message_signature': '01787d5afa2ce7c381793ddaaf469c1091602e729e2cf4de2d18c8e40feb0bda'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:36:03.676279', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5373441e-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10238.846167129, 'message_signature': 'c92d9a6bd6fbae2e75dcb233eb62ae002db92088af980f0293eeae4dc883fb65'}]}, 'timestamp': '2025-11-26 09:36:03.676803', '_unique_id': '413580f07ca84164b570007a88e6c2c3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.677 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.678 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.678 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.678 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b068fa9a-2c50-46f3-8fd0-333c8ec8fbb9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:36:03.678155', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '53738366-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10238.815421328, 'message_signature': '81a0e456428f5a656c6bf20886aa865d30f46ab538f6e063f4aea06678c0648c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:36:03.678155', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '53738d70-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10238.815421328, 'message_signature': '6512517ef2c98de883a0d39adf4c9c129938b512af4b44e5bd9806e2788e4aa5'}]}, 'timestamp': '2025-11-26 09:36:03.678678', '_unique_id': 'a6210273c8e643f8ae085c3ce5902622'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:36:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:36:03.679 12 ERROR oslo_messaging.notify.messaging Nov 26 04:36:04 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:36:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47108 DF PROTO=TCP SPT=36664 DPT=9101 SEQ=2685834520 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C238C00000000001030307) Nov 26 04:36:04 localhost nova_compute[229802]: 2025-11-26 09:36:04.613 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:36:05 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 26 04:36:05 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6600.1 total, 600.0 interval#012Cumulative writes: 5692 writes, 25K keys, 5692 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5692 writes, 763 syncs, 7.46 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 26 04:36:05 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:36:05 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Nov 26 04:36:06 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Nov 26 04:36:06 localhost kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:36:06 localhost kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:36:06 localhost nova_compute[229802]: 2025-11-26 09:36:06.714 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:36:08 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:36:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38849 DF PROTO=TCP SPT=48138 DPT=9102 SEQ=3911535079 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C24BFC0000000001030307) Nov 26 04:36:09 localhost systemd[1]: var-lib-containers-storage-overlay-7fbacd248b5281d15359a0a3185510949d60a9c5c12517cf35c6a3746148bd16-merged.mount: Deactivated successfully. Nov 26 04:36:09 localhost kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:36:09 localhost kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:36:09 localhost nova_compute[229802]: 2025-11-26 09:36:09.643 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:36:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 26 04:36:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6600.1 total, 600.0 interval#012Cumulative writes: 4860 writes, 21K keys, 4860 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4860 writes, 621 syncs, 7.83 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 26 04:36:09 localhost sshd[244984]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:36:09 localhost systemd[1]: var-lib-containers-storage-overlay-b4f761d90eeb5a4c1ea51e856783cf8398e02a6caf306b90498250a43e5bbae1-merged.mount: Deactivated successfully. Nov 26 04:36:09 localhost systemd[1]: var-lib-containers-storage-overlay-e1fac4507a16e359f79966290a44e975bb0ed717e8b6cc0e34b61e8c96e0a1a3-merged.mount: Deactivated successfully. Nov 26 04:36:10 localhost python3.9[245038]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Nov 26 04:36:10 localhost systemd[1]: Started libpod-conmon-f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.scope. Nov 26 04:36:10 localhost podman[245039]: 2025-11-26 09:36:10.195207202 +0000 UTC m=+0.122982365 container exec f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0) Nov 26 04:36:10 localhost podman[245039]: 2025-11-26 09:36:10.227367438 +0000 UTC m=+0.155142651 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 26 04:36:10 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:36:10 localhost systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully. Nov 26 04:36:11 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:36:11 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:36:11 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:36:11 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:36:11 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:36:11 localhost nova_compute[229802]: 2025-11-26 09:36:11.718 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:36:12 localhost python3.9[245178]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Nov 26 04:36:12 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:36:12 localhost podman[240049]: time="2025-11-26T09:36:12Z" level=error msg="Getting root fs size for \"b4943101d1e1374af66c7bc0dccfe1d6a9673519bb9ee51850c62dbe56f7b176\": getting diffsize of layer \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\" and its parent \"c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6\": unmounting layer efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf: replacing mount point \"/var/lib/containers/storage/overlay/efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf/merged\": device or resource busy" Nov 26 04:36:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36182 DF PROTO=TCP SPT=51614 DPT=9882 SEQ=2906386088 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C259FC0000000001030307) Nov 26 04:36:12 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:36:12 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:36:12 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:36:13 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:36:13 localhost systemd[1]: libpod-conmon-f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.scope: Deactivated successfully. Nov 26 04:36:13 localhost systemd[1]: Started libpod-conmon-f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.scope. Nov 26 04:36:13 localhost podman[245179]: 2025-11-26 09:36:13.056343826 +0000 UTC m=+0.589394966 container exec f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251118) Nov 26 04:36:13 localhost podman[245179]: 2025-11-26 09:36:13.088083009 +0000 UTC m=+0.621134099 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0) Nov 26 04:36:13 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:36:13 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:36:13 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:36:13 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:36:14 localhost systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully. Nov 26 04:36:14 localhost systemd[1]: var-lib-containers-storage-overlay-772983d29741817fb5112b04db0ec34846c51e947d40ce51144a956997c63192-merged.mount: Deactivated successfully. Nov 26 04:36:14 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:36:14 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:36:14 localhost nova_compute[229802]: 2025-11-26 09:36:14.646 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:36:14 localhost python3.9[245318]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:36:15 localhost python3.9[245428]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman Nov 26 04:36:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65282 DF PROTO=TCP SPT=56570 DPT=9105 SEQ=4047016471 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C266300000000001030307) Nov 26 04:36:16 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:36:16 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Nov 26 04:36:16 localhost nova_compute[229802]: 2025-11-26 09:36:16.753 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:36:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65283 DF PROTO=TCP SPT=56570 DPT=9105 SEQ=4047016471 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C26A3C0000000001030307) Nov 26 04:36:16 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Nov 26 04:36:16 localhost systemd[1]: libpod-conmon-f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.scope: Deactivated successfully. Nov 26 04:36:16 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:36:16 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:36:18 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:36:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65284 DF PROTO=TCP SPT=56570 DPT=9105 SEQ=4047016471 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C2723C0000000001030307) Nov 26 04:36:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:36:18 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:36:19 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:36:19 localhost nova_compute[229802]: 2025-11-26 09:36:19.677 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:36:19 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Nov 26 04:36:19 localhost systemd[1]: var-lib-containers-storage-overlay-1ad32a5db29098f5568060ccdb89afe68c9fb2dd318793af5aa95785da54e96e-merged.mount: Deactivated successfully. Nov 26 04:36:20 localhost systemd[1]: var-lib-containers-storage-overlay-1ad32a5db29098f5568060ccdb89afe68c9fb2dd318793af5aa95785da54e96e-merged.mount: Deactivated successfully. Nov 26 04:36:20 localhost podman[245443]: 2025-11-26 09:36:20.161310665 +0000 UTC m=+1.267449782 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:36:20 localhost podman[245443]: 2025-11-26 09:36:20.178405005 +0000 UTC m=+1.284544152 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3) Nov 26 04:36:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:36:21 localhost systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully. Nov 26 04:36:21 localhost systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully. Nov 26 04:36:21 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:36:21 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:36:21 localhost podman[245461]: 2025-11-26 09:36:21.521544195 +0000 UTC m=+0.776368571 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 26 04:36:21 localhost podman[245461]: 2025-11-26 09:36:21.559329839 +0000 UTC m=+0.814154215 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 26 04:36:21 localhost podman[245461]: unhealthy Nov 26 04:36:21 localhost nova_compute[229802]: 2025-11-26 09:36:21.757 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:36:21 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:36:22 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:36:22 localhost python3.9[245595]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Nov 26 04:36:22 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:36:22 localhost systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully. Nov 26 04:36:22 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:36:22 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Failed with result 'exit-code'. Nov 26 04:36:22 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:36:22 localhost systemd[1]: Started libpod-conmon-4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.scope. Nov 26 04:36:22 localhost podman[245596]: 2025-11-26 09:36:22.620300898 +0000 UTC m=+0.363565733 container exec 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 04:36:22 localhost podman[245596]: 2025-11-26 09:36:22.649704517 +0000 UTC m=+0.392969382 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 26 04:36:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65285 DF PROTO=TCP SPT=56570 DPT=9105 SEQ=4047016471 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C281FC0000000001030307) Nov 26 04:36:22 localhost systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully. Nov 26 04:36:23 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:36:23 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:36:23 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:36:23 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:36:23 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:36:24 localhost python3.9[245734]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Nov 26 04:36:24 localhost nova_compute[229802]: 2025-11-26 09:36:24.708 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:36:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58320 DF PROTO=TCP SPT=40192 DPT=9102 SEQ=3182919638 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C2897C0000000001030307) Nov 26 04:36:25 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Nov 26 04:36:25 localhost systemd[1]: var-lib-containers-storage-overlay-a9100a267b24367cb8370dbd929de9892c803b386b41da1c102b2b3d1fba1b2a-merged.mount: Deactivated successfully. Nov 26 04:36:26 localhost systemd[1]: var-lib-containers-storage-overlay-a9100a267b24367cb8370dbd929de9892c803b386b41da1c102b2b3d1fba1b2a-merged.mount: Deactivated successfully. Nov 26 04:36:26 localhost systemd[1]: libpod-conmon-4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.scope: Deactivated successfully. Nov 26 04:36:26 localhost systemd[1]: Started libpod-conmon-4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.scope. Nov 26 04:36:26 localhost podman[245735]: 2025-11-26 09:36:26.287104297 +0000 UTC m=+2.249482877 container exec 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:36:26 localhost podman[245735]: 2025-11-26 09:36:26.316321931 +0000 UTC m=+2.278700531 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 04:36:26 localhost nova_compute[229802]: 2025-11-26 09:36:26.760 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:36:27 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:36:27 localhost systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully. Nov 26 04:36:27 localhost systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully. Nov 26 04:36:27 localhost systemd[1]: libpod-conmon-4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.scope: Deactivated successfully. Nov 26 04:36:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:36:27 localhost systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully. Nov 26 04:36:27 localhost systemd[1]: var-lib-containers-storage-overlay-435018edd8bd40c695e2155529ca14d60cdcfce0e73e35f8aa64d52f759a88b7-merged.mount: Deactivated successfully. Nov 26 04:36:27 localhost podman[245876]: 2025-11-26 09:36:27.970007239 +0000 UTC m=+0.118287666 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller) Nov 26 04:36:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33998 DF PROTO=TCP SPT=35184 DPT=9100 SEQ=3580594546 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C295FD0000000001030307) Nov 26 04:36:28 localhost python3.9[245875]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:36:28 localhost podman[245876]: 2025-11-26 09:36:28.080133527 +0000 UTC m=+0.228413914 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible) Nov 26 04:36:28 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:36:28 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:36:28 localhost systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully. Nov 26 04:36:28 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:36:28 localhost python3.9[246009]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman Nov 26 04:36:28 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:36:29 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:36:29 localhost nova_compute[229802]: 2025-11-26 09:36:29.713 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:36:30 localhost systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully. Nov 26 04:36:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:36:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:36:30 localhost podman[246023]: 2025-11-26 09:36:30.198377489 +0000 UTC m=+0.101672133 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public, container_name=openstack_network_exporter, release=1755695350, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc.) Nov 26 04:36:30 localhost podman[246023]: 2025-11-26 09:36:30.234477638 +0000 UTC m=+0.137772232 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, config_id=edpm, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., distribution-scope=public, version=9.6, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vendor=Red Hat, Inc., name=ubi9-minimal, io.openshift.tags=minimal rhel9) Nov 26 04:36:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58322 DF PROTO=TCP SPT=40192 DPT=9102 SEQ=3182919638 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C2A13D0000000001030307) Nov 26 04:36:31 localhost systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully. Nov 26 04:36:31 localhost systemd[1]: var-lib-containers-storage-overlay-530f599619a39baef532cc44e4223e07e6aec95ffdcc3af67e9a79046d6bd825-merged.mount: Deactivated successfully. Nov 26 04:36:31 localhost nova_compute[229802]: 2025-11-26 09:36:31.794 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:36:32 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:36:32 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:36:33 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:36:33 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:36:33 localhost podman[246035]: 2025-11-26 09:36:33.154197514 +0000 UTC m=+2.980432983 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:36:33 localhost podman[246035]: 2025-11-26 09:36:33.18924328 +0000 UTC m=+3.015478749 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:36:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:36:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:36:33 localhost sshd[246192]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:36:33 localhost python3.9[246194]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Nov 26 04:36:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31969 DF PROTO=TCP SPT=56432 DPT=9101 SEQ=2043149364 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C2ADFC0000000001030307) Nov 26 04:36:34 localhost nova_compute[229802]: 2025-11-26 09:36:34.755 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:36:35 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:36:35 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:36:35 localhost systemd[1]: var-lib-containers-storage-overlay-bba04dbd2e113925874f779bcae6bca7316b7a66f7b274a8553d6a317f393238-merged.mount: Deactivated successfully. Nov 26 04:36:35 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:36:35 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:36:35 localhost systemd[1]: Started libpod-conmon-b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.scope. Nov 26 04:36:35 localhost podman[246195]: 2025-11-26 09:36:35.807855955 +0000 UTC m=+1.889590031 container exec b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 26 04:36:35 localhost podman[246118]: 2025-11-26 09:36:35.819128391 +0000 UTC m=+2.326921334 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 26 04:36:35 localhost podman[246118]: 2025-11-26 09:36:35.825277315 +0000 UTC m=+2.333070308 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Nov 26 04:36:35 localhost podman[246195]: 2025-11-26 09:36:35.837092938 +0000 UTC m=+1.918826994 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 26 04:36:35 localhost podman[246119]: 2025-11-26 09:36:35.930086155 +0000 UTC m=+2.429706169 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 26 04:36:35 localhost podman[246119]: 2025-11-26 09:36:35.939076049 +0000 UTC m=+2.438696073 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:36:36 localhost systemd[1]: var-lib-containers-storage-overlay-5d77d5f2ee8b05529dee3e4c9abd16672fdc0c09b8af133b2b88b75f7f780254-merged.mount: Deactivated successfully. Nov 26 04:36:36 localhost nova_compute[229802]: 2025-11-26 09:36:36.846 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:36:36 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:36:36 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:36:37 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:36:37 localhost systemd[1]: var-lib-containers-storage-overlay-0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c-merged.mount: Deactivated successfully. Nov 26 04:36:37 localhost systemd[1]: var-lib-containers-storage-overlay-bba04dbd2e113925874f779bcae6bca7316b7a66f7b274a8553d6a317f393238-merged.mount: Deactivated successfully. Nov 26 04:36:37 localhost systemd[1]: var-lib-containers-storage-overlay-bba04dbd2e113925874f779bcae6bca7316b7a66f7b274a8553d6a317f393238-merged.mount: Deactivated successfully. Nov 26 04:36:37 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:36:37 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:36:38 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:36:38 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:36:38 localhost python3.9[246350]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Nov 26 04:36:38 localhost systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully. Nov 26 04:36:38 localhost systemd[1]: libpod-conmon-b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.scope: Deactivated successfully. Nov 26 04:36:38 localhost systemd[1]: Started libpod-conmon-b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.scope. Nov 26 04:36:38 localhost podman[246351]: 2025-11-26 09:36:38.951214772 +0000 UTC m=+0.317850549 container exec b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:36:38 localhost podman[246351]: 2025-11-26 09:36:38.982675646 +0000 UTC m=+0.349311453 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 04:36:39 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:36:39 localhost systemd[1]: var-lib-containers-storage-overlay-0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c-merged.mount: Deactivated successfully. Nov 26 04:36:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47164 DF PROTO=TCP SPT=46958 DPT=9100 SEQ=3977621154 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C2C1FC0000000001030307) Nov 26 04:36:39 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:36:39 localhost systemd[1]: var-lib-containers-storage-overlay-ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6-merged.mount: Deactivated successfully. Nov 26 04:36:39 localhost systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully. Nov 26 04:36:39 localhost nova_compute[229802]: 2025-11-26 09:36:39.780 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:36:40 localhost python3.9[246491]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:36:41 localhost python3.9[246601]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman Nov 26 04:36:41 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:36:41 localhost nova_compute[229802]: 2025-11-26 09:36:41.881 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:36:42 localhost systemd[1]: var-lib-containers-storage-overlay-06407ef92aa62a84cce5a4c105c3570211ecf8dcded9a8a2a2909fc5cebcb893-merged.mount: Deactivated successfully. Nov 26 04:36:42 localhost systemd[1]: var-lib-containers-storage-overlay-06407ef92aa62a84cce5a4c105c3570211ecf8dcded9a8a2a2909fc5cebcb893-merged.mount: Deactivated successfully. Nov 26 04:36:42 localhost systemd[1]: libpod-conmon-b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.scope: Deactivated successfully. Nov 26 04:36:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31970 DF PROTO=TCP SPT=56432 DPT=9101 SEQ=2043149364 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C2CDFD0000000001030307) Nov 26 04:36:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38708 DF PROTO=TCP SPT=45094 DPT=9882 SEQ=4226454377 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C2CFFC0000000001030307) Nov 26 04:36:44 localhost systemd[1]: var-lib-containers-storage-overlay-5d77d5f2ee8b05529dee3e4c9abd16672fdc0c09b8af133b2b88b75f7f780254-merged.mount: Deactivated successfully. Nov 26 04:36:44 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:36:44 localhost nova_compute[229802]: 2025-11-26 09:36:44.782 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:36:44 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:36:45 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:36:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42619 DF PROTO=TCP SPT=59138 DPT=9105 SEQ=921786513 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C2DB600000000001030307) Nov 26 04:36:46 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:36:46 localhost nova_compute[229802]: 2025-11-26 09:36:46.934 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:36:46 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:36:47 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:36:47 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Nov 26 04:36:47 localhost systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully. Nov 26 04:36:48 localhost systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully. Nov 26 04:36:48 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:36:48 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:36:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42621 DF PROTO=TCP SPT=59138 DPT=9105 SEQ=921786513 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C2E77D0000000001030307) Nov 26 04:36:48 localhost python3.9[246724]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Nov 26 04:36:49 localhost systemd[1]: Started libpod-conmon-a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.scope. Nov 26 04:36:49 localhost podman[246725]: 2025-11-26 09:36:49.047572483 +0000 UTC m=+0.132320180 container exec a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, managed_by=edpm_ansible, container_name=openstack_network_exporter, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, version=9.6, config_id=edpm, vendor=Red Hat, Inc.) Nov 26 04:36:49 localhost podman[246725]: 2025-11-26 09:36:49.076544637 +0000 UTC m=+0.161292334 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, distribution-scope=public, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers) Nov 26 04:36:49 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:36:49 localhost podman[240049]: time="2025-11-26T09:36:49Z" level=error msg="Getting root fs size for \"b29f4cd20a1c18ffd470f87f5036c652bb1768cdf8614e6a7c6503ca9a73b365\": getting diffsize of layer \"3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae\" and its parent \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\": unmounting layer 3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae: replacing mount point \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged\": device or resource busy" Nov 26 04:36:49 localhost nova_compute[229802]: 2025-11-26 09:36:49.826 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:36:49 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:36:50 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Nov 26 04:36:50 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Nov 26 04:36:50 localhost kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:36:50 localhost kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:36:51 localhost python3.9[246862]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Nov 26 04:36:51 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:36:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:36:51 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:36:51 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:36:51 localhost nova_compute[229802]: 2025-11-26 09:36:51.939 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:36:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:36:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42622 DF PROTO=TCP SPT=59138 DPT=9105 SEQ=921786513 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C2F73C0000000001030307) Nov 26 04:36:53 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:36:53 localhost systemd[1]: var-lib-containers-storage-overlay-7fbacd248b5281d15359a0a3185510949d60a9c5c12517cf35c6a3746148bd16-merged.mount: Deactivated successfully. Nov 26 04:36:53 localhost kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:36:53 localhost kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:36:53 localhost systemd[1]: var-lib-containers-storage-overlay-7fbacd248b5281d15359a0a3185510949d60a9c5c12517cf35c6a3746148bd16-merged.mount: Deactivated successfully. Nov 26 04:36:53 localhost systemd[1]: libpod-conmon-a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.scope: Deactivated successfully. Nov 26 04:36:53 localhost podman[246874]: 2025-11-26 09:36:53.798584515 +0000 UTC m=+2.260254027 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:36:53 localhost podman[246874]: 2025-11-26 09:36:53.811350054 +0000 UTC m=+2.273019526 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 26 04:36:53 localhost systemd[1]: Started libpod-conmon-a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.scope. Nov 26 04:36:53 localhost podman[246863]: 2025-11-26 09:36:53.906133067 +0000 UTC m=+2.556153036 container exec a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_id=edpm, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, release=1755695350) Nov 26 04:36:53 localhost podman[246863]: 2025-11-26 09:36:53.935149183 +0000 UTC m=+2.585169192 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, version=9.6, vcs-type=git, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41) Nov 26 04:36:54 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:36:54 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:36:54 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:36:54 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:36:54 localhost podman[246886]: 2025-11-26 09:36:54.782773124 +0000 UTC m=+2.038589152 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:36:54 localhost podman[246886]: 2025-11-26 09:36:54.793552481 +0000 UTC m=+2.049368509 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 26 04:36:54 localhost podman[246886]: unhealthy Nov 26 04:36:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44303 DF PROTO=TCP SPT=55300 DPT=9102 SEQ=435731928 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C2FEBC0000000001030307) Nov 26 04:36:54 localhost nova_compute[229802]: 2025-11-26 09:36:54.830 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:36:55 localhost systemd[1]: libpod-conmon-a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.scope: Deactivated successfully. Nov 26 04:36:55 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:36:55 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Failed with result 'exit-code'. Nov 26 04:36:55 localhost nova_compute[229802]: 2025-11-26 09:36:55.608 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:36:55 localhost nova_compute[229802]: 2025-11-26 09:36:55.676 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:36:55 localhost nova_compute[229802]: 2025-11-26 09:36:55.677 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:36:55 localhost nova_compute[229802]: 2025-11-26 09:36:55.677 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:36:55 localhost nova_compute[229802]: 2025-11-26 09:36:55.677 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 04:36:55 localhost nova_compute[229802]: 2025-11-26 09:36:55.678 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:36:56 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:36:56 localhost systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully. Nov 26 04:36:56 localhost nova_compute[229802]: 2025-11-26 09:36:56.160 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:36:56 localhost systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully. Nov 26 04:36:56 localhost python3.9[247098]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:36:56 localhost nova_compute[229802]: 2025-11-26 09:36:56.278 229806 DEBUG nova.virt.libvirt.driver [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:36:56 localhost nova_compute[229802]: 2025-11-26 09:36:56.279 229806 DEBUG nova.virt.libvirt.driver [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:36:56 localhost nova_compute[229802]: 2025-11-26 09:36:56.464 229806 WARNING nova.virt.libvirt.driver [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 04:36:56 localhost nova_compute[229802]: 2025-11-26 09:36:56.467 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=12294MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 04:36:56 localhost nova_compute[229802]: 2025-11-26 09:36:56.467 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:36:56 localhost nova_compute[229802]: 2025-11-26 09:36:56.468 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:36:56 localhost nova_compute[229802]: 2025-11-26 09:36:56.560 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 04:36:56 localhost nova_compute[229802]: 2025-11-26 09:36:56.560 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 04:36:56 localhost nova_compute[229802]: 2025-11-26 09:36:56.560 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 04:36:56 localhost nova_compute[229802]: 2025-11-26 09:36:56.574 229806 DEBUG nova.scheduler.client.report [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Refreshing inventories for resource provider 05276789-7461-410b-9529-16f5185a8bff _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Nov 26 04:36:56 localhost nova_compute[229802]: 2025-11-26 09:36:56.589 229806 DEBUG nova.scheduler.client.report [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Updating ProviderTree inventory for provider 05276789-7461-410b-9529-16f5185a8bff from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Nov 26 04:36:56 localhost nova_compute[229802]: 2025-11-26 09:36:56.590 229806 DEBUG nova.compute.provider_tree [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Updating inventory in ProviderTree for provider 05276789-7461-410b-9529-16f5185a8bff with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Nov 26 04:36:56 localhost nova_compute[229802]: 2025-11-26 09:36:56.603 229806 DEBUG nova.scheduler.client.report [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Refreshing aggregate associations for resource provider 05276789-7461-410b-9529-16f5185a8bff, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Nov 26 04:36:56 localhost nova_compute[229802]: 2025-11-26 09:36:56.626 229806 DEBUG nova.scheduler.client.report [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Refreshing trait associations for resource provider 05276789-7461-410b-9529-16f5185a8bff, traits: COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_FMA3,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_ABM,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_F16C,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_AMD_SVM,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE41,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_CLMUL,COMPUTE_NODE,HW_CPU_X86_SSE4A,COMPUTE_DEVICE_TAGGING,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_AESNI,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_TRUSTED_CERTS,COMPUTE_ACCELERATORS,HW_CPU_X86_AVX _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Nov 26 04:36:56 localhost nova_compute[229802]: 2025-11-26 09:36:56.669 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:36:56 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:36:56 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:36:56 localhost nova_compute[229802]: 2025-11-26 09:36:56.976 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:36:57 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:36:57 localhost nova_compute[229802]: 2025-11-26 09:36:57.108 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:36:57 localhost nova_compute[229802]: 2025-11-26 09:36:57.117 229806 DEBUG nova.compute.provider_tree [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 04:36:57 localhost nova_compute[229802]: 2025-11-26 09:36:57.159 229806 DEBUG nova.scheduler.client.report [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 04:36:57 localhost nova_compute[229802]: 2025-11-26 09:36:57.162 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 04:36:57 localhost nova_compute[229802]: 2025-11-26 09:36:57.163 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.695s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:36:57 localhost systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully. Nov 26 04:36:57 localhost systemd[1]: var-lib-containers-storage-overlay-61b78344a7655d1356d5611c58b53e5d1f1449149a2ea477d54243044567e90e-merged.mount: Deactivated successfully. Nov 26 04:36:58 localhost systemd[1]: var-lib-containers-storage-overlay-61b78344a7655d1356d5611c58b53e5d1f1449149a2ea477d54243044567e90e-merged.mount: Deactivated successfully. Nov 26 04:36:58 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:36:58 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:36:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7908 DF PROTO=TCP SPT=49198 DPT=9101 SEQ=2133492065 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C30B7C0000000001030307) Nov 26 04:36:58 localhost nova_compute[229802]: 2025-11-26 09:36:58.163 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:36:58 localhost nova_compute[229802]: 2025-11-26 09:36:58.164 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 04:36:58 localhost nova_compute[229802]: 2025-11-26 09:36:58.164 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 04:36:58 localhost nova_compute[229802]: 2025-11-26 09:36:58.545 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:36:58 localhost nova_compute[229802]: 2025-11-26 09:36:58.545 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:36:58 localhost nova_compute[229802]: 2025-11-26 09:36:58.546 229806 DEBUG nova.network.neutron [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 04:36:58 localhost nova_compute[229802]: 2025-11-26 09:36:58.546 229806 DEBUG nova.objects.instance [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:36:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:36:58 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:36:58 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:36:58 localhost systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully. Nov 26 04:36:58 localhost podman[247169]: 2025-11-26 09:36:58.827907966 +0000 UTC m=+0.087035146 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118) Nov 26 04:36:58 localhost podman[247169]: 2025-11-26 09:36:58.867652337 +0000 UTC m=+0.126779527 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0) Nov 26 04:36:58 localhost nova_compute[229802]: 2025-11-26 09:36:58.967 229806 DEBUG nova.network.neutron [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:36:59 localhost nova_compute[229802]: 2025-11-26 09:36:59.026 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:36:59 localhost nova_compute[229802]: 2025-11-26 09:36:59.027 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 04:36:59 localhost nova_compute[229802]: 2025-11-26 09:36:59.027 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:36:59 localhost nova_compute[229802]: 2025-11-26 09:36:59.028 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:36:59 localhost nova_compute[229802]: 2025-11-26 09:36:59.028 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 04:36:59 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:36:59 localhost nova_compute[229802]: 2025-11-26 09:36:59.468 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:36:59 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:36:59 localhost systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully. Nov 26 04:36:59 localhost systemd[1]: var-lib-containers-storage-overlay-772983d29741817fb5112b04db0ec34846c51e947d40ce51144a956997c63192-merged.mount: Deactivated successfully. Nov 26 04:36:59 localhost nova_compute[229802]: 2025-11-26 09:36:59.853 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:36:59 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:36:59 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:37:00 localhost nova_compute[229802]: 2025-11-26 09:37:00.608 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:37:00 localhost nova_compute[229802]: 2025-11-26 09:37:00.609 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:37:00 localhost nova_compute[229802]: 2025-11-26 09:37:00.609 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:37:00 localhost systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully. Nov 26 04:37:00 localhost systemd[1]: var-lib-containers-storage-overlay-4bba6b9e4ee096356fde8a3c1b121b7bd17e19e8adc642d7cdd467db161ba283-merged.mount: Deactivated successfully. Nov 26 04:37:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44305 DF PROTO=TCP SPT=55300 DPT=9102 SEQ=435731928 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C3167C0000000001030307) Nov 26 04:37:01 localhost nova_compute[229802]: 2025-11-26 09:37:01.608 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:37:01 localhost nova_compute[229802]: 2025-11-26 09:37:01.979 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:37:02 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:37:02 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Nov 26 04:37:02 localhost nova_compute[229802]: 2025-11-26 09:37:02.608 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:37:02 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Nov 26 04:37:03 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Nov 26 04:37:03 localhost systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully. Nov 26 04:37:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:37:03 localhost podman[247212]: 2025-11-26 09:37:03.355329116 +0000 UTC m=+0.076001433 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_id=edpm, maintainer=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, distribution-scope=public) Nov 26 04:37:03 localhost podman[247212]: 2025-11-26 09:37:03.393399994 +0000 UTC m=+0.114072301 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.openshift.expose-services=, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-type=git, version=9.6, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7) Nov 26 04:37:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:37:03.637 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:37:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:37:03.638 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:37:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:37:03.640 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:37:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7910 DF PROTO=TCP SPT=49198 DPT=9101 SEQ=2133492065 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C3233D0000000001030307) Nov 26 04:37:04 localhost nova_compute[229802]: 2025-11-26 09:37:04.889 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:37:04 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:37:05 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:37:05 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:37:05 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Nov 26 04:37:05 localhost kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:37:05 localhost kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:37:05 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:37:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:37:06 localhost podman[247232]: 2025-11-26 09:37:06.048632785 +0000 UTC m=+0.099705682 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 26 04:37:06 localhost podman[247232]: 2025-11-26 09:37:06.085310537 +0000 UTC m=+0.136383424 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 26 04:37:06 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:37:06 localhost podman[240049]: time="2025-11-26T09:37:06Z" level=error msg="Getting root fs size for \"be5982fcd687e82860b080b799e695b90682b3fd1b100a2990a5b909b8891628\": getting diffsize of layer \"3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae\" and its parent \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\": unmounting layer 3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae: replacing mount point \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged\": device or resource busy" Nov 26 04:37:06 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:37:06 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:37:06 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:37:06 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:37:07 localhost nova_compute[229802]: 2025-11-26 09:37:07.014 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:37:07 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:37:07 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:37:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:37:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:37:08 localhost podman[247255]: 2025-11-26 09:37:08.101214827 +0000 UTC m=+0.149162753 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Nov 26 04:37:08 localhost podman[247255]: 2025-11-26 09:37:08.110439264 +0000 UTC m=+0.158387200 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 26 04:37:08 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:37:08 localhost podman[247256]: 2025-11-26 09:37:08.061560068 +0000 UTC m=+0.101091968 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3) Nov 26 04:37:08 localhost podman[247256]: 2025-11-26 09:37:08.192003939 +0000 UTC m=+0.231535789 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd) Nov 26 04:37:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44306 DF PROTO=TCP SPT=55300 DPT=9102 SEQ=435731928 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C335FC0000000001030307) Nov 26 04:37:09 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Nov 26 04:37:09 localhost systemd[1]: var-lib-containers-storage-overlay-a9100a267b24367cb8370dbd929de9892c803b386b41da1c102b2b3d1fba1b2a-merged.mount: Deactivated successfully. Nov 26 04:37:09 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:37:09 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:37:09 localhost nova_compute[229802]: 2025-11-26 09:37:09.935 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:37:10 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:37:10 localhost systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully. Nov 26 04:37:10 localhost systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully. Nov 26 04:37:11 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:37:11 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:37:11 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:37:11 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:37:12 localhost nova_compute[229802]: 2025-11-26 09:37:12.043 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:37:12 localhost systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully. Nov 26 04:37:12 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:37:12 localhost sshd[247293]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:37:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7911 DF PROTO=TCP SPT=49198 DPT=9101 SEQ=2133492065 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C343FD0000000001030307) Nov 26 04:37:12 localhost systemd[1]: var-lib-containers-storage-overlay-23a3228a9cce0b0415c49ae8c807f0239a9343b6fe6519fde67c779fd8bde488-merged.mount: Deactivated successfully. Nov 26 04:37:12 localhost systemd[1]: var-lib-containers-storage-overlay-530f599619a39baef532cc44e4223e07e6aec95ffdcc3af67e9a79046d6bd825-merged.mount: Deactivated successfully. Nov 26 04:37:13 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46311 DF PROTO=TCP SPT=41006 DPT=9882 SEQ=2708091577 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C345FC0000000001030307) Nov 26 04:37:13 localhost systemd[1]: var-lib-containers-storage-overlay-0ef5cc2b89c8a41c643bbf27e239c40ba42d2785cdc67bc5e5d4e7b894568a96-merged.mount: Deactivated successfully. Nov 26 04:37:13 localhost systemd[1]: var-lib-containers-storage-overlay-3e0acb8482e457c209ba092ee475b7db4ab9004896d495d3f7437d3c12bacbd3-merged.mount: Deactivated successfully. Nov 26 04:37:14 localhost systemd[1]: var-lib-containers-storage-overlay-3e0acb8482e457c209ba092ee475b7db4ab9004896d495d3f7437d3c12bacbd3-merged.mount: Deactivated successfully. Nov 26 04:37:14 localhost nova_compute[229802]: 2025-11-26 09:37:14.936 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:37:15 localhost systemd[1]: var-lib-containers-storage-overlay-0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c-merged.mount: Deactivated successfully. Nov 26 04:37:15 localhost systemd[1]: var-lib-containers-storage-overlay-0ef5cc2b89c8a41c643bbf27e239c40ba42d2785cdc67bc5e5d4e7b894568a96-merged.mount: Deactivated successfully. Nov 26 04:37:15 localhost systemd[1]: var-lib-containers-storage-overlay-bba04dbd2e113925874f779bcae6bca7316b7a66f7b274a8553d6a317f393238-merged.mount: Deactivated successfully. Nov 26 04:37:15 localhost systemd[1]: var-lib-containers-storage-overlay-5d77d5f2ee8b05529dee3e4c9abd16672fdc0c09b8af133b2b88b75f7f780254-merged.mount: Deactivated successfully. Nov 26 04:37:15 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:37:15 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:37:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63697 DF PROTO=TCP SPT=48456 DPT=9105 SEQ=570032398 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C350900000000001030307) Nov 26 04:37:16 localhost systemd[1]: var-lib-containers-storage-overlay-0ef5cc2b89c8a41c643bbf27e239c40ba42d2785cdc67bc5e5d4e7b894568a96-merged.mount: Deactivated successfully. Nov 26 04:37:16 localhost systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully. Nov 26 04:37:16 localhost systemd[1]: var-lib-containers-storage-overlay-ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6-merged.mount: Deactivated successfully. Nov 26 04:37:17 localhost nova_compute[229802]: 2025-11-26 09:37:17.084 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:37:17 localhost systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully. Nov 26 04:37:17 localhost systemd[1]: var-lib-containers-storage-overlay-ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6-merged.mount: Deactivated successfully. Nov 26 04:37:17 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:37:17 localhost systemd[1]: var-lib-containers-storage-overlay-0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c-merged.mount: Deactivated successfully. Nov 26 04:37:17 localhost systemd[1]: var-lib-containers-storage-overlay-bba04dbd2e113925874f779bcae6bca7316b7a66f7b274a8553d6a317f393238-merged.mount: Deactivated successfully. Nov 26 04:37:18 localhost systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully. Nov 26 04:37:18 localhost systemd[1]: var-lib-containers-storage-overlay-0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c-merged.mount: Deactivated successfully. Nov 26 04:37:18 localhost systemd[1]: var-lib-containers-storage-overlay-3e0acb8482e457c209ba092ee475b7db4ab9004896d495d3f7437d3c12bacbd3-merged.mount: Deactivated successfully. Nov 26 04:37:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63699 DF PROTO=TCP SPT=48456 DPT=9105 SEQ=570032398 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C35C7D0000000001030307) Nov 26 04:37:19 localhost systemd[1]: var-lib-containers-storage-overlay-0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c-merged.mount: Deactivated successfully. Nov 26 04:37:19 localhost systemd[1]: var-lib-containers-storage-overlay-ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6-merged.mount: Deactivated successfully. Nov 26 04:37:19 localhost systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully. Nov 26 04:37:19 localhost systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully. Nov 26 04:37:19 localhost nova_compute[229802]: 2025-11-26 09:37:19.957 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:37:21 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:37:21 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:37:21 localhost systemd[1]: var-lib-containers-storage-overlay-5d77d5f2ee8b05529dee3e4c9abd16672fdc0c09b8af133b2b88b75f7f780254-merged.mount: Deactivated successfully. Nov 26 04:37:22 localhost nova_compute[229802]: 2025-11-26 09:37:22.088 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:37:22 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:37:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63700 DF PROTO=TCP SPT=48456 DPT=9105 SEQ=570032398 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C36C3D0000000001030307) Nov 26 04:37:23 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:37:23 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:37:24 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Nov 26 04:37:24 localhost systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully. Nov 26 04:37:24 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:37:24 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:37:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:37:24 localhost podman[247295]: 2025-11-26 09:37:24.83057839 +0000 UTC m=+0.093061241 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Nov 26 04:37:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6374 DF PROTO=TCP SPT=45004 DPT=9102 SEQ=1956901186 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C373FC0000000001030307) Nov 26 04:37:24 localhost podman[247295]: 2025-11-26 09:37:24.870722352 +0000 UTC m=+0.133205193 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true) Nov 26 04:37:24 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:37:24 localhost nova_compute[229802]: 2025-11-26 09:37:24.960 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:37:25 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:37:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:37:26 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:37:26 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Nov 26 04:37:26 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Nov 26 04:37:26 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:37:26 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:37:26 localhost podman[247313]: 2025-11-26 09:37:26.975754621 +0000 UTC m=+1.323474865 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 04:37:26 localhost podman[247313]: 2025-11-26 09:37:26.991186238 +0000 UTC m=+1.338906482 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 04:37:26 localhost podman[247313]: unhealthy Nov 26 04:37:27 localhost nova_compute[229802]: 2025-11-26 09:37:27.093 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:37:27 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:37:27 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:37:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58325 DF PROTO=TCP SPT=40192 DPT=9102 SEQ=3182919638 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C37FFC0000000001030307) Nov 26 04:37:27 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:37:28 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:37:28 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:37:28 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:37:28 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Failed with result 'exit-code'. Nov 26 04:37:28 localhost podman[240049]: time="2025-11-26T09:37:28Z" level=error msg="Getting root fs size for \"f67337eb348d14cc8789e9dcf8617d0dec3d3d925b69fc4ab56922ca0f9658f9\": unmounting layer c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6: replacing mount point \"/var/lib/containers/storage/overlay/c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6/merged\": device or resource busy" Nov 26 04:37:28 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:37:28 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:37:28 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:37:28 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:37:28 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:37:29 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:37:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:37:29 localhost podman[247335]: 2025-11-26 09:37:29.829165013 +0000 UTC m=+0.092594740 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251118, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller) Nov 26 04:37:29 localhost podman[247335]: 2025-11-26 09:37:29.870080004 +0000 UTC m=+0.133509761 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller) Nov 26 04:37:29 localhost nova_compute[229802]: 2025-11-26 09:37:29.993 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:37:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6376 DF PROTO=TCP SPT=45004 DPT=9102 SEQ=1956901186 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C38BBC0000000001030307) Nov 26 04:37:31 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:37:31 localhost systemd[1]: var-lib-containers-storage-overlay-562cf4f00dff93969d256b2be1bb2ab69067066ea2da814f14524981979b95c3-merged.mount: Deactivated successfully. Nov 26 04:37:31 localhost systemd[1]: var-lib-containers-storage-overlay-562cf4f00dff93969d256b2be1bb2ab69067066ea2da814f14524981979b95c3-merged.mount: Deactivated successfully. Nov 26 04:37:31 localhost systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully. Nov 26 04:37:31 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:37:32 localhost nova_compute[229802]: 2025-11-26 09:37:32.115 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:37:32 localhost systemd[1]: var-lib-containers-storage-overlay-61b78344a7655d1356d5611c58b53e5d1f1449149a2ea477d54243044567e90e-merged.mount: Deactivated successfully. Nov 26 04:37:32 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:37:32 localhost systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully. Nov 26 04:37:32 localhost systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully. Nov 26 04:37:33 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:37:33 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:37:33 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:37:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29570 DF PROTO=TCP SPT=33746 DPT=9101 SEQ=3456300304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C3983C0000000001030307) Nov 26 04:37:34 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:37:34 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Nov 26 04:37:34 localhost systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully. Nov 26 04:37:34 localhost systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully. Nov 26 04:37:34 localhost systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully. Nov 26 04:37:34 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:37:35 localhost nova_compute[229802]: 2025-11-26 09:37:35.030 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:37:35 localhost systemd[1]: var-lib-containers-storage-overlay-4bba6b9e4ee096356fde8a3c1b121b7bd17e19e8adc642d7cdd467db161ba283-merged.mount: Deactivated successfully. Nov 26 04:37:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:37:35 localhost systemd[1]: tmp-crun.ddPo4a.mount: Deactivated successfully. Nov 26 04:37:35 localhost podman[247358]: 2025-11-26 09:37:35.8372029 +0000 UTC m=+0.100435725 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vcs-type=git, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm) Nov 26 04:37:35 localhost podman[247358]: 2025-11-26 09:37:35.854772001 +0000 UTC m=+0.118004836 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1755695350, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, name=ubi9-minimal) Nov 26 04:37:36 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:37:37 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Nov 26 04:37:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:37:37 localhost nova_compute[229802]: 2025-11-26 09:37:37.154 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:37:37 localhost systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully. Nov 26 04:37:37 localhost systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully. Nov 26 04:37:37 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:37:37 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:37:37 localhost podman[247378]: 2025-11-26 09:37:37.328721506 +0000 UTC m=+0.281848485 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 26 04:37:37 localhost podman[247378]: 2025-11-26 09:37:37.364255418 +0000 UTC m=+0.317382457 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 04:37:38 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:37:39 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:37:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6377 DF PROTO=TCP SPT=45004 DPT=9102 SEQ=1956901186 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C3ABFC0000000001030307) Nov 26 04:37:39 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Nov 26 04:37:39 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Nov 26 04:37:39 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:37:39 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:37:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:37:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:37:40 localhost nova_compute[229802]: 2025-11-26 09:37:40.081 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:37:40 localhost podman[247401]: 2025-11-26 09:37:40.117858407 +0000 UTC m=+0.121400302 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible) Nov 26 04:37:40 localhost podman[247401]: 2025-11-26 09:37:40.153154342 +0000 UTC m=+0.156696217 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:37:40 localhost podman[247400]: 2025-11-26 09:37:40.232018411 +0000 UTC m=+0.239120407 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:37:40 localhost podman[247400]: 2025-11-26 09:37:40.267298376 +0000 UTC m=+0.274400312 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:37:40 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:37:40 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:37:40 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:37:41 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:37:41 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:37:41 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:37:41 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:37:41 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:37:41 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:37:41 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:37:41 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:37:41 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:37:42 localhost nova_compute[229802]: 2025-11-26 09:37:42.191 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:37:42 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:37:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29571 DF PROTO=TCP SPT=33746 DPT=9101 SEQ=3456300304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C3B7FC0000000001030307) Nov 26 04:37:42 localhost python3.9[247530]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:37:43 localhost systemd[1]: var-lib-containers-storage-overlay-f82f216961b52eae10893792e45fbdb3b64d10f3200729b24023287a013ca587-merged.mount: Deactivated successfully. Nov 26 04:37:43 localhost python3.9[247640]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:37:43 localhost systemd[1]: var-lib-containers-storage-overlay-f82f216961b52eae10893792e45fbdb3b64d10f3200729b24023287a013ca587-merged.mount: Deactivated successfully. Nov 26 04:37:44 localhost python3.9[247728]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764149863.156921-3066-66240729994387/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:37:44 localhost systemd[1]: var-lib-containers-storage-overlay-23a3228a9cce0b0415c49ae8c807f0239a9343b6fe6519fde67c779fd8bde488-merged.mount: Deactivated successfully. Nov 26 04:37:44 localhost systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully. Nov 26 04:37:45 localhost nova_compute[229802]: 2025-11-26 09:37:45.131 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:37:45 localhost python3.9[247838]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:37:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3010 DF PROTO=TCP SPT=35936 DPT=9105 SEQ=1275732352 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C3C5C00000000001030307) Nov 26 04:37:45 localhost python3.9[247948]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:37:46 localhost systemd[1]: var-lib-containers-storage-overlay-0ef5cc2b89c8a41c643bbf27e239c40ba42d2785cdc67bc5e5d4e7b894568a96-merged.mount: Deactivated successfully. Nov 26 04:37:46 localhost systemd[1]: var-lib-containers-storage-overlay-3e0acb8482e457c209ba092ee475b7db4ab9004896d495d3f7437d3c12bacbd3-merged.mount: Deactivated successfully. Nov 26 04:37:46 localhost systemd[1]: var-lib-containers-storage-overlay-3e0acb8482e457c209ba092ee475b7db4ab9004896d495d3f7437d3c12bacbd3-merged.mount: Deactivated successfully. Nov 26 04:37:46 localhost python3.9[248005]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:37:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3011 DF PROTO=TCP SPT=35936 DPT=9105 SEQ=1275732352 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C3C9BC0000000001030307) Nov 26 04:37:47 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:37:47 localhost python3.9[248115]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:37:47 localhost nova_compute[229802]: 2025-11-26 09:37:47.195 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:37:47 localhost systemd[1]: var-lib-containers-storage-overlay-0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c-merged.mount: Deactivated successfully. Nov 26 04:37:47 localhost systemd[1]: var-lib-containers-storage-overlay-0ef5cc2b89c8a41c643bbf27e239c40ba42d2785cdc67bc5e5d4e7b894568a96-merged.mount: Deactivated successfully. Nov 26 04:37:47 localhost python3.9[248172]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.mutad2rr recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:37:47 localhost systemd[1]: var-lib-containers-storage-overlay-0ef5cc2b89c8a41c643bbf27e239c40ba42d2785cdc67bc5e5d4e7b894568a96-merged.mount: Deactivated successfully. Nov 26 04:37:48 localhost python3.9[248283]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:37:48 localhost systemd[1]: var-lib-containers-storage-overlay-6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068-merged.mount: Deactivated successfully. Nov 26 04:37:48 localhost systemd[1]: var-lib-containers-storage-overlay-0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c-merged.mount: Deactivated successfully. Nov 26 04:37:48 localhost systemd[1]: var-lib-containers-storage-overlay-0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c-merged.mount: Deactivated successfully. Nov 26 04:37:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3012 DF PROTO=TCP SPT=35936 DPT=9105 SEQ=1275732352 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C3D1BC0000000001030307) Nov 26 04:37:48 localhost python3.9[248340]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:37:49 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:37:49 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:37:49 localhost systemd[1]: var-lib-containers-storage-overlay-ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6-merged.mount: Deactivated successfully. Nov 26 04:37:49 localhost python3.9[248450]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:37:50 localhost nova_compute[229802]: 2025-11-26 09:37:50.133 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:37:50 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:37:50 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:37:50 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:37:50 localhost python3[248561]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall Nov 26 04:37:50 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:37:50 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:37:51 localhost systemd[1]: var-lib-containers-storage-overlay-3e0acb8482e457c209ba092ee475b7db4ab9004896d495d3f7437d3c12bacbd3-merged.mount: Deactivated successfully. Nov 26 04:37:51 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:37:51 localhost sshd[248672]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:37:51 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:37:51 localhost python3.9[248671]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:37:51 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:37:51 localhost python3.9[248730]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:37:51 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:37:52 localhost nova_compute[229802]: 2025-11-26 09:37:52.198 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:37:52 localhost python3.9[248840]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:37:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3013 DF PROTO=TCP SPT=35936 DPT=9105 SEQ=1275732352 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C3E17D0000000001030307) Nov 26 04:37:53 localhost python3.9[248897]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:37:53 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:37:53 localhost python3.9[249007]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:37:54 localhost python3.9[249064]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:37:54 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:37:54 localhost systemd[1]: var-lib-containers-storage-overlay-97061593b7b68895c2bf349b5cf4d3820f346f775d77096764091fa4e8f8d995-merged.mount: Deactivated successfully. Nov 26 04:37:54 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Nov 26 04:37:54 localhost systemd[1]: var-lib-containers-storage-overlay-97061593b7b68895c2bf349b5cf4d3820f346f775d77096764091fa4e8f8d995-merged.mount: Deactivated successfully. Nov 26 04:37:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42005 DF PROTO=TCP SPT=48152 DPT=9102 SEQ=3066544008 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C3E8FD0000000001030307) Nov 26 04:37:55 localhost nova_compute[229802]: 2025-11-26 09:37:55.136 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:37:55 localhost python3.9[249174]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:37:55 localhost systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully. Nov 26 04:37:55 localhost systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully. Nov 26 04:37:55 localhost python3.9[249231]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:37:55 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:37:55 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:37:55 localhost systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully. Nov 26 04:37:56 localhost nova_compute[229802]: 2025-11-26 09:37:56.609 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:37:56 localhost nova_compute[229802]: 2025-11-26 09:37:56.609 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 04:37:56 localhost nova_compute[229802]: 2025-11-26 09:37:56.610 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 04:37:56 localhost systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully. Nov 26 04:37:56 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:37:57 localhost python3.9[249341]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:37:57 localhost nova_compute[229802]: 2025-11-26 09:37:57.202 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:37:57 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:37:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:37:57 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:37:57 localhost systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully. Nov 26 04:37:57 localhost podman[249374]: 2025-11-26 09:37:57.413773671 +0000 UTC m=+0.100359864 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 26 04:37:57 localhost podman[249374]: 2025-11-26 09:37:57.424627959 +0000 UTC m=+0.111214142 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:37:57 localhost podman[240049]: time="2025-11-26T09:37:57Z" level=error msg="Unable to write json: \"write unix /run/podman/podman.sock->@: write: broken pipe\"" Nov 26 04:37:57 localhost podman[240049]: @ - - [26/Nov/2025:09:32:25 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 4096 "" "Go-http-client/1.1" Nov 26 04:37:57 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:37:57 localhost nova_compute[229802]: 2025-11-26 09:37:57.561 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:37:57 localhost nova_compute[229802]: 2025-11-26 09:37:57.562 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:37:57 localhost nova_compute[229802]: 2025-11-26 09:37:57.563 229806 DEBUG nova.network.neutron [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 04:37:57 localhost nova_compute[229802]: 2025-11-26 09:37:57.563 229806 DEBUG nova.objects.instance [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:37:57 localhost python3.9[249451]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764149876.5569084-3441-32026568907195/.source.nft follow=False _original_basename=ruleset.j2 checksum=953266ca5f7d82d2777a0a437bd7feceb9259ee8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:37:57 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:37:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4337 DF PROTO=TCP SPT=33376 DPT=9101 SEQ=3181714146 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C3F5BC0000000001030307) Nov 26 04:37:58 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:37:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:37:58 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:37:58 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:37:58 localhost podman[249523]: 2025-11-26 09:37:58.2668594 +0000 UTC m=+0.093871783 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 26 04:37:58 localhost podman[249523]: 2025-11-26 09:37:58.30542568 +0000 UTC m=+0.132438073 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 04:37:58 localhost podman[249523]: unhealthy Nov 26 04:37:58 localhost nova_compute[229802]: 2025-11-26 09:37:58.519 229806 DEBUG nova.network.neutron [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:37:58 localhost nova_compute[229802]: 2025-11-26 09:37:58.547 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:37:58 localhost nova_compute[229802]: 2025-11-26 09:37:58.547 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 04:37:58 localhost nova_compute[229802]: 2025-11-26 09:37:58.548 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:37:58 localhost nova_compute[229802]: 2025-11-26 09:37:58.570 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:37:58 localhost nova_compute[229802]: 2025-11-26 09:37:58.571 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:37:58 localhost nova_compute[229802]: 2025-11-26 09:37:58.571 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:37:58 localhost nova_compute[229802]: 2025-11-26 09:37:58.571 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 04:37:58 localhost nova_compute[229802]: 2025-11-26 09:37:58.572 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:37:58 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Main process exited, code=exited, status=1/FAILURE Nov 26 04:37:58 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Failed with result 'exit-code'. Nov 26 04:37:59 localhost nova_compute[229802]: 2025-11-26 09:37:59.065 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.493s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:37:59 localhost nova_compute[229802]: 2025-11-26 09:37:59.137 229806 DEBUG nova.virt.libvirt.driver [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:37:59 localhost nova_compute[229802]: 2025-11-26 09:37:59.138 229806 DEBUG nova.virt.libvirt.driver [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:37:59 localhost python3.9[249623]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:37:59 localhost nova_compute[229802]: 2025-11-26 09:37:59.334 229806 WARNING nova.virt.libvirt.driver [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 04:37:59 localhost nova_compute[229802]: 2025-11-26 09:37:59.335 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=12315MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 04:37:59 localhost nova_compute[229802]: 2025-11-26 09:37:59.336 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:37:59 localhost nova_compute[229802]: 2025-11-26 09:37:59.336 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:37:59 localhost nova_compute[229802]: 2025-11-26 09:37:59.404 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 04:37:59 localhost nova_compute[229802]: 2025-11-26 09:37:59.405 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 04:37:59 localhost nova_compute[229802]: 2025-11-26 09:37:59.405 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 04:37:59 localhost nova_compute[229802]: 2025-11-26 09:37:59.449 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:37:59 localhost nova_compute[229802]: 2025-11-26 09:37:59.927 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:37:59 localhost nova_compute[229802]: 2025-11-26 09:37:59.936 229806 DEBUG nova.compute.provider_tree [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 04:37:59 localhost nova_compute[229802]: 2025-11-26 09:37:59.955 229806 DEBUG nova.scheduler.client.report [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 04:37:59 localhost nova_compute[229802]: 2025-11-26 09:37:59.958 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 04:37:59 localhost nova_compute[229802]: 2025-11-26 09:37:59.959 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.623s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:38:00 localhost nova_compute[229802]: 2025-11-26 09:38:00.020 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:38:00 localhost nova_compute[229802]: 2025-11-26 09:38:00.021 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:38:00 localhost nova_compute[229802]: 2025-11-26 09:38:00.021 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 04:38:00 localhost python3.9[249785]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:38:00 localhost nova_compute[229802]: 2025-11-26 09:38:00.139 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:38:00 localhost nova_compute[229802]: 2025-11-26 09:38:00.610 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:38:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42007 DF PROTO=TCP SPT=48152 DPT=9102 SEQ=3066544008 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C400BC0000000001030307) Nov 26 04:38:01 localhost python3.9[249900]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:38:01 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:38:01 localhost systemd[1]: var-lib-containers-storage-overlay-562cf4f00dff93969d256b2be1bb2ab69067066ea2da814f14524981979b95c3-merged.mount: Deactivated successfully. Nov 26 04:38:01 localhost python3.9[250012]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:38:01 localhost auditd[727]: Audit daemon rotating log files Nov 26 04:38:02 localhost nova_compute[229802]: 2025-11-26 09:38:02.206 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:38:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:38:02 localhost podman[250086]: 2025-11-26 09:38:02.337288384 +0000 UTC m=+0.087850795 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3) Nov 26 04:38:02 localhost podman[250086]: 2025-11-26 09:38:02.364262664 +0000 UTC m=+0.114825075 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller) Nov 26 04:38:02 localhost python3.9[250182]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:38:02 localhost nova_compute[229802]: 2025-11-26 09:38:02.605 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:38:02 localhost nova_compute[229802]: 2025-11-26 09:38:02.607 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:38:02 localhost nova_compute[229802]: 2025-11-26 09:38:02.608 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:38:02 localhost nova_compute[229802]: 2025-11-26 09:38:02.608 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:38:02 localhost sshd[250203]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:38:03 localhost python3.9[250297]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.560 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'name': 'test', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005536118.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'hostId': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.562 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.562 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.568 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e45d102-a5c9-4392-933d-5d16279ba0bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:38:03.562809', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '9ae958b0-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10358.805706595, 'message_signature': '95d1ae470562e8a0f5056ca18aa56d114d7eec798691bf6526ab18d60fd3cbf1'}]}, 'timestamp': '2025-11-26 09:38:03.569142', '_unique_id': 'bfcc7df27feb4a9da6c0a5edfe1ce05f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.571 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.572 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.605 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 73912320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.606 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9538fc20-d682-4061-8c83-985ec76d5b88', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73912320, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:38:03.572683', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9aef02e2-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10358.814959693, 'message_signature': 'e90761b0a5d4752b366f70fb1b56066e6cedc4d40181a7eed9522d3d6f845cdc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:38:03.572683', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9aef167e-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10358.814959693, 'message_signature': 'd5a1b882b65a44d5cecc44bb6fd3c8c8f870272aa25d2e77745b39b14b41add4'}]}, 'timestamp': '2025-11-26 09:38:03.606603', '_unique_id': 'df885014dd0b4900affa98baca07520e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.607 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.609 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.609 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.609 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a8f5443d-5ff8-4ecc-88c7-f4261062121f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:38:03.609160', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9aef8c9e-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10358.814959693, 'message_signature': 'edadf7b25a447dffeec39b85e973ec2f3b2d8f67fd00d4ee89e2a97ecf92bccc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:38:03.609160', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9aef9d2e-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10358.814959693, 'message_signature': '7d82d6f4ba59286e6d605e5de53b412317827478689c9a63a0d1e4dc17a65de8'}]}, 'timestamp': '2025-11-26 09:38:03.610073', '_unique_id': 'f98d5205686c42b5ae9e3ee1610216eb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.611 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.612 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.627 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/memory.usage volume: 52.296875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '176846c7-384a-4cf8-af6e-111722b33e35', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.296875, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T09:38:03.612353', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '9af263ce-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10358.869852972, 'message_signature': '965ef03e600e902be64511ff0c6adc8c9ce156ab340dae7e5a720a60861ba8c7'}]}, 'timestamp': '2025-11-26 09:38:03.628288', '_unique_id': 'fd17c802ac58476ea3cc644bea2cf0bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.629 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.630 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.630 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c8ba5a9e-6fce-4106-8f5e-fcd014547453', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:38:03.630609', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '9af2d296-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10358.805706595, 'message_signature': '72635bf27fd3d62e35af6241bf1d54c4ac300d1f76c36d648ecc875079b3cac6'}]}, 'timestamp': '2025-11-26 09:38:03.631140', '_unique_id': 'b621662a249541cdad57e303bee56e06'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.632 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.633 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 26 04:38:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:38:03.639 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:38:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:38:03.640 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:38:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:38:03.643 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.647 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.648 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e93b847c-74e0-46e1-8218-d265806c6c59', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:38:03.633460', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9af57d7a-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10358.875713354, 'message_signature': '66f030cdc85d5a534aa7cc138f6e7d60f5e334681bc461cfce22bfb6c983d481'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:38:03.633460', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9af59116-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10358.875713354, 'message_signature': '2d27a9dd495774af12d54130138fe465984f79c71704c20fcb6e2d40383fec39'}]}, 'timestamp': '2025-11-26 09:38:03.649103', '_unique_id': '5ca30cbb9db545af85cf2f2703af779f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.650 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.651 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.652 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5a734476-abb5-4bed-bb9a-d6391a72fa0c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:38:03.652127', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '9af61b90-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10358.805706595, 'message_signature': '9bbbce6daedb8371f985c78f74dbc9bc6279a8d98b65451811db5c9d3602ac55'}]}, 'timestamp': '2025-11-26 09:38:03.652634', '_unique_id': '1be8de411b8c443fa710a0391c65860f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.653 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.654 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.654 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets volume: 88 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '390f2f43-0d41-4ee7-b14e-c92809131e5f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 88, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:38:03.654846', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '9af68616-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10358.805706595, 'message_signature': '934742a7d5e64a40cf5d0eebcb0f15e3623a031812aeb6148a8a30f9f24a6adb'}]}, 'timestamp': '2025-11-26 09:38:03.655357', '_unique_id': 'f113c5d76f514f5f88c998f858a43981'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.656 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.657 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.657 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.657 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 627516836 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.658 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 21052656 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50a13dac-9287-4a79-837f-2532817e8331', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 627516836, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:38:03.657789', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9af6f8ee-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10358.814959693, 'message_signature': 'fb62fa63a51b791e07a1b8350b71974b67ffe382e62b5b2d369bf706fa168338'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21052656, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:38:03.657789', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9af709ba-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10358.814959693, 'message_signature': '60667654a60ccab806e1d6526c16fc97383ba500a291f2193c1840cd72de57c6'}]}, 'timestamp': '2025-11-26 09:38:03.658696', '_unique_id': '1d1b68adcdbc4a4fafc04be300c51d2e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.659 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.660 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.661 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 1141678425 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.661 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 173265014 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '20a72ca9-4197-4142-b551-15b59029efe9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1141678425, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:38:03.660999', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9af77544-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10358.814959693, 'message_signature': 'fef3d71c3157e4ce01fd84b712e3d70f1f62c230dbe59e89c86d00f86360be0e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 173265014, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:38:03.660999', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9af785ca-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10358.814959693, 'message_signature': '2378cd6d87bb4d56066a87aeadd7e50aa17351a60e78b380b4432b95deb8fdc5'}]}, 'timestamp': '2025-11-26 09:38:03.661871', '_unique_id': 'cab2489d115a41afb34460abe70bcc59'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.662 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.663 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.664 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8345c615-af37-447f-a515-1bc1327cf90b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:38:03.664144', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '9af7f06e-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10358.805706595, 'message_signature': '67b2da9ac3dc9f25f1ba4e86873f83bf7488b324b83d1cf41a051733570edda0'}]}, 'timestamp': '2025-11-26 09:38:03.664634', '_unique_id': 'd808e0ef45dc45f686ec64c69e593c0c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.665 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.666 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.666 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/cpu volume: 58710000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '151bd57f-29e6-40fb-80b5-36ded4ba20a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 58710000000, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T09:38:03.666832', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '9af85a40-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10358.869852972, 'message_signature': '609a367d4b5e276255fe7df0cf359ee4fe579c1825b6c12464f40fcc016fe6b8'}]}, 'timestamp': '2025-11-26 09:38:03.667327', '_unique_id': 'd7f108fac3d743aaa92307925e57eef5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.668 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.669 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.669 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '60d0ba81-1241-4551-b322-7e1433798e82', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:38:03.669544', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '9af8c304-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10358.805706595, 'message_signature': '7d73a074df4a5f9349101ca74a85d8731173cc3d61606cecb10d2347a164f72a'}]}, 'timestamp': '2025-11-26 09:38:03.670054', '_unique_id': '8e5e2e844a874a3296bd7816a189be5c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.670 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.672 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.672 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.672 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd9b4639a-e11b-4a38-8a20-c8ee0d6f1f72', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:38:03.672244', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9af92c5e-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10358.875713354, 'message_signature': '272b9cefd71ea0f01775f819450fffcd473cc762cf19f7c4ef197ceb9a177747'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:38:03.672244', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9af9409a-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10358.875713354, 'message_signature': 'cbbc721cf1ab14f2ba8c8922e97e14d3b72deaa010b3766f172c0fe68ef797ab'}]}, 'timestamp': '2025-11-26 09:38:03.673236', '_unique_id': '7386990b87f54252a772881a8b6de5be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.674 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.675 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.675 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 524 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.676 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0683a8d9-23bc-4d59-847d-bb2f35c2dccc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 524, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:38:03.675723', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9af9b52a-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10358.814959693, 'message_signature': '05b05b0f8f503222830cd3f9ddd96383997dc159dbe467714b01abc7514898ae'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:38:03.675723', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9af9c9c0-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10358.814959693, 'message_signature': '718f96228cac7613a6b300e10b2553ea0354559fecfa96d207cc14cd8a522c95'}]}, 'timestamp': '2025-11-26 09:38:03.676757', '_unique_id': '974dd4d29f274badaf426a8a83439896'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.677 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.679 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.679 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.679 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '02d96cdc-0fce-4948-ab9b-617f620ddaa8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:38:03.679201', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9afa3c84-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10358.814959693, 'message_signature': '69b2cb8e41577b4322ed2fce2dd9b008ba3efe0f4ed8fbdd7095d67708002752'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:38:03.679201', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9afa4ce2-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10358.814959693, 'message_signature': '26a22b09926e96bb385ea5c888f48cf1a48b6b08f30c2f908fcb3089eddea3c5'}]}, 'timestamp': '2025-11-26 09:38:03.680145', '_unique_id': '5099972485364643a0dc198cc4b44885'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.680 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.681 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.681 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a12298d5-0eb5-49b9-89c5-6eb4da84f775', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:38:03.681848', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '9afaa14c-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10358.805706595, 'message_signature': 'a24643eb9bc02d8900361f48cc1cc252e024bdae57bc6c21b3e752e6f2e317c2'}]}, 'timestamp': '2025-11-26 09:38:03.682313', '_unique_id': '9c555f46c0ab4fb485757b3a2330bf2c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.682 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.683 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.683 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6657a27d-c453-4f97-b18b-d8533b797b1d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:38:03.683828', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '9afaee0e-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10358.805706595, 'message_signature': 'f47062fc1fcb1b88f930faab918d71a4132d2442c33195401528eb6c78a5106c'}]}, 'timestamp': '2025-11-26 09:38:03.684164', '_unique_id': '1997e51bfc2842cca4bf5347e583b35f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.684 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.685 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.685 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e90168c-ac14-4935-97ef-f3e27304ef9d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:38:03.685742', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '9afb3814-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10358.805706595, 'message_signature': 'eb21e355bc4b583a61c9e6307ab902c8378d99101e3d48b8c7e8033cd629cc9a'}]}, 'timestamp': '2025-11-26 09:38:03.686058', '_unique_id': '65f7c58022df42fe955e5b93167e701f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.686 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.687 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.687 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes volume: 9035 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a92bd1df-0e99-49b5-a883-c8eb245dd992', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9035, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:38:03.687394', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '9afb7888-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10358.805706595, 'message_signature': '9632f79a2ab80cbff23dc46669b1c006966549e9e8a66cf52ff62fc80389fc8d'}]}, 'timestamp': '2025-11-26 09:38:03.687687', '_unique_id': '884246350fe2434f920ff1fd3d672da6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.688 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.689 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.689 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3dfc8a54-f46c-45b1-8f83-a9b13927dcbd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:38:03.689082', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9afbba78-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10358.875713354, 'message_signature': '8c9bf4d5fceea80263ae31f99edac21156768ab9243e46171ac0f70cd4bad18f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:38:03.689082', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9afbc4f0-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10358.875713354, 'message_signature': 'cc9f2690423b823965c22edbb7b2d3b0a2ca1ac53214a7d65bf1e5de86bccb6c'}]}, 'timestamp': '2025-11-26 09:38:03.689626', '_unique_id': '82fb9455176f47b8948ae5bad8b3874c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:38:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:38:03.690 12 ERROR oslo_messaging.notify.messaging Nov 26 04:38:03 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Nov 26 04:38:04 localhost systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully. Nov 26 04:38:04 localhost systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully. Nov 26 04:38:04 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:38:04 localhost python3.9[250410]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:38:04 localhost systemd[1]: session-57.scope: Deactivated successfully. Nov 26 04:38:04 localhost systemd[1]: session-57.scope: Consumed 31.179s CPU time. Nov 26 04:38:04 localhost systemd-logind[761]: Session 57 logged out. Waiting for processes to exit. Nov 26 04:38:04 localhost systemd-logind[761]: Removed session 57. Nov 26 04:38:05 localhost nova_compute[229802]: 2025-11-26 09:38:05.142 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:38:06 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:38:06 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Nov 26 04:38:06 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Nov 26 04:38:07 localhost nova_compute[229802]: 2025-11-26 09:38:07.210 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:38:07 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:38:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:38:07 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:38:07 localhost podman[250428]: 2025-11-26 09:38:07.634118504 +0000 UTC m=+0.084011345 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, version=9.6, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 04:38:07 localhost podman[250428]: 2025-11-26 09:38:07.64172567 +0000 UTC m=+0.091618501 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, vcs-type=git, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9-minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=edpm, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41) Nov 26 04:38:07 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:38:08 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:38:08 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:38:08 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:38:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42008 DF PROTO=TCP SPT=48152 DPT=9102 SEQ=3066544008 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C421FD0000000001030307) Nov 26 04:38:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:38:09 localhost podman[250450]: 2025-11-26 09:38:09.812136115 +0000 UTC m=+0.074477708 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:38:09 localhost podman[250450]: 2025-11-26 09:38:09.853516583 +0000 UTC m=+0.115858146 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 26 04:38:10 localhost nova_compute[229802]: 2025-11-26 09:38:10.144 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:38:10 localhost sshd[250473]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:38:11 localhost systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully. Nov 26 04:38:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:38:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:38:11 localhost systemd-logind[761]: New session 58 of user zuul. Nov 26 04:38:11 localhost systemd[1]: Started Session 58 of User zuul. Nov 26 04:38:11 localhost systemd[1]: var-lib-containers-storage-overlay-f82f216961b52eae10893792e45fbdb3b64d10f3200729b24023287a013ca587-merged.mount: Deactivated successfully. Nov 26 04:38:11 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:38:11 localhost podman[250477]: 2025-11-26 09:38:11.362767152 +0000 UTC m=+0.304252330 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd) Nov 26 04:38:11 localhost podman[250477]: 2025-11-26 09:38:11.402085735 +0000 UTC m=+0.343570923 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 26 04:38:11 localhost podman[250476]: 2025-11-26 09:38:11.423324556 +0000 UTC m=+0.368489608 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent) Nov 26 04:38:11 localhost podman[250476]: 2025-11-26 09:38:11.452184825 +0000 UTC m=+0.397349847 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_managed=true) Nov 26 04:38:11 localhost python3.9[250620]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:38:12 localhost nova_compute[229802]: 2025-11-26 09:38:12.214 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:38:13 localhost python3.9[250730]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:38:13 localhost python3.9[250840]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated/neutron-sriov-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:38:13 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:38:13 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:38:14 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:38:14 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:38:14 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:38:14 localhost python3.9[250948]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/neutron_sriov_agent.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:38:15 localhost nova_compute[229802]: 2025-11-26 09:38:15.168 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:38:15 localhost python3.9[251034]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/neutron_sriov_agent.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764149893.9511719-105-214820178713224/.source.yaml follow=False _original_basename=neutron_sriov_agent.yaml.j2 checksum=d3942d8476d006ea81540d2a1d96dd9d67f33f5f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:38:15 localhost openstack_network_exporter[242153]: ERROR 09:38:15 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:38:15 localhost openstack_network_exporter[242153]: ERROR 09:38:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:38:15 localhost openstack_network_exporter[242153]: ERROR 09:38:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:38:15 localhost openstack_network_exporter[242153]: ERROR 09:38:15 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:38:15 localhost openstack_network_exporter[242153]: Nov 26 04:38:15 localhost openstack_network_exporter[242153]: ERROR 09:38:15 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:38:15 localhost openstack_network_exporter[242153]: Nov 26 04:38:15 localhost python3.9[251146]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:38:16 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:38:16 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:38:16 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Nov 26 04:38:16 localhost python3.9[251233]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764149895.5293832-150-238603142289582/.source.conf follow=False _original_basename=neutron.conf.j2 checksum=24e013b64eb8be4a13596c6ffccbd94df7442bd2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:38:17 localhost python3.9[251341]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:38:17 localhost nova_compute[229802]: 2025-11-26 09:38:17.217 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:38:17 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:38:17 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:38:17 localhost python3.9[251427]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764149896.7217147-150-53475869527586/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:38:17 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Nov 26 04:38:18 localhost python3.9[251535]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron-sriov-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:38:18 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Nov 26 04:38:18 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:38:18 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Nov 26 04:38:18 localhost python3.9[251621]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron-sriov-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764149897.9039207-150-22775962939754/.source.conf follow=False _original_basename=neutron-sriov-agent.conf.j2 checksum=bf65e0eaa01fbacd21ac641b9ce1d1f0af9bcf56 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:38:20 localhost python3.9[251729]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/10-neutron-sriov.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:38:20 localhost nova_compute[229802]: 2025-11-26 09:38:20.218 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:38:20 localhost python3.9[251815]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/10-neutron-sriov.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764149899.7449052-324-135822073243438/.source.conf _original_basename=10-neutron-sriov.conf follow=False checksum=fce2fc14e9ea703c92eb8589d29e752536516b0b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:38:21 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Nov 26 04:38:21 localhost systemd[1]: var-lib-containers-storage-overlay-97061593b7b68895c2bf349b5cf4d3820f346f775d77096764091fa4e8f8d995-merged.mount: Deactivated successfully. Nov 26 04:38:21 localhost python3.9[251923]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:38:21 localhost systemd[1]: var-lib-containers-storage-overlay-97061593b7b68895c2bf349b5cf4d3820f346f775d77096764091fa4e8f8d995-merged.mount: Deactivated successfully. Nov 26 04:38:22 localhost python3.9[252035]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:38:22 localhost nova_compute[229802]: 2025-11-26 09:38:22.222 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:38:22 localhost systemd[1]: var-lib-containers-storage-overlay-55d5530fe8468c8c9907e0aa1de030811941604fa5f46de3db6dc15ec40906dd-merged.mount: Deactivated successfully. Nov 26 04:38:22 localhost systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully. Nov 26 04:38:22 localhost systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully. Nov 26 04:38:22 localhost python3.9[252145]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:38:23 localhost python3.9[252202]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:38:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43933 DF PROTO=TCP SPT=34756 DPT=9102 SEQ=1648210641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C45A390000000001030307) Nov 26 04:38:24 localhost python3.9[252312]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:38:24 localhost systemd[1]: var-lib-containers-storage-overlay-ae0ebe7656e29542866ff018f5be9a3d02c88268a65814cf045e1b6c30ffd352-merged.mount: Deactivated successfully. Nov 26 04:38:24 localhost podman[240049]: @ - - [26/Nov/2025:09:32:32 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 142737 "" "Go-http-client/1.1" Nov 26 04:38:24 localhost podman_exporter[240255]: ts=2025-11-26T09:38:24.396Z caller=exporter.go:96 level=info msg="Listening on" address=:9882 Nov 26 04:38:24 localhost podman_exporter[240255]: ts=2025-11-26T09:38:24.396Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882 Nov 26 04:38:24 localhost podman_exporter[240255]: ts=2025-11-26T09:38:24.396Z caller=tls_config.go:316 level=info msg="TLS is disabled." http2=false address=[::]:9882 Nov 26 04:38:24 localhost python3.9[252371]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:38:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43934 DF PROTO=TCP SPT=34756 DPT=9102 SEQ=1648210641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C45E3C0000000001030307) Nov 26 04:38:25 localhost nova_compute[229802]: 2025-11-26 09:38:25.252 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:38:25 localhost python3.9[252481]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:38:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42009 DF PROTO=TCP SPT=48152 DPT=9102 SEQ=3066544008 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C461FC0000000001030307) Nov 26 04:38:26 localhost python3.9[252591]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:38:26 localhost python3.9[252648]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:38:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43935 DF PROTO=TCP SPT=34756 DPT=9102 SEQ=1648210641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C4663D0000000001030307) Nov 26 04:38:27 localhost nova_compute[229802]: 2025-11-26 09:38:27.251 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:38:27 localhost python3.9[252758]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:38:27 localhost podman[240049]: time="2025-11-26T09:38:27Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:38:27 localhost podman[240049]: @ - - [26/Nov/2025:09:38:27 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 144354 "" "Go-http-client/1.1" Nov 26 04:38:27 localhost podman[240049]: @ - - [26/Nov/2025:09:38:27 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16339 "" "Go-http-client/1.1" Nov 26 04:38:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:38:27 localhost systemd[1]: tmp-crun.PVdZkb.mount: Deactivated successfully. Nov 26 04:38:27 localhost podman[252817]: 2025-11-26 09:38:27.692106321 +0000 UTC m=+0.104230325 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3) Nov 26 04:38:27 localhost podman[252817]: 2025-11-26 09:38:27.703178135 +0000 UTC m=+0.115302159 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 26 04:38:27 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:38:27 localhost python3.9[252816]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:38:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6379 DF PROTO=TCP SPT=45004 DPT=9102 SEQ=1956901186 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C469FC0000000001030307) Nov 26 04:38:28 localhost python3.9[252945]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:38:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:38:28 localhost systemd[1]: Reloading. Nov 26 04:38:29 localhost podman[252947]: 2025-11-26 09:38:29.080718314 +0000 UTC m=+0.103392087 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 26 04:38:29 localhost systemd-rc-local-generator[252991]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:38:29 localhost systemd-sysv-generator[252996]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:38:29 localhost podman[252947]: 2025-11-26 09:38:29.130678369 +0000 UTC m=+0.153352142 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:38:29 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:38:29 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:38:29 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:38:29 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:38:29 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:38:29 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:38:29 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:38:29 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:38:29 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:38:29 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:38:30 localhost python3.9[253115]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:38:30 localhost sshd[253116]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:38:30 localhost nova_compute[229802]: 2025-11-26 09:38:30.290 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:38:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43936 DF PROTO=TCP SPT=34756 DPT=9102 SEQ=1648210641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C475FC0000000001030307) Nov 26 04:38:31 localhost python3.9[253174]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:38:32 localhost python3.9[253284]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:38:32 localhost nova_compute[229802]: 2025-11-26 09:38:32.288 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:38:32 localhost python3.9[253341]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:38:34 localhost python3.9[253451]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:38:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:38:34 localhost systemd[1]: Reloading. Nov 26 04:38:34 localhost systemd-rc-local-generator[253493]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:38:34 localhost systemd-sysv-generator[253497]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:38:34 localhost podman[253453]: 2025-11-26 09:38:34.7189604 +0000 UTC m=+0.125166336 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Nov 26 04:38:34 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:38:34 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:38:34 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:38:34 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:38:34 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:38:34 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:38:34 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:38:34 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:38:34 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:38:34 localhost podman[253453]: 2025-11-26 09:38:34.813318847 +0000 UTC m=+0.219524773 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:38:34 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:38:35 localhost nova_compute[229802]: 2025-11-26 09:38:35.294 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:38:36 localhost systemd[1]: Starting Create netns directory... Nov 26 04:38:36 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Nov 26 04:38:36 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Nov 26 04:38:36 localhost systemd[1]: Finished Create netns directory. Nov 26 04:38:37 localhost python3.9[253628]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:38:37 localhost nova_compute[229802]: 2025-11-26 09:38:37.318 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:38:37 localhost python3.9[253738]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/neutron_sriov_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:38:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:38:38 localhost podman[253827]: 2025-11-26 09:38:38.38373543 +0000 UTC m=+0.096159994 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, config_id=edpm, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9) Nov 26 04:38:38 localhost podman[253827]: 2025-11-26 09:38:38.398376366 +0000 UTC m=+0.110800900 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, distribution-scope=public, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Nov 26 04:38:38 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:38:38 localhost python3.9[253826]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/neutron_sriov_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764149917.3536875-735-54592244768937/.source.json _original_basename=.akklh7ud follow=False checksum=a32073fdba4733b9ffe872cfb91708eff83a585a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:38:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43937 DF PROTO=TCP SPT=34756 DPT=9102 SEQ=1648210641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C495FD0000000001030307) Nov 26 04:38:39 localhost python3.9[253955]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:38:40 localhost nova_compute[229802]: 2025-11-26 09:38:40.331 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:38:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:38:41 localhost systemd[1]: tmp-crun.9lh7S3.mount: Deactivated successfully. Nov 26 04:38:41 localhost podman[254264]: 2025-11-26 09:38:41.809525112 +0000 UTC m=+0.099014192 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:38:41 localhost podman[254264]: 2025-11-26 09:38:41.817169861 +0000 UTC m=+0.106658961 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:38:41 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:38:41 localhost python3.9[254263]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent config_pattern=*.json debug=False Nov 26 04:38:42 localhost nova_compute[229802]: 2025-11-26 09:38:42.347 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:38:42 localhost python3.9[254396]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Nov 26 04:38:43 localhost python3.9[254506]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Nov 26 04:38:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:38:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:38:44 localhost systemd[1]: tmp-crun.P2DX35.mount: Deactivated successfully. Nov 26 04:38:44 localhost podman[254550]: 2025-11-26 09:38:44.836567106 +0000 UTC m=+0.094227624 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent) Nov 26 04:38:44 localhost podman[254550]: 2025-11-26 09:38:44.873376852 +0000 UTC m=+0.131037310 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible) Nov 26 04:38:44 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:38:44 localhost podman[254551]: 2025-11-26 09:38:44.927494076 +0000 UTC m=+0.184319728 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 26 04:38:44 localhost podman[254551]: 2025-11-26 09:38:44.970279147 +0000 UTC m=+0.227104749 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 26 04:38:44 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:38:45 localhost nova_compute[229802]: 2025-11-26 09:38:45.371 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:38:45 localhost openstack_network_exporter[242153]: ERROR 09:38:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:38:45 localhost openstack_network_exporter[242153]: ERROR 09:38:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:38:45 localhost openstack_network_exporter[242153]: ERROR 09:38:45 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:38:45 localhost openstack_network_exporter[242153]: ERROR 09:38:45 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:38:45 localhost openstack_network_exporter[242153]: Nov 26 04:38:45 localhost openstack_network_exporter[242153]: ERROR 09:38:45 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:38:45 localhost openstack_network_exporter[242153]: Nov 26 04:38:47 localhost nova_compute[229802]: 2025-11-26 09:38:47.397 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:38:48 localhost python3[254677]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent config_id=neutron_sriov_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False Nov 26 04:38:48 localhost podman[254714]: Nov 26 04:38:48 localhost podman[254714]: 2025-11-26 09:38:48.861773533 +0000 UTC m=+0.071644011 container create 3680ab12a0b6f3a6b116215a2914df2325f1b5b04257bc9ff5753c5a13c1952c (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, config_id=neutron_sriov_agent, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=neutron_sriov_agent, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0278b6e848cdc9f630c584054bec526b7b0eea103b9a252922d9477799941c6f'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:38:48 localhost podman[254714]: 2025-11-26 09:38:48.823167401 +0000 UTC m=+0.033037929 image pull quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified Nov 26 04:38:48 localhost python3[254677]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name neutron_sriov_agent --conmon-pidfile /run/neutron_sriov_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0278b6e848cdc9f630c584054bec526b7b0eea103b9a252922d9477799941c6f --label config_id=neutron_sriov_agent --label container_name=neutron_sriov_agent --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0278b6e848cdc9f630c584054bec526b7b0eea103b9a252922d9477799941c6f'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user neutron --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified Nov 26 04:38:49 localhost python3.9[254860]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:38:50 localhost nova_compute[229802]: 2025-11-26 09:38:50.412 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:38:50 localhost python3.9[254972]: ansible-file Invoked with path=/etc/systemd/system/edpm_neutron_sriov_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:38:51 localhost python3.9[255027]: ansible-stat Invoked with path=/etc/systemd/system/edpm_neutron_sriov_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:38:51 localhost python3.9[255136]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764149931.0766282-999-62322116377863/source dest=/etc/systemd/system/edpm_neutron_sriov_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:38:52 localhost python3.9[255191]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 26 04:38:52 localhost systemd[1]: Reloading. Nov 26 04:38:52 localhost nova_compute[229802]: 2025-11-26 09:38:52.449 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:38:52 localhost systemd-rc-local-generator[255215]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:38:52 localhost systemd-sysv-generator[255220]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:38:52 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:38:52 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:38:52 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:38:52 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:38:52 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:38:52 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:38:52 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:38:52 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:38:52 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:38:53 localhost python3.9[255281]: ansible-systemd Invoked with state=restarted name=edpm_neutron_sriov_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:38:53 localhost systemd[1]: Reloading. Nov 26 04:38:53 localhost systemd-rc-local-generator[255306]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:38:53 localhost systemd-sysv-generator[255312]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:38:53 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:38:53 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:38:53 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:38:53 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:38:53 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:38:53 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:38:53 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:38:53 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:38:53 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:38:53 localhost systemd[1]: Starting neutron_sriov_agent container... Nov 26 04:38:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17649 DF PROTO=TCP SPT=41328 DPT=9102 SEQ=1904253311 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C4CF6A0000000001030307) Nov 26 04:38:53 localhost systemd[1]: Started libcrun container. Nov 26 04:38:53 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/677050064fe3b8e94f7194862f075887c714b23ab98b55e6ce963677f1b36894/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Nov 26 04:38:53 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/677050064fe3b8e94f7194862f075887c714b23ab98b55e6ce963677f1b36894/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 04:38:53 localhost podman[255321]: 2025-11-26 09:38:53.871116396 +0000 UTC m=+0.123427902 container init 3680ab12a0b6f3a6b116215a2914df2325f1b5b04257bc9ff5753c5a13c1952c (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0278b6e848cdc9f630c584054bec526b7b0eea103b9a252922d9477799941c6f'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_id=neutron_sriov_agent, container_name=neutron_sriov_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:38:53 localhost systemd[1]: tmp-crun.la3e0s.mount: Deactivated successfully. Nov 26 04:38:53 localhost podman[255321]: 2025-11-26 09:38:53.884634707 +0000 UTC m=+0.136946213 container start 3680ab12a0b6f3a6b116215a2914df2325f1b5b04257bc9ff5753c5a13c1952c (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0278b6e848cdc9f630c584054bec526b7b0eea103b9a252922d9477799941c6f'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=neutron_sriov_agent, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=neutron_sriov_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true) Nov 26 04:38:53 localhost podman[255321]: neutron_sriov_agent Nov 26 04:38:53 localhost neutron_sriov_agent[255335]: + sudo -E kolla_set_configs Nov 26 04:38:53 localhost systemd[1]: Started neutron_sriov_agent container. Nov 26 04:38:53 localhost neutron_sriov_agent[255335]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Nov 26 04:38:53 localhost neutron_sriov_agent[255335]: INFO:__main__:Validating config file Nov 26 04:38:53 localhost neutron_sriov_agent[255335]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Nov 26 04:38:53 localhost neutron_sriov_agent[255335]: INFO:__main__:Copying service configuration files Nov 26 04:38:53 localhost neutron_sriov_agent[255335]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Nov 26 04:38:53 localhost neutron_sriov_agent[255335]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Nov 26 04:38:53 localhost neutron_sriov_agent[255335]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Nov 26 04:38:53 localhost neutron_sriov_agent[255335]: INFO:__main__:Writing out command to execute Nov 26 04:38:53 localhost neutron_sriov_agent[255335]: INFO:__main__:Setting permission for /var/lib/neutron Nov 26 04:38:53 localhost neutron_sriov_agent[255335]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Nov 26 04:38:53 localhost neutron_sriov_agent[255335]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Nov 26 04:38:53 localhost neutron_sriov_agent[255335]: INFO:__main__:Setting permission for /var/lib/neutron/external Nov 26 04:38:53 localhost neutron_sriov_agent[255335]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Nov 26 04:38:53 localhost neutron_sriov_agent[255335]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Nov 26 04:38:53 localhost neutron_sriov_agent[255335]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Nov 26 04:38:53 localhost neutron_sriov_agent[255335]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Nov 26 04:38:53 localhost neutron_sriov_agent[255335]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Nov 26 04:38:53 localhost neutron_sriov_agent[255335]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934 Nov 26 04:38:53 localhost neutron_sriov_agent[255335]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Nov 26 04:38:53 localhost neutron_sriov_agent[255335]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/3633976c-3aa0-4c4a-aa49-e8224cd25e39.pid.haproxy Nov 26 04:38:53 localhost neutron_sriov_agent[255335]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/3633976c-3aa0-4c4a-aa49-e8224cd25e39.conf Nov 26 04:38:53 localhost neutron_sriov_agent[255335]: ++ cat /run_command Nov 26 04:38:53 localhost neutron_sriov_agent[255335]: + CMD=/usr/bin/neutron-sriov-nic-agent Nov 26 04:38:53 localhost neutron_sriov_agent[255335]: + ARGS= Nov 26 04:38:53 localhost neutron_sriov_agent[255335]: + sudo kolla_copy_cacerts Nov 26 04:38:53 localhost neutron_sriov_agent[255335]: + [[ ! -n '' ]] Nov 26 04:38:53 localhost neutron_sriov_agent[255335]: + . kolla_extend_start Nov 26 04:38:53 localhost neutron_sriov_agent[255335]: + echo 'Running command: '\''/usr/bin/neutron-sriov-nic-agent'\''' Nov 26 04:38:53 localhost neutron_sriov_agent[255335]: Running command: '/usr/bin/neutron-sriov-nic-agent' Nov 26 04:38:53 localhost neutron_sriov_agent[255335]: + umask 0022 Nov 26 04:38:53 localhost neutron_sriov_agent[255335]: + exec /usr/bin/neutron-sriov-nic-agent Nov 26 04:38:54 localhost python3.9[255458]: ansible-ansible.builtin.systemd Invoked with name=edpm_neutron_sriov_agent.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 26 04:38:54 localhost systemd[1]: Stopping neutron_sriov_agent container... Nov 26 04:38:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17650 DF PROTO=TCP SPT=41328 DPT=9102 SEQ=1904253311 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C4D37C0000000001030307) Nov 26 04:38:54 localhost systemd[1]: tmp-crun.6gGy62.mount: Deactivated successfully. Nov 26 04:38:54 localhost systemd[1]: libpod-3680ab12a0b6f3a6b116215a2914df2325f1b5b04257bc9ff5753c5a13c1952c.scope: Deactivated successfully. Nov 26 04:38:54 localhost systemd[1]: libpod-3680ab12a0b6f3a6b116215a2914df2325f1b5b04257bc9ff5753c5a13c1952c.scope: Consumed 1.003s CPU time. Nov 26 04:38:54 localhost podman[255463]: 2025-11-26 09:38:54.909216482 +0000 UTC m=+0.086498723 container died 3680ab12a0b6f3a6b116215a2914df2325f1b5b04257bc9ff5753c5a13c1952c (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=neutron_sriov_agent, org.label-schema.vendor=CentOS, container_name=neutron_sriov_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0278b6e848cdc9f630c584054bec526b7b0eea103b9a252922d9477799941c6f'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.schema-version=1.0) Nov 26 04:38:54 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3680ab12a0b6f3a6b116215a2914df2325f1b5b04257bc9ff5753c5a13c1952c-userdata-shm.mount: Deactivated successfully. Nov 26 04:38:54 localhost podman[255463]: 2025-11-26 09:38:54.952076516 +0000 UTC m=+0.129358758 container cleanup 3680ab12a0b6f3a6b116215a2914df2325f1b5b04257bc9ff5753c5a13c1952c (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0278b6e848cdc9f630c584054bec526b7b0eea103b9a252922d9477799941c6f'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_id=neutron_sriov_agent, org.label-schema.vendor=CentOS, container_name=neutron_sriov_agent) Nov 26 04:38:54 localhost podman[255463]: neutron_sriov_agent Nov 26 04:38:55 localhost podman[255489]: 2025-11-26 09:38:55.038119044 +0000 UTC m=+0.057308375 container cleanup 3680ab12a0b6f3a6b116215a2914df2325f1b5b04257bc9ff5753c5a13c1952c (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=neutron_sriov_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0278b6e848cdc9f630c584054bec526b7b0eea103b9a252922d9477799941c6f'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_sriov_agent, org.label-schema.build-date=20251118) Nov 26 04:38:55 localhost podman[255489]: neutron_sriov_agent Nov 26 04:38:55 localhost systemd[1]: edpm_neutron_sriov_agent.service: Deactivated successfully. Nov 26 04:38:55 localhost systemd[1]: Stopped neutron_sriov_agent container. Nov 26 04:38:55 localhost systemd[1]: Starting neutron_sriov_agent container... Nov 26 04:38:55 localhost systemd[1]: Started libcrun container. Nov 26 04:38:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/677050064fe3b8e94f7194862f075887c714b23ab98b55e6ce963677f1b36894/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Nov 26 04:38:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/677050064fe3b8e94f7194862f075887c714b23ab98b55e6ce963677f1b36894/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 04:38:55 localhost podman[255501]: 2025-11-26 09:38:55.176368617 +0000 UTC m=+0.108609162 container init 3680ab12a0b6f3a6b116215a2914df2325f1b5b04257bc9ff5753c5a13c1952c (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0278b6e848cdc9f630c584054bec526b7b0eea103b9a252922d9477799941c6f'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_sriov_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, container_name=neutron_sriov_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:38:55 localhost podman[255501]: 2025-11-26 09:38:55.183524009 +0000 UTC m=+0.115764544 container start 3680ab12a0b6f3a6b116215a2914df2325f1b5b04257bc9ff5753c5a13c1952c (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0278b6e848cdc9f630c584054bec526b7b0eea103b9a252922d9477799941c6f'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, config_id=neutron_sriov_agent, container_name=neutron_sriov_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible) Nov 26 04:38:55 localhost podman[255501]: neutron_sriov_agent Nov 26 04:38:55 localhost neutron_sriov_agent[255515]: + sudo -E kolla_set_configs Nov 26 04:38:55 localhost systemd[1]: Started neutron_sriov_agent container. Nov 26 04:38:55 localhost neutron_sriov_agent[255515]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Nov 26 04:38:55 localhost neutron_sriov_agent[255515]: INFO:__main__:Validating config file Nov 26 04:38:55 localhost neutron_sriov_agent[255515]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Nov 26 04:38:55 localhost neutron_sriov_agent[255515]: INFO:__main__:Copying service configuration files Nov 26 04:38:55 localhost neutron_sriov_agent[255515]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Nov 26 04:38:55 localhost neutron_sriov_agent[255515]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Nov 26 04:38:55 localhost neutron_sriov_agent[255515]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Nov 26 04:38:55 localhost neutron_sriov_agent[255515]: INFO:__main__:Writing out command to execute Nov 26 04:38:55 localhost neutron_sriov_agent[255515]: INFO:__main__:Setting permission for /var/lib/neutron Nov 26 04:38:55 localhost neutron_sriov_agent[255515]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Nov 26 04:38:55 localhost neutron_sriov_agent[255515]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Nov 26 04:38:55 localhost neutron_sriov_agent[255515]: INFO:__main__:Setting permission for /var/lib/neutron/external Nov 26 04:38:55 localhost neutron_sriov_agent[255515]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Nov 26 04:38:55 localhost neutron_sriov_agent[255515]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Nov 26 04:38:55 localhost neutron_sriov_agent[255515]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Nov 26 04:38:55 localhost neutron_sriov_agent[255515]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Nov 26 04:38:55 localhost neutron_sriov_agent[255515]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Nov 26 04:38:55 localhost neutron_sriov_agent[255515]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934 Nov 26 04:38:55 localhost neutron_sriov_agent[255515]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Nov 26 04:38:55 localhost neutron_sriov_agent[255515]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/3633976c-3aa0-4c4a-aa49-e8224cd25e39.pid.haproxy Nov 26 04:38:55 localhost neutron_sriov_agent[255515]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/3633976c-3aa0-4c4a-aa49-e8224cd25e39.conf Nov 26 04:38:55 localhost neutron_sriov_agent[255515]: ++ cat /run_command Nov 26 04:38:55 localhost neutron_sriov_agent[255515]: + CMD=/usr/bin/neutron-sriov-nic-agent Nov 26 04:38:55 localhost neutron_sriov_agent[255515]: + ARGS= Nov 26 04:38:55 localhost neutron_sriov_agent[255515]: + sudo kolla_copy_cacerts Nov 26 04:38:55 localhost neutron_sriov_agent[255515]: + [[ ! -n '' ]] Nov 26 04:38:55 localhost neutron_sriov_agent[255515]: + . kolla_extend_start Nov 26 04:38:55 localhost neutron_sriov_agent[255515]: Running command: '/usr/bin/neutron-sriov-nic-agent' Nov 26 04:38:55 localhost neutron_sriov_agent[255515]: + echo 'Running command: '\''/usr/bin/neutron-sriov-nic-agent'\''' Nov 26 04:38:55 localhost neutron_sriov_agent[255515]: + umask 0022 Nov 26 04:38:55 localhost neutron_sriov_agent[255515]: + exec /usr/bin/neutron-sriov-nic-agent Nov 26 04:38:55 localhost nova_compute[229802]: 2025-11-26 09:38:55.460 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:38:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43938 DF PROTO=TCP SPT=34756 DPT=9102 SEQ=1648210641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C4D5FD0000000001030307) Nov 26 04:38:55 localhost systemd[1]: session-58.scope: Deactivated successfully. Nov 26 04:38:55 localhost systemd[1]: session-58.scope: Consumed 24.408s CPU time. Nov 26 04:38:55 localhost systemd-logind[761]: Session 58 logged out. Waiting for processes to exit. Nov 26 04:38:55 localhost systemd-logind[761]: Removed session 58. Nov 26 04:38:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17651 DF PROTO=TCP SPT=41328 DPT=9102 SEQ=1904253311 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C4DB7C0000000001030307) Nov 26 04:38:56 localhost neutron_sriov_agent[255515]: 2025-11-26 09:38:56.934 2 INFO neutron.common.config [-] Logging enabled!#033[00m Nov 26 04:38:56 localhost neutron_sriov_agent[255515]: 2025-11-26 09:38:56.934 2 INFO neutron.common.config [-] /usr/bin/neutron-sriov-nic-agent version 22.2.2.dev43#033[00m Nov 26 04:38:56 localhost neutron_sriov_agent[255515]: 2025-11-26 09:38:56.935 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Physical Devices mappings: {'dummy_sriov_net': ['dummy-dev']}#033[00m Nov 26 04:38:56 localhost neutron_sriov_agent[255515]: 2025-11-26 09:38:56.935 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Exclude Devices: {}#033[00m Nov 26 04:38:56 localhost neutron_sriov_agent[255515]: 2025-11-26 09:38:56.935 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider bandwidths: {}#033[00m Nov 26 04:38:56 localhost neutron_sriov_agent[255515]: 2025-11-26 09:38:56.935 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider inventory defaults: {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0}#033[00m Nov 26 04:38:56 localhost neutron_sriov_agent[255515]: 2025-11-26 09:38:56.935 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider hypervisors: {'dummy-dev': 'np0005536118.localdomain'}#033[00m Nov 26 04:38:56 localhost neutron_sriov_agent[255515]: 2025-11-26 09:38:56.936 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-c0fbb7e2-309c-46b0-9974-7879398737ce - - - - - -] RPC agent_id: nic-switch-agent.np0005536118.localdomain#033[00m Nov 26 04:38:56 localhost neutron_sriov_agent[255515]: 2025-11-26 09:38:56.940 2 INFO neutron.agent.agent_extensions_manager [None req-c0fbb7e2-309c-46b0-9974-7879398737ce - - - - - -] Loaded agent extensions: ['qos']#033[00m Nov 26 04:38:56 localhost neutron_sriov_agent[255515]: 2025-11-26 09:38:56.941 2 INFO neutron.agent.agent_extensions_manager [None req-c0fbb7e2-309c-46b0-9974-7879398737ce - - - - - -] Initializing agent extension 'qos'#033[00m Nov 26 04:38:57 localhost neutron_sriov_agent[255515]: 2025-11-26 09:38:57.340 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-c0fbb7e2-309c-46b0-9974-7879398737ce - - - - - -] Agent initialized successfully, now running... #033[00m Nov 26 04:38:57 localhost neutron_sriov_agent[255515]: 2025-11-26 09:38:57.340 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-c0fbb7e2-309c-46b0-9974-7879398737ce - - - - - -] SRIOV NIC Agent RPC Daemon Started!#033[00m Nov 26 04:38:57 localhost neutron_sriov_agent[255515]: 2025-11-26 09:38:57.341 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-c0fbb7e2-309c-46b0-9974-7879398737ce - - - - - -] Agent out of sync with plugin!#033[00m Nov 26 04:38:57 localhost nova_compute[229802]: 2025-11-26 09:38:57.452 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:38:57 localhost podman[240049]: time="2025-11-26T09:38:57Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:38:57 localhost podman[240049]: @ - - [26/Nov/2025:09:38:57 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146310 "" "Go-http-client/1.1" Nov 26 04:38:57 localhost podman[240049]: @ - - [26/Nov/2025:09:38:57 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16782 "" "Go-http-client/1.1" Nov 26 04:38:57 localhost nova_compute[229802]: 2025-11-26 09:38:57.609 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:38:57 localhost nova_compute[229802]: 2025-11-26 09:38:57.610 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 04:38:57 localhost nova_compute[229802]: 2025-11-26 09:38:57.610 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 04:38:57 localhost nova_compute[229802]: 2025-11-26 09:38:57.695 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:38:57 localhost nova_compute[229802]: 2025-11-26 09:38:57.696 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:38:57 localhost nova_compute[229802]: 2025-11-26 09:38:57.696 229806 DEBUG nova.network.neutron [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 04:38:57 localhost nova_compute[229802]: 2025-11-26 09:38:57.697 229806 DEBUG nova.objects.instance [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:38:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42010 DF PROTO=TCP SPT=48152 DPT=9102 SEQ=3066544008 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C4DFFD0000000001030307) Nov 26 04:38:58 localhost nova_compute[229802]: 2025-11-26 09:38:58.368 229806 DEBUG nova.network.neutron [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:38:58 localhost nova_compute[229802]: 2025-11-26 09:38:58.385 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:38:58 localhost nova_compute[229802]: 2025-11-26 09:38:58.386 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 04:38:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:38:58 localhost podman[255548]: 2025-11-26 09:38:58.81298368 +0000 UTC m=+0.073820879 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible) Nov 26 04:38:58 localhost podman[255548]: 2025-11-26 09:38:58.829010529 +0000 UTC m=+0.089847738 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 26 04:38:58 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:38:59 localhost nova_compute[229802]: 2025-11-26 09:38:59.609 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:38:59 localhost nova_compute[229802]: 2025-11-26 09:38:59.610 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:38:59 localhost nova_compute[229802]: 2025-11-26 09:38:59.634 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:38:59 localhost nova_compute[229802]: 2025-11-26 09:38:59.635 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:38:59 localhost nova_compute[229802]: 2025-11-26 09:38:59.635 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:38:59 localhost nova_compute[229802]: 2025-11-26 09:38:59.635 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 04:38:59 localhost nova_compute[229802]: 2025-11-26 09:38:59.636 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:38:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:38:59 localhost systemd[1]: tmp-crun.jSJodl.mount: Deactivated successfully. Nov 26 04:38:59 localhost podman[255566]: 2025-11-26 09:38:59.821525567 +0000 UTC m=+0.084220022 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 04:38:59 localhost podman[255566]: 2025-11-26 09:38:59.835329436 +0000 UTC m=+0.098023871 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 26 04:38:59 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:39:00 localhost nova_compute[229802]: 2025-11-26 09:39:00.117 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:39:00 localhost nova_compute[229802]: 2025-11-26 09:39:00.178 229806 DEBUG nova.virt.libvirt.driver [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:39:00 localhost nova_compute[229802]: 2025-11-26 09:39:00.178 229806 DEBUG nova.virt.libvirt.driver [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:39:00 localhost nova_compute[229802]: 2025-11-26 09:39:00.399 229806 WARNING nova.virt.libvirt.driver [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 04:39:00 localhost nova_compute[229802]: 2025-11-26 09:39:00.402 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=12199MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 04:39:00 localhost nova_compute[229802]: 2025-11-26 09:39:00.402 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:39:00 localhost nova_compute[229802]: 2025-11-26 09:39:00.403 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:39:00 localhost nova_compute[229802]: 2025-11-26 09:39:00.494 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 04:39:00 localhost nova_compute[229802]: 2025-11-26 09:39:00.494 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 04:39:00 localhost nova_compute[229802]: 2025-11-26 09:39:00.495 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 04:39:00 localhost nova_compute[229802]: 2025-11-26 09:39:00.499 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:39:00 localhost nova_compute[229802]: 2025-11-26 09:39:00.545 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:39:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17652 DF PROTO=TCP SPT=41328 DPT=9102 SEQ=1904253311 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C4EB3C0000000001030307) Nov 26 04:39:00 localhost sshd[255631]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:39:01 localhost nova_compute[229802]: 2025-11-26 09:39:01.012 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:39:01 localhost nova_compute[229802]: 2025-11-26 09:39:01.021 229806 DEBUG nova.compute.provider_tree [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 04:39:01 localhost nova_compute[229802]: 2025-11-26 09:39:01.042 229806 DEBUG nova.scheduler.client.report [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 04:39:01 localhost nova_compute[229802]: 2025-11-26 09:39:01.045 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 04:39:01 localhost nova_compute[229802]: 2025-11-26 09:39:01.045 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.642s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:39:01 localhost systemd-logind[761]: New session 59 of user zuul. Nov 26 04:39:01 localhost systemd[1]: Started Session 59 of User zuul. Nov 26 04:39:02 localhost nova_compute[229802]: 2025-11-26 09:39:02.045 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:39:02 localhost nova_compute[229802]: 2025-11-26 09:39:02.048 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:39:02 localhost nova_compute[229802]: 2025-11-26 09:39:02.048 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 04:39:02 localhost python3.9[255744]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 04:39:02 localhost nova_compute[229802]: 2025-11-26 09:39:02.493 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:39:02 localhost nova_compute[229802]: 2025-11-26 09:39:02.606 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:39:02 localhost nova_compute[229802]: 2025-11-26 09:39:02.608 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:39:03 localhost python3.9[255925]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 26 04:39:03 localhost nova_compute[229802]: 2025-11-26 09:39:03.604 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:39:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:39:03.640 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:39:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:39:03.640 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:39:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:39:03.642 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:39:04 localhost python3.9[255988]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 26 04:39:04 localhost nova_compute[229802]: 2025-11-26 09:39:04.609 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:39:04 localhost nova_compute[229802]: 2025-11-26 09:39:04.609 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:39:05 localhost nova_compute[229802]: 2025-11-26 09:39:05.518 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:39:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:39:05 localhost podman[256009]: 2025-11-26 09:39:05.831634114 +0000 UTC m=+0.092676725 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:39:05 localhost podman[256009]: 2025-11-26 09:39:05.879168074 +0000 UTC m=+0.140210665 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 26 04:39:05 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:39:07 localhost nova_compute[229802]: 2025-11-26 09:39:07.512 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:39:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:39:08 localhost podman[256051]: 2025-11-26 09:39:08.825564968 +0000 UTC m=+0.088271149 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, io.buildah.version=1.33.7) Nov 26 04:39:08 localhost podman[256051]: 2025-11-26 09:39:08.840087849 +0000 UTC m=+0.102794040 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, release=1755695350, io.openshift.tags=minimal rhel9, version=9.6, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal) Nov 26 04:39:08 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:39:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17653 DF PROTO=TCP SPT=41328 DPT=9102 SEQ=1904253311 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C50BFC0000000001030307) Nov 26 04:39:09 localhost python3.9[256163]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Nov 26 04:39:09 localhost sshd[256165]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:39:10 localhost nova_compute[229802]: 2025-11-26 09:39:10.550 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:39:11 localhost python3.9[256278]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:39:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:39:12 localhost podman[256389]: 2025-11-26 09:39:12.322108042 +0000 UTC m=+0.081508318 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 04:39:12 localhost podman[256389]: 2025-11-26 09:39:12.331537835 +0000 UTC m=+0.090938111 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 26 04:39:12 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:39:12 localhost python3.9[256388]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:39:12 localhost nova_compute[229802]: 2025-11-26 09:39:12.543 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:39:13 localhost python3.9[256521]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:39:13 localhost python3.9[256631]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:39:14 localhost python3.9[256741]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:39:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:39:15 localhost podman[256851]: 2025-11-26 09:39:15.008592086 +0000 UTC m=+0.080242258 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:39:15 localhost podman[256851]: 2025-11-26 09:39:15.018629709 +0000 UTC m=+0.090279921 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true) Nov 26 04:39:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:39:15 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:39:15 localhost systemd[1]: tmp-crun.ta0aEp.mount: Deactivated successfully. Nov 26 04:39:15 localhost python3.9[256852]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ns-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:39:15 localhost podman[256870]: 2025-11-26 09:39:15.1250087 +0000 UTC m=+0.091616513 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd) Nov 26 04:39:15 localhost podman[256870]: 2025-11-26 09:39:15.165316234 +0000 UTC m=+0.131924067 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:39:15 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:39:15 localhost nova_compute[229802]: 2025-11-26 09:39:15.586 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:39:15 localhost openstack_network_exporter[242153]: ERROR 09:39:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:39:15 localhost openstack_network_exporter[242153]: ERROR 09:39:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:39:15 localhost openstack_network_exporter[242153]: ERROR 09:39:15 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:39:15 localhost openstack_network_exporter[242153]: ERROR 09:39:15 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:39:15 localhost openstack_network_exporter[242153]: Nov 26 04:39:15 localhost openstack_network_exporter[242153]: ERROR 09:39:15 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:39:15 localhost openstack_network_exporter[242153]: Nov 26 04:39:15 localhost python3.9[256998]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:39:16 localhost python3.9[257108]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/neutron_dhcp_agent.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:39:17 localhost nova_compute[229802]: 2025-11-26 09:39:17.574 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:39:17 localhost python3.9[257196]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/neutron_dhcp_agent.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764149956.1732311-279-23995897689209/.source.yaml follow=False _original_basename=neutron_dhcp_agent.yaml.j2 checksum=3ebfe8ab1da42a1c6ca52429f61716009c5fd177 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:39:18 localhost python3.9[257304]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:39:19 localhost python3.9[257390]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764149957.8463511-324-37677529014548/.source.conf follow=False _original_basename=neutron.conf.j2 checksum=24e013b64eb8be4a13596c6ffccbd94df7442bd2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:39:20 localhost python3.9[257498]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:39:20 localhost nova_compute[229802]: 2025-11-26 09:39:20.608 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:39:21 localhost python3.9[257584]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764149959.7960315-324-93462069168423/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:39:21 localhost python3.9[257692]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron-dhcp-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:39:22 localhost python3.9[257778]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron-dhcp-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764149961.4692981-324-69411474279042/.source.conf follow=False _original_basename=neutron-dhcp-agent.conf.j2 checksum=1d7ed71a4a41b21f5b087feb1b6dafb54abe0d30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:39:22 localhost nova_compute[229802]: 2025-11-26 09:39:22.577 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:39:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42199 DF PROTO=TCP SPT=46918 DPT=9102 SEQ=2799441351 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C544980000000001030307) Nov 26 04:39:23 localhost python3.9[257886]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/10-neutron-dhcp.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:39:24 localhost python3.9[257972]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/10-neutron-dhcp.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764149963.3245533-498-88582388213532/.source.conf _original_basename=10-neutron-dhcp.conf follow=False checksum=fce2fc14e9ea703c92eb8589d29e752536516b0b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:39:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42200 DF PROTO=TCP SPT=46918 DPT=9102 SEQ=2799441351 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C548BC0000000001030307) Nov 26 04:39:25 localhost python3.9[258080]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/dhcp_agent_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:39:25 localhost nova_compute[229802]: 2025-11-26 09:39:25.608 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:39:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17654 DF PROTO=TCP SPT=41328 DPT=9102 SEQ=1904253311 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C54BFC0000000001030307) Nov 26 04:39:25 localhost python3.9[258166]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/dhcp_agent_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764149964.611721-543-164268207253968/.source follow=False _original_basename=haproxy.j2 checksum=e4288860049c1baef23f6e1bb6c6f91acb5432e7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:39:26 localhost python3.9[258274]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/dhcp_agent_dnsmasq_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:39:26 localhost python3.9[258360]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/dhcp_agent_dnsmasq_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764149965.8685877-543-53464073276438/.source follow=False _original_basename=dnsmasq.j2 checksum=efc19f376a79c40570368e9c2b979cde746f1ea8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:39:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42201 DF PROTO=TCP SPT=46918 DPT=9102 SEQ=2799441351 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C550BC0000000001030307) Nov 26 04:39:27 localhost podman[240049]: time="2025-11-26T09:39:27Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:39:27 localhost podman[240049]: @ - - [26/Nov/2025:09:39:27 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146310 "" "Go-http-client/1.1" Nov 26 04:39:27 localhost podman[240049]: @ - - [26/Nov/2025:09:39:27 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16787 "" "Go-http-client/1.1" Nov 26 04:39:27 localhost python3.9[258468]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:39:27 localhost nova_compute[229802]: 2025-11-26 09:39:27.581 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:39:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43939 DF PROTO=TCP SPT=34756 DPT=9102 SEQ=1648210641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C553FC0000000001030307) Nov 26 04:39:28 localhost python3.9[258523]: ansible-ansible.legacy.file Invoked with mode=0755 setype=container_file_t dest=/var/lib/neutron/kill_scripts/haproxy-kill _original_basename=kill-script.j2 recurse=False state=file path=/var/lib/neutron/kill_scripts/haproxy-kill force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:39:28 localhost python3.9[258631]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/dnsmasq-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:39:29 localhost python3.9[258717]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/dnsmasq-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764149968.170177-630-55453748130375/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:39:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:39:29 localhost podman[258789]: 2025-11-26 09:39:29.824888218 +0000 UTC m=+0.084511930 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Nov 26 04:39:29 localhost podman[258789]: 2025-11-26 09:39:29.842348662 +0000 UTC m=+0.101972364 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, managed_by=edpm_ansible) Nov 26 04:39:29 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:39:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:39:29 localhost podman[258845]: 2025-11-26 09:39:29.979987665 +0000 UTC m=+0.090707754 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 26 04:39:29 localhost podman[258845]: 2025-11-26 09:39:29.991489883 +0000 UTC m=+0.102209972 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 04:39:30 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:39:30 localhost python3.9[258844]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:39:30 localhost nova_compute[229802]: 2025-11-26 09:39:30.649 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:39:30 localhost python3.9[258979]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:39:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42202 DF PROTO=TCP SPT=46918 DPT=9102 SEQ=2799441351 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C5607C0000000001030307) Nov 26 04:39:31 localhost python3.9[259089]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:39:31 localhost python3.9[259146]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:39:32 localhost python3.9[259256]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:39:32 localhost nova_compute[229802]: 2025-11-26 09:39:32.635 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:39:33 localhost python3.9[259313]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:39:33 localhost python3.9[259423]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:39:34 localhost python3.9[259533]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:39:34 localhost python3.9[259590]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:39:35 localhost python3.9[259700]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:39:35 localhost nova_compute[229802]: 2025-11-26 09:39:35.686 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:39:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:39:36 localhost systemd[1]: tmp-crun.SfPdhw.mount: Deactivated successfully. Nov 26 04:39:36 localhost podman[259758]: 2025-11-26 09:39:36.065654095 +0000 UTC m=+0.097230807 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:39:36 localhost podman[259758]: 2025-11-26 09:39:36.110989655 +0000 UTC m=+0.142566357 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_managed=true) Nov 26 04:39:36 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:39:36 localhost python3.9[259757]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:39:36 localhost python3.9[259893]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:39:36 localhost systemd[1]: Reloading. Nov 26 04:39:37 localhost systemd-rc-local-generator[259918]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:39:37 localhost systemd-sysv-generator[259923]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:39:37 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:39:37 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:39:37 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:39:37 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:39:37 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:39:37 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:39:37 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:39:37 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:39:37 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:39:37 localhost nova_compute[229802]: 2025-11-26 09:39:37.674 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:39:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42203 DF PROTO=TCP SPT=46918 DPT=9102 SEQ=2799441351 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C57FFD0000000001030307) Nov 26 04:39:38 localhost python3.9[260041]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:39:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:39:39 localhost podman[260060]: 2025-11-26 09:39:39.838277871 +0000 UTC m=+0.089599150 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1755695350, version=9.6, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., architecture=x86_64) Nov 26 04:39:39 localhost podman[260060]: 2025-11-26 09:39:39.880429902 +0000 UTC m=+0.131751161 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, maintainer=Red Hat, Inc., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, name=ubi9-minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, distribution-scope=public, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, managed_by=edpm_ansible) Nov 26 04:39:39 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:39:40 localhost python3.9[260118]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:39:40 localhost nova_compute[229802]: 2025-11-26 09:39:40.715 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:39:41 localhost python3.9[260228]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:39:41 localhost python3.9[260285]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:39:42 localhost python3.9[260395]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:39:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:39:42 localhost systemd[1]: Reloading. Nov 26 04:39:42 localhost podman[260397]: 2025-11-26 09:39:42.48308263 +0000 UTC m=+0.093164972 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 26 04:39:42 localhost systemd-sysv-generator[260444]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:39:42 localhost systemd-rc-local-generator[260440]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:39:42 localhost podman[260397]: 2025-11-26 09:39:42.521491464 +0000 UTC m=+0.131573786 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 04:39:42 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:39:42 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:39:42 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:39:42 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:39:42 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:39:42 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:39:42 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:39:42 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:39:42 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:39:42 localhost nova_compute[229802]: 2025-11-26 09:39:42.718 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:39:42 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:39:42 localhost systemd[1]: Starting Create netns directory... Nov 26 04:39:42 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Nov 26 04:39:42 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Nov 26 04:39:42 localhost systemd[1]: Finished Create netns directory. Nov 26 04:39:43 localhost python3.9[260569]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:39:44 localhost python3.9[260679]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/neutron_dhcp_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:39:45 localhost python3.9[260767]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/neutron_dhcp_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764149984.0462756-1074-102104594889826/.source.json _original_basename=.y5iibyf2 follow=False checksum=c62829c98c0f9e788d62f52aa71fba276cd98270 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:39:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:39:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:39:45 localhost nova_compute[229802]: 2025-11-26 09:39:45.748 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:39:45 localhost openstack_network_exporter[242153]: ERROR 09:39:45 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:39:45 localhost openstack_network_exporter[242153]: ERROR 09:39:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:39:45 localhost openstack_network_exporter[242153]: ERROR 09:39:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:39:45 localhost openstack_network_exporter[242153]: ERROR 09:39:45 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:39:45 localhost openstack_network_exporter[242153]: Nov 26 04:39:45 localhost openstack_network_exporter[242153]: ERROR 09:39:45 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:39:45 localhost openstack_network_exporter[242153]: Nov 26 04:39:45 localhost podman[260878]: 2025-11-26 09:39:45.829966476 +0000 UTC m=+0.166246064 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent) Nov 26 04:39:45 localhost podman[260879]: 2025-11-26 09:39:45.798882639 +0000 UTC m=+0.135246870 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd) Nov 26 04:39:45 localhost podman[260878]: 2025-11-26 09:39:45.860507577 +0000 UTC m=+0.196787195 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:39:45 localhost python3.9[260877]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/neutron_dhcp state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:39:45 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:39:45 localhost podman[260879]: 2025-11-26 09:39:45.878442285 +0000 UTC m=+0.214806516 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:39:45 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:39:47 localhost nova_compute[229802]: 2025-11-26 09:39:47.756 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:39:48 localhost python3.9[261220]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/neutron_dhcp config_pattern=*.json debug=False Nov 26 04:39:49 localhost python3.9[261330]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Nov 26 04:39:49 localhost sshd[261331]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:39:50 localhost python3.9[261442]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Nov 26 04:39:50 localhost nova_compute[229802]: 2025-11-26 09:39:50.787 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:39:52 localhost nova_compute[229802]: 2025-11-26 09:39:52.795 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:39:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42649 DF PROTO=TCP SPT=48388 DPT=9102 SEQ=64747573 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C5B9C90000000001030307) Nov 26 04:39:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42650 DF PROTO=TCP SPT=48388 DPT=9102 SEQ=64747573 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C5BDBC0000000001030307) Nov 26 04:39:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42204 DF PROTO=TCP SPT=46918 DPT=9102 SEQ=2799441351 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C5BFFC0000000001030307) Nov 26 04:39:55 localhost python3[261580]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/neutron_dhcp config_id=neutron_dhcp config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False Nov 26 04:39:55 localhost podman[261621]: Nov 26 04:39:55 localhost podman[261621]: 2025-11-26 09:39:55.717482592 +0000 UTC m=+0.093300325 container create b793c53bfa44618b9f0b8989b84a6926ff1880d4925c98310a6c062ef68762f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3ebc0f2603b0886b4fea7beabc45dd872809ddc013a9354ba62cfce274f2df41'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_dhcp, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, container_name=neutron_dhcp_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Nov 26 04:39:55 localhost podman[261621]: 2025-11-26 09:39:55.65637434 +0000 UTC m=+0.032192093 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 04:39:55 localhost python3[261580]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name neutron_dhcp_agent --cgroupns=host --conmon-pidfile /run/neutron_dhcp_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=3ebc0f2603b0886b4fea7beabc45dd872809ddc013a9354ba62cfce274f2df41 --label config_id=neutron_dhcp --label container_name=neutron_dhcp_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3ebc0f2603b0886b4fea7beabc45dd872809ddc013a9354ba62cfce274f2df41'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/netns:/run/netns:shared --volume /var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 04:39:55 localhost nova_compute[229802]: 2025-11-26 09:39:55.828 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:39:56 localhost python3.9[261766]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:39:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42651 DF PROTO=TCP SPT=48388 DPT=9102 SEQ=64747573 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C5C5BD0000000001030307) Nov 26 04:39:57 localhost python3.9[261878]: ansible-file Invoked with path=/etc/systemd/system/edpm_neutron_dhcp_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:39:57 localhost podman[240049]: time="2025-11-26T09:39:57Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:39:57 localhost podman[240049]: @ - - [26/Nov/2025:09:39:57 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148614 "" "Go-http-client/1.1" Nov 26 04:39:57 localhost podman[240049]: @ - - [26/Nov/2025:09:39:57 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17101 "" "Go-http-client/1.1" Nov 26 04:39:57 localhost python3.9[261933]: ansible-stat Invoked with path=/etc/systemd/system/edpm_neutron_dhcp_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:39:57 localhost nova_compute[229802]: 2025-11-26 09:39:57.798 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:39:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17655 DF PROTO=TCP SPT=41328 DPT=9102 SEQ=1904253311 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C5C9FD0000000001030307) Nov 26 04:39:58 localhost sshd[262042]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:39:58 localhost python3.9[262044]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764149997.8422453-1338-50807549145591/source dest=/etc/systemd/system/edpm_neutron_dhcp_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:39:59 localhost python3.9[262099]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 26 04:39:59 localhost systemd[1]: Reloading. Nov 26 04:39:59 localhost systemd-rc-local-generator[262123]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:39:59 localhost systemd-sysv-generator[262126]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:39:59 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:39:59 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:39:59 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:39:59 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:39:59 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:39:59 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:39:59 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:39:59 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:39:59 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:39:59 localhost nova_compute[229802]: 2025-11-26 09:39:59.609 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:39:59 localhost nova_compute[229802]: 2025-11-26 09:39:59.609 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 04:39:59 localhost nova_compute[229802]: 2025-11-26 09:39:59.610 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 04:40:00 localhost python3.9[262189]: ansible-systemd Invoked with state=restarted name=edpm_neutron_dhcp_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:40:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:40:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:40:00 localhost systemd[1]: Reloading. Nov 26 04:40:00 localhost podman[262191]: 2025-11-26 09:40:00.212658194 +0000 UTC m=+0.104255046 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:40:00 localhost systemd-rc-local-generator[262256]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:40:00 localhost podman[262191]: 2025-11-26 09:40:00.277974016 +0000 UTC m=+0.169570878 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 26 04:40:00 localhost systemd-sysv-generator[262260]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:40:00 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:40:00 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:40:00 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:40:00 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:40:00 localhost podman[262192]: 2025-11-26 09:40:00.286990524 +0000 UTC m=+0.178310648 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute) Nov 26 04:40:00 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:40:00 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:40:00 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:40:00 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:40:00 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:40:00 localhost podman[262192]: 2025-11-26 09:40:00.373524922 +0000 UTC m=+0.264845056 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118) Nov 26 04:40:00 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:40:00 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:40:00 localhost systemd[1]: Starting neutron_dhcp_agent container... Nov 26 04:40:00 localhost nova_compute[229802]: 2025-11-26 09:40:00.660 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:40:00 localhost nova_compute[229802]: 2025-11-26 09:40:00.661 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:40:00 localhost nova_compute[229802]: 2025-11-26 09:40:00.662 229806 DEBUG nova.network.neutron [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 04:40:00 localhost nova_compute[229802]: 2025-11-26 09:40:00.663 229806 DEBUG nova.objects.instance [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:40:00 localhost systemd[1]: Started libcrun container. Nov 26 04:40:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1d11008d9e63936d94c7b09f83d6a30183d12c9c55a32c5cfb17ad1f2736ed00/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Nov 26 04:40:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1d11008d9e63936d94c7b09f83d6a30183d12c9c55a32c5cfb17ad1f2736ed00/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 04:40:00 localhost podman[262271]: 2025-11-26 09:40:00.694372989 +0000 UTC m=+0.148186497 container init b793c53bfa44618b9f0b8989b84a6926ff1880d4925c98310a6c062ef68762f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, container_name=neutron_dhcp_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3ebc0f2603b0886b4fea7beabc45dd872809ddc013a9354ba62cfce274f2df41'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.schema-version=1.0, config_id=neutron_dhcp, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 26 04:40:00 localhost podman[262271]: 2025-11-26 09:40:00.703220612 +0000 UTC m=+0.157034120 container start b793c53bfa44618b9f0b8989b84a6926ff1880d4925c98310a6c062ef68762f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3ebc0f2603b0886b4fea7beabc45dd872809ddc013a9354ba62cfce274f2df41'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=neutron_dhcp, container_name=neutron_dhcp_agent) Nov 26 04:40:00 localhost podman[262271]: neutron_dhcp_agent Nov 26 04:40:00 localhost neutron_dhcp_agent[262285]: + sudo -E kolla_set_configs Nov 26 04:40:00 localhost systemd[1]: Started neutron_dhcp_agent container. Nov 26 04:40:00 localhost neutron_dhcp_agent[262285]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Nov 26 04:40:00 localhost neutron_dhcp_agent[262285]: INFO:__main__:Validating config file Nov 26 04:40:00 localhost neutron_dhcp_agent[262285]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Nov 26 04:40:00 localhost neutron_dhcp_agent[262285]: INFO:__main__:Copying service configuration files Nov 26 04:40:00 localhost neutron_dhcp_agent[262285]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Nov 26 04:40:00 localhost neutron_dhcp_agent[262285]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Nov 26 04:40:00 localhost neutron_dhcp_agent[262285]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Nov 26 04:40:00 localhost neutron_dhcp_agent[262285]: INFO:__main__:Writing out command to execute Nov 26 04:40:00 localhost neutron_dhcp_agent[262285]: INFO:__main__:Setting permission for /var/lib/neutron Nov 26 04:40:00 localhost neutron_dhcp_agent[262285]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Nov 26 04:40:00 localhost neutron_dhcp_agent[262285]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Nov 26 04:40:00 localhost neutron_dhcp_agent[262285]: INFO:__main__:Setting permission for /var/lib/neutron/external Nov 26 04:40:00 localhost neutron_dhcp_agent[262285]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Nov 26 04:40:00 localhost neutron_dhcp_agent[262285]: INFO:__main__:Setting permission for /var/lib/neutron/ns-metadata-proxy Nov 26 04:40:00 localhost neutron_dhcp_agent[262285]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Nov 26 04:40:00 localhost neutron_dhcp_agent[262285]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Nov 26 04:40:00 localhost neutron_dhcp_agent[262285]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_haproxy_wrapper Nov 26 04:40:00 localhost neutron_dhcp_agent[262285]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_dnsmasq_wrapper Nov 26 04:40:00 localhost neutron_dhcp_agent[262285]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Nov 26 04:40:00 localhost neutron_dhcp_agent[262285]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/dnsmasq-kill Nov 26 04:40:00 localhost neutron_dhcp_agent[262285]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Nov 26 04:40:00 localhost neutron_dhcp_agent[262285]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934 Nov 26 04:40:00 localhost neutron_dhcp_agent[262285]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/c8b37597a29e61f341a0e3f5416437aac1a5cd21cb3a407dd674c7a7a1ff41da Nov 26 04:40:00 localhost neutron_dhcp_agent[262285]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Nov 26 04:40:00 localhost neutron_dhcp_agent[262285]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/3633976c-3aa0-4c4a-aa49-e8224cd25e39.pid.haproxy Nov 26 04:40:00 localhost neutron_dhcp_agent[262285]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/3633976c-3aa0-4c4a-aa49-e8224cd25e39.conf Nov 26 04:40:00 localhost neutron_dhcp_agent[262285]: ++ cat /run_command Nov 26 04:40:00 localhost neutron_dhcp_agent[262285]: + CMD=/usr/bin/neutron-dhcp-agent Nov 26 04:40:00 localhost neutron_dhcp_agent[262285]: + ARGS= Nov 26 04:40:00 localhost neutron_dhcp_agent[262285]: + sudo kolla_copy_cacerts Nov 26 04:40:00 localhost neutron_dhcp_agent[262285]: + [[ ! -n '' ]] Nov 26 04:40:00 localhost neutron_dhcp_agent[262285]: + . kolla_extend_start Nov 26 04:40:00 localhost neutron_dhcp_agent[262285]: Running command: '/usr/bin/neutron-dhcp-agent' Nov 26 04:40:00 localhost neutron_dhcp_agent[262285]: + echo 'Running command: '\''/usr/bin/neutron-dhcp-agent'\''' Nov 26 04:40:00 localhost neutron_dhcp_agent[262285]: + umask 0022 Nov 26 04:40:00 localhost neutron_dhcp_agent[262285]: + exec /usr/bin/neutron-dhcp-agent Nov 26 04:40:00 localhost nova_compute[229802]: 2025-11-26 09:40:00.853 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:40:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42652 DF PROTO=TCP SPT=48388 DPT=9102 SEQ=64747573 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C5D57C0000000001030307) Nov 26 04:40:01 localhost python3.9[262409]: ansible-ansible.builtin.systemd Invoked with name=edpm_neutron_dhcp_agent.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 26 04:40:01 localhost systemd[1]: Stopping neutron_dhcp_agent container... Nov 26 04:40:01 localhost systemd[1]: libpod-b793c53bfa44618b9f0b8989b84a6926ff1880d4925c98310a6c062ef68762f9.scope: Deactivated successfully. Nov 26 04:40:01 localhost systemd[1]: libpod-b793c53bfa44618b9f0b8989b84a6926ff1880d4925c98310a6c062ef68762f9.scope: Consumed 1.017s CPU time. Nov 26 04:40:01 localhost podman[262413]: 2025-11-26 09:40:01.735008027 +0000 UTC m=+0.084114104 container died b793c53bfa44618b9f0b8989b84a6926ff1880d4925c98310a6c062ef68762f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=neutron_dhcp_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3ebc0f2603b0886b4fea7beabc45dd872809ddc013a9354ba62cfce274f2df41'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_dhcp, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:40:01 localhost podman[262413]: 2025-11-26 09:40:01.791717471 +0000 UTC m=+0.140823468 container cleanup b793c53bfa44618b9f0b8989b84a6926ff1880d4925c98310a6c062ef68762f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, container_name=neutron_dhcp_agent, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3ebc0f2603b0886b4fea7beabc45dd872809ddc013a9354ba62cfce274f2df41'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_dhcp, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:40:01 localhost podman[262413]: neutron_dhcp_agent Nov 26 04:40:01 localhost podman[262450]: error opening file `/run/crun/b793c53bfa44618b9f0b8989b84a6926ff1880d4925c98310a6c062ef68762f9/status`: No such file or directory Nov 26 04:40:01 localhost podman[262438]: 2025-11-26 09:40:01.866550217 +0000 UTC m=+0.047873622 container cleanup b793c53bfa44618b9f0b8989b84a6926ff1880d4925c98310a6c062ef68762f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, container_name=neutron_dhcp_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=neutron_dhcp, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3ebc0f2603b0886b4fea7beabc45dd872809ddc013a9354ba62cfce274f2df41'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:40:01 localhost podman[262438]: neutron_dhcp_agent Nov 26 04:40:01 localhost systemd[1]: edpm_neutron_dhcp_agent.service: Deactivated successfully. Nov 26 04:40:01 localhost systemd[1]: Stopped neutron_dhcp_agent container. Nov 26 04:40:01 localhost systemd[1]: Starting neutron_dhcp_agent container... Nov 26 04:40:02 localhost systemd[1]: Started libcrun container. Nov 26 04:40:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1d11008d9e63936d94c7b09f83d6a30183d12c9c55a32c5cfb17ad1f2736ed00/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Nov 26 04:40:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1d11008d9e63936d94c7b09f83d6a30183d12c9c55a32c5cfb17ad1f2736ed00/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 04:40:02 localhost podman[262452]: 2025-11-26 09:40:02.020757608 +0000 UTC m=+0.123173012 container init b793c53bfa44618b9f0b8989b84a6926ff1880d4925c98310a6c062ef68762f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, container_name=neutron_dhcp_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3ebc0f2603b0886b4fea7beabc45dd872809ddc013a9354ba62cfce274f2df41'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_dhcp, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:40:02 localhost podman[262452]: 2025-11-26 09:40:02.031713837 +0000 UTC m=+0.134129231 container start b793c53bfa44618b9f0b8989b84a6926ff1880d4925c98310a6c062ef68762f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, config_id=neutron_dhcp, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '3ebc0f2603b0886b4fea7beabc45dd872809ddc013a9354ba62cfce274f2df41'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=neutron_dhcp_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Nov 26 04:40:02 localhost podman[262452]: neutron_dhcp_agent Nov 26 04:40:02 localhost neutron_dhcp_agent[262467]: + sudo -E kolla_set_configs Nov 26 04:40:02 localhost systemd[1]: Started neutron_dhcp_agent container. Nov 26 04:40:02 localhost neutron_dhcp_agent[262467]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Nov 26 04:40:02 localhost neutron_dhcp_agent[262467]: INFO:__main__:Validating config file Nov 26 04:40:02 localhost neutron_dhcp_agent[262467]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Nov 26 04:40:02 localhost neutron_dhcp_agent[262467]: INFO:__main__:Copying service configuration files Nov 26 04:40:02 localhost neutron_dhcp_agent[262467]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Nov 26 04:40:02 localhost neutron_dhcp_agent[262467]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Nov 26 04:40:02 localhost neutron_dhcp_agent[262467]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Nov 26 04:40:02 localhost neutron_dhcp_agent[262467]: INFO:__main__:Writing out command to execute Nov 26 04:40:02 localhost neutron_dhcp_agent[262467]: INFO:__main__:Setting permission for /var/lib/neutron Nov 26 04:40:02 localhost neutron_dhcp_agent[262467]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Nov 26 04:40:02 localhost neutron_dhcp_agent[262467]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Nov 26 04:40:02 localhost neutron_dhcp_agent[262467]: INFO:__main__:Setting permission for /var/lib/neutron/external Nov 26 04:40:02 localhost neutron_dhcp_agent[262467]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Nov 26 04:40:02 localhost neutron_dhcp_agent[262467]: INFO:__main__:Setting permission for /var/lib/neutron/ns-metadata-proxy Nov 26 04:40:02 localhost neutron_dhcp_agent[262467]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Nov 26 04:40:02 localhost neutron_dhcp_agent[262467]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Nov 26 04:40:02 localhost neutron_dhcp_agent[262467]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_haproxy_wrapper Nov 26 04:40:02 localhost neutron_dhcp_agent[262467]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_dnsmasq_wrapper Nov 26 04:40:02 localhost neutron_dhcp_agent[262467]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Nov 26 04:40:02 localhost neutron_dhcp_agent[262467]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/dnsmasq-kill Nov 26 04:40:02 localhost neutron_dhcp_agent[262467]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Nov 26 04:40:02 localhost neutron_dhcp_agent[262467]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934 Nov 26 04:40:02 localhost neutron_dhcp_agent[262467]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/c8b37597a29e61f341a0e3f5416437aac1a5cd21cb3a407dd674c7a7a1ff41da Nov 26 04:40:02 localhost neutron_dhcp_agent[262467]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Nov 26 04:40:02 localhost neutron_dhcp_agent[262467]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/3633976c-3aa0-4c4a-aa49-e8224cd25e39.pid.haproxy Nov 26 04:40:02 localhost neutron_dhcp_agent[262467]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/3633976c-3aa0-4c4a-aa49-e8224cd25e39.conf Nov 26 04:40:02 localhost neutron_dhcp_agent[262467]: ++ cat /run_command Nov 26 04:40:02 localhost neutron_dhcp_agent[262467]: + CMD=/usr/bin/neutron-dhcp-agent Nov 26 04:40:02 localhost neutron_dhcp_agent[262467]: + ARGS= Nov 26 04:40:02 localhost neutron_dhcp_agent[262467]: + sudo kolla_copy_cacerts Nov 26 04:40:02 localhost neutron_dhcp_agent[262467]: + [[ ! -n '' ]] Nov 26 04:40:02 localhost neutron_dhcp_agent[262467]: + . kolla_extend_start Nov 26 04:40:02 localhost neutron_dhcp_agent[262467]: + echo 'Running command: '\''/usr/bin/neutron-dhcp-agent'\''' Nov 26 04:40:02 localhost neutron_dhcp_agent[262467]: Running command: '/usr/bin/neutron-dhcp-agent' Nov 26 04:40:02 localhost neutron_dhcp_agent[262467]: + umask 0022 Nov 26 04:40:02 localhost neutron_dhcp_agent[262467]: + exec /usr/bin/neutron-dhcp-agent Nov 26 04:40:02 localhost nova_compute[229802]: 2025-11-26 09:40:02.835 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:40:03 localhost systemd[1]: session-59.scope: Deactivated successfully. Nov 26 04:40:03 localhost systemd[1]: session-59.scope: Consumed 36.012s CPU time. Nov 26 04:40:03 localhost systemd-logind[761]: Session 59 logged out. Waiting for processes to exit. Nov 26 04:40:03 localhost systemd-logind[761]: Removed session 59. Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.560 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'name': 'test', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005536118.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'hostId': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.561 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.566 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c74da92c-7978-4020-b57b-3848cabf48fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:40:03.561188', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'e26f8e02-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10478.803429082, 'message_signature': '1deda3220f972bb843fe2d3d2905e148a395e211d0cb0392f5eccc5df62cd14f'}]}, 'timestamp': '2025-11-26 09:40:03.566755', '_unique_id': '1962333b332a4acf9fd1232e13f0019c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.567 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.568 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.568 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3eb990b4-dbb6-4c8f-8bf0-b5f3ae81bdbb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:40:03.568820', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'e26fee7e-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10478.803429082, 'message_signature': 'e684f27ffcbf9111ac9500506e2de20438fedcdb5b6088d6dd9ac16fba886392'}]}, 'timestamp': '2025-11-26 09:40:03.569137', '_unique_id': 'd93a8473f65d4b6b82bd84ba37a64f0a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.569 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.570 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.570 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '167466b4-24ed-44aa-87ea-59b07f523586', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:40:03.570281', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'e2702646-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10478.803429082, 'message_signature': 'cf496f584ed686c32ef715768ba349bd5870970ce193e1d32229075968d138cd'}]}, 'timestamp': '2025-11-26 09:40:03.570594', '_unique_id': '8852d3ac6797470693800652b0e7894a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.571 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a09b11cd-656e-4fe0-8fc9-35577caf24e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:40:03.571706', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'e2705da0-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10478.803429082, 'message_signature': '97056ca6a8e6a580d16bb79174da2dbb7e261a5dcf2fa8723f7136d38ae2e4ed'}]}, 'timestamp': '2025-11-26 09:40:03.572028', '_unique_id': 'bcf808dae3994c6b8770b7b3e2b005ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.572 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.573 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 26 04:40:03 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:40:03.591 262471 INFO neutron.common.config [-] Logging enabled!#033[00m Nov 26 04:40:03 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:40:03.592 262471 INFO neutron.common.config [-] /usr/bin/neutron-dhcp-agent version 22.2.2.dev43#033[00m Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.602 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/memory.usage volume: 52.296875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ecb3276c-28ef-434f-b832-9d3f30a593f3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.296875, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T09:40:03.573364', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'e27519b2-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10478.844359808, 'message_signature': 'dfd7022e8def6cbe06fd420f0b90cf58c6fd35f78a968df8990427eba9865d1b'}]}, 'timestamp': '2025-11-26 09:40:03.603142', '_unique_id': '92a054a1a8454e5e86825694467ce618'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.604 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets volume: 88 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c8725e1-de97-467d-8be3-e366bc93822f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 88, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:40:03.605088', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'e2757484-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10478.803429082, 'message_signature': '210d0392c27878094ab257d2a528fa015b583294d15d9fed9cd2592da275164f'}]}, 'timestamp': '2025-11-26 09:40:03.605320', '_unique_id': 'a3f24c1ee3614b9ebf2a17e4929afb13'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.605 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.606 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.606 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.606 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/cpu volume: 60040000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '22795e66-4492-4c75-9fbf-7dc72fadbd9c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 60040000000, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T09:40:03.606398', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'e275a6e8-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10478.844359808, 'message_signature': 'dd3916d1ea4b33b6677a0f1a1b5e72ffb7235e041a737dfc5a03097e66a2975f'}]}, 'timestamp': '2025-11-26 09:40:03.606601', '_unique_id': '956290899be241dc8894d701dc51a22d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.607 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e454b5d-7644-4d96-ac3e-7512348c7c99', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:40:03.607627', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'e275d776-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10478.803429082, 'message_signature': '3066fd516557de2f872c3468e6e28d695cedb8130607da43697ee68c7d5898df'}]}, 'timestamp': '2025-11-26 09:40:03.607852', '_unique_id': 'f38becb0c6524e5a81553912e65f3839'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.608 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.609 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.609 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82802ed5-92cc-4a0b-8d9d-d19ca77dc130', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:40:03.609244', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'e276165a-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10478.803429082, 'message_signature': 'd99e0bda34c70d663f5dec121865fffb79ab4682ad057e1a6c2ef27233399620'}]}, 'timestamp': '2025-11-26 09:40:03.609544', '_unique_id': '07534c6cbb67476fa5252a61d9f6af83'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.610 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.621 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.621 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '932c2fae-9b5b-4564-94cf-4929bb79c515', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:40:03.610669', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e277ea66-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10478.852918323, 'message_signature': '9ff7f6ca69d099adb76766de5a57e06beef1a5d35c2f5928fb1c13d4333d2e32'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:40:03.610669', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e277f330-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10478.852918323, 'message_signature': '21ef54207fc90a8cd00ff79328290f45d554557e4cdb1d51588e44ae6ac3f2c1'}]}, 'timestamp': '2025-11-26 09:40:03.621694', '_unique_id': 'f930aa9eaca745129a35be528d9f6790'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.622 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.623 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 26 04:40:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:40:03.641 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:40:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:40:03.641 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:40:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:40:03.643 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.652 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 73912320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.653 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a738ac3-67dc-4ffa-bf7b-661fd3672876', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73912320, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:40:03.623921', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e27cd210-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10478.866182033, 'message_signature': '0737bf7a59eba7e4026bbac1fb3daa1caa8e50d96723715e5af0e9e8ac50a276'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:40:03.623921', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e27ce9da-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10478.866182033, 'message_signature': 'cd0c892c1f1ffe96fbe01402864ab05b34ee0fd1f774236f6d44819425c6c308'}]}, 'timestamp': '2025-11-26 09:40:03.654320', '_unique_id': 'c08757e1ed7c4618996a15281064f1ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.656 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.657 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.657 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.658 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '385d3911-6c2b-4937-b5b9-87f8de8f6a71', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:40:03.657532', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e27d7a1c-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10478.852918323, 'message_signature': '13f1b375cbed6fde40002e525984e1b09077596eaa111d73840739ef0990500e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:40:03.657532', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e27d8d22-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10478.852918323, 'message_signature': '8b8c7a8bc563daeaf0f4af3c6b6084382e2dbf8a88e796b4851c5ff5a427e60b'}]}, 'timestamp': '2025-11-26 09:40:03.658475', '_unique_id': '50a19d77c99a46fa94f58a8f13f35115'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.659 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.660 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.660 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 1141678425 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.661 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 173265014 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2902c962-2336-4491-af40-5df6bcc1b9b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1141678425, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:40:03.660667', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e27df3de-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10478.866182033, 'message_signature': '00e6242c42cf244a4063f63c3e4453635456777ea2336a097be86531ee9442f7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 173265014, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:40:03.660667', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e27e0522-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10478.866182033, 'message_signature': '18b1564338ac215b1a727bbbf5daa461fefaa196a1863aa04b3beed6a1d0ebc4'}]}, 'timestamp': '2025-11-26 09:40:03.661542', '_unique_id': '51c8d6a1d663472b8a8ae5a36bb255fa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.662 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.663 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.663 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.664 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '987f840c-1956-49c4-bf6c-c574c42b8943', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:40:03.663721', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e27e6c38-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10478.866182033, 'message_signature': 'e3bc65fae9f20592f97e61add915fe739c7fc9ecb2d9e947b024eef6f3cb888d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:40:03.663721', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e27e7c78-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10478.866182033, 'message_signature': 'e304bf43ae7b487fd27a3a9c9c291415c49c0dd0ec446299ca4450c528006fb2'}]}, 'timestamp': '2025-11-26 09:40:03.664599', '_unique_id': 'db0fa65294e74056b3b356cf3bacb519'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.665 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.666 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.666 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.667 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a2c0adbc-5977-4952-8aa0-f83454d0b0df', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:40:03.667064', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'e27eeb2c-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10478.803429082, 'message_signature': '0a588446b0ef42013e6431682f9ab38f59848cf3f029c5e96d6edb6bedda5886'}]}, 'timestamp': '2025-11-26 09:40:03.667387', '_unique_id': '0db2c32a0ece4bd89b8ac59310673771'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.668 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 524 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.669 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a4a83289-7e0d-445b-b5fc-1cc8409bc784', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 524, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:40:03.668791', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e27f300a-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10478.866182033, 'message_signature': '7771df2d43dbd733ba08c06aebbbd85f89621bbb7bbed88149b5dacdaf889930'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:40:03.668791', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e27f3b22-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10478.866182033, 'message_signature': '57a4dcd8e8c13c5a649d46c2b97e267b6ae4dc5a18ce22274fb4710405164186'}]}, 'timestamp': '2025-11-26 09:40:03.669415', '_unique_id': '566814bc685e451c8df7a0c1d3a65d6c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.670 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 627516836 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.671 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 21052656 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b0b9b14c-882a-4864-b801-cddf0739f779', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 627516836, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:40:03.670868', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e27f8028-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10478.866182033, 'message_signature': 'cd6e051500db6fea7574126aba0f09dc37767d9be153267b72a85be10a973779'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21052656, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:40:03.670868', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e27f8b18-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10478.866182033, 'message_signature': '1ddd6ac1d55aceaa4c5a8f34d124f05b07d19f96e94239107316596d2e19980d'}]}, 'timestamp': '2025-11-26 09:40:03.671458', '_unique_id': 'e721226b67374f5aab3c491a005f8e20'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.672 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '955e0f5f-841c-4843-bac2-3e539ddc7312', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:40:03.672898', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'e27fcf88-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10478.803429082, 'message_signature': 'c7b2db41a2c9a390b06c6992d66f7e522852431846bfa3dc33abfb21f8ff28b8'}]}, 'timestamp': '2025-11-26 09:40:03.673232', '_unique_id': '0b7bc9b7240e4d9e9c0d904143735470'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.673 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.674 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.674 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.675 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.675 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '068b5dc9-9e3c-4b33-b86d-298d444e6830', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:40:03.674984', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e280201e-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10478.852918323, 'message_signature': '4ed21238dfd3257adcce83f3f15631f2b9edfe2f32980afe385146b7a9bcbea7'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:40:03.674984', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e2802ac8-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10478.852918323, 'message_signature': 'e2f2a788867aa2a15d2f8c39358bb9fc8d20ca7db6b20241e9b7ef32f7c3f59a'}]}, 'timestamp': '2025-11-26 09:40:03.675547', '_unique_id': '348455d5cddb4e1f90921d987a9ec783'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.676 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.677 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.677 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.677 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd244fda6-66f3-441d-bbf2-d87d5f46fe11', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:40:03.677132', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e28073de-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10478.866182033, 'message_signature': 'a64ac2543ee4a68da6bf6bb3f3ba6961ca6e0cf07e8d6afe9a75c04c592dbeda'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:40:03.677132', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e2807e7e-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10478.866182033, 'message_signature': 'c2a06921d9269302913dd0d2bec3631521508f84e50852654ebe90cb62b40050'}]}, 'timestamp': '2025-11-26 09:40:03.677688', '_unique_id': '8361ec8482b04c5a9ad9a44f8e4594bd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.678 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.679 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.679 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes volume: 9035 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9caccfb5-c14f-4ee2-b497-bca9f23734dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9035, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:40:03.679137', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'e280c226-caab-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10478.803429082, 'message_signature': '388e4ab8480a37b6f2b54e5b101db8f3866d3e06d55741fa09d6ea64d2606cb8'}]}, 'timestamp': '2025-11-26 09:40:03.679442', '_unique_id': '1666bb1667fe4145a8def58826a117a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:40:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:40:03.680 12 ERROR oslo_messaging.notify.messaging Nov 26 04:40:03 localhost nova_compute[229802]: 2025-11-26 09:40:03.704 229806 DEBUG nova.network.neutron [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:40:03 localhost nova_compute[229802]: 2025-11-26 09:40:03.720 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:40:03 localhost nova_compute[229802]: 2025-11-26 09:40:03.720 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 04:40:03 localhost nova_compute[229802]: 2025-11-26 09:40:03.720 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:40:03 localhost nova_compute[229802]: 2025-11-26 09:40:03.720 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:40:03 localhost nova_compute[229802]: 2025-11-26 09:40:03.721 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:40:03 localhost nova_compute[229802]: 2025-11-26 09:40:03.721 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 04:40:03 localhost nova_compute[229802]: 2025-11-26 09:40:03.721 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:40:03 localhost nova_compute[229802]: 2025-11-26 09:40:03.748 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:40:03 localhost nova_compute[229802]: 2025-11-26 09:40:03.749 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:40:03 localhost nova_compute[229802]: 2025-11-26 09:40:03.749 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:40:03 localhost nova_compute[229802]: 2025-11-26 09:40:03.749 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 04:40:03 localhost nova_compute[229802]: 2025-11-26 09:40:03.750 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:40:04 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:40:04.110 262471 INFO neutron.agent.dhcp.agent [-] Synchronizing state#033[00m Nov 26 04:40:04 localhost nova_compute[229802]: 2025-11-26 09:40:04.207 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:40:04 localhost nova_compute[229802]: 2025-11-26 09:40:04.315 229806 DEBUG nova.virt.libvirt.driver [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:40:04 localhost nova_compute[229802]: 2025-11-26 09:40:04.316 229806 DEBUG nova.virt.libvirt.driver [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:40:04 localhost nova_compute[229802]: 2025-11-26 09:40:04.579 229806 WARNING nova.virt.libvirt.driver [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 04:40:04 localhost nova_compute[229802]: 2025-11-26 09:40:04.581 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=12101MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 04:40:04 localhost nova_compute[229802]: 2025-11-26 09:40:04.581 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:40:04 localhost nova_compute[229802]: 2025-11-26 09:40:04.582 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:40:04 localhost nova_compute[229802]: 2025-11-26 09:40:04.648 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 04:40:04 localhost nova_compute[229802]: 2025-11-26 09:40:04.649 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 04:40:04 localhost nova_compute[229802]: 2025-11-26 09:40:04.650 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 04:40:04 localhost nova_compute[229802]: 2025-11-26 09:40:04.689 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:40:04 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:40:04.725 262471 INFO neutron.agent.dhcp.agent [None req-3d9a5d7e-9be2-4dc5-ab29-1757d5022679 - - - - - -] All active networks have been fetched through RPC.#033[00m Nov 26 04:40:04 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:40:04.726 262471 INFO neutron.agent.dhcp.agent [None req-3d9a5d7e-9be2-4dc5-ab29-1757d5022679 - - - - - -] Synchronizing state complete#033[00m Nov 26 04:40:04 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:40:04.810 262471 INFO neutron.agent.dhcp.agent [None req-3d9a5d7e-9be2-4dc5-ab29-1757d5022679 - - - - - -] DHCP agent started#033[00m Nov 26 04:40:05 localhost nova_compute[229802]: 2025-11-26 09:40:05.158 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:40:05 localhost nova_compute[229802]: 2025-11-26 09:40:05.168 229806 DEBUG nova.compute.provider_tree [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 04:40:05 localhost nova_compute[229802]: 2025-11-26 09:40:05.183 229806 DEBUG nova.scheduler.client.report [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 04:40:05 localhost nova_compute[229802]: 2025-11-26 09:40:05.186 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 04:40:05 localhost nova_compute[229802]: 2025-11-26 09:40:05.187 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.605s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:40:05 localhost nova_compute[229802]: 2025-11-26 09:40:05.920 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:40:06 localhost nova_compute[229802]: 2025-11-26 09:40:06.075 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:40:06 localhost nova_compute[229802]: 2025-11-26 09:40:06.076 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:40:06 localhost nova_compute[229802]: 2025-11-26 09:40:06.077 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:40:06 localhost nova_compute[229802]: 2025-11-26 09:40:06.077 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:40:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:40:06 localhost systemd[1]: tmp-crun.mGFtOB.mount: Deactivated successfully. Nov 26 04:40:06 localhost podman[262684]: 2025-11-26 09:40:06.626190183 +0000 UTC m=+0.119700845 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118) Nov 26 04:40:06 localhost podman[262684]: 2025-11-26 09:40:06.701632317 +0000 UTC m=+0.195143039 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:40:06 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:40:07 localhost nova_compute[229802]: 2025-11-26 09:40:07.884 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:40:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42653 DF PROTO=TCP SPT=48388 DPT=9102 SEQ=64747573 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C5F5FC0000000001030307) Nov 26 04:40:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:40:10 localhost podman[262712]: 2025-11-26 09:40:10.833591281 +0000 UTC m=+0.087569881 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6, config_id=edpm) Nov 26 04:40:10 localhost podman[262712]: 2025-11-26 09:40:10.846040806 +0000 UTC m=+0.100019356 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-type=git, version=9.6) Nov 26 04:40:10 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:40:10 localhost nova_compute[229802]: 2025-11-26 09:40:10.953 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:40:12 localhost nova_compute[229802]: 2025-11-26 09:40:12.913 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:40:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:40:13 localhost podman[262734]: 2025-11-26 09:40:13.825685799 +0000 UTC m=+0.087235320 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 04:40:13 localhost podman[262734]: 2025-11-26 09:40:13.865511201 +0000 UTC m=+0.127060752 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 26 04:40:13 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:40:15 localhost openstack_network_exporter[242153]: ERROR 09:40:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:40:15 localhost openstack_network_exporter[242153]: ERROR 09:40:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:40:15 localhost openstack_network_exporter[242153]: ERROR 09:40:15 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:40:15 localhost openstack_network_exporter[242153]: ERROR 09:40:15 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:40:15 localhost openstack_network_exporter[242153]: Nov 26 04:40:15 localhost openstack_network_exporter[242153]: ERROR 09:40:15 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:40:15 localhost openstack_network_exporter[242153]: Nov 26 04:40:15 localhost nova_compute[229802]: 2025-11-26 09:40:15.997 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:40:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:40:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:40:16 localhost podman[262756]: 2025-11-26 09:40:16.838156787 +0000 UTC m=+0.097505348 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Nov 26 04:40:16 localhost podman[262756]: 2025-11-26 09:40:16.874539193 +0000 UTC m=+0.133887744 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2) Nov 26 04:40:16 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:40:16 localhost podman[262757]: 2025-11-26 09:40:16.889661841 +0000 UTC m=+0.145474042 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251118) Nov 26 04:40:16 localhost podman[262757]: 2025-11-26 09:40:16.929312128 +0000 UTC m=+0.185124369 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 26 04:40:16 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:40:17 localhost nova_compute[229802]: 2025-11-26 09:40:17.955 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:40:21 localhost nova_compute[229802]: 2025-11-26 09:40:21.042 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:40:22 localhost nova_compute[229802]: 2025-11-26 09:40:22.996 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:40:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7816 DF PROTO=TCP SPT=59240 DPT=9102 SEQ=781678455 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C62EF90000000001030307) Nov 26 04:40:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7817 DF PROTO=TCP SPT=59240 DPT=9102 SEQ=781678455 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C632FD0000000001030307) Nov 26 04:40:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42654 DF PROTO=TCP SPT=48388 DPT=9102 SEQ=64747573 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C635FC0000000001030307) Nov 26 04:40:26 localhost nova_compute[229802]: 2025-11-26 09:40:26.047 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:40:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7818 DF PROTO=TCP SPT=59240 DPT=9102 SEQ=781678455 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C63AFC0000000001030307) Nov 26 04:40:27 localhost podman[240049]: time="2025-11-26T09:40:27Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:40:27 localhost podman[240049]: @ - - [26/Nov/2025:09:40:27 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148615 "" "Go-http-client/1.1" Nov 26 04:40:27 localhost podman[240049]: @ - - [26/Nov/2025:09:40:27 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17224 "" "Go-http-client/1.1" Nov 26 04:40:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42205 DF PROTO=TCP SPT=46918 DPT=9102 SEQ=2799441351 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C63DFD0000000001030307) Nov 26 04:40:28 localhost nova_compute[229802]: 2025-11-26 09:40:28.000 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:40:29 localhost sshd[262793]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:40:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:40:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:40:30 localhost podman[262796]: 2025-11-26 09:40:30.829416707 +0000 UTC m=+0.087294281 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:40:30 localhost podman[262796]: 2025-11-26 09:40:30.837007813 +0000 UTC m=+0.094885437 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 26 04:40:30 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:40:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7819 DF PROTO=TCP SPT=59240 DPT=9102 SEQ=781678455 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C64ABC0000000001030307) Nov 26 04:40:30 localhost systemd[1]: tmp-crun.yOdv8C.mount: Deactivated successfully. Nov 26 04:40:30 localhost podman[262795]: 2025-11-26 09:40:30.945666505 +0000 UTC m=+0.205869001 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 04:40:30 localhost podman[262795]: 2025-11-26 09:40:30.95455547 +0000 UTC m=+0.214757966 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 26 04:40:30 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:40:31 localhost nova_compute[229802]: 2025-11-26 09:40:31.050 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:40:33 localhost nova_compute[229802]: 2025-11-26 09:40:33.039 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:40:36 localhost nova_compute[229802]: 2025-11-26 09:40:36.087 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:40:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:40:37 localhost systemd[1]: tmp-crun.EgeXBc.mount: Deactivated successfully. Nov 26 04:40:37 localhost podman[262838]: 2025-11-26 09:40:37.850477623 +0000 UTC m=+0.106991001 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true) Nov 26 04:40:37 localhost podman[262838]: 2025-11-26 09:40:37.921557142 +0000 UTC m=+0.178070540 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 26 04:40:37 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:40:38 localhost nova_compute[229802]: 2025-11-26 09:40:38.041 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:40:38 localhost sshd[262863]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:40:39 localhost systemd-logind[761]: New session 60 of user zuul. Nov 26 04:40:39 localhost systemd[1]: Started Session 60 of User zuul. Nov 26 04:40:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7820 DF PROTO=TCP SPT=59240 DPT=9102 SEQ=781678455 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C66BFD0000000001030307) Nov 26 04:40:40 localhost python3.9[262974]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 04:40:41 localhost nova_compute[229802]: 2025-11-26 09:40:41.117 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:40:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:40:41 localhost systemd[1]: tmp-crun.tChcDn.mount: Deactivated successfully. Nov 26 04:40:41 localhost podman[263087]: 2025-11-26 09:40:41.855261123 +0000 UTC m=+0.110034266 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, release=1755695350, distribution-scope=public, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, name=ubi9-minimal, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64) Nov 26 04:40:41 localhost podman[263087]: 2025-11-26 09:40:41.867919135 +0000 UTC m=+0.122692288 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, distribution-scope=public, version=9.6, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, release=1755695350, vcs-type=git, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Nov 26 04:40:41 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:40:41 localhost python3.9[263086]: ansible-ansible.builtin.service_facts Invoked Nov 26 04:40:41 localhost network[263124]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 26 04:40:41 localhost network[263125]: 'network-scripts' will be removed from distribution in near future. Nov 26 04:40:41 localhost network[263126]: It is advised to switch to 'NetworkManager' instead for network management. Nov 26 04:40:43 localhost nova_compute[229802]: 2025-11-26 09:40:43.044 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:40:43 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:40:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:40:44 localhost podman[263197]: 2025-11-26 09:40:44.011091676 +0000 UTC m=+0.095887118 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 04:40:44 localhost podman[263197]: 2025-11-26 09:40:44.025348527 +0000 UTC m=+0.110143979 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:40:44 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:40:45 localhost openstack_network_exporter[242153]: ERROR 09:40:45 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:40:45 localhost openstack_network_exporter[242153]: ERROR 09:40:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:40:45 localhost openstack_network_exporter[242153]: ERROR 09:40:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:40:45 localhost openstack_network_exporter[242153]: ERROR 09:40:45 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:40:45 localhost openstack_network_exporter[242153]: Nov 26 04:40:45 localhost openstack_network_exporter[242153]: ERROR 09:40:45 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:40:45 localhost openstack_network_exporter[242153]: Nov 26 04:40:46 localhost nova_compute[229802]: 2025-11-26 09:40:46.148 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:40:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:40:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:40:47 localhost podman[263386]: 2025-11-26 09:40:47.700371936 +0000 UTC m=+0.087113016 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:40:47 localhost podman[263386]: 2025-11-26 09:40:47.70824854 +0000 UTC m=+0.094989670 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:40:47 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:40:47 localhost podman[263387]: 2025-11-26 09:40:47.770375622 +0000 UTC m=+0.151385715 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd) Nov 26 04:40:47 localhost podman[263387]: 2025-11-26 09:40:47.811204674 +0000 UTC m=+0.192214767 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:40:47 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:40:47 localhost python3.9[263385]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 26 04:40:48 localhost nova_compute[229802]: 2025-11-26 09:40:48.092 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:40:48 localhost python3.9[263483]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 26 04:40:51 localhost nova_compute[229802]: 2025-11-26 09:40:51.180 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:40:53 localhost nova_compute[229802]: 2025-11-26 09:40:53.126 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:40:53 localhost python3.9[263595]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:40:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56913 DF PROTO=TCP SPT=46986 DPT=9102 SEQ=2958578714 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C6A4290000000001030307) Nov 26 04:40:54 localhost python3.9[263705]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:40:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56914 DF PROTO=TCP SPT=46986 DPT=9102 SEQ=2958578714 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C6A83C0000000001030307) Nov 26 04:40:55 localhost python3.9[263816]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:40:55 localhost nova_compute[229802]: 2025-11-26 09:40:55.609 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:40:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7821 DF PROTO=TCP SPT=59240 DPT=9102 SEQ=781678455 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C6ABFC0000000001030307) Nov 26 04:40:56 localhost nova_compute[229802]: 2025-11-26 09:40:56.182 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:40:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56915 DF PROTO=TCP SPT=46986 DPT=9102 SEQ=2958578714 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C6B03D0000000001030307) Nov 26 04:40:57 localhost python3.9[263928]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:40:57 localhost podman[240049]: time="2025-11-26T09:40:57Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:40:57 localhost podman[240049]: @ - - [26/Nov/2025:09:40:57 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148615 "" "Go-http-client/1.1" Nov 26 04:40:57 localhost podman[240049]: @ - - [26/Nov/2025:09:40:57 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17231 "" "Go-http-client/1.1" Nov 26 04:40:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42655 DF PROTO=TCP SPT=48388 DPT=9102 SEQ=64747573 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C6B3FC0000000001030307) Nov 26 04:40:58 localhost nova_compute[229802]: 2025-11-26 09:40:58.163 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:40:58 localhost python3.9[264038]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:40:59 localhost python3.9[264150]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:41:00 localhost nova_compute[229802]: 2025-11-26 09:41:00.624 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:41:00 localhost python3.9[264262]: ansible-ansible.builtin.service_facts Invoked Nov 26 04:41:00 localhost network[264279]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 26 04:41:00 localhost network[264280]: 'network-scripts' will be removed from distribution in near future. Nov 26 04:41:00 localhost network[264281]: It is advised to switch to 'NetworkManager' instead for network management. Nov 26 04:41:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:41:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56916 DF PROTO=TCP SPT=46986 DPT=9102 SEQ=2958578714 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C6BFFC0000000001030307) Nov 26 04:41:01 localhost podman[264286]: 2025-11-26 09:41:01.005809724 +0000 UTC m=+0.096084544 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 26 04:41:01 localhost podman[264286]: 2025-11-26 09:41:01.04641007 +0000 UTC m=+0.136684900 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 26 04:41:01 localhost nova_compute[229802]: 2025-11-26 09:41:01.186 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:41:01 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:41:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:41:01 localhost nova_compute[229802]: 2025-11-26 09:41:01.608 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:41:01 localhost nova_compute[229802]: 2025-11-26 09:41:01.609 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 04:41:01 localhost nova_compute[229802]: 2025-11-26 09:41:01.609 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 04:41:01 localhost systemd[1]: tmp-crun.AvQl15.mount: Deactivated successfully. Nov 26 04:41:01 localhost podman[264307]: 2025-11-26 09:41:01.686261057 +0000 UTC m=+0.078477959 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 26 04:41:01 localhost podman[264307]: 2025-11-26 09:41:01.725377038 +0000 UTC m=+0.117593970 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 26 04:41:01 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:41:02 localhost nova_compute[229802]: 2025-11-26 09:41:02.700 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:41:02 localhost nova_compute[229802]: 2025-11-26 09:41:02.701 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:41:02 localhost nova_compute[229802]: 2025-11-26 09:41:02.701 229806 DEBUG nova.network.neutron [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 04:41:02 localhost nova_compute[229802]: 2025-11-26 09:41:02.701 229806 DEBUG nova.objects.instance [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:41:03 localhost nova_compute[229802]: 2025-11-26 09:41:03.168 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:41:03 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:41:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:41:03.642 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:41:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:41:03.643 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:41:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:41:03.644 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:41:06 localhost nova_compute[229802]: 2025-11-26 09:41:06.018 229806 DEBUG nova.network.neutron [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:41:06 localhost nova_compute[229802]: 2025-11-26 09:41:06.035 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:41:06 localhost nova_compute[229802]: 2025-11-26 09:41:06.036 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 04:41:06 localhost nova_compute[229802]: 2025-11-26 09:41:06.037 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:41:06 localhost nova_compute[229802]: 2025-11-26 09:41:06.037 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:41:06 localhost nova_compute[229802]: 2025-11-26 09:41:06.037 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:41:06 localhost nova_compute[229802]: 2025-11-26 09:41:06.038 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:41:06 localhost nova_compute[229802]: 2025-11-26 09:41:06.038 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 04:41:06 localhost nova_compute[229802]: 2025-11-26 09:41:06.039 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:41:06 localhost nova_compute[229802]: 2025-11-26 09:41:06.060 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:41:06 localhost nova_compute[229802]: 2025-11-26 09:41:06.061 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:41:06 localhost nova_compute[229802]: 2025-11-26 09:41:06.061 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:41:06 localhost nova_compute[229802]: 2025-11-26 09:41:06.062 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 04:41:06 localhost nova_compute[229802]: 2025-11-26 09:41:06.062 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:41:06 localhost nova_compute[229802]: 2025-11-26 09:41:06.192 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:41:06 localhost nova_compute[229802]: 2025-11-26 09:41:06.613 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.551s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:41:06 localhost nova_compute[229802]: 2025-11-26 09:41:06.702 229806 DEBUG nova.virt.libvirt.driver [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:41:06 localhost nova_compute[229802]: 2025-11-26 09:41:06.703 229806 DEBUG nova.virt.libvirt.driver [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:41:06 localhost nova_compute[229802]: 2025-11-26 09:41:06.953 229806 WARNING nova.virt.libvirt.driver [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 04:41:06 localhost nova_compute[229802]: 2025-11-26 09:41:06.955 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=12092MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 04:41:06 localhost nova_compute[229802]: 2025-11-26 09:41:06.956 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:41:06 localhost nova_compute[229802]: 2025-11-26 09:41:06.957 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:41:07 localhost nova_compute[229802]: 2025-11-26 09:41:07.107 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 04:41:07 localhost nova_compute[229802]: 2025-11-26 09:41:07.108 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 04:41:07 localhost nova_compute[229802]: 2025-11-26 09:41:07.108 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 04:41:07 localhost nova_compute[229802]: 2025-11-26 09:41:07.230 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:41:07 localhost sshd[264576]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:41:07 localhost nova_compute[229802]: 2025-11-26 09:41:07.702 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:41:07 localhost nova_compute[229802]: 2025-11-26 09:41:07.711 229806 DEBUG nova.compute.provider_tree [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 04:41:07 localhost nova_compute[229802]: 2025-11-26 09:41:07.739 229806 DEBUG nova.scheduler.client.report [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 04:41:07 localhost nova_compute[229802]: 2025-11-26 09:41:07.742 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 04:41:07 localhost nova_compute[229802]: 2025-11-26 09:41:07.742 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.786s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:41:07 localhost nova_compute[229802]: 2025-11-26 09:41:07.743 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:41:07 localhost nova_compute[229802]: 2025-11-26 09:41:07.743 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Nov 26 04:41:07 localhost nova_compute[229802]: 2025-11-26 09:41:07.757 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Nov 26 04:41:07 localhost nova_compute[229802]: 2025-11-26 09:41:07.757 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:41:07 localhost nova_compute[229802]: 2025-11-26 09:41:07.758 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Nov 26 04:41:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:41:08 localhost nova_compute[229802]: 2025-11-26 09:41:08.170 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:41:08 localhost systemd[1]: tmp-crun.gyS3Dj.mount: Deactivated successfully. Nov 26 04:41:08 localhost podman[264635]: 2025-11-26 09:41:08.192350099 +0000 UTC m=+0.096987682 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible) Nov 26 04:41:08 localhost podman[264635]: 2025-11-26 09:41:08.239391875 +0000 UTC m=+0.144029498 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251118, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller) Nov 26 04:41:08 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:41:08 localhost python3.9[264715]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Nov 26 04:41:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56917 DF PROTO=TCP SPT=46986 DPT=9102 SEQ=2958578714 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C6DFFC0000000001030307) Nov 26 04:41:09 localhost nova_compute[229802]: 2025-11-26 09:41:09.343 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:41:09 localhost nova_compute[229802]: 2025-11-26 09:41:09.344 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:41:09 localhost nova_compute[229802]: 2025-11-26 09:41:09.370 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:41:10 localhost python3.9[264825]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled Nov 26 04:41:11 localhost python3.9[264935]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:41:11 localhost nova_compute[229802]: 2025-11-26 09:41:11.192 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:41:11 localhost python3.9[264992]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/modules-load.d/dm-multipath.conf _original_basename=module-load.conf.j2 recurse=False state=file path=/etc/modules-load.d/dm-multipath.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:41:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:41:12 localhost systemd[1]: tmp-crun.YWx3jn.mount: Deactivated successfully. Nov 26 04:41:12 localhost podman[265103]: 2025-11-26 09:41:12.349010307 +0000 UTC m=+0.103977748 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-type=git, release=1755695350, version=9.6, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, distribution-scope=public, name=ubi9-minimal) Nov 26 04:41:12 localhost podman[265103]: 2025-11-26 09:41:12.387270312 +0000 UTC m=+0.142237763 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, name=ubi9-minimal, release=1755695350, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, distribution-scope=public) Nov 26 04:41:12 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:41:12 localhost python3.9[265102]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:41:13 localhost nova_compute[229802]: 2025-11-26 09:41:13.175 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:41:13 localhost python3.9[265232]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:41:14 localhost python3.9[265342]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:41:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:41:14 localhost podman[265453]: 2025-11-26 09:41:14.84740415 +0000 UTC m=+0.094926809 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 26 04:41:14 localhost podman[265453]: 2025-11-26 09:41:14.882329081 +0000 UTC m=+0.129851750 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 26 04:41:14 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:41:14 localhost python3.9[265466]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:41:15 localhost openstack_network_exporter[242153]: ERROR 09:41:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:41:15 localhost openstack_network_exporter[242153]: ERROR 09:41:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:41:15 localhost openstack_network_exporter[242153]: ERROR 09:41:15 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:41:15 localhost openstack_network_exporter[242153]: ERROR 09:41:15 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:41:15 localhost openstack_network_exporter[242153]: Nov 26 04:41:15 localhost openstack_network_exporter[242153]: ERROR 09:41:15 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:41:15 localhost openstack_network_exporter[242153]: Nov 26 04:41:15 localhost python3.9[265589]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:41:16 localhost nova_compute[229802]: 2025-11-26 09:41:16.195 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:41:16 localhost python3.9[265700]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:41:17 localhost python3.9[265810]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:41:18 localhost nova_compute[229802]: 2025-11-26 09:41:18.211 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:41:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:41:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:41:18 localhost podman[265866]: 2025-11-26 09:41:18.815004191 +0000 UTC m=+0.075750925 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118) Nov 26 04:41:18 localhost podman[265866]: 2025-11-26 09:41:18.826598509 +0000 UTC m=+0.087345223 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_metadata_agent) Nov 26 04:41:18 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:41:18 localhost podman[265867]: 2025-11-26 09:41:18.879252619 +0000 UTC m=+0.134327758 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible) Nov 26 04:41:18 localhost podman[265867]: 2025-11-26 09:41:18.897267216 +0000 UTC m=+0.152342385 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2) Nov 26 04:41:18 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:41:19 localhost python3.9[265958]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:41:19 localhost python3.9[266068]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:41:21 localhost nova_compute[229802]: 2025-11-26 09:41:21.198 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:41:21 localhost python3.9[266178]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:41:22 localhost python3.9[266288]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:41:23 localhost python3.9[266400]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:41:23 localhost nova_compute[229802]: 2025-11-26 09:41:23.258 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:41:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41556 DF PROTO=TCP SPT=35816 DPT=9102 SEQ=3892618140 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C719590000000001030307) Nov 26 04:41:23 localhost python3.9[266510]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:41:24 localhost python3.9[266567]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:41:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41557 DF PROTO=TCP SPT=35816 DPT=9102 SEQ=3892618140 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C71D7C0000000001030307) Nov 26 04:41:25 localhost python3.9[266677]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:41:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56918 DF PROTO=TCP SPT=46986 DPT=9102 SEQ=2958578714 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C71FFC0000000001030307) Nov 26 04:41:25 localhost python3.9[266734]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:41:26 localhost nova_compute[229802]: 2025-11-26 09:41:26.201 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:41:26 localhost python3.9[266844]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:41:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41558 DF PROTO=TCP SPT=35816 DPT=9102 SEQ=3892618140 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C7257C0000000001030307) Nov 26 04:41:26 localhost python3.9[266954]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:41:27 localhost python3.9[267011]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:41:27 localhost podman[240049]: time="2025-11-26T09:41:27Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:41:27 localhost podman[240049]: @ - - [26/Nov/2025:09:41:27 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148615 "" "Go-http-client/1.1" Nov 26 04:41:27 localhost podman[240049]: @ - - [26/Nov/2025:09:41:27 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17232 "" "Go-http-client/1.1" Nov 26 04:41:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7822 DF PROTO=TCP SPT=59240 DPT=9102 SEQ=781678455 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C729FC0000000001030307) Nov 26 04:41:28 localhost python3.9[267121]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:41:28 localhost nova_compute[229802]: 2025-11-26 09:41:28.288 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:41:28 localhost python3.9[267178]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:41:29 localhost python3.9[267288]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:41:29 localhost systemd[1]: Reloading. Nov 26 04:41:30 localhost systemd-sysv-generator[267318]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:41:30 localhost systemd-rc-local-generator[267312]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:41:30 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:41:30 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:41:30 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:41:30 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:41:30 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:41:30 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:41:30 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:41:30 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:41:30 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:41:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41559 DF PROTO=TCP SPT=35816 DPT=9102 SEQ=3892618140 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C7353C0000000001030307) Nov 26 04:41:31 localhost nova_compute[229802]: 2025-11-26 09:41:31.204 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:41:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:41:31 localhost podman[267400]: 2025-11-26 09:41:31.849349611 +0000 UTC m=+0.101283294 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:41:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:41:31 localhost podman[267400]: 2025-11-26 09:41:31.867369679 +0000 UTC m=+0.119303372 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS) Nov 26 04:41:31 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:41:31 localhost podman[267455]: 2025-11-26 09:41:31.967469887 +0000 UTC m=+0.096917330 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 26 04:41:31 localhost podman[267455]: 2025-11-26 09:41:31.98437051 +0000 UTC m=+0.113817933 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 26 04:41:31 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:41:32 localhost python3.9[267462]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:41:32 localhost python3.9[267535]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:41:33 localhost nova_compute[229802]: 2025-11-26 09:41:33.333 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:41:34 localhost python3.9[267645]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:41:34 localhost python3.9[267702]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:41:35 localhost python3.9[267812]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:41:35 localhost systemd[1]: Reloading. Nov 26 04:41:35 localhost nova_compute[229802]: 2025-11-26 09:41:35.696 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:41:35 localhost systemd-rc-local-generator[267834]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:41:35 localhost systemd-sysv-generator[267840]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:41:35 localhost nova_compute[229802]: 2025-11-26 09:41:35.739 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Triggering sync for uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Nov 26 04:41:35 localhost nova_compute[229802]: 2025-11-26 09:41:35.740 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquiring lock "9d78bef9-6977-4fb5-b50b-ae75124e73af" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:41:35 localhost nova_compute[229802]: 2025-11-26 09:41:35.740 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "9d78bef9-6977-4fb5-b50b-ae75124e73af" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:41:35 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:41:35 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:41:35 localhost nova_compute[229802]: 2025-11-26 09:41:35.784 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "9d78bef9-6977-4fb5-b50b-ae75124e73af" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.043s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:41:35 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:41:35 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:41:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:41:35 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:41:35 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:41:35 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:41:35 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:41:36 localhost systemd[1]: Starting Create netns directory... Nov 26 04:41:36 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Nov 26 04:41:36 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Nov 26 04:41:36 localhost systemd[1]: Finished Create netns directory. Nov 26 04:41:36 localhost nova_compute[229802]: 2025-11-26 09:41:36.208 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:41:37 localhost python3.9[267963]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:41:38 localhost nova_compute[229802]: 2025-11-26 09:41:38.364 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:41:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:41:38 localhost podman[268074]: 2025-11-26 09:41:38.643103103 +0000 UTC m=+0.093735792 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller) Nov 26 04:41:38 localhost podman[268074]: 2025-11-26 09:41:38.718375031 +0000 UTC m=+0.169007690 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 26 04:41:38 localhost python3.9[268073]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:41:38 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:41:39 localhost python3.9[268156]: ansible-ansible.legacy.file Invoked with group=zuul mode=0700 owner=zuul setype=container_file_t dest=/var/lib/openstack/healthchecks/multipathd/ _original_basename=healthcheck recurse=False state=file path=/var/lib/openstack/healthchecks/multipathd/ force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:41:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41560 DF PROTO=TCP SPT=35816 DPT=9102 SEQ=3892618140 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C755FD0000000001030307) Nov 26 04:41:40 localhost python3.9[268266]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:41:41 localhost python3.9[268376]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:41:41 localhost nova_compute[229802]: 2025-11-26 09:41:41.214 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:41:41 localhost python3.9[268433]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/multipathd.json _original_basename=.hivv_xjm recurse=False state=file path=/var/lib/kolla/config_files/multipathd.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:41:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:41:42 localhost podman[268522]: 2025-11-26 09:41:42.83388325 +0000 UTC m=+0.090335857 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_id=edpm, vendor=Red Hat, Inc., name=ubi9-minimal, io.buildah.version=1.33.7, release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Nov 26 04:41:42 localhost podman[268522]: 2025-11-26 09:41:42.850681449 +0000 UTC m=+0.107134116 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, version=9.6, config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Nov 26 04:41:42 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:41:43 localhost nova_compute[229802]: 2025-11-26 09:41:43.368 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:41:43 localhost python3.9[268563]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:41:44 localhost sshd[268731]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:41:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:41:45 localhost podman[268733]: 2025-11-26 09:41:45.502217129 +0000 UTC m=+0.087452097 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 26 04:41:45 localhost podman[268733]: 2025-11-26 09:41:45.51649034 +0000 UTC m=+0.101725298 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 04:41:45 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:41:45 localhost openstack_network_exporter[242153]: ERROR 09:41:45 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:41:45 localhost openstack_network_exporter[242153]: ERROR 09:41:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:41:45 localhost openstack_network_exporter[242153]: ERROR 09:41:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:41:45 localhost openstack_network_exporter[242153]: ERROR 09:41:45 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:41:45 localhost openstack_network_exporter[242153]: Nov 26 04:41:45 localhost openstack_network_exporter[242153]: ERROR 09:41:45 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:41:45 localhost openstack_network_exporter[242153]: Nov 26 04:41:46 localhost nova_compute[229802]: 2025-11-26 09:41:46.214 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:41:46 localhost python3.9[268864]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False Nov 26 04:41:47 localhost python3.9[268974]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Nov 26 04:41:47 localhost sshd[268992]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:41:48 localhost nova_compute[229802]: 2025-11-26 09:41:48.404 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:41:48 localhost python3.9[269086]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Nov 26 04:41:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:41:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:41:49 localhost podman[269131]: 2025-11-26 09:41:49.81082343 +0000 UTC m=+0.067159409 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:41:49 localhost podman[269131]: 2025-11-26 09:41:49.820443217 +0000 UTC m=+0.076779146 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 26 04:41:49 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:41:49 localhost podman[269130]: 2025-11-26 09:41:49.876352557 +0000 UTC m=+0.132660236 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:41:49 localhost podman[269130]: 2025-11-26 09:41:49.886300705 +0000 UTC m=+0.142608404 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible) Nov 26 04:41:49 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:41:51 localhost nova_compute[229802]: 2025-11-26 09:41:51.219 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:41:53 localhost python3[269259]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False Nov 26 04:41:53 localhost nova_compute[229802]: 2025-11-26 09:41:53.441 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:41:53 localhost python3[269259]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "5a87eb2d1bea5c4c3bce654551fc0b05a96cf5556b36110e17bddeee8189b072",#012 "Digest": "sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-multipathd:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-11-21T06:11:34.680484424Z",#012 "Config": {#012 "User": "root",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251118",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 249489385,#012 "VirtualSize": 249489385,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068/diff:/var/lib/containers/storage/overlay/ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/4f2d68ffc7b0d2ad7154f194ce01f6add8f68d1c87ebccb7dfe58b78cf788c91/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/4f2d68ffc7b0d2ad7154f194ce01f6add8f68d1c87ebccb7dfe58b78cf788c91/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6",#012 "sha256:573e98f577c8b1610c1485067040ff856a142394fcd22ad4cb9c66b7d1de6bef",#012 "sha256:d9e3e9c6b6b086eeb756b403557bba77ecef73e97936fb3285a5484cd95a1b1a"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251118",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "root",#012 "History": [#012 {#012 "created": "2025-11-18T01:56:49.795434035Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:6d427dd138d2b0977a7ef7feaa8bd82d04e99cc5f4a16d555d6cff0cb52d43c6 in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-18T01:56:49.795512415Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251118\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-18T01:56:52.547242013Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-11-21T06:10:01.947310748Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947327778Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947358359Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947372589Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.94738527Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.94739397Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:02.324930938Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:36.349393468Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:39.924297673Z",#012 "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:40.346524368Z",#012 Nov 26 04:41:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12286 DF PROTO=TCP SPT=50428 DPT=9102 SEQ=2198164754 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C78E890000000001030307) Nov 26 04:41:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12287 DF PROTO=TCP SPT=50428 DPT=9102 SEQ=2198164754 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C7927C0000000001030307) Nov 26 04:41:55 localhost python3.9[269431]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:41:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41561 DF PROTO=TCP SPT=35816 DPT=9102 SEQ=3892618140 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C795FC0000000001030307) Nov 26 04:41:56 localhost nova_compute[229802]: 2025-11-26 09:41:56.221 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:41:56 localhost python3.9[269543]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:41:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12288 DF PROTO=TCP SPT=50428 DPT=9102 SEQ=2198164754 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C79A7D0000000001030307) Nov 26 04:41:56 localhost python3.9[269598]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:41:57 localhost podman[240049]: time="2025-11-26T09:41:57Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:41:57 localhost podman[240049]: @ - - [26/Nov/2025:09:41:57 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148615 "" "Go-http-client/1.1" Nov 26 04:41:57 localhost podman[240049]: @ - - [26/Nov/2025:09:41:57 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17229 "" "Go-http-client/1.1" Nov 26 04:41:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56919 DF PROTO=TCP SPT=46986 DPT=9102 SEQ=2958578714 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C79DFD0000000001030307) Nov 26 04:41:58 localhost python3.9[269707]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764150117.6276565-1365-175228380205445/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:41:58 localhost nova_compute[229802]: 2025-11-26 09:41:58.495 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:41:59 localhost python3.9[269762]: ansible-systemd Invoked with state=started name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:41:59 localhost python3.9[269872]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:42:00 localhost python3.9[269982]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:42:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12289 DF PROTO=TCP SPT=50428 DPT=9102 SEQ=2198164754 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C7AA3C0000000001030307) Nov 26 04:42:01 localhost nova_compute[229802]: 2025-11-26 09:42:01.225 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:42:01 localhost nova_compute[229802]: 2025-11-26 09:42:01.653 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:42:01 localhost nova_compute[229802]: 2025-11-26 09:42:01.654 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 04:42:01 localhost nova_compute[229802]: 2025-11-26 09:42:01.654 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 04:42:01 localhost python3.9[270092]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Nov 26 04:42:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:42:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:42:02 localhost podman[270203]: 2025-11-26 09:42:02.487809175 +0000 UTC m=+0.095163786 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 04:42:02 localhost podman[270203]: 2025-11-26 09:42:02.502180099 +0000 UTC m=+0.109534760 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 04:42:02 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:42:02 localhost python3.9[270202]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled Nov 26 04:42:02 localhost podman[270204]: 2025-11-26 09:42:02.595101135 +0000 UTC m=+0.202343802 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Nov 26 04:42:02 localhost podman[270204]: 2025-11-26 09:42:02.608323554 +0000 UTC m=+0.215566241 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:42:02 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:42:02 localhost nova_compute[229802]: 2025-11-26 09:42:02.782 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:42:02 localhost nova_compute[229802]: 2025-11-26 09:42:02.782 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:42:02 localhost nova_compute[229802]: 2025-11-26 09:42:02.783 229806 DEBUG nova.network.neutron [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 04:42:02 localhost nova_compute[229802]: 2025-11-26 09:42:02.783 229806 DEBUG nova.objects.instance [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:42:03 localhost python3.9[270354]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:42:03 localhost nova_compute[229802]: 2025-11-26 09:42:03.573 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.577 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'name': 'test', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005536118.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'hostId': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.578 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.578 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.578 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.618 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 627516836 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.619 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 21052656 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ccf9dca-f4e3-42dd-b24b-1f8ee8305816', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 627516836, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:42:03.579079', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '29fe0d20-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10598.821343073, 'message_signature': '746e8c32b5d2cd352148ffc62752677835f5474d962b70d6ec85d7a0df00f424'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21052656, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:42:03.579079', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '29fe24c2-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10598.821343073, 'message_signature': '8d5ec4e53d4664c649b8337b078e50cb701f22d7a13863802f28d0760e20f323'}]}, 'timestamp': '2025-11-26 09:42:03.619536', '_unique_id': '575e4b10a8914a3b8c53b88bc06c415a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.621 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.622 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.626 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes volume: 9035 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9c3112cc-cbfe-4d6f-a15f-441a66c2372f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9035, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:42:03.623107', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '29ff4712-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10598.865408166, 'message_signature': 'c4f4604a2f244fc7b102fb65211a0a5053c3a6f0a4280ad913d1e024b34af925'}]}, 'timestamp': '2025-11-26 09:42:03.626968', '_unique_id': '24f20f2988c64822bbe6d12370cddca8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.628 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.629 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.629 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb602e57-2f3e-49be-b005-1c03552a2bb5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:42:03.629462', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '29ffbda0-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10598.865408166, 'message_signature': 'fb73ce618a89397e552261339d694cc1cc463de6b2d287196be242cc00156ded'}]}, 'timestamp': '2025-11-26 09:42:03.629992', '_unique_id': '46b1d0baa2544066b22a682acdddb651'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.631 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.632 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.632 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.632 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 26 04:42:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:42:03.643 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:42:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:42:03.644 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:42:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:42:03.645 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.646 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.647 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '330a3463-768f-4164-ac84-a7efb603e1fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:42:03.632793', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2a0262b2-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10598.875048974, 'message_signature': 'f431df508531c79707b1c617d2ce91771879deaa2a58263ab509bc35dbe3e306'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:42:03.632793', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2a0277f2-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10598.875048974, 'message_signature': '2bf24796159c8a7a4588c07652a433e81f0168d993e1c5c08f7fd5cba8857ee8'}]}, 'timestamp': '2025-11-26 09:42:03.647815', '_unique_id': 'dfd3ec8f1ca94f3380a95c0910d637b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.649 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.650 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.671 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/cpu volume: 61350000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97d9390e-4fa5-4fcf-8d01-a468992759e6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 61350000000, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T09:42:03.650795', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '2a062d5c-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10598.913564456, 'message_signature': '2f52a634e76aec0b4330b048ab80b67cbae5a90fe68d13618df193fb1811139f'}]}, 'timestamp': '2025-11-26 09:42:03.672279', '_unique_id': '9c3bf61749da4478a8579d459232738e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.673 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.675 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.675 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4ffedf0b-9169-40b1-938a-10d740cb0cc3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:42:03.675410', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '2a06c118-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10598.865408166, 'message_signature': 'f940a09655f4b4ce2a803c60ad0c89bc1c8962b81ae8b4a42d5e92b1d486c846'}]}, 'timestamp': '2025-11-26 09:42:03.676018', '_unique_id': '5d66424427db4678bee763bada97930c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.677 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.678 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.678 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '03db606b-a685-4002-942a-dda593f3dcfe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:42:03.678444', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '2a0736d4-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10598.865408166, 'message_signature': 'ed1ad72d8f20c16db3dc3351be96cfc3f0a138d1a536c8b774263757d2e7d6c2'}]}, 'timestamp': '2025-11-26 09:42:03.678972', '_unique_id': '50ccba4061dd49f2be52a0547781ad85'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.679 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.681 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.681 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/memory.usage volume: 52.296875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bf9114e6-4876-4dbb-83cc-b92c211abd08', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.296875, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T09:42:03.681385', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '2a07aa7e-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10598.913564456, 'message_signature': '7ec63ad35ab9bd20164e038b80049a093494ce1bb770923af0b34635ddc59773'}]}, 'timestamp': '2025-11-26 09:42:03.681881', '_unique_id': '2f442c3a13bd42d6b02bf26fcc965478'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.682 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.684 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.684 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets volume: 88 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a76953c4-70d8-4973-a79e-aca1f66b37b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 88, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:42:03.684327', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '2a081e82-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10598.865408166, 'message_signature': '7a7a7787e8f86410e25713be74607bc4cf1f3ff63d2fc016425d7a3d6258dc1e'}]}, 'timestamp': '2025-11-26 09:42:03.684888', '_unique_id': 'e15fce34c83746c9b47d469faecd603d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.685 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.687 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.687 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.688 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b78b957b-90fc-485a-a514-2a98360fcd3b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:42:03.687536', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2a089b5a-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10598.875048974, 'message_signature': '56d45168bb313a61ee58697229efaec77a076ffdffa39baa1ea55edb303a9105'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:42:03.687536', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2a08b608-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10598.875048974, 'message_signature': '0cdb80cc99f131b867a710d0265b4502c554d2e5621b2ff3186b57772ee1ba91'}]}, 'timestamp': '2025-11-26 09:42:03.688736', '_unique_id': 'a968401860d449de8eac42e489fe81c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.690 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.691 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.691 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 524 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.692 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5c7b7121-275b-48c8-ba80-f0ba29937881', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 524, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:42:03.691483', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2a0934b6-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10598.821343073, 'message_signature': '46aa65513379184d6211e4b2d2a8c3b33eb44a21f03fb20cb53599de5ac839a6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:42:03.691483', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2a09531a-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10598.821343073, 'message_signature': '8eebf1706205e03da04b4eebec7bf630c89840011a6918867a037d108b71d5de'}]}, 'timestamp': '2025-11-26 09:42:03.692760', '_unique_id': '2cae1e128dc343c69f6be23ad0581ae4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.693 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.695 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.695 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.696 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3f5c901c-0f0c-4165-8825-7422b85fff5e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:42:03.695405', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2a09cfca-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10598.875048974, 'message_signature': '09ffbd8f39b837545752f265b43719c6a7a755ae390ab697496141874adb42fe'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:42:03.695405', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2a09e988-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10598.875048974, 'message_signature': 'bed7f91ab95772995acb52fed11fd3eb35b7a137835ddd3ee7f229fdde10606d'}]}, 'timestamp': '2025-11-26 09:42:03.696686', '_unique_id': '928702c305974efb85eaa78e8614f604'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.697 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.699 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.699 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 1141678425 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.700 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 173265014 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '08ebf18c-994a-4205-b28b-43f45021a1dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1141678425, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:42:03.699323', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2a0a6a2a-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10598.821343073, 'message_signature': '9b26b523bb2b94b7a6ca364985e8e560f1819671a00b1ce86c36336847a8040e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 173265014, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:42:03.699323', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2a0a83de-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10598.821343073, 'message_signature': 'edc09ab7c20d504762314bdac0030d06fc5cbd9719bec48d1595656113a7872b'}]}, 'timestamp': '2025-11-26 09:42:03.700563', '_unique_id': '854142445bb747bdb90d9aae3ae85ca5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.701 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.703 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.703 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 73912320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.703 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8b4d3da7-5aba-44f5-baba-3ccded461ffe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73912320, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:42:03.703199', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2a0aff30-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10598.821343073, 'message_signature': 'd2320e933146fbe3cbf5c0b013b52a72c5fc7049d6da62fe5309a15d95036bd7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:42:03.703199', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2a0b129a-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10598.821343073, 'message_signature': '885512c2622b5d376cd255cfc1dddd40b0950fec5a790ef06e6b8e533c2e0236'}]}, 'timestamp': '2025-11-26 09:42:03.704291', '_unique_id': 'e268bca47dfe4f419f856aee8b20de68'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.705 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.706 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.706 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b53491b2-e9ec-4691-9da1-688c2717fc2f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:42:03.706844', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '2a0b9008-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10598.865408166, 'message_signature': 'c86575c58efa0e54aaf115f8431ed3aeea04b0e4118e0ef4cfef32a0f213dedc'}]}, 'timestamp': '2025-11-26 09:42:03.707479', '_unique_id': '98e780cb19524d3391813c27f4c630ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.708 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.709 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.710 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d1737bc-7861-4db1-a7cc-86b233440a2f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:42:03.710130', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '2a0c0cfe-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10598.865408166, 'message_signature': '2619803ba72e3edf9e57d4d058e9cde824dc47bb8c8e54e98fa2d77e120b89c8'}]}, 'timestamp': '2025-11-26 09:42:03.710652', '_unique_id': '26a7194745de4b5aabe4f8287088f4a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.711 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.712 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.712 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ed3d9d7-c419-48d1-b2ff-759ae037a1ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:42:03.712957', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '2a0c77b6-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10598.865408166, 'message_signature': '0182bbdaa7472e3353694cf2a221357e03d25fd8d7fadf91eaaf35026d3ae05e'}]}, 'timestamp': '2025-11-26 09:42:03.713272', '_unique_id': 'b4459291870f4101bf35ffccc943cceb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.713 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.714 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.714 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eff0fbdf-dd85-48c9-b313-cfbe0a204c5c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:42:03.714702', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '2a0cbb7c-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10598.865408166, 'message_signature': '479d8743d23179e4230aee8462d6713d25af10d54eacd09c50406386097bbdbc'}]}, 'timestamp': '2025-11-26 09:42:03.715059', '_unique_id': '2f8d24fda74f4e1cb2b6d62327a75a80'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.715 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.716 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.716 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.717 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '30aaca17-bfb2-42fc-ab5e-0a0d431bdcc2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:42:03.716783', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2a0d0d7a-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10598.821343073, 'message_signature': '97a725db5f350fdc01c3eed9507483b769377dceb33b494187944e3e50c5c522'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:42:03.716783', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2a0d1a2c-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10598.821343073, 'message_signature': 'e61cb05e5b7d300d30fc0051f38082d6ce533e61b26cb7ad9c8f983dcba2f32d'}]}, 'timestamp': '2025-11-26 09:42:03.717414', '_unique_id': 'eeb959f8e65d47cd90eb1dd5eef44c7e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.718 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9dde3e2-6001-4a1c-8d10-774c2cb2f64e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:42:03.718872', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '2a0d602c-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10598.865408166, 'message_signature': '827581016b3394a3a8a0ea01e9ffe231c6a3126f26d52884d65fa45df276c28b'}]}, 'timestamp': '2025-11-26 09:42:03.719247', '_unique_id': '6e6afcb2c4174ba4b55223a76908d8e2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.719 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.721 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.721 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.722 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c7ea2601-2ace-42dd-9f8f-fb54a6df14e9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:42:03.721602', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2a0dcd5a-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10598.821343073, 'message_signature': 'f83b9f0b369ef21fa2616515253bfb824c38be578bdd0e3095e590ab8129f60a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:42:03.721602', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2a0ddc8c-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10598.821343073, 'message_signature': 'd8dd98c455b6dc384fdf18f81c662a1a0386b41f5c3242299583d5ad96bd05a7'}]}, 'timestamp': '2025-11-26 09:42:03.722404', '_unique_id': 'a8c83767f8114688b279ebd97eaaa760'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:42:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:42:03.723 12 ERROR oslo_messaging.notify.messaging Nov 26 04:42:03 localhost python3.9[270411]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/modules-load.d/nvme-fabrics.conf _original_basename=module-load.conf.j2 recurse=False state=file path=/etc/modules-load.d/nvme-fabrics.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:42:03 localhost nova_compute[229802]: 2025-11-26 09:42:03.931 229806 DEBUG nova.network.neutron [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:42:03 localhost nova_compute[229802]: 2025-11-26 09:42:03.949 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:42:03 localhost nova_compute[229802]: 2025-11-26 09:42:03.950 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 04:42:03 localhost nova_compute[229802]: 2025-11-26 09:42:03.951 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:42:03 localhost nova_compute[229802]: 2025-11-26 09:42:03.951 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:42:03 localhost nova_compute[229802]: 2025-11-26 09:42:03.951 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:42:03 localhost nova_compute[229802]: 2025-11-26 09:42:03.970 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:42:03 localhost nova_compute[229802]: 2025-11-26 09:42:03.970 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:42:03 localhost nova_compute[229802]: 2025-11-26 09:42:03.971 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:42:03 localhost nova_compute[229802]: 2025-11-26 09:42:03.971 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 04:42:03 localhost nova_compute[229802]: 2025-11-26 09:42:03.972 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:42:04 localhost nova_compute[229802]: 2025-11-26 09:42:04.440 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:42:04 localhost nova_compute[229802]: 2025-11-26 09:42:04.518 229806 DEBUG nova.virt.libvirt.driver [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:42:04 localhost nova_compute[229802]: 2025-11-26 09:42:04.519 229806 DEBUG nova.virt.libvirt.driver [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:42:04 localhost python3.9[270541]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:42:04 localhost nova_compute[229802]: 2025-11-26 09:42:04.750 229806 WARNING nova.virt.libvirt.driver [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 04:42:04 localhost nova_compute[229802]: 2025-11-26 09:42:04.752 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=12102MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 04:42:04 localhost nova_compute[229802]: 2025-11-26 09:42:04.753 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:42:04 localhost nova_compute[229802]: 2025-11-26 09:42:04.753 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:42:04 localhost nova_compute[229802]: 2025-11-26 09:42:04.888 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 04:42:04 localhost nova_compute[229802]: 2025-11-26 09:42:04.889 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 04:42:04 localhost nova_compute[229802]: 2025-11-26 09:42:04.889 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 04:42:04 localhost nova_compute[229802]: 2025-11-26 09:42:04.919 229806 DEBUG nova.scheduler.client.report [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Refreshing inventories for resource provider 05276789-7461-410b-9529-16f5185a8bff _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Nov 26 04:42:04 localhost nova_compute[229802]: 2025-11-26 09:42:04.949 229806 DEBUG nova.scheduler.client.report [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Updating ProviderTree inventory for provider 05276789-7461-410b-9529-16f5185a8bff from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Nov 26 04:42:04 localhost nova_compute[229802]: 2025-11-26 09:42:04.949 229806 DEBUG nova.compute.provider_tree [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Updating inventory in ProviderTree for provider 05276789-7461-410b-9529-16f5185a8bff with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Nov 26 04:42:04 localhost nova_compute[229802]: 2025-11-26 09:42:04.964 229806 DEBUG nova.scheduler.client.report [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Refreshing aggregate associations for resource provider 05276789-7461-410b-9529-16f5185a8bff, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Nov 26 04:42:04 localhost nova_compute[229802]: 2025-11-26 09:42:04.989 229806 DEBUG nova.scheduler.client.report [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Refreshing trait associations for resource provider 05276789-7461-410b-9529-16f5185a8bff, traits: COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_FMA3,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_ABM,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_F16C,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_AMD_SVM,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE41,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_CLMUL,COMPUTE_NODE,HW_CPU_X86_SSE4A,COMPUTE_DEVICE_TAGGING,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_AESNI,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_TRUSTED_CERTS,COMPUTE_ACCELERATORS,HW_CPU_X86_AVX _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Nov 26 04:42:05 localhost nova_compute[229802]: 2025-11-26 09:42:05.035 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:42:05 localhost nova_compute[229802]: 2025-11-26 09:42:05.501 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:42:05 localhost nova_compute[229802]: 2025-11-26 09:42:05.507 229806 DEBUG nova.compute.provider_tree [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 04:42:05 localhost python3.9[270673]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 26 04:42:05 localhost nova_compute[229802]: 2025-11-26 09:42:05.527 229806 DEBUG nova.scheduler.client.report [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 04:42:05 localhost nova_compute[229802]: 2025-11-26 09:42:05.530 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 04:42:05 localhost nova_compute[229802]: 2025-11-26 09:42:05.530 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.777s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:42:06 localhost nova_compute[229802]: 2025-11-26 09:42:06.188 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:42:06 localhost nova_compute[229802]: 2025-11-26 09:42:06.188 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:42:06 localhost nova_compute[229802]: 2025-11-26 09:42:06.189 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:42:06 localhost nova_compute[229802]: 2025-11-26 09:42:06.189 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:42:06 localhost nova_compute[229802]: 2025-11-26 09:42:06.190 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 04:42:06 localhost nova_compute[229802]: 2025-11-26 09:42:06.228 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:42:08 localhost nova_compute[229802]: 2025-11-26 09:42:08.639 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:42:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:42:08 localhost podman[270756]: 2025-11-26 09:42:08.976544088 +0000 UTC m=+0.086845226 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Nov 26 04:42:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12290 DF PROTO=TCP SPT=50428 DPT=9102 SEQ=2198164754 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C7C9FD0000000001030307) Nov 26 04:42:09 localhost podman[270756]: 2025-11-26 09:42:09.068018306 +0000 UTC m=+0.178319494 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0) Nov 26 04:42:09 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:42:09 localhost systemd[1]: tmp-crun.qxPaAG.mount: Deactivated successfully. Nov 26 04:42:09 localhost podman[270829]: 2025-11-26 09:42:09.247481335 +0000 UTC m=+0.090731426 container exec a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, RELEASE=main, architecture=x86_64, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, io.openshift.expose-services=, vendor=Red Hat, Inc., GIT_BRANCH=main, ceph=True) Nov 26 04:42:09 localhost podman[270829]: 2025-11-26 09:42:09.353585216 +0000 UTC m=+0.196835297 container exec_died a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, version=7, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, ceph=True, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 26 04:42:09 localhost nova_compute[229802]: 2025-11-26 09:42:09.610 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:42:10 localhost python3.9[271036]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 04:42:11 localhost nova_compute[229802]: 2025-11-26 09:42:11.231 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:42:11 localhost python3.9[271182]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:42:12 localhost python3.9[271292]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 26 04:42:12 localhost systemd[1]: Reloading. Nov 26 04:42:12 localhost systemd-sysv-generator[271320]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:42:12 localhost systemd-rc-local-generator[271315]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:42:12 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:42:12 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:42:12 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:42:12 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:42:13 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:42:13 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:42:13 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:42:13 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:42:13 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:42:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:42:13 localhost podman[271329]: 2025-11-26 09:42:13.315587384 +0000 UTC m=+0.098107894 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, io.buildah.version=1.33.7, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, com.redhat.component=ubi9-minimal-container, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, version=9.6, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Nov 26 04:42:13 localhost podman[271329]: 2025-11-26 09:42:13.355758097 +0000 UTC m=+0.138278617 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-08-20T13:12:41, config_id=edpm, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, architecture=x86_64, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Nov 26 04:42:13 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:42:13 localhost nova_compute[229802]: 2025-11-26 09:42:13.643 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:42:13 localhost python3.9[271457]: ansible-ansible.builtin.service_facts Invoked Nov 26 04:42:14 localhost network[271474]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 26 04:42:14 localhost network[271475]: 'network-scripts' will be removed from distribution in near future. Nov 26 04:42:14 localhost network[271476]: It is advised to switch to 'NetworkManager' instead for network management. Nov 26 04:42:15 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:42:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:42:15 localhost podman[271525]: 2025-11-26 09:42:15.672716594 +0000 UTC m=+0.104710289 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:42:15 localhost podman[271525]: 2025-11-26 09:42:15.681181976 +0000 UTC m=+0.113175711 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 04:42:15 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:42:15 localhost openstack_network_exporter[242153]: ERROR 09:42:15 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:42:15 localhost openstack_network_exporter[242153]: ERROR 09:42:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:42:15 localhost openstack_network_exporter[242153]: ERROR 09:42:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:42:15 localhost openstack_network_exporter[242153]: ERROR 09:42:15 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:42:15 localhost openstack_network_exporter[242153]: Nov 26 04:42:15 localhost openstack_network_exporter[242153]: ERROR 09:42:15 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:42:15 localhost openstack_network_exporter[242153]: Nov 26 04:42:16 localhost nova_compute[229802]: 2025-11-26 09:42:16.234 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:42:18 localhost python3.9[271733]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:42:18 localhost nova_compute[229802]: 2025-11-26 09:42:18.674 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:42:19 localhost python3.9[271844]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:42:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:42:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:42:19 localhost podman[271846]: 2025-11-26 09:42:19.967329587 +0000 UTC m=+0.104323867 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 26 04:42:19 localhost podman[271846]: 2025-11-26 09:42:19.98132978 +0000 UTC m=+0.118324020 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true) Nov 26 04:42:19 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:42:20 localhost systemd[1]: tmp-crun.jHLRug.mount: Deactivated successfully. Nov 26 04:42:20 localhost podman[271882]: 2025-11-26 09:42:20.080105673 +0000 UTC m=+0.101106187 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 26 04:42:20 localhost podman[271882]: 2025-11-26 09:42:20.091361651 +0000 UTC m=+0.112362135 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:42:20 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:42:20 localhost python3.9[271991]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:42:21 localhost nova_compute[229802]: 2025-11-26 09:42:21.238 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:42:22 localhost python3.9[272103]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:42:22 localhost rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 26 04:42:22 localhost rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 26 04:42:23 localhost python3.9[272217]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:42:23 localhost nova_compute[229802]: 2025-11-26 09:42:23.679 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:42:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35910 DF PROTO=TCP SPT=57510 DPT=9102 SEQ=1397795905 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C803B90000000001030307) Nov 26 04:42:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35911 DF PROTO=TCP SPT=57510 DPT=9102 SEQ=1397795905 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C807BC0000000001030307) Nov 26 04:42:24 localhost sshd[272330]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:42:24 localhost python3.9[272329]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:42:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12291 DF PROTO=TCP SPT=50428 DPT=9102 SEQ=2198164754 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C809FC0000000001030307) Nov 26 04:42:25 localhost python3.9[272442]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:42:26 localhost nova_compute[229802]: 2025-11-26 09:42:26.244 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:42:26 localhost python3.9[272553]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:42:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35912 DF PROTO=TCP SPT=57510 DPT=9102 SEQ=1397795905 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C80FBC0000000001030307) Nov 26 04:42:27 localhost podman[240049]: time="2025-11-26T09:42:27Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:42:27 localhost podman[240049]: @ - - [26/Nov/2025:09:42:27 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148615 "" "Go-http-client/1.1" Nov 26 04:42:27 localhost podman[240049]: @ - - [26/Nov/2025:09:42:27 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17224 "" "Go-http-client/1.1" Nov 26 04:42:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41562 DF PROTO=TCP SPT=35816 DPT=9102 SEQ=3892618140 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C813FC0000000001030307) Nov 26 04:42:28 localhost python3.9[272839]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:42:28 localhost nova_compute[229802]: 2025-11-26 09:42:28.722 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:42:29 localhost python3.9[272949]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:42:29 localhost rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 26 04:42:29 localhost rhsm-service[6576]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 26 04:42:29 localhost python3.9[273061]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:42:30 localhost python3.9[273171]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:42:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35913 DF PROTO=TCP SPT=57510 DPT=9102 SEQ=1397795905 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C81F7D0000000001030307) Nov 26 04:42:30 localhost python3.9[273281]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:42:31 localhost nova_compute[229802]: 2025-11-26 09:42:31.244 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:42:32 localhost python3.9[273391]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:42:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:42:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:42:32 localhost podman[273502]: 2025-11-26 09:42:32.812582853 +0000 UTC m=+0.090893701 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 26 04:42:32 localhost podman[273502]: 2025-11-26 09:42:32.846549563 +0000 UTC m=+0.124860441 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 26 04:42:32 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:42:32 localhost podman[273503]: 2025-11-26 09:42:32.863429075 +0000 UTC m=+0.143238480 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118) Nov 26 04:42:32 localhost podman[273503]: 2025-11-26 09:42:32.874636091 +0000 UTC m=+0.154445506 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible) Nov 26 04:42:32 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:42:32 localhost python3.9[273501]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:42:33 localhost python3.9[273653]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:42:33 localhost nova_compute[229802]: 2025-11-26 09:42:33.761 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:42:34 localhost python3.9[273763]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:42:35 localhost python3.9[273873]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:42:35 localhost python3.9[273983]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:42:36 localhost nova_compute[229802]: 2025-11-26 09:42:36.246 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:42:36 localhost python3.9[274093]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:42:37 localhost python3.9[274203]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:42:37 localhost python3.9[274313]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:42:38 localhost python3.9[274423]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:42:38 localhost nova_compute[229802]: 2025-11-26 09:42:38.762 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:42:39 localhost python3.9[274533]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:42:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35914 DF PROTO=TCP SPT=57510 DPT=9102 SEQ=1397795905 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C83FFC0000000001030307) Nov 26 04:42:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:42:39 localhost podman[274628]: 2025-11-26 09:42:39.840519977 +0000 UTC m=+0.090227991 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251118, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Nov 26 04:42:39 localhost podman[274628]: 2025-11-26 09:42:39.883483485 +0000 UTC m=+0.133191529 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 26 04:42:39 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:42:39 localhost python3.9[274655]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:42:40 localhost python3.9[274779]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Nov 26 04:42:41 localhost nova_compute[229802]: 2025-11-26 09:42:41.250 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:42:41 localhost python3.9[274889]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 26 04:42:41 localhost systemd[1]: Reloading. Nov 26 04:42:42 localhost systemd-sysv-generator[274919]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:42:42 localhost systemd-rc-local-generator[274915]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:42:42 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:42:42 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:42:42 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:42:42 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:42:42 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:42:42 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:42:42 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:42:42 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:42:42 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:42:43 localhost python3.9[275035]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:42:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:42:43 localhost podman[275147]: 2025-11-26 09:42:43.637169433 +0000 UTC m=+0.087244688 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, config_id=edpm, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, name=ubi9-minimal) Nov 26 04:42:43 localhost podman[275147]: 2025-11-26 09:42:43.652307242 +0000 UTC m=+0.102382497 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, release=1755695350, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, config_id=edpm, distribution-scope=public, vendor=Red Hat, Inc., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter) Nov 26 04:42:43 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:42:43 localhost python3.9[275146]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:42:43 localhost nova_compute[229802]: 2025-11-26 09:42:43.764 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:42:44 localhost python3.9[275277]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:42:45 localhost python3.9[275388]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:42:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:42:45 localhost openstack_network_exporter[242153]: ERROR 09:42:45 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:42:45 localhost openstack_network_exporter[242153]: ERROR 09:42:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:42:45 localhost openstack_network_exporter[242153]: ERROR 09:42:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:42:45 localhost openstack_network_exporter[242153]: ERROR 09:42:45 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:42:45 localhost openstack_network_exporter[242153]: Nov 26 04:42:45 localhost openstack_network_exporter[242153]: ERROR 09:42:45 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:42:45 localhost openstack_network_exporter[242153]: Nov 26 04:42:45 localhost systemd[1]: tmp-crun.smjX3X.mount: Deactivated successfully. Nov 26 04:42:45 localhost podman[275407]: 2025-11-26 09:42:45.823310616 +0000 UTC m=+0.088512798 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:42:45 localhost podman[275407]: 2025-11-26 09:42:45.837101572 +0000 UTC m=+0.102303694 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:42:45 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:42:46 localhost nova_compute[229802]: 2025-11-26 09:42:46.252 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:42:47 localhost python3.9[275522]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:42:47 localhost python3.9[275633]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:42:48 localhost python3.9[275744]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:42:48 localhost nova_compute[229802]: 2025-11-26 09:42:48.801 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:42:49 localhost python3.9[275855]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:42:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:42:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:42:50 localhost podman[275874]: 2025-11-26 09:42:50.841154701 +0000 UTC m=+0.098620690 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3) Nov 26 04:42:50 localhost podman[275874]: 2025-11-26 09:42:50.871140338 +0000 UTC m=+0.128606347 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:42:50 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:42:50 localhost podman[275875]: 2025-11-26 09:42:50.941459943 +0000 UTC m=+0.194573698 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:42:50 localhost podman[275875]: 2025-11-26 09:42:50.954550897 +0000 UTC m=+0.207664622 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, tcib_managed=true) Nov 26 04:42:50 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:42:51 localhost nova_compute[229802]: 2025-11-26 09:42:51.258 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:42:51 localhost python3.9[276002]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:42:52 localhost python3.9[276112]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:42:53 localhost python3.9[276222]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:42:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25481 DF PROTO=TCP SPT=38760 DPT=9102 SEQ=2507384895 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C878E90000000001030307) Nov 26 04:42:53 localhost nova_compute[229802]: 2025-11-26 09:42:53.842 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:42:54 localhost python3.9[276332]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:42:54 localhost python3.9[276442]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:42:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25482 DF PROTO=TCP SPT=38760 DPT=9102 SEQ=2507384895 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C87CFC0000000001030307) Nov 26 04:42:55 localhost python3.9[276552]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:42:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35915 DF PROTO=TCP SPT=57510 DPT=9102 SEQ=1397795905 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C87FFC0000000001030307) Nov 26 04:42:55 localhost python3.9[276662]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 26 04:42:56 localhost nova_compute[229802]: 2025-11-26 09:42:56.258 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:42:56 localhost python3.9[276772]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Nov 26 04:42:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25483 DF PROTO=TCP SPT=38760 DPT=9102 SEQ=2507384895 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C884FC0000000001030307) Nov 26 04:42:57 localhost podman[240049]: time="2025-11-26T09:42:57Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:42:57 localhost podman[240049]: @ - - [26/Nov/2025:09:42:57 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148615 "" "Go-http-client/1.1" Nov 26 04:42:57 localhost podman[240049]: @ - - [26/Nov/2025:09:42:57 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17227 "" "Go-http-client/1.1" Nov 26 04:42:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12292 DF PROTO=TCP SPT=50428 DPT=9102 SEQ=2198164754 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C887FC0000000001030307) Nov 26 04:42:57 localhost python3.9[276882]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Nov 26 04:42:58 localhost python3.9[276992]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Nov 26 04:42:58 localhost nova_compute[229802]: 2025-11-26 09:42:58.876 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:43:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25484 DF PROTO=TCP SPT=38760 DPT=9102 SEQ=2507384895 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C894BC0000000001030307) Nov 26 04:43:01 localhost nova_compute[229802]: 2025-11-26 09:43:01.261 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:43:02 localhost sshd[277010]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:43:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:43:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:43:03 localhost nova_compute[229802]: 2025-11-26 09:43:03.609 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:43:03 localhost nova_compute[229802]: 2025-11-26 09:43:03.610 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 04:43:03 localhost nova_compute[229802]: 2025-11-26 09:43:03.610 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 04:43:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:43:03.645 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:43:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:43:03.646 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:43:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:43:03.648 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:43:03 localhost podman[277012]: 2025-11-26 09:43:03.707460939 +0000 UTC m=+0.088744475 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 26 04:43:03 localhost podman[277013]: 2025-11-26 09:43:03.759977742 +0000 UTC m=+0.136998137 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Nov 26 04:43:03 localhost podman[277012]: 2025-11-26 09:43:03.790417953 +0000 UTC m=+0.171701489 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:43:03 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:43:03 localhost podman[277013]: 2025-11-26 09:43:03.844396112 +0000 UTC m=+0.221416577 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm) Nov 26 04:43:03 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:43:03 localhost nova_compute[229802]: 2025-11-26 09:43:03.917 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:43:05 localhost nova_compute[229802]: 2025-11-26 09:43:05.001 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:43:05 localhost nova_compute[229802]: 2025-11-26 09:43:05.001 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:43:05 localhost nova_compute[229802]: 2025-11-26 09:43:05.002 229806 DEBUG nova.network.neutron [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 04:43:05 localhost nova_compute[229802]: 2025-11-26 09:43:05.002 229806 DEBUG nova.objects.instance [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:43:05 localhost nova_compute[229802]: 2025-11-26 09:43:05.301 229806 DEBUG nova.network.neutron [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:43:05 localhost nova_compute[229802]: 2025-11-26 09:43:05.331 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:43:05 localhost nova_compute[229802]: 2025-11-26 09:43:05.331 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 04:43:05 localhost nova_compute[229802]: 2025-11-26 09:43:05.332 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:43:05 localhost nova_compute[229802]: 2025-11-26 09:43:05.332 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:43:05 localhost nova_compute[229802]: 2025-11-26 09:43:05.608 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:43:05 localhost nova_compute[229802]: 2025-11-26 09:43:05.608 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:43:05 localhost nova_compute[229802]: 2025-11-26 09:43:05.609 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:43:05 localhost nova_compute[229802]: 2025-11-26 09:43:05.609 229806 DEBUG nova.compute.manager [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 04:43:05 localhost nova_compute[229802]: 2025-11-26 09:43:05.610 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:43:05 localhost nova_compute[229802]: 2025-11-26 09:43:05.628 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:43:05 localhost nova_compute[229802]: 2025-11-26 09:43:05.629 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:43:05 localhost nova_compute[229802]: 2025-11-26 09:43:05.629 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:43:05 localhost nova_compute[229802]: 2025-11-26 09:43:05.629 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 04:43:05 localhost nova_compute[229802]: 2025-11-26 09:43:05.630 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:43:06 localhost nova_compute[229802]: 2025-11-26 09:43:06.105 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:43:06 localhost nova_compute[229802]: 2025-11-26 09:43:06.170 229806 DEBUG nova.virt.libvirt.driver [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:43:06 localhost nova_compute[229802]: 2025-11-26 09:43:06.171 229806 DEBUG nova.virt.libvirt.driver [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:43:06 localhost nova_compute[229802]: 2025-11-26 09:43:06.267 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:43:06 localhost nova_compute[229802]: 2025-11-26 09:43:06.394 229806 WARNING nova.virt.libvirt.driver [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 04:43:06 localhost nova_compute[229802]: 2025-11-26 09:43:06.395 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=12102MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 04:43:06 localhost nova_compute[229802]: 2025-11-26 09:43:06.396 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:43:06 localhost nova_compute[229802]: 2025-11-26 09:43:06.396 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:43:06 localhost python3.9[277166]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None Nov 26 04:43:06 localhost nova_compute[229802]: 2025-11-26 09:43:06.528 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 04:43:06 localhost nova_compute[229802]: 2025-11-26 09:43:06.529 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 04:43:06 localhost nova_compute[229802]: 2025-11-26 09:43:06.529 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 04:43:06 localhost nova_compute[229802]: 2025-11-26 09:43:06.565 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:43:07 localhost nova_compute[229802]: 2025-11-26 09:43:07.034 229806 DEBUG oslo_concurrency.processutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:43:07 localhost nova_compute[229802]: 2025-11-26 09:43:07.043 229806 DEBUG nova.compute.provider_tree [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 04:43:07 localhost nova_compute[229802]: 2025-11-26 09:43:07.066 229806 DEBUG nova.scheduler.client.report [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 04:43:07 localhost nova_compute[229802]: 2025-11-26 09:43:07.069 229806 DEBUG nova.compute.resource_tracker [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 04:43:07 localhost nova_compute[229802]: 2025-11-26 09:43:07.069 229806 DEBUG oslo_concurrency.lockutils [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.673s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:43:07 localhost sshd[277207]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:43:07 localhost systemd-logind[761]: New session 61 of user zuul. Nov 26 04:43:07 localhost systemd[1]: Started Session 61 of User zuul. Nov 26 04:43:07 localhost systemd[1]: session-61.scope: Deactivated successfully. Nov 26 04:43:07 localhost systemd-logind[761]: Session 61 logged out. Waiting for processes to exit. Nov 26 04:43:07 localhost systemd-logind[761]: Removed session 61. Nov 26 04:43:08 localhost nova_compute[229802]: 2025-11-26 09:43:08.070 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:43:08 localhost python3.9[277318]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:43:08 localhost python3.9[277404]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764150187.9606953-3039-87195669268646/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:43:08 localhost nova_compute[229802]: 2025-11-26 09:43:08.959 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:43:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25485 DF PROTO=TCP SPT=38760 DPT=9102 SEQ=2507384895 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C8B5FC0000000001030307) Nov 26 04:43:09 localhost python3.9[277512]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:43:09 localhost nova_compute[229802]: 2025-11-26 09:43:09.605 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:43:09 localhost nova_compute[229802]: 2025-11-26 09:43:09.632 229806 DEBUG oslo_service.periodic_task [None req-e4d96e47-e76e-4e15-b5e4-ce7d9bd36723 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:43:10 localhost python3.9[277567]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:43:10 localhost python3.9[277675]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:43:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:43:10 localhost podman[277708]: 2025-11-26 09:43:10.862428521 +0000 UTC m=+0.121535330 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible) Nov 26 04:43:10 localhost podman[277708]: 2025-11-26 09:43:10.94039233 +0000 UTC m=+0.199499149 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller) Nov 26 04:43:10 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:43:11 localhost python3.9[277783]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764150190.1696408-3039-193667795708780/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:43:11 localhost nova_compute[229802]: 2025-11-26 09:43:11.271 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:43:11 localhost python3.9[277947]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:43:12 localhost python3.9[278047]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764150191.2868319-3039-155975471974159/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=bf2176996cbca305070d0fff5e0027db1ed8fcef backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:43:12 localhost python3.9[278173]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:43:13 localhost python3.9[278259]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764150192.4751809-3039-261580381782335/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:43:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:43:13 localhost podman[278315]: 2025-11-26 09:43:13.840600411 +0000 UTC m=+0.094915756 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, vcs-type=git, distribution-scope=public, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, name=ubi9-minimal, architecture=x86_64, io.openshift.tags=minimal rhel9) Nov 26 04:43:13 localhost podman[278315]: 2025-11-26 09:43:13.85933424 +0000 UTC m=+0.113649585 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., version=9.6, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, managed_by=edpm_ansible, name=ubi9-minimal) Nov 26 04:43:13 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:43:13 localhost nova_compute[229802]: 2025-11-26 09:43:13.985 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:43:14 localhost python3.9[278388]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:43:14 localhost python3.9[278474]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764150193.6674309-3039-180767736976287/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:43:15 localhost python3.9[278584]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:43:15 localhost openstack_network_exporter[242153]: ERROR 09:43:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:43:15 localhost openstack_network_exporter[242153]: ERROR 09:43:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:43:15 localhost openstack_network_exporter[242153]: ERROR 09:43:15 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:43:15 localhost openstack_network_exporter[242153]: ERROR 09:43:15 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:43:15 localhost openstack_network_exporter[242153]: Nov 26 04:43:15 localhost openstack_network_exporter[242153]: ERROR 09:43:15 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:43:15 localhost openstack_network_exporter[242153]: Nov 26 04:43:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:43:16 localhost systemd[1]: tmp-crun.WA3KrY.mount: Deactivated successfully. Nov 26 04:43:16 localhost podman[278694]: 2025-11-26 09:43:16.171574693 +0000 UTC m=+0.084139013 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 26 04:43:16 localhost podman[278694]: 2025-11-26 09:43:16.184510462 +0000 UTC m=+0.097074792 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 26 04:43:16 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:43:16 localhost nova_compute[229802]: 2025-11-26 09:43:16.274 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:43:16 localhost python3.9[278695]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:43:16 localhost python3.9[278827]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:43:17 localhost python3.9[278939]: ansible-ansible.builtin.file Invoked with group=nova mode=0400 owner=nova path=/var/lib/nova/compute_id state=file recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:43:18 localhost python3.9[279047]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:43:19 localhost nova_compute[229802]: 2025-11-26 09:43:19.018 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:43:19 localhost python3.9[279157]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:43:19 localhost python3.9[279212]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/containers/nova_compute.json _original_basename=nova_compute.json.j2 recurse=False state=file path=/var/lib/openstack/config/containers/nova_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:43:20 localhost python3.9[279320]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 26 04:43:21 localhost python3.9[279375]: ansible-ansible.legacy.file Invoked with mode=0700 setype=container_file_t dest=/var/lib/openstack/config/containers/nova_compute_init.json _original_basename=nova_compute_init.json.j2 recurse=False state=file path=/var/lib/openstack/config/containers/nova_compute_init.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 26 04:43:21 localhost nova_compute[229802]: 2025-11-26 09:43:21.277 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:43:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:43:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:43:21 localhost podman[279452]: 2025-11-26 09:43:21.846305997 +0000 UTC m=+0.095839464 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 26 04:43:21 localhost podman[279452]: 2025-11-26 09:43:21.885357934 +0000 UTC m=+0.134891341 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 26 04:43:21 localhost systemd[1]: tmp-crun.4RDluP.mount: Deactivated successfully. Nov 26 04:43:21 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:43:21 localhost podman[279448]: 2025-11-26 09:43:21.902396632 +0000 UTC m=+0.155786328 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent) Nov 26 04:43:21 localhost podman[279448]: 2025-11-26 09:43:21.935600878 +0000 UTC m=+0.188990564 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Nov 26 04:43:21 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:43:22 localhost python3.9[279516]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False Nov 26 04:43:22 localhost python3.9[279633]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Nov 26 04:43:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60810 DF PROTO=TCP SPT=59486 DPT=9102 SEQ=338990211 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C8EE190000000001030307) Nov 26 04:43:23 localhost python3[279743]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False Nov 26 04:43:24 localhost nova_compute[229802]: 2025-11-26 09:43:24.073 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:43:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60811 DF PROTO=TCP SPT=59486 DPT=9102 SEQ=338990211 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C8F23C0000000001030307) Nov 26 04:43:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25486 DF PROTO=TCP SPT=38760 DPT=9102 SEQ=2507384895 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C8F5FC0000000001030307) Nov 26 04:43:26 localhost python3[279743]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "8e31b7b83c8d26bacd9598fdae1b287d27f8fa7d1d3cf4270dd8e435ff2f6a66",#012 "Digest": "sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-11-21T06:33:31.011385583Z",#012 "Config": {#012 "User": "nova",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251118",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 1211770748,#012 "VirtualSize": 1211770748,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/bba04dbd2e113925874f779bcae6bca7316b7a66f7b274a8553d6a317f393238/diff:/var/lib/containers/storage/overlay/0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c/diff:/var/lib/containers/storage/overlay/6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068/diff:/var/lib/containers/storage/overlay/ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/5d77d5f2ee8b05529dee3e4c9abd16672fdc0c09b8af133b2b88b75f7f780254/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/5d77d5f2ee8b05529dee3e4c9abd16672fdc0c09b8af133b2b88b75f7f780254/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6",#012 "sha256:573e98f577c8b1610c1485067040ff856a142394fcd22ad4cb9c66b7d1de6bef",#012 "sha256:5a71e5d7d31f15255619cb8b9384b708744757c93993652418b0f45b0c0931d5",#012 "sha256:b9b598b1eb0c08906fe1bc9a64fc0e72719a6197d83669d2eb4309e69a00aa62",#012 "sha256:33e3811ab7487b27336fdf94252d5a875b17efb438cbc4ffc943f851ad3eceb6"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251118",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "nova",#012 "History": [#012 {#012 "created": "2025-11-18T01:56:49.795434035Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:6d427dd138d2b0977a7ef7feaa8bd82d04e99cc5f4a16d555d6cff0cb52d43c6 in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-18T01:56:49.795512415Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251118\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-18T01:56:52.547242013Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-11-21T06:10:01.947310748Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947327778Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947358359Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947372589Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.94738527Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.94739397Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:02.324930938Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:36.349393468Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 Nov 26 04:43:26 localhost nova_compute[229802]: 2025-11-26 09:43:26.280 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:43:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60812 DF PROTO=TCP SPT=59486 DPT=9102 SEQ=338990211 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C8FA3C0000000001030307) Nov 26 04:43:26 localhost python3.9[279911]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:43:27 localhost podman[240049]: time="2025-11-26T09:43:27Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:43:27 localhost podman[240049]: @ - - [26/Nov/2025:09:43:27 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148615 "" "Go-http-client/1.1" Nov 26 04:43:27 localhost podman[240049]: @ - - [26/Nov/2025:09:43:27 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17228 "" "Go-http-client/1.1" Nov 26 04:43:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35916 DF PROTO=TCP SPT=57510 DPT=9102 SEQ=1397795905 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C8FDFC0000000001030307) Nov 26 04:43:28 localhost python3.9[280023]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False Nov 26 04:43:28 localhost python3.9[280133]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Nov 26 04:43:29 localhost nova_compute[229802]: 2025-11-26 09:43:29.076 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:43:29 localhost python3[280243]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False Nov 26 04:43:30 localhost python3[280243]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "8e31b7b83c8d26bacd9598fdae1b287d27f8fa7d1d3cf4270dd8e435ff2f6a66",#012 "Digest": "sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-11-21T06:33:31.011385583Z",#012 "Config": {#012 "User": "nova",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251118",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 1211770748,#012 "VirtualSize": 1211770748,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/bba04dbd2e113925874f779bcae6bca7316b7a66f7b274a8553d6a317f393238/diff:/var/lib/containers/storage/overlay/0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c/diff:/var/lib/containers/storage/overlay/6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068/diff:/var/lib/containers/storage/overlay/ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/5d77d5f2ee8b05529dee3e4c9abd16672fdc0c09b8af133b2b88b75f7f780254/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/5d77d5f2ee8b05529dee3e4c9abd16672fdc0c09b8af133b2b88b75f7f780254/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6",#012 "sha256:573e98f577c8b1610c1485067040ff856a142394fcd22ad4cb9c66b7d1de6bef",#012 "sha256:5a71e5d7d31f15255619cb8b9384b708744757c93993652418b0f45b0c0931d5",#012 "sha256:b9b598b1eb0c08906fe1bc9a64fc0e72719a6197d83669d2eb4309e69a00aa62",#012 "sha256:33e3811ab7487b27336fdf94252d5a875b17efb438cbc4ffc943f851ad3eceb6"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251118",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "nova",#012 "History": [#012 {#012 "created": "2025-11-18T01:56:49.795434035Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:6d427dd138d2b0977a7ef7feaa8bd82d04e99cc5f4a16d555d6cff0cb52d43c6 in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-18T01:56:49.795512415Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251118\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-18T01:56:52.547242013Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-11-21T06:10:01.947310748Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947327778Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947358359Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947372589Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.94738527Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.94739397Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:02.324930938Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:36.349393468Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 Nov 26 04:43:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60813 DF PROTO=TCP SPT=59486 DPT=9102 SEQ=338990211 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C909FD0000000001030307) Nov 26 04:43:31 localhost python3.9[280414]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:43:31 localhost nova_compute[229802]: 2025-11-26 09:43:31.283 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:43:31 localhost python3.9[280526]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:43:32 localhost python3.9[280635]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764150212.0058174-3716-23144524400367/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:43:33 localhost python3.9[280690]: ansible-systemd Invoked with state=started name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:43:34 localhost nova_compute[229802]: 2025-11-26 09:43:34.105 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:43:34 localhost python3.9[280800]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:43:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:43:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:43:34 localhost podman[280801]: 2025-11-26 09:43:34.843176242 +0000 UTC m=+0.100767587 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 04:43:34 localhost systemd[1]: tmp-crun.BgHCr5.mount: Deactivated successfully. Nov 26 04:43:34 localhost podman[280802]: 2025-11-26 09:43:34.891501066 +0000 UTC m=+0.148488322 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm) Nov 26 04:43:34 localhost podman[280801]: 2025-11-26 09:43:34.902992371 +0000 UTC m=+0.160583746 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 04:43:34 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:43:34 localhost podman[280802]: 2025-11-26 09:43:34.953271335 +0000 UTC m=+0.210258641 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:43:34 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:43:36 localhost python3.9[280951]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:43:36 localhost nova_compute[229802]: 2025-11-26 09:43:36.285 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:43:37 localhost python3.9[281059]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 26 04:43:38 localhost sshd[281077]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:43:39 localhost python3.9[281171]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None Nov 26 04:43:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60814 DF PROTO=TCP SPT=59486 DPT=9102 SEQ=338990211 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C929FD0000000001030307) Nov 26 04:43:39 localhost systemd-journald[47778]: Field hash table of /run/log/journal/ea6370aa35b896eb1e7cdbd81aa316d7/system.journal has a fill level at 119.8 (399 of 333 items), suggesting rotation. Nov 26 04:43:39 localhost systemd-journald[47778]: /run/log/journal/ea6370aa35b896eb1e7cdbd81aa316d7/system.journal: Journal header limits reached or header out-of-date, rotating. Nov 26 04:43:39 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 26 04:43:39 localhost nova_compute[229802]: 2025-11-26 09:43:39.138 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:43:39 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 26 04:43:40 localhost python3.9[281306]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 26 04:43:40 localhost systemd[1]: Stopping nova_compute container... Nov 26 04:43:40 localhost nova_compute[229802]: 2025-11-26 09:43:40.212 229806 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170#033[00m Nov 26 04:43:41 localhost nova_compute[229802]: 2025-11-26 09:43:41.288 229806 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:43:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:43:41 localhost podman[281323]: 2025-11-26 09:43:41.836323519 +0000 UTC m=+0.088806316 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller) Nov 26 04:43:41 localhost podman[281323]: 2025-11-26 09:43:41.88131114 +0000 UTC m=+0.133793907 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 26 04:43:41 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:43:43 localhost sshd[281348]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:43:43 localhost nova_compute[229802]: 2025-11-26 09:43:43.696 229806 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored#033[00m Nov 26 04:43:43 localhost nova_compute[229802]: 2025-11-26 09:43:43.698 229806 DEBUG oslo_concurrency.lockutils [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:43:43 localhost nova_compute[229802]: 2025-11-26 09:43:43.698 229806 DEBUG oslo_concurrency.lockutils [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:43:43 localhost nova_compute[229802]: 2025-11-26 09:43:43.698 229806 DEBUG oslo_concurrency.lockutils [None req-e628e3f2-81f4-4ada-b402-9c624bd26dd6 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:43:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:43:44 localhost podman[281350]: 2025-11-26 09:43:44.074024007 +0000 UTC m=+0.082852374 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, distribution-scope=public, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Nov 26 04:43:44 localhost podman[281350]: 2025-11-26 09:43:44.086980578 +0000 UTC m=+0.095809005 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal, architecture=x86_64, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, vcs-type=git) Nov 26 04:43:44 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:43:44 localhost journal[202976]: End of file while reading data: Input/output error Nov 26 04:43:44 localhost systemd[1]: libpod-d28800f3f46d9c1297abef0c0a14c0459ac4b900bed0df41b879062363102b98.scope: Deactivated successfully. Nov 26 04:43:44 localhost systemd[1]: libpod-d28800f3f46d9c1297abef0c0a14c0459ac4b900bed0df41b879062363102b98.scope: Consumed 21.722s CPU time. Nov 26 04:43:44 localhost podman[281310]: 2025-11-26 09:43:44.297600049 +0000 UTC m=+4.160540019 container died d28800f3f46d9c1297abef0c0a14c0459ac4b900bed0df41b879062363102b98 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=nova_compute) Nov 26 04:43:44 localhost systemd[1]: tmp-crun.BX0WmL.mount: Deactivated successfully. Nov 26 04:43:44 localhost podman[281310]: 2025-11-26 09:43:44.476656356 +0000 UTC m=+4.339596236 container cleanup d28800f3f46d9c1297abef0c0a14c0459ac4b900bed0df41b879062363102b98 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251118, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Nov 26 04:43:44 localhost podman[281310]: nova_compute Nov 26 04:43:44 localhost podman[281397]: error opening file `/run/crun/d28800f3f46d9c1297abef0c0a14c0459ac4b900bed0df41b879062363102b98/status`: No such file or directory Nov 26 04:43:44 localhost podman[281386]: 2025-11-26 09:43:44.586475201 +0000 UTC m=+0.071306346 container cleanup d28800f3f46d9c1297abef0c0a14c0459ac4b900bed0df41b879062363102b98 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}) Nov 26 04:43:44 localhost podman[281386]: nova_compute Nov 26 04:43:44 localhost systemd[1]: edpm_nova_compute.service: Deactivated successfully. Nov 26 04:43:44 localhost systemd[1]: Stopped nova_compute container. Nov 26 04:43:44 localhost systemd[1]: Starting nova_compute container... Nov 26 04:43:44 localhost systemd[1]: Started libcrun container. Nov 26 04:43:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be414c140a3c518600d6946a87c4cbe7ac9519e95399f3f816da17ed7bc9185b/merged/etc/nvme supports timestamps until 2038 (0x7fffffff) Nov 26 04:43:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be414c140a3c518600d6946a87c4cbe7ac9519e95399f3f816da17ed7bc9185b/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Nov 26 04:43:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be414c140a3c518600d6946a87c4cbe7ac9519e95399f3f816da17ed7bc9185b/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Nov 26 04:43:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be414c140a3c518600d6946a87c4cbe7ac9519e95399f3f816da17ed7bc9185b/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 26 04:43:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be414c140a3c518600d6946a87c4cbe7ac9519e95399f3f816da17ed7bc9185b/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Nov 26 04:43:44 localhost podman[281401]: 2025-11-26 09:43:44.746735566 +0000 UTC m=+0.128214146 container init d28800f3f46d9c1297abef0c0a14c0459ac4b900bed0df41b879062363102b98 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, container_name=nova_compute, org.label-schema.build-date=20251118) Nov 26 04:43:44 localhost podman[281401]: 2025-11-26 09:43:44.75595353 +0000 UTC m=+0.137432120 container start d28800f3f46d9c1297abef0c0a14c0459ac4b900bed0df41b879062363102b98 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm) Nov 26 04:43:44 localhost podman[281401]: nova_compute Nov 26 04:43:44 localhost nova_compute[281415]: + sudo -E kolla_set_configs Nov 26 04:43:44 localhost systemd[1]: Started nova_compute container. Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Validating config file Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Copying service configuration files Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Deleting /etc/nova/nova.conf Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Setting permission for /etc/nova/nova.conf Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Deleting /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Deleting /etc/ceph Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Creating directory /etc/ceph Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Setting permission for /etc/ceph Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Deleting /var/lib/nova/.ssh/config Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Deleting /usr/sbin/iscsiadm Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Writing out command to execute Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Nov 26 04:43:44 localhost nova_compute[281415]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Nov 26 04:43:44 localhost nova_compute[281415]: ++ cat /run_command Nov 26 04:43:44 localhost nova_compute[281415]: + CMD=nova-compute Nov 26 04:43:44 localhost nova_compute[281415]: + ARGS= Nov 26 04:43:44 localhost nova_compute[281415]: + sudo kolla_copy_cacerts Nov 26 04:43:44 localhost nova_compute[281415]: + [[ ! -n '' ]] Nov 26 04:43:44 localhost nova_compute[281415]: + . kolla_extend_start Nov 26 04:43:44 localhost nova_compute[281415]: Running command: 'nova-compute' Nov 26 04:43:44 localhost nova_compute[281415]: + echo 'Running command: '\''nova-compute'\''' Nov 26 04:43:44 localhost nova_compute[281415]: + umask 0022 Nov 26 04:43:44 localhost nova_compute[281415]: + exec nova-compute Nov 26 04:43:45 localhost systemd[1]: tmp-crun.6SV0Ih.mount: Deactivated successfully. Nov 26 04:43:45 localhost python3.9[281536]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None Nov 26 04:43:45 localhost openstack_network_exporter[242153]: ERROR 09:43:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:43:45 localhost openstack_network_exporter[242153]: ERROR 09:43:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:43:45 localhost openstack_network_exporter[242153]: ERROR 09:43:45 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:43:45 localhost openstack_network_exporter[242153]: ERROR 09:43:45 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:43:45 localhost openstack_network_exporter[242153]: Nov 26 04:43:45 localhost openstack_network_exporter[242153]: ERROR 09:43:45 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:43:45 localhost openstack_network_exporter[242153]: Nov 26 04:43:45 localhost systemd[1]: Started libpod-conmon-3f032c1307c1383e9e9f0aa4849db71c945330a8cb483b4401270616e20b08e9.scope. Nov 26 04:43:46 localhost systemd[1]: Started libcrun container. Nov 26 04:43:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1e67a08f7f89bb249239464cd2488dcb0276d30630b75fe760d996b7617d582f/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff) Nov 26 04:43:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1e67a08f7f89bb249239464cd2488dcb0276d30630b75fe760d996b7617d582f/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff) Nov 26 04:43:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1e67a08f7f89bb249239464cd2488dcb0276d30630b75fe760d996b7617d582f/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 26 04:43:46 localhost podman[281563]: 2025-11-26 09:43:46.030064044 +0000 UTC m=+0.171876515 container init 3f032c1307c1383e9e9f0aa4849db71c945330a8cb483b4401270616e20b08e9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, container_name=nova_compute_init, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:43:46 localhost podman[281563]: 2025-11-26 09:43:46.040796786 +0000 UTC m=+0.182609267 container start 3f032c1307c1383e9e9f0aa4849db71c945330a8cb483b4401270616e20b08e9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 26 04:43:46 localhost python3.9[281536]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init Nov 26 04:43:46 localhost nova_compute_init[281584]: INFO:nova_statedir:Applying nova statedir ownership Nov 26 04:43:46 localhost nova_compute_init[281584]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436 Nov 26 04:43:46 localhost nova_compute_init[281584]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/ Nov 26 04:43:46 localhost nova_compute_init[281584]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436 Nov 26 04:43:46 localhost nova_compute_init[281584]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0 Nov 26 04:43:46 localhost nova_compute_init[281584]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/ Nov 26 04:43:46 localhost nova_compute_init[281584]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436 Nov 26 04:43:46 localhost nova_compute_init[281584]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0 Nov 26 04:43:46 localhost nova_compute_init[281584]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/9d78bef9-6977-4fb5-b50b-ae75124e73af/ Nov 26 04:43:46 localhost nova_compute_init[281584]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/9d78bef9-6977-4fb5-b50b-ae75124e73af already 42436:42436 Nov 26 04:43:46 localhost nova_compute_init[281584]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/9d78bef9-6977-4fb5-b50b-ae75124e73af to system_u:object_r:container_file_t:s0 Nov 26 04:43:46 localhost nova_compute_init[281584]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/instances/9d78bef9-6977-4fb5-b50b-ae75124e73af/console.log Nov 26 04:43:46 localhost nova_compute_init[281584]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ Nov 26 04:43:46 localhost nova_compute_init[281584]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/_base already 42436:42436 Nov 26 04:43:46 localhost nova_compute_init[281584]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/_base to system_u:object_r:container_file_t:s0 Nov 26 04:43:46 localhost nova_compute_init[281584]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ed49784906b83c1a7713dc04a5e33f72ee029af6 Nov 26 04:43:46 localhost nova_compute_init[281584]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ephemeral_1_0706d66 Nov 26 04:43:46 localhost nova_compute_init[281584]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/ Nov 26 04:43:46 localhost nova_compute_init[281584]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/locks already 42436:42436 Nov 26 04:43:46 localhost nova_compute_init[281584]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/locks to system_u:object_r:container_file_t:s0 Nov 26 04:43:46 localhost nova_compute_init[281584]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-ed49784906b83c1a7713dc04a5e33f72ee029af6 Nov 26 04:43:46 localhost nova_compute_init[281584]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-ephemeral_1_0706d66 Nov 26 04:43:46 localhost nova_compute_init[281584]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/delay-nova-compute Nov 26 04:43:46 localhost nova_compute_init[281584]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ Nov 26 04:43:46 localhost nova_compute_init[281584]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436 Nov 26 04:43:46 localhost nova_compute_init[281584]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0 Nov 26 04:43:46 localhost nova_compute_init[281584]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey Nov 26 04:43:46 localhost nova_compute_init[281584]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config Nov 26 04:43:46 localhost nova_compute_init[281584]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/ Nov 26 04:43:46 localhost nova_compute_init[281584]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache already 42436:42436 Nov 26 04:43:46 localhost nova_compute_init[281584]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache to system_u:object_r:container_file_t:s0 Nov 26 04:43:46 localhost nova_compute_init[281584]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/ Nov 26 04:43:46 localhost nova_compute_init[281584]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache/python-entrypoints already 42436:42436 Nov 26 04:43:46 localhost nova_compute_init[281584]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache/python-entrypoints to system_u:object_r:container_file_t:s0 Nov 26 04:43:46 localhost nova_compute_init[281584]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/b234715fc878456b41e32c4fbc669b417044dbe6c6684bbc9059e5c93396ffea Nov 26 04:43:46 localhost nova_compute_init[281584]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/4143dbbec5b08621aa3c8eb364f8a7d3e97604e18b7ed41c4bab0da11ed561fd Nov 26 04:43:46 localhost nova_compute_init[281584]: INFO:nova_statedir:Nova statedir ownership complete Nov 26 04:43:46 localhost systemd[1]: libpod-3f032c1307c1383e9e9f0aa4849db71c945330a8cb483b4401270616e20b08e9.scope: Deactivated successfully. Nov 26 04:43:46 localhost podman[281585]: 2025-11-26 09:43:46.128027774 +0000 UTC m=+0.065045742 container died 3f032c1307c1383e9e9f0aa4849db71c945330a8cb483b4401270616e20b08e9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=nova_compute_init) Nov 26 04:43:46 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3f032c1307c1383e9e9f0aa4849db71c945330a8cb483b4401270616e20b08e9-userdata-shm.mount: Deactivated successfully. Nov 26 04:43:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:43:46 localhost podman[281596]: 2025-11-26 09:43:46.219644426 +0000 UTC m=+0.092159710 container cleanup 3f032c1307c1383e9e9f0aa4849db71c945330a8cb483b4401270616e20b08e9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, container_name=nova_compute_init, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 26 04:43:46 localhost systemd[1]: var-lib-containers-storage-overlay-1e67a08f7f89bb249239464cd2488dcb0276d30630b75fe760d996b7617d582f-merged.mount: Deactivated successfully. Nov 26 04:43:46 localhost systemd[1]: libpod-conmon-3f032c1307c1383e9e9f0aa4849db71c945330a8cb483b4401270616e20b08e9.scope: Deactivated successfully. Nov 26 04:43:46 localhost podman[281622]: 2025-11-26 09:43:46.321341471 +0000 UTC m=+0.085336460 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 26 04:43:46 localhost podman[281622]: 2025-11-26 09:43:46.366435815 +0000 UTC m=+0.130430794 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 26 04:43:46 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:43:46 localhost nova_compute[281415]: 2025-11-26 09:43:46.654 281419 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Nov 26 04:43:46 localhost nova_compute[281415]: 2025-11-26 09:43:46.655 281419 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Nov 26 04:43:46 localhost nova_compute[281415]: 2025-11-26 09:43:46.655 281419 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Nov 26 04:43:46 localhost nova_compute[281415]: 2025-11-26 09:43:46.655 281419 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m Nov 26 04:43:46 localhost nova_compute[281415]: 2025-11-26 09:43:46.811 281419 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:43:46 localhost nova_compute[281415]: 2025-11-26 09:43:46.838 281419 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:43:46 localhost nova_compute[281415]: 2025-11-26 09:43:46.839 281419 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.277 281419 INFO nova.virt.driver [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.398 281419 INFO nova.compute.provider_config [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.408 281419 DEBUG oslo_concurrency.lockutils [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.408 281419 DEBUG oslo_concurrency.lockutils [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.408 281419 DEBUG oslo_concurrency.lockutils [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.408 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.408 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.409 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.409 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.409 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.409 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.409 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] allow_resize_to_same_host = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.409 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] arq_binding_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.409 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] backdoor_port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.409 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] backdoor_socket = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.410 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] block_device_allocate_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.410 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.410 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cert = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.410 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] compute_driver = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.410 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] compute_monitors = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.410 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] config_dir = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.410 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] config_drive_format = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.411 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.411 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.411 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] console_host = np0005536118.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.411 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] control_exchange = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.411 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cpu_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.411 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.411 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.411 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.412 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] default_availability_zone = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.412 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] default_ephemeral_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.412 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.412 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] default_schedule_zone = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.412 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] disk_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.412 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] enable_new_services = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.412 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] enabled_apis = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.413 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] enabled_ssl_apis = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.413 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] flat_injected = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.413 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] force_config_drive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.413 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] force_raw_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.413 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.413 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.413 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] host = np0005536118.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.414 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] initial_cpu_allocation_ratio = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.414 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] initial_disk_allocation_ratio = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.414 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] initial_ram_allocation_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.414 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] injected_network_template = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.414 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] instance_build_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.414 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] instance_delete_interval = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.414 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.415 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] instance_name_template = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.415 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] instance_usage_audit = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.415 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] instance_usage_audit_period = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.415 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.415 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] instances_path = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.415 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.415 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.416 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] live_migration_retry_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.416 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.416 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.416 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.416 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.416 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.416 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.416 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.417 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] log_rotation_type = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.417 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.417 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.417 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.417 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.417 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.417 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] long_rpc_timeout = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.417 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] max_concurrent_builds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.418 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.418 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] max_concurrent_snapshots = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.418 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] max_local_block_devices = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.418 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] max_logfile_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.418 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] max_logfile_size_mb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.418 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.418 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] metadata_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.419 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] metadata_listen_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.419 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] metadata_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.419 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] migrate_max_retries = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.419 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] mkisofs_cmd = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.419 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] my_block_storage_ip = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.419 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] my_ip = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.419 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] network_allocate_retries = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.420 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.420 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] osapi_compute_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.420 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] osapi_compute_listen_port = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.420 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] osapi_compute_unique_server_name_scope = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.420 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] osapi_compute_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.420 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] password_length = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.420 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] periodic_enable = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.420 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] periodic_fuzzy_delay = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.421 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] pointer_model = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.421 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] preallocate_images = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.421 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.421 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] pybasedir = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.421 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] ram_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.421 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.421 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.422 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.422 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] reboot_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.422 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] reclaim_instance_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.422 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] record = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.422 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] reimage_timeout_per_gb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.422 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] report_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.422 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] rescue_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.422 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] reserved_host_cpus = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.423 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] reserved_host_disk_mb = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.423 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] reserved_host_memory_mb = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.423 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] reserved_huge_pages = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.423 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] resize_confirm_window = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.423 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] resize_fs_using_block_device = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.423 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.424 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] rootwrap_config = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.424 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] rpc_response_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.424 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] run_external_periodic_tasks = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.424 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.424 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.424 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.424 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.424 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] service_down_time = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.425 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] servicegroup_driver = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.425 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] shelved_offload_time = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.425 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] shelved_poll_interval = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.425 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.425 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] source_is_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.425 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] ssl_only = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.425 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] state_path = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.426 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] sync_power_state_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.426 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] sync_power_state_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.426 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.426 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] tempdir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.426 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] timeout_nbd = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.426 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.426 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] update_resources_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.426 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] use_cow_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.427 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.427 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.427 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.427 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] use_rootwrap_daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.427 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.427 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.427 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vcpu_pin_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.428 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vif_plugging_is_fatal = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.428 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vif_plugging_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.428 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] virt_mkfs = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.428 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] volume_usage_poll_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.428 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.428 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] web = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.428 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.428 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_concurrency.lock_path = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.429 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.429 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.429 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.429 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.429 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.429 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api.auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.429 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api.compute_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.430 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.430 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api.dhcp_domain = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.430 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api.enable_instance_password = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.430 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api.glance_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.430 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.430 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.430 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.431 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.431 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api.local_metadata_per_cell = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.431 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api.max_limit = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.431 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api.metadata_cache_expiration = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.431 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api.neutron_default_tenant_id = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.431 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api.use_forwarded_for = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.431 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api.use_neutron_default_nets = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.432 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.432 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.432 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.432 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api.vendordata_dynamic_ssl_certfile = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.432 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.432 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api.vendordata_jsonfile_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.432 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api.vendordata_providers = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.432 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cache.backend = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.433 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cache.backend_argument = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.433 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cache.config_prefix = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.433 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cache.dead_timeout = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.433 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cache.debug_cache_backend = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.433 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cache.enable_retry_client = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.433 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cache.enable_socket_keepalive = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.434 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cache.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.434 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cache.expiration_time = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.434 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.434 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cache.hashclient_retry_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.434 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cache.memcache_dead_retry = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.434 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cache.memcache_password = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.434 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.434 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.435 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cache.memcache_pool_maxsize = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.435 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.435 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cache.memcache_sasl_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.435 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cache.memcache_servers = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.435 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cache.memcache_socket_timeout = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.435 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cache.memcache_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.435 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cache.proxies = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.436 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cache.retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.436 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cache.retry_delay = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.436 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cache.socket_keepalive_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.436 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cache.socket_keepalive_idle = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.436 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.436 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cache.tls_allowed_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.436 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cache.tls_cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.437 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cache.tls_certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.437 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cache.tls_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.437 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cache.tls_keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.437 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cinder.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.437 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cinder.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.437 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cinder.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.437 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cinder.catalog_info = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.438 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cinder.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.438 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cinder.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.438 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cinder.cross_az_attach = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.438 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cinder.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.438 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cinder.endpoint_template = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.438 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cinder.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.438 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cinder.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.438 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cinder.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.439 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cinder.os_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.439 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cinder.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.439 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cinder.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.439 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.439 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] compute.cpu_dedicated_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.439 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] compute.cpu_shared_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.439 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.440 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.440 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.440 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.440 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.440 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.440 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.440 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.440 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.441 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] conductor.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.441 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] console.allowed_origins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.441 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] console.ssl_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.441 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] console.ssl_minimum_version = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.441 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] consoleauth.token_ttl = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.441 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cyborg.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.442 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cyborg.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.442 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cyborg.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.442 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cyborg.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.442 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cyborg.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.442 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cyborg.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.442 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cyborg.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.442 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cyborg.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.442 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cyborg.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.443 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cyborg.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.443 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cyborg.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.443 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cyborg.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.443 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cyborg.service_type = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.443 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cyborg.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.443 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cyborg.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.443 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.443 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cyborg.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.444 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cyborg.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.444 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] cyborg.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.444 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.444 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.444 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.444 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.444 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.445 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.445 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.445 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.445 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.445 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.445 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.445 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.446 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.446 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.446 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.446 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.446 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.446 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.446 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.447 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.447 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api_database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.447 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api_database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.447 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api_database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.447 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api_database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.447 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.447 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api_database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.448 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.448 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api_database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.448 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.448 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.448 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api_database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.448 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api_database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.448 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api_database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.448 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api_database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.449 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api_database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.449 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.449 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api_database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.449 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api_database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.449 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api_database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.449 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.449 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] devices.enabled_mdev_types = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.450 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.450 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.450 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.450 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] glance.api_servers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.450 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] glance.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.450 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] glance.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.450 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] glance.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.450 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] glance.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.451 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] glance.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.451 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] glance.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.451 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.451 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.451 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] glance.enable_rbd_download = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.451 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] glance.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.451 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] glance.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.452 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] glance.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.452 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] glance.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.452 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] glance.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.452 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] glance.num_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.452 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] glance.rbd_ceph_conf = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.452 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] glance.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.452 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] glance.rbd_pool = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.452 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] glance.rbd_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.453 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] glance.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.453 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] glance.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.453 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] glance.service_type = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.453 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] glance.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.453 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] glance.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.453 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.453 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] glance.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.454 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] glance.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.454 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.454 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] glance.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.454 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] guestfs.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.454 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] hyperv.config_drive_cdrom = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.454 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.454 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] hyperv.dynamic_memory_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.454 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.455 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] hyperv.enable_remotefx = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.455 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] hyperv.instances_path_share = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.455 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] hyperv.iscsi_initiator_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.455 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] hyperv.limit_cpu_features = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.455 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.455 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.455 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.456 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.456 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] hyperv.qemu_img_cmd = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.456 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] hyperv.use_multipath_io = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.456 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.456 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.456 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] hyperv.vswitch_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.456 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.457 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] mks.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.457 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] mks.mksproxy_base_url = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.457 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] image_cache.manager_interval = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.457 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.457 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.457 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.458 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.458 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] image_cache.subdirectory_name = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.458 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] ironic.api_max_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.458 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] ironic.api_retry_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.458 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.458 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.458 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.458 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.459 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.459 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.459 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.459 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.459 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.459 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.460 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.460 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.460 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] ironic.partition_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.460 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] ironic.peer_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.460 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.460 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.460 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.460 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] ironic.service_type = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.461 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.461 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.461 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.461 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.461 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] ironic.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.461 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.461 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] key_manager.backend = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.462 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] key_manager.fixed_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.462 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] barbican.auth_endpoint = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.462 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] barbican.barbican_api_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.462 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] barbican.barbican_endpoint = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.462 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.462 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] barbican.barbican_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.462 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] barbican.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.463 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] barbican.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.463 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] barbican.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.463 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] barbican.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.463 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] barbican.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.463 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] barbican.number_of_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.463 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] barbican.retry_delay = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.463 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.464 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] barbican.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.464 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] barbican.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.464 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] barbican.verify_ssl = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.464 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] barbican.verify_ssl_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.464 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.464 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.464 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] barbican_service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.465 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.465 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.465 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.465 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] barbican_service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.465 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.465 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] barbican_service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.465 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vault.approle_role_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.465 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vault.approle_secret_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.466 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vault.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.466 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vault.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.466 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vault.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.466 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vault.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.466 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vault.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.466 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vault.kv_mountpoint = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.466 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vault.kv_version = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.467 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vault.namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.467 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vault.root_token_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.467 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vault.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.467 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vault.ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.467 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vault.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.467 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vault.use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.467 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vault.vault_url = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.468 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] keystone.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.468 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] keystone.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.468 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] keystone.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.468 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] keystone.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.468 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] keystone.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.468 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] keystone.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.468 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] keystone.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.469 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] keystone.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.469 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] keystone.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.469 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] keystone.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.469 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] keystone.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.469 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] keystone.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.469 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] keystone.service_type = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.469 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] keystone.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.469 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] keystone.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.470 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.470 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] keystone.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.470 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] keystone.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.470 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] keystone.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.470 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.connection_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.470 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.cpu_mode = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.470 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.cpu_model_extra_flags = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.471 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.cpu_models = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.471 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.471 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.471 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.cpu_power_management = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.471 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.471 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.471 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.device_detach_timeout = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.472 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.disk_cachemodes = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.472 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.disk_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.472 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.enabled_perf_events = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.472 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.file_backed_memory = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.472 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.gid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.472 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.hw_disk_discard = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.472 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.hw_machine_type = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.473 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.images_rbd_ceph_conf = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.473 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.473 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.473 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.473 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.images_rbd_pool = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.473 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.images_type = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.473 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.images_volume_group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.474 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.inject_key = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.474 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.inject_partition = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.474 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.474 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.iscsi_iface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.474 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.iser_use_multipath = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.474 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.474 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.475 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.475 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.475 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.475 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.475 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.475 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.475 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.live_migration_scheme = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.476 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.476 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.476 281419 WARNING oslo_config.cfg [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Nov 26 04:43:47 localhost nova_compute[281415]: live_migration_uri is deprecated for removal in favor of two other options that Nov 26 04:43:47 localhost nova_compute[281415]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Nov 26 04:43:47 localhost nova_compute[281415]: and ``live_migration_inbound_addr`` respectively. Nov 26 04:43:47 localhost nova_compute[281415]: ). Its value may be silently ignored in the future.#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.476 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.live_migration_uri = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.476 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.476 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.max_queues = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.477 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.477 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.nfs_mount_options = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.477 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.nfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.477 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.477 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.num_iser_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.477 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.477 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.478 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.num_pcie_ports = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.478 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.num_volume_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.478 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.pmem_namespaces = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.478 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.quobyte_client_cfg = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.478 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.478 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.478 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.479 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.479 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.rbd_secret_uuid = 0d5e5e6d-3c4b-5efe-8c65-346ae6715606 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.479 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.rbd_user = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.479 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.479 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.479 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.rescue_image_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.479 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.rescue_kernel_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.480 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.rescue_ramdisk_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.480 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.rng_dev_path = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.480 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.rx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.480 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.smbfs_mount_options = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.480 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.480 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.snapshot_compression = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.480 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.snapshot_image_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.481 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.snapshots_directory = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.481 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.481 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.swtpm_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.481 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.swtpm_group = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.481 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.swtpm_user = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.481 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.sysinfo_serial = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.481 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.tx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.482 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.uid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.482 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.482 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.virt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.482 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.volume_clear = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.482 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.volume_clear_size = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.482 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.volume_use_multipath = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.482 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.vzstorage_cache_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.483 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.483 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.vzstorage_mount_group = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.483 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.vzstorage_mount_opts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.483 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.vzstorage_mount_perms = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.483 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.483 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.vzstorage_mount_user = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.483 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.484 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] neutron.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.484 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] neutron.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.484 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] neutron.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.484 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] neutron.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.484 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] neutron.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.484 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] neutron.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.484 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] neutron.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.485 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] neutron.default_floating_pool = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.485 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] neutron.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.485 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.485 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] neutron.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.485 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] neutron.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.485 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] neutron.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.485 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] neutron.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.486 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.486 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] neutron.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.486 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] neutron.ovs_bridge = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.486 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] neutron.physnets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.486 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] neutron.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.486 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.487 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] neutron.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.487 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] neutron.service_type = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.487 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] neutron.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.487 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] neutron.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.487 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.487 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] neutron.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.487 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] neutron.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.488 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] neutron.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.488 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.488 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] notifications.default_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.488 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.488 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.488 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.489 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] pci.alias = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.489 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] pci.device_spec = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.489 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] pci.report_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.489 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.489 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] placement.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.489 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] placement.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.489 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.490 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.490 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.490 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] placement.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.490 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] placement.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.490 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] placement.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.490 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] placement.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.490 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] placement.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.491 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] placement.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.491 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] placement.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.491 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.491 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.491 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] placement.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.491 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] placement.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.491 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] placement.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.491 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] placement.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.492 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] placement.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.492 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] placement.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.492 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] placement.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.492 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] placement.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.492 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] placement.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.492 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] placement.service_type = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.492 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.493 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] placement.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.493 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.493 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] placement.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.493 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.493 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] placement.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.493 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] placement.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.493 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] placement.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.494 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] placement.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.494 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] placement.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.494 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] placement.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.494 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] placement.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.494 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] quota.cores = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.494 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.494 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] quota.driver = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.495 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.495 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.495 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] quota.injected_files = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.495 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] quota.instances = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.495 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] quota.key_pairs = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.495 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] quota.metadata_items = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.495 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] quota.ram = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.496 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] quota.recheck_quota = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.496 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] quota.server_group_members = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.496 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] quota.server_groups = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.496 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] rdp.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.496 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.496 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.497 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.497 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.497 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.497 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] scheduler.max_attempts = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.497 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.497 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.497 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.498 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.498 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.498 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] scheduler.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.498 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.498 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.498 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.498 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.499 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.499 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.499 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.499 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.499 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.499 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.499 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.500 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.500 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.500 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.500 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.500 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.500 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.501 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.501 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.501 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.501 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.501 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.501 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.502 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.502 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] metrics.required = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.502 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] metrics.weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.502 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] metrics.weight_of_unavailable = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.502 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] metrics.weight_setting = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.502 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] serial_console.base_url = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.502 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] serial_console.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.503 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] serial_console.port_range = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.503 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.503 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.503 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.503 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.503 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] service_user.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.503 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.504 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.504 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.504 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.504 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.504 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.504 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.504 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.505 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] spice.agent_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.505 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] spice.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.505 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.505 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] spice.html5proxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.505 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] spice.html5proxy_port = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.505 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] spice.image_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.506 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] spice.jpeg_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.506 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] spice.playback_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.506 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] spice.server_listen = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.506 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.506 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] spice.streaming_mode = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.506 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] spice.zlib_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.506 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] upgrade_levels.baseapi = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.507 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] upgrade_levels.cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.507 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] upgrade_levels.compute = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.507 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] upgrade_levels.conductor = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.507 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] upgrade_levels.scheduler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.507 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.507 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.507 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.508 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.508 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.508 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.508 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.508 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.508 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.508 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.508 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.509 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vmware.cache_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.509 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vmware.cluster_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.509 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vmware.connection_pool_size = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.509 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vmware.console_delay_seconds = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.509 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vmware.datastore_regex = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.509 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vmware.host_ip = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.509 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.510 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.510 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vmware.host_username = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.510 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.510 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vmware.integration_bridge = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.510 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vmware.maximum_objects = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.510 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vmware.pbm_default_policy = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.510 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vmware.pbm_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.510 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vmware.pbm_wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.511 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vmware.serial_log_dir = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.511 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vmware.serial_port_proxy_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.511 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.511 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.511 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vmware.use_linked_clone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.511 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vmware.vnc_keymap = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.511 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vmware.vnc_port = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.512 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vmware.vnc_port_total = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.512 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vnc.auth_schemes = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.512 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vnc.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.512 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vnc.novncproxy_base_url = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.512 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vnc.novncproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.512 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vnc.novncproxy_port = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.513 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vnc.server_listen = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.513 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vnc.server_proxyclient_address = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.513 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vnc.vencrypt_ca_certs = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.513 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vnc.vencrypt_client_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.513 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vnc.vencrypt_client_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.513 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.514 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.514 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.514 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.514 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.514 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] workarounds.disable_rootwrap = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.514 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.514 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.514 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.515 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.515 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.515 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.515 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.515 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.515 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.516 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.516 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.516 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.516 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.516 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.516 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.516 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] wsgi.api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.517 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] wsgi.client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.517 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] wsgi.default_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.517 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] wsgi.keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.517 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] wsgi.max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.517 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] wsgi.secure_proxy_ssl_header = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.517 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] wsgi.ssl_ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.517 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] wsgi.ssl_cert_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.518 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] wsgi.ssl_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.518 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] wsgi.tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.518 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.518 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] zvm.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.518 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] zvm.cloud_connector_url = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.518 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] zvm.image_tmp_path = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.518 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] zvm.reachable_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.519 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.519 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_policy.enforce_scope = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.519 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.519 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.519 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.519 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.519 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.520 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.520 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.520 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.520 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.520 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.520 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] remote_debug.host = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.520 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] remote_debug.port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.521 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.521 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.521 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.521 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.521 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.521 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.522 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.522 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.522 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.522 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.522 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.522 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.522 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.523 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.523 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.523 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.523 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.523 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.523 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.523 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.524 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.524 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.524 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.524 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.524 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.524 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.524 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.525 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.525 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.525 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.525 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.525 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.525 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.525 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.526 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.526 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_limit.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.526 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_limit.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.526 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_limit.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.526 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_limit.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.526 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_limit.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.526 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_limit.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.527 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_limit.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.527 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.527 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_limit.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.527 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.527 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_limit.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.527 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_limit.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.527 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_limit.endpoint_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.528 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_limit.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.528 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_limit.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.528 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_limit.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.528 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_limit.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.528 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_limit.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.528 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_limit.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.528 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_limit.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.528 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.529 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_limit.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.529 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_limit.project_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.529 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_limit.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.529 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_limit.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.529 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_limit.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.529 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_limit.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.529 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.530 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.530 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_limit.system_scope = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.530 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_limit.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.530 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_limit.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.530 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_limit.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.530 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_limit.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.531 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_limit.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.531 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_limit.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.531 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_limit.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.531 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_limit.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.531 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.531 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.532 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] oslo_reports.log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.532 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.532 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.532 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.532 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.532 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.532 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.533 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.533 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vif_plug_ovs_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.533 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.533 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.533 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.533 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] vif_plug_ovs_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.534 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.534 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.534 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.534 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.534 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] os_vif_linux_bridge.iptables_top_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.534 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.534 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] os_vif_linux_bridge.use_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.535 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.535 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] os_vif_ovs.isolate_vif = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.535 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] os_vif_ovs.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.535 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] os_vif_ovs.ovs_vsctl_timeout = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.535 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.535 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] os_vif_ovs.ovsdb_interface = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.535 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] os_vif_ovs.per_port_bridge = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.536 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] os_brick.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.536 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.536 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.536 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] privsep_osbrick.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.536 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] privsep_osbrick.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.536 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.536 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] privsep_osbrick.logger_name = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.536 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.537 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] privsep_osbrick.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.537 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost systemd[1]: session-60.scope: Deactivated successfully. Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.537 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] nova_sys_admin.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.537 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] nova_sys_admin.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost systemd[1]: session-60.scope: Consumed 1min 37.186s CPU time. Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.537 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] nova_sys_admin.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.537 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.538 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] nova_sys_admin.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.538 281419 DEBUG oslo_service.service [None req-014540dc-2961-4a83-83cc-b223116750f6 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.538 281419 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m Nov 26 04:43:47 localhost systemd-logind[761]: Session 60 logged out. Waiting for processes to exit. Nov 26 04:43:47 localhost systemd-logind[761]: Removed session 60. Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.552 281419 INFO nova.virt.node [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Determined node identity 05276789-7461-410b-9529-16f5185a8bff from /var/lib/nova/compute_id#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.553 281419 DEBUG nova.virt.libvirt.host [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.553 281419 DEBUG nova.virt.libvirt.host [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.554 281419 DEBUG nova.virt.libvirt.host [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.554 281419 DEBUG nova.virt.libvirt.host [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.572 281419 DEBUG nova.virt.libvirt.host [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Registering for lifecycle events _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.575 281419 DEBUG nova.virt.libvirt.host [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Registering for connection events: _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.576 281419 INFO nova.virt.libvirt.driver [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Connection event '1' reason 'None'#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.583 281419 INFO nova.virt.libvirt.host [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Libvirt host capabilities Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: 54d67e25-3d53-4e7f-ba95-c2d307a21761 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: x86_64 Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Rome-v4 Nov 26 04:43:47 localhost nova_compute[281415]: AMD Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: tcp Nov 26 04:43:47 localhost nova_compute[281415]: rdma Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: 16116612 Nov 26 04:43:47 localhost nova_compute[281415]: 4029153 Nov 26 04:43:47 localhost nova_compute[281415]: 0 Nov 26 04:43:47 localhost nova_compute[281415]: 0 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: selinux Nov 26 04:43:47 localhost nova_compute[281415]: 0 Nov 26 04:43:47 localhost nova_compute[281415]: system_u:system_r:svirt_t:s0 Nov 26 04:43:47 localhost nova_compute[281415]: system_u:system_r:svirt_tcg_t:s0 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: dac Nov 26 04:43:47 localhost nova_compute[281415]: 0 Nov 26 04:43:47 localhost nova_compute[281415]: +107:+107 Nov 26 04:43:47 localhost nova_compute[281415]: +107:+107 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: hvm Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: 32 Nov 26 04:43:47 localhost nova_compute[281415]: /usr/libexec/qemu-kvm Nov 26 04:43:47 localhost nova_compute[281415]: pc-i440fx-rhel7.6.0 Nov 26 04:43:47 localhost nova_compute[281415]: pc Nov 26 04:43:47 localhost nova_compute[281415]: pc-q35-rhel9.8.0 Nov 26 04:43:47 localhost nova_compute[281415]: q35 Nov 26 04:43:47 localhost nova_compute[281415]: pc-q35-rhel9.6.0 Nov 26 04:43:47 localhost nova_compute[281415]: pc-q35-rhel8.6.0 Nov 26 04:43:47 localhost nova_compute[281415]: pc-q35-rhel9.4.0 Nov 26 04:43:47 localhost nova_compute[281415]: pc-q35-rhel8.5.0 Nov 26 04:43:47 localhost nova_compute[281415]: pc-q35-rhel8.3.0 Nov 26 04:43:47 localhost nova_compute[281415]: pc-q35-rhel7.6.0 Nov 26 04:43:47 localhost nova_compute[281415]: pc-q35-rhel8.4.0 Nov 26 04:43:47 localhost nova_compute[281415]: pc-q35-rhel9.2.0 Nov 26 04:43:47 localhost nova_compute[281415]: pc-q35-rhel8.2.0 Nov 26 04:43:47 localhost nova_compute[281415]: pc-q35-rhel9.0.0 Nov 26 04:43:47 localhost nova_compute[281415]: pc-q35-rhel8.0.0 Nov 26 04:43:47 localhost nova_compute[281415]: pc-q35-rhel8.1.0 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: hvm Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: 64 Nov 26 04:43:47 localhost nova_compute[281415]: /usr/libexec/qemu-kvm Nov 26 04:43:47 localhost nova_compute[281415]: pc-i440fx-rhel7.6.0 Nov 26 04:43:47 localhost nova_compute[281415]: pc Nov 26 04:43:47 localhost nova_compute[281415]: pc-q35-rhel9.8.0 Nov 26 04:43:47 localhost nova_compute[281415]: q35 Nov 26 04:43:47 localhost nova_compute[281415]: pc-q35-rhel9.6.0 Nov 26 04:43:47 localhost nova_compute[281415]: pc-q35-rhel8.6.0 Nov 26 04:43:47 localhost nova_compute[281415]: pc-q35-rhel9.4.0 Nov 26 04:43:47 localhost nova_compute[281415]: pc-q35-rhel8.5.0 Nov 26 04:43:47 localhost nova_compute[281415]: pc-q35-rhel8.3.0 Nov 26 04:43:47 localhost nova_compute[281415]: pc-q35-rhel7.6.0 Nov 26 04:43:47 localhost nova_compute[281415]: pc-q35-rhel8.4.0 Nov 26 04:43:47 localhost nova_compute[281415]: pc-q35-rhel9.2.0 Nov 26 04:43:47 localhost nova_compute[281415]: pc-q35-rhel8.2.0 Nov 26 04:43:47 localhost nova_compute[281415]: pc-q35-rhel9.0.0 Nov 26 04:43:47 localhost nova_compute[281415]: pc-q35-rhel8.0.0 Nov 26 04:43:47 localhost nova_compute[281415]: pc-q35-rhel8.1.0 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: #033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.589 281419 DEBUG nova.virt.libvirt.volume.mount [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.591 281419 DEBUG nova.virt.libvirt.host [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.599 281419 DEBUG nova.virt.libvirt.host [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: /usr/libexec/qemu-kvm Nov 26 04:43:47 localhost nova_compute[281415]: kvm Nov 26 04:43:47 localhost nova_compute[281415]: pc-i440fx-rhel7.6.0 Nov 26 04:43:47 localhost nova_compute[281415]: i686 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: /usr/share/OVMF/OVMF_CODE.secboot.fd Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: rom Nov 26 04:43:47 localhost nova_compute[281415]: pflash Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: yes Nov 26 04:43:47 localhost nova_compute[281415]: no Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: no Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: on Nov 26 04:43:47 localhost nova_compute[281415]: off Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: on Nov 26 04:43:47 localhost nova_compute[281415]: off Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Rome Nov 26 04:43:47 localhost nova_compute[281415]: AMD Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: 486 Nov 26 04:43:47 localhost nova_compute[281415]: 486-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Broadwell Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Broadwell-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Broadwell-noTSX Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Broadwell-noTSX-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Broadwell-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Broadwell-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Broadwell-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Broadwell-v4 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Cascadelake-Server Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Cascadelake-Server-noTSX Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Cascadelake-Server-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Cascadelake-Server-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Cascadelake-Server-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Cascadelake-Server-v4 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Cascadelake-Server-v5 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Conroe Nov 26 04:43:47 localhost nova_compute[281415]: Conroe-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Cooperlake Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Cooperlake-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Cooperlake-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Denverton Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Denverton-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Denverton-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Denverton-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Dhyana Nov 26 04:43:47 localhost nova_compute[281415]: Dhyana-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Dhyana-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Genoa Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Genoa-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-IBPB Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Milan Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Milan-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Milan-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Rome Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Rome-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Rome-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Rome-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Rome-v4 Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-v1 Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-v2 Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-v4 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: GraniteRapids Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: GraniteRapids-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: GraniteRapids-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Haswell Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Haswell-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Haswell-noTSX Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Haswell-noTSX-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Haswell-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Haswell-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Haswell-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Haswell-v4 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Icelake-Server Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Icelake-Server-noTSX Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Icelake-Server-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Icelake-Server-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Icelake-Server-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Icelake-Server-v4 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Icelake-Server-v5 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Icelake-Server-v6 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Icelake-Server-v7 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: IvyBridge Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: IvyBridge-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: IvyBridge-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: IvyBridge-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: KnightsMill Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: KnightsMill-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nehalem Nov 26 04:43:47 localhost nova_compute[281415]: Nehalem-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nehalem-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nehalem-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G1 Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G1-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G2 Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G2-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G3 Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G3-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G4 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G4-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G5 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G5-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Penryn Nov 26 04:43:47 localhost nova_compute[281415]: Penryn-v1 Nov 26 04:43:47 localhost nova_compute[281415]: SandyBridge Nov 26 04:43:47 localhost nova_compute[281415]: SandyBridge-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: SandyBridge-v1 Nov 26 04:43:47 localhost nova_compute[281415]: SandyBridge-v2 Nov 26 04:43:47 localhost nova_compute[281415]: SapphireRapids Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: SapphireRapids-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: SapphireRapids-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: SapphireRapids-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: SierraForest Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: SierraForest-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Client Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Client-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Client-noTSX-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Client-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Client-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Client-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Client-v4 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Server Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Server-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Server-noTSX-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Server-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Server-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Server-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Server-v4 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Server-v5 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Snowridge Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Snowridge-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Snowridge-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Snowridge-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Snowridge-v4 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Westmere Nov 26 04:43:47 localhost nova_compute[281415]: Westmere-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Westmere-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Westmere-v2 Nov 26 04:43:47 localhost nova_compute[281415]: athlon Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: athlon-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: core2duo Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: core2duo-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: coreduo Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: coreduo-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: kvm32 Nov 26 04:43:47 localhost nova_compute[281415]: kvm32-v1 Nov 26 04:43:47 localhost nova_compute[281415]: kvm64 Nov 26 04:43:47 localhost nova_compute[281415]: kvm64-v1 Nov 26 04:43:47 localhost nova_compute[281415]: n270 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: n270-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: pentium Nov 26 04:43:47 localhost nova_compute[281415]: pentium-v1 Nov 26 04:43:47 localhost nova_compute[281415]: pentium2 Nov 26 04:43:47 localhost nova_compute[281415]: pentium2-v1 Nov 26 04:43:47 localhost nova_compute[281415]: pentium3 Nov 26 04:43:47 localhost nova_compute[281415]: pentium3-v1 Nov 26 04:43:47 localhost nova_compute[281415]: phenom Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: phenom-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: qemu32 Nov 26 04:43:47 localhost nova_compute[281415]: qemu32-v1 Nov 26 04:43:47 localhost nova_compute[281415]: qemu64 Nov 26 04:43:47 localhost nova_compute[281415]: qemu64-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: file Nov 26 04:43:47 localhost nova_compute[281415]: anonymous Nov 26 04:43:47 localhost nova_compute[281415]: memfd Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: disk Nov 26 04:43:47 localhost nova_compute[281415]: cdrom Nov 26 04:43:47 localhost nova_compute[281415]: floppy Nov 26 04:43:47 localhost nova_compute[281415]: lun Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: ide Nov 26 04:43:47 localhost nova_compute[281415]: fdc Nov 26 04:43:47 localhost nova_compute[281415]: scsi Nov 26 04:43:47 localhost nova_compute[281415]: virtio Nov 26 04:43:47 localhost nova_compute[281415]: usb Nov 26 04:43:47 localhost nova_compute[281415]: sata Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: virtio Nov 26 04:43:47 localhost nova_compute[281415]: virtio-transitional Nov 26 04:43:47 localhost nova_compute[281415]: virtio-non-transitional Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: vnc Nov 26 04:43:47 localhost nova_compute[281415]: egl-headless Nov 26 04:43:47 localhost nova_compute[281415]: dbus Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: subsystem Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: default Nov 26 04:43:47 localhost nova_compute[281415]: mandatory Nov 26 04:43:47 localhost nova_compute[281415]: requisite Nov 26 04:43:47 localhost nova_compute[281415]: optional Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: usb Nov 26 04:43:47 localhost nova_compute[281415]: pci Nov 26 04:43:47 localhost nova_compute[281415]: scsi Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: virtio Nov 26 04:43:47 localhost nova_compute[281415]: virtio-transitional Nov 26 04:43:47 localhost nova_compute[281415]: virtio-non-transitional Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: random Nov 26 04:43:47 localhost nova_compute[281415]: egd Nov 26 04:43:47 localhost nova_compute[281415]: builtin Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: path Nov 26 04:43:47 localhost nova_compute[281415]: handle Nov 26 04:43:47 localhost nova_compute[281415]: virtiofs Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: tpm-tis Nov 26 04:43:47 localhost nova_compute[281415]: tpm-crb Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: emulator Nov 26 04:43:47 localhost nova_compute[281415]: external Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: 2.0 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: usb Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: pty Nov 26 04:43:47 localhost nova_compute[281415]: unix Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: qemu Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: builtin Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: default Nov 26 04:43:47 localhost nova_compute[281415]: passt Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: isa Nov 26 04:43:47 localhost nova_compute[281415]: hyperv Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: null Nov 26 04:43:47 localhost nova_compute[281415]: vc Nov 26 04:43:47 localhost nova_compute[281415]: pty Nov 26 04:43:47 localhost nova_compute[281415]: dev Nov 26 04:43:47 localhost nova_compute[281415]: file Nov 26 04:43:47 localhost nova_compute[281415]: pipe Nov 26 04:43:47 localhost nova_compute[281415]: stdio Nov 26 04:43:47 localhost nova_compute[281415]: udp Nov 26 04:43:47 localhost nova_compute[281415]: tcp Nov 26 04:43:47 localhost nova_compute[281415]: unix Nov 26 04:43:47 localhost nova_compute[281415]: qemu-vdagent Nov 26 04:43:47 localhost nova_compute[281415]: dbus Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: relaxed Nov 26 04:43:47 localhost nova_compute[281415]: vapic Nov 26 04:43:47 localhost nova_compute[281415]: spinlocks Nov 26 04:43:47 localhost nova_compute[281415]: vpindex Nov 26 04:43:47 localhost nova_compute[281415]: runtime Nov 26 04:43:47 localhost nova_compute[281415]: synic Nov 26 04:43:47 localhost nova_compute[281415]: stimer Nov 26 04:43:47 localhost nova_compute[281415]: reset Nov 26 04:43:47 localhost nova_compute[281415]: vendor_id Nov 26 04:43:47 localhost nova_compute[281415]: frequencies Nov 26 04:43:47 localhost nova_compute[281415]: reenlightenment Nov 26 04:43:47 localhost nova_compute[281415]: tlbflush Nov 26 04:43:47 localhost nova_compute[281415]: ipi Nov 26 04:43:47 localhost nova_compute[281415]: avic Nov 26 04:43:47 localhost nova_compute[281415]: emsr_bitmap Nov 26 04:43:47 localhost nova_compute[281415]: xmm_input Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: 4095 Nov 26 04:43:47 localhost nova_compute[281415]: on Nov 26 04:43:47 localhost nova_compute[281415]: off Nov 26 04:43:47 localhost nova_compute[281415]: off Nov 26 04:43:47 localhost nova_compute[281415]: Linux KVM Hv Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: tdx Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.605 281419 DEBUG nova.virt.libvirt.host [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: /usr/libexec/qemu-kvm Nov 26 04:43:47 localhost nova_compute[281415]: kvm Nov 26 04:43:47 localhost nova_compute[281415]: pc-q35-rhel9.8.0 Nov 26 04:43:47 localhost nova_compute[281415]: i686 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: /usr/share/OVMF/OVMF_CODE.secboot.fd Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: rom Nov 26 04:43:47 localhost nova_compute[281415]: pflash Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: yes Nov 26 04:43:47 localhost nova_compute[281415]: no Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: no Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: on Nov 26 04:43:47 localhost nova_compute[281415]: off Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: on Nov 26 04:43:47 localhost nova_compute[281415]: off Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Rome Nov 26 04:43:47 localhost nova_compute[281415]: AMD Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: 486 Nov 26 04:43:47 localhost nova_compute[281415]: 486-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Broadwell Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Broadwell-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Broadwell-noTSX Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Broadwell-noTSX-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Broadwell-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Broadwell-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Broadwell-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Broadwell-v4 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Cascadelake-Server Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Cascadelake-Server-noTSX Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Cascadelake-Server-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Cascadelake-Server-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Cascadelake-Server-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Cascadelake-Server-v4 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Cascadelake-Server-v5 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Conroe Nov 26 04:43:47 localhost nova_compute[281415]: Conroe-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Cooperlake Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Cooperlake-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Cooperlake-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Denverton Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Denverton-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Denverton-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Denverton-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Dhyana Nov 26 04:43:47 localhost nova_compute[281415]: Dhyana-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Dhyana-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Genoa Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Genoa-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-IBPB Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Milan Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Milan-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Milan-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Rome Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Rome-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Rome-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Rome-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Rome-v4 Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-v1 Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-v2 Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-v4 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: GraniteRapids Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: GraniteRapids-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: GraniteRapids-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Haswell Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Haswell-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Haswell-noTSX Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Haswell-noTSX-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Haswell-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Haswell-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Haswell-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Haswell-v4 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Icelake-Server Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Icelake-Server-noTSX Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Icelake-Server-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Icelake-Server-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Icelake-Server-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Icelake-Server-v4 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Icelake-Server-v5 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Icelake-Server-v6 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Icelake-Server-v7 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: IvyBridge Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: IvyBridge-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: IvyBridge-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: IvyBridge-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: KnightsMill Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: KnightsMill-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nehalem Nov 26 04:43:47 localhost nova_compute[281415]: Nehalem-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nehalem-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nehalem-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G1 Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G1-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G2 Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G2-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G3 Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G3-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G4 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G4-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G5 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G5-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Penryn Nov 26 04:43:47 localhost nova_compute[281415]: Penryn-v1 Nov 26 04:43:47 localhost nova_compute[281415]: SandyBridge Nov 26 04:43:47 localhost nova_compute[281415]: SandyBridge-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: SandyBridge-v1 Nov 26 04:43:47 localhost nova_compute[281415]: SandyBridge-v2 Nov 26 04:43:47 localhost nova_compute[281415]: SapphireRapids Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: SapphireRapids-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: SapphireRapids-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: SapphireRapids-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: SierraForest Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: SierraForest-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Client Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Client-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Client-noTSX-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Client-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Client-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Client-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Client-v4 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Server Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Server-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Server-noTSX-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Server-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Server-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Server-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Server-v4 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Server-v5 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Snowridge Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Snowridge-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Snowridge-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Snowridge-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Snowridge-v4 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Westmere Nov 26 04:43:47 localhost nova_compute[281415]: Westmere-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Westmere-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Westmere-v2 Nov 26 04:43:47 localhost nova_compute[281415]: athlon Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: athlon-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: core2duo Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: core2duo-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: coreduo Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: coreduo-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: kvm32 Nov 26 04:43:47 localhost nova_compute[281415]: kvm32-v1 Nov 26 04:43:47 localhost nova_compute[281415]: kvm64 Nov 26 04:43:47 localhost nova_compute[281415]: kvm64-v1 Nov 26 04:43:47 localhost nova_compute[281415]: n270 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: n270-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: pentium Nov 26 04:43:47 localhost nova_compute[281415]: pentium-v1 Nov 26 04:43:47 localhost nova_compute[281415]: pentium2 Nov 26 04:43:47 localhost nova_compute[281415]: pentium2-v1 Nov 26 04:43:47 localhost nova_compute[281415]: pentium3 Nov 26 04:43:47 localhost nova_compute[281415]: pentium3-v1 Nov 26 04:43:47 localhost nova_compute[281415]: phenom Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: phenom-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: qemu32 Nov 26 04:43:47 localhost nova_compute[281415]: qemu32-v1 Nov 26 04:43:47 localhost nova_compute[281415]: qemu64 Nov 26 04:43:47 localhost nova_compute[281415]: qemu64-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: file Nov 26 04:43:47 localhost nova_compute[281415]: anonymous Nov 26 04:43:47 localhost nova_compute[281415]: memfd Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: disk Nov 26 04:43:47 localhost nova_compute[281415]: cdrom Nov 26 04:43:47 localhost nova_compute[281415]: floppy Nov 26 04:43:47 localhost nova_compute[281415]: lun Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: fdc Nov 26 04:43:47 localhost nova_compute[281415]: scsi Nov 26 04:43:47 localhost nova_compute[281415]: virtio Nov 26 04:43:47 localhost nova_compute[281415]: usb Nov 26 04:43:47 localhost nova_compute[281415]: sata Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: virtio Nov 26 04:43:47 localhost nova_compute[281415]: virtio-transitional Nov 26 04:43:47 localhost nova_compute[281415]: virtio-non-transitional Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: vnc Nov 26 04:43:47 localhost nova_compute[281415]: egl-headless Nov 26 04:43:47 localhost nova_compute[281415]: dbus Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: subsystem Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: default Nov 26 04:43:47 localhost nova_compute[281415]: mandatory Nov 26 04:43:47 localhost nova_compute[281415]: requisite Nov 26 04:43:47 localhost nova_compute[281415]: optional Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: usb Nov 26 04:43:47 localhost nova_compute[281415]: pci Nov 26 04:43:47 localhost nova_compute[281415]: scsi Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: virtio Nov 26 04:43:47 localhost nova_compute[281415]: virtio-transitional Nov 26 04:43:47 localhost nova_compute[281415]: virtio-non-transitional Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: random Nov 26 04:43:47 localhost nova_compute[281415]: egd Nov 26 04:43:47 localhost nova_compute[281415]: builtin Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: path Nov 26 04:43:47 localhost nova_compute[281415]: handle Nov 26 04:43:47 localhost nova_compute[281415]: virtiofs Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: tpm-tis Nov 26 04:43:47 localhost nova_compute[281415]: tpm-crb Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: emulator Nov 26 04:43:47 localhost nova_compute[281415]: external Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: 2.0 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: usb Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: pty Nov 26 04:43:47 localhost nova_compute[281415]: unix Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: qemu Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: builtin Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: default Nov 26 04:43:47 localhost nova_compute[281415]: passt Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: isa Nov 26 04:43:47 localhost nova_compute[281415]: hyperv Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: null Nov 26 04:43:47 localhost nova_compute[281415]: vc Nov 26 04:43:47 localhost nova_compute[281415]: pty Nov 26 04:43:47 localhost nova_compute[281415]: dev Nov 26 04:43:47 localhost nova_compute[281415]: file Nov 26 04:43:47 localhost nova_compute[281415]: pipe Nov 26 04:43:47 localhost nova_compute[281415]: stdio Nov 26 04:43:47 localhost nova_compute[281415]: udp Nov 26 04:43:47 localhost nova_compute[281415]: tcp Nov 26 04:43:47 localhost nova_compute[281415]: unix Nov 26 04:43:47 localhost nova_compute[281415]: qemu-vdagent Nov 26 04:43:47 localhost nova_compute[281415]: dbus Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: relaxed Nov 26 04:43:47 localhost nova_compute[281415]: vapic Nov 26 04:43:47 localhost nova_compute[281415]: spinlocks Nov 26 04:43:47 localhost nova_compute[281415]: vpindex Nov 26 04:43:47 localhost nova_compute[281415]: runtime Nov 26 04:43:47 localhost nova_compute[281415]: synic Nov 26 04:43:47 localhost nova_compute[281415]: stimer Nov 26 04:43:47 localhost nova_compute[281415]: reset Nov 26 04:43:47 localhost nova_compute[281415]: vendor_id Nov 26 04:43:47 localhost nova_compute[281415]: frequencies Nov 26 04:43:47 localhost nova_compute[281415]: reenlightenment Nov 26 04:43:47 localhost nova_compute[281415]: tlbflush Nov 26 04:43:47 localhost nova_compute[281415]: ipi Nov 26 04:43:47 localhost nova_compute[281415]: avic Nov 26 04:43:47 localhost nova_compute[281415]: emsr_bitmap Nov 26 04:43:47 localhost nova_compute[281415]: xmm_input Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: 4095 Nov 26 04:43:47 localhost nova_compute[281415]: on Nov 26 04:43:47 localhost nova_compute[281415]: off Nov 26 04:43:47 localhost nova_compute[281415]: off Nov 26 04:43:47 localhost nova_compute[281415]: Linux KVM Hv Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: tdx Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.655 281419 DEBUG nova.virt.libvirt.host [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.660 281419 DEBUG nova.virt.libvirt.host [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: /usr/libexec/qemu-kvm Nov 26 04:43:47 localhost nova_compute[281415]: kvm Nov 26 04:43:47 localhost nova_compute[281415]: pc-q35-rhel9.8.0 Nov 26 04:43:47 localhost nova_compute[281415]: x86_64 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: efi Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: /usr/share/edk2/ovmf/OVMF_CODE.secboot.fd Nov 26 04:43:47 localhost nova_compute[281415]: /usr/share/edk2/ovmf/OVMF_CODE.fd Nov 26 04:43:47 localhost nova_compute[281415]: /usr/share/edk2/ovmf/OVMF.amdsev.fd Nov 26 04:43:47 localhost nova_compute[281415]: /usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: rom Nov 26 04:43:47 localhost nova_compute[281415]: pflash Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: yes Nov 26 04:43:47 localhost nova_compute[281415]: no Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: yes Nov 26 04:43:47 localhost nova_compute[281415]: no Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: on Nov 26 04:43:47 localhost nova_compute[281415]: off Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: on Nov 26 04:43:47 localhost nova_compute[281415]: off Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Rome Nov 26 04:43:47 localhost nova_compute[281415]: AMD Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: 486 Nov 26 04:43:47 localhost nova_compute[281415]: 486-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Broadwell Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Broadwell-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Broadwell-noTSX Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Broadwell-noTSX-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Broadwell-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Broadwell-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Broadwell-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Broadwell-v4 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Cascadelake-Server Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Cascadelake-Server-noTSX Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Cascadelake-Server-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Cascadelake-Server-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Cascadelake-Server-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Cascadelake-Server-v4 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Cascadelake-Server-v5 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Conroe Nov 26 04:43:47 localhost nova_compute[281415]: Conroe-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Cooperlake Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Cooperlake-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Cooperlake-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Denverton Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Denverton-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Denverton-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Denverton-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Dhyana Nov 26 04:43:47 localhost nova_compute[281415]: Dhyana-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Dhyana-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Genoa Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Genoa-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-IBPB Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Milan Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Milan-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Milan-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Rome Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Rome-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Rome-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Rome-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Rome-v4 Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-v1 Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-v2 Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-v4 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: GraniteRapids Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: GraniteRapids-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: GraniteRapids-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Haswell Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Haswell-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Haswell-noTSX Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Haswell-noTSX-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Haswell-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Haswell-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Haswell-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Haswell-v4 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Icelake-Server Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Icelake-Server-noTSX Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Icelake-Server-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Icelake-Server-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Icelake-Server-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Icelake-Server-v4 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Icelake-Server-v5 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Icelake-Server-v6 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Icelake-Server-v7 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: IvyBridge Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: IvyBridge-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: IvyBridge-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: IvyBridge-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: KnightsMill Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: KnightsMill-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nehalem Nov 26 04:43:47 localhost nova_compute[281415]: Nehalem-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nehalem-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nehalem-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G1 Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G1-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G2 Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G2-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G3 Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G3-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G4 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G4-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G5 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G5-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Penryn Nov 26 04:43:47 localhost nova_compute[281415]: Penryn-v1 Nov 26 04:43:47 localhost nova_compute[281415]: SandyBridge Nov 26 04:43:47 localhost nova_compute[281415]: SandyBridge-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: SandyBridge-v1 Nov 26 04:43:47 localhost nova_compute[281415]: SandyBridge-v2 Nov 26 04:43:47 localhost nova_compute[281415]: SapphireRapids Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: SapphireRapids-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: SapphireRapids-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: SapphireRapids-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: SierraForest Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: SierraForest-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Client Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Client-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Client-noTSX-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Client-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Client-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Client-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Client-v4 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Server Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Server-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Server-noTSX-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Server-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Server-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Server-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Server-v4 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Server-v5 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Snowridge Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Snowridge-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Snowridge-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Snowridge-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Snowridge-v4 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Westmere Nov 26 04:43:47 localhost nova_compute[281415]: Westmere-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Westmere-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Westmere-v2 Nov 26 04:43:47 localhost nova_compute[281415]: athlon Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: athlon-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: core2duo Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: core2duo-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: coreduo Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: coreduo-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: kvm32 Nov 26 04:43:47 localhost nova_compute[281415]: kvm32-v1 Nov 26 04:43:47 localhost nova_compute[281415]: kvm64 Nov 26 04:43:47 localhost nova_compute[281415]: kvm64-v1 Nov 26 04:43:47 localhost nova_compute[281415]: n270 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: n270-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: pentium Nov 26 04:43:47 localhost nova_compute[281415]: pentium-v1 Nov 26 04:43:47 localhost nova_compute[281415]: pentium2 Nov 26 04:43:47 localhost nova_compute[281415]: pentium2-v1 Nov 26 04:43:47 localhost nova_compute[281415]: pentium3 Nov 26 04:43:47 localhost nova_compute[281415]: pentium3-v1 Nov 26 04:43:47 localhost nova_compute[281415]: phenom Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: phenom-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: qemu32 Nov 26 04:43:47 localhost nova_compute[281415]: qemu32-v1 Nov 26 04:43:47 localhost nova_compute[281415]: qemu64 Nov 26 04:43:47 localhost nova_compute[281415]: qemu64-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: file Nov 26 04:43:47 localhost nova_compute[281415]: anonymous Nov 26 04:43:47 localhost nova_compute[281415]: memfd Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: disk Nov 26 04:43:47 localhost nova_compute[281415]: cdrom Nov 26 04:43:47 localhost nova_compute[281415]: floppy Nov 26 04:43:47 localhost nova_compute[281415]: lun Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: fdc Nov 26 04:43:47 localhost nova_compute[281415]: scsi Nov 26 04:43:47 localhost nova_compute[281415]: virtio Nov 26 04:43:47 localhost nova_compute[281415]: usb Nov 26 04:43:47 localhost nova_compute[281415]: sata Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: virtio Nov 26 04:43:47 localhost nova_compute[281415]: virtio-transitional Nov 26 04:43:47 localhost nova_compute[281415]: virtio-non-transitional Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: vnc Nov 26 04:43:47 localhost nova_compute[281415]: egl-headless Nov 26 04:43:47 localhost nova_compute[281415]: dbus Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: subsystem Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: default Nov 26 04:43:47 localhost nova_compute[281415]: mandatory Nov 26 04:43:47 localhost nova_compute[281415]: requisite Nov 26 04:43:47 localhost nova_compute[281415]: optional Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: usb Nov 26 04:43:47 localhost nova_compute[281415]: pci Nov 26 04:43:47 localhost nova_compute[281415]: scsi Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: virtio Nov 26 04:43:47 localhost nova_compute[281415]: virtio-transitional Nov 26 04:43:47 localhost nova_compute[281415]: virtio-non-transitional Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: random Nov 26 04:43:47 localhost nova_compute[281415]: egd Nov 26 04:43:47 localhost nova_compute[281415]: builtin Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: path Nov 26 04:43:47 localhost nova_compute[281415]: handle Nov 26 04:43:47 localhost nova_compute[281415]: virtiofs Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: tpm-tis Nov 26 04:43:47 localhost nova_compute[281415]: tpm-crb Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: emulator Nov 26 04:43:47 localhost nova_compute[281415]: external Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: 2.0 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: usb Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: pty Nov 26 04:43:47 localhost nova_compute[281415]: unix Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: qemu Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: builtin Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: default Nov 26 04:43:47 localhost nova_compute[281415]: passt Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: isa Nov 26 04:43:47 localhost nova_compute[281415]: hyperv Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: null Nov 26 04:43:47 localhost nova_compute[281415]: vc Nov 26 04:43:47 localhost nova_compute[281415]: pty Nov 26 04:43:47 localhost nova_compute[281415]: dev Nov 26 04:43:47 localhost nova_compute[281415]: file Nov 26 04:43:47 localhost nova_compute[281415]: pipe Nov 26 04:43:47 localhost nova_compute[281415]: stdio Nov 26 04:43:47 localhost nova_compute[281415]: udp Nov 26 04:43:47 localhost nova_compute[281415]: tcp Nov 26 04:43:47 localhost nova_compute[281415]: unix Nov 26 04:43:47 localhost nova_compute[281415]: qemu-vdagent Nov 26 04:43:47 localhost nova_compute[281415]: dbus Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: relaxed Nov 26 04:43:47 localhost nova_compute[281415]: vapic Nov 26 04:43:47 localhost nova_compute[281415]: spinlocks Nov 26 04:43:47 localhost nova_compute[281415]: vpindex Nov 26 04:43:47 localhost nova_compute[281415]: runtime Nov 26 04:43:47 localhost nova_compute[281415]: synic Nov 26 04:43:47 localhost nova_compute[281415]: stimer Nov 26 04:43:47 localhost nova_compute[281415]: reset Nov 26 04:43:47 localhost nova_compute[281415]: vendor_id Nov 26 04:43:47 localhost nova_compute[281415]: frequencies Nov 26 04:43:47 localhost nova_compute[281415]: reenlightenment Nov 26 04:43:47 localhost nova_compute[281415]: tlbflush Nov 26 04:43:47 localhost nova_compute[281415]: ipi Nov 26 04:43:47 localhost nova_compute[281415]: avic Nov 26 04:43:47 localhost nova_compute[281415]: emsr_bitmap Nov 26 04:43:47 localhost nova_compute[281415]: xmm_input Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: 4095 Nov 26 04:43:47 localhost nova_compute[281415]: on Nov 26 04:43:47 localhost nova_compute[281415]: off Nov 26 04:43:47 localhost nova_compute[281415]: off Nov 26 04:43:47 localhost nova_compute[281415]: Linux KVM Hv Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: tdx Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.756 281419 DEBUG nova.virt.libvirt.host [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: /usr/libexec/qemu-kvm Nov 26 04:43:47 localhost nova_compute[281415]: kvm Nov 26 04:43:47 localhost nova_compute[281415]: pc-i440fx-rhel7.6.0 Nov 26 04:43:47 localhost nova_compute[281415]: x86_64 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: /usr/share/OVMF/OVMF_CODE.secboot.fd Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: rom Nov 26 04:43:47 localhost nova_compute[281415]: pflash Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: yes Nov 26 04:43:47 localhost nova_compute[281415]: no Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: no Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: on Nov 26 04:43:47 localhost nova_compute[281415]: off Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: on Nov 26 04:43:47 localhost nova_compute[281415]: off Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Rome Nov 26 04:43:47 localhost nova_compute[281415]: AMD Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: 486 Nov 26 04:43:47 localhost nova_compute[281415]: 486-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Broadwell Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Broadwell-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Broadwell-noTSX Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Broadwell-noTSX-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Broadwell-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Broadwell-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Broadwell-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Broadwell-v4 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Cascadelake-Server Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Cascadelake-Server-noTSX Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Cascadelake-Server-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Cascadelake-Server-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Cascadelake-Server-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Cascadelake-Server-v4 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Cascadelake-Server-v5 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Conroe Nov 26 04:43:47 localhost nova_compute[281415]: Conroe-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Cooperlake Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Cooperlake-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Cooperlake-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Denverton Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Denverton-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Denverton-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Denverton-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Dhyana Nov 26 04:43:47 localhost nova_compute[281415]: Dhyana-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Dhyana-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Genoa Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Genoa-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-IBPB Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Milan Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Milan-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Milan-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Rome Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Rome-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Rome-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Rome-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-Rome-v4 Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-v1 Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-v2 Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: EPYC-v4 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: GraniteRapids Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: GraniteRapids-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: GraniteRapids-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Haswell Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Haswell-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Haswell-noTSX Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Haswell-noTSX-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Haswell-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Haswell-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Haswell-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Haswell-v4 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Icelake-Server Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Icelake-Server-noTSX Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Icelake-Server-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Icelake-Server-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Icelake-Server-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Icelake-Server-v4 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Icelake-Server-v5 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Icelake-Server-v6 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Icelake-Server-v7 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: IvyBridge Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: IvyBridge-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: IvyBridge-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: IvyBridge-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: KnightsMill Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: KnightsMill-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nehalem Nov 26 04:43:47 localhost nova_compute[281415]: Nehalem-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nehalem-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nehalem-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G1 Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G1-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G2 Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G2-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G3 Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G3-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G4 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G4-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G5 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Opteron_G5-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Penryn Nov 26 04:43:47 localhost nova_compute[281415]: Penryn-v1 Nov 26 04:43:47 localhost nova_compute[281415]: SandyBridge Nov 26 04:43:47 localhost nova_compute[281415]: SandyBridge-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: SandyBridge-v1 Nov 26 04:43:47 localhost nova_compute[281415]: SandyBridge-v2 Nov 26 04:43:47 localhost nova_compute[281415]: SapphireRapids Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: SapphireRapids-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: SapphireRapids-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: SapphireRapids-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: SierraForest Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: SierraForest-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Client Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Client-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Client-noTSX-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Client-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Client-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Client-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Client-v4 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Server Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Server-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Server-noTSX-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Server-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Server-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Server-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Server-v4 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Skylake-Server-v5 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Snowridge Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Snowridge-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Snowridge-v2 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Snowridge-v3 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Snowridge-v4 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Westmere Nov 26 04:43:47 localhost nova_compute[281415]: Westmere-IBRS Nov 26 04:43:47 localhost nova_compute[281415]: Westmere-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Westmere-v2 Nov 26 04:43:47 localhost nova_compute[281415]: athlon Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: athlon-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: core2duo Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: core2duo-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: coreduo Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: coreduo-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: kvm32 Nov 26 04:43:47 localhost nova_compute[281415]: kvm32-v1 Nov 26 04:43:47 localhost nova_compute[281415]: kvm64 Nov 26 04:43:47 localhost nova_compute[281415]: kvm64-v1 Nov 26 04:43:47 localhost nova_compute[281415]: n270 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: n270-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: pentium Nov 26 04:43:47 localhost nova_compute[281415]: pentium-v1 Nov 26 04:43:47 localhost nova_compute[281415]: pentium2 Nov 26 04:43:47 localhost nova_compute[281415]: pentium2-v1 Nov 26 04:43:47 localhost nova_compute[281415]: pentium3 Nov 26 04:43:47 localhost nova_compute[281415]: pentium3-v1 Nov 26 04:43:47 localhost nova_compute[281415]: phenom Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: phenom-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: qemu32 Nov 26 04:43:47 localhost nova_compute[281415]: qemu32-v1 Nov 26 04:43:47 localhost nova_compute[281415]: qemu64 Nov 26 04:43:47 localhost nova_compute[281415]: qemu64-v1 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: file Nov 26 04:43:47 localhost nova_compute[281415]: anonymous Nov 26 04:43:47 localhost nova_compute[281415]: memfd Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: disk Nov 26 04:43:47 localhost nova_compute[281415]: cdrom Nov 26 04:43:47 localhost nova_compute[281415]: floppy Nov 26 04:43:47 localhost nova_compute[281415]: lun Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: ide Nov 26 04:43:47 localhost nova_compute[281415]: fdc Nov 26 04:43:47 localhost nova_compute[281415]: scsi Nov 26 04:43:47 localhost nova_compute[281415]: virtio Nov 26 04:43:47 localhost nova_compute[281415]: usb Nov 26 04:43:47 localhost nova_compute[281415]: sata Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: virtio Nov 26 04:43:47 localhost nova_compute[281415]: virtio-transitional Nov 26 04:43:47 localhost nova_compute[281415]: virtio-non-transitional Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: vnc Nov 26 04:43:47 localhost nova_compute[281415]: egl-headless Nov 26 04:43:47 localhost nova_compute[281415]: dbus Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: subsystem Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: default Nov 26 04:43:47 localhost nova_compute[281415]: mandatory Nov 26 04:43:47 localhost nova_compute[281415]: requisite Nov 26 04:43:47 localhost nova_compute[281415]: optional Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: usb Nov 26 04:43:47 localhost nova_compute[281415]: pci Nov 26 04:43:47 localhost nova_compute[281415]: scsi Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: virtio Nov 26 04:43:47 localhost nova_compute[281415]: virtio-transitional Nov 26 04:43:47 localhost nova_compute[281415]: virtio-non-transitional Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: random Nov 26 04:43:47 localhost nova_compute[281415]: egd Nov 26 04:43:47 localhost nova_compute[281415]: builtin Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: path Nov 26 04:43:47 localhost nova_compute[281415]: handle Nov 26 04:43:47 localhost nova_compute[281415]: virtiofs Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: tpm-tis Nov 26 04:43:47 localhost nova_compute[281415]: tpm-crb Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: emulator Nov 26 04:43:47 localhost nova_compute[281415]: external Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: 2.0 Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: usb Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: pty Nov 26 04:43:47 localhost nova_compute[281415]: unix Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: qemu Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: builtin Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: default Nov 26 04:43:47 localhost nova_compute[281415]: passt Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: isa Nov 26 04:43:47 localhost nova_compute[281415]: hyperv Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: null Nov 26 04:43:47 localhost nova_compute[281415]: vc Nov 26 04:43:47 localhost nova_compute[281415]: pty Nov 26 04:43:47 localhost nova_compute[281415]: dev Nov 26 04:43:47 localhost nova_compute[281415]: file Nov 26 04:43:47 localhost nova_compute[281415]: pipe Nov 26 04:43:47 localhost nova_compute[281415]: stdio Nov 26 04:43:47 localhost nova_compute[281415]: udp Nov 26 04:43:47 localhost nova_compute[281415]: tcp Nov 26 04:43:47 localhost nova_compute[281415]: unix Nov 26 04:43:47 localhost nova_compute[281415]: qemu-vdagent Nov 26 04:43:47 localhost nova_compute[281415]: dbus Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: relaxed Nov 26 04:43:47 localhost nova_compute[281415]: vapic Nov 26 04:43:47 localhost nova_compute[281415]: spinlocks Nov 26 04:43:47 localhost nova_compute[281415]: vpindex Nov 26 04:43:47 localhost nova_compute[281415]: runtime Nov 26 04:43:47 localhost nova_compute[281415]: synic Nov 26 04:43:47 localhost nova_compute[281415]: stimer Nov 26 04:43:47 localhost nova_compute[281415]: reset Nov 26 04:43:47 localhost nova_compute[281415]: vendor_id Nov 26 04:43:47 localhost nova_compute[281415]: frequencies Nov 26 04:43:47 localhost nova_compute[281415]: reenlightenment Nov 26 04:43:47 localhost nova_compute[281415]: tlbflush Nov 26 04:43:47 localhost nova_compute[281415]: ipi Nov 26 04:43:47 localhost nova_compute[281415]: avic Nov 26 04:43:47 localhost nova_compute[281415]: emsr_bitmap Nov 26 04:43:47 localhost nova_compute[281415]: xmm_input Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: 4095 Nov 26 04:43:47 localhost nova_compute[281415]: on Nov 26 04:43:47 localhost nova_compute[281415]: off Nov 26 04:43:47 localhost nova_compute[281415]: off Nov 26 04:43:47 localhost nova_compute[281415]: Linux KVM Hv Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: tdx Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: Nov 26 04:43:47 localhost nova_compute[281415]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.828 281419 DEBUG nova.virt.libvirt.host [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.828 281419 INFO nova.virt.libvirt.host [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Secure Boot support detected#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.831 281419 INFO nova.virt.libvirt.driver [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.831 281419 INFO nova.virt.libvirt.driver [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.847 281419 DEBUG nova.virt.libvirt.driver [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.969 281419 INFO nova.virt.node [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Determined node identity 05276789-7461-410b-9529-16f5185a8bff from /var/lib/nova/compute_id#033[00m Nov 26 04:43:47 localhost nova_compute[281415]: 2025-11-26 09:43:47.989 281419 DEBUG nova.compute.manager [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Verified node 05276789-7461-410b-9529-16f5185a8bff matches my host np0005536118.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m Nov 26 04:43:48 localhost nova_compute[281415]: 2025-11-26 09:43:48.044 281419 DEBUG nova.compute.manager [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 26 04:43:48 localhost nova_compute[281415]: 2025-11-26 09:43:48.051 281419 DEBUG nova.virt.libvirt.vif [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-26T08:29:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=,hidden=False,host='np0005536118.localdomain',hostname='test',id=2,image_ref='7ebee4f6-b3ad-441d-abd0-239ae838ae37',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-11-26T08:29:20Z,launched_on='np0005536118.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=,node='np0005536118.localdomain',numa_topology=None,old_flavor=,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='b2fe3cd6f6ea49b8a2de01b236dd92e3',ramdisk_id='',reservation_id='r-hokjvvqr',resources=,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata=,tags=,task_state=None,terminated_at=None,trusted_certs=,updated_at=2025-11-26T08:29:20Z,user_data=None,user_id='9f8fafc3f43241c3a71039595891ea0e',uuid=9d78bef9-6977-4fb5-b50b-ae75124e73af,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Nov 26 04:43:48 localhost nova_compute[281415]: 2025-11-26 09:43:48.051 281419 DEBUG nova.network.os_vif_util [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Converting VIF {"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Nov 26 04:43:48 localhost nova_compute[281415]: 2025-11-26 09:43:48.052 281419 DEBUG nova.network.os_vif_util [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8c:0f:d8,bridge_name='br-int',has_traffic_filtering=True,id=5afdc9d0-9595-4904-b83b-3d24f739ffec,network=Network(3633976c-3aa0-4c4a-aa49-e8224cd25e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5afdc9d0-95') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Nov 26 04:43:48 localhost nova_compute[281415]: 2025-11-26 09:43:48.054 281419 DEBUG os_vif [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8c:0f:d8,bridge_name='br-int',has_traffic_filtering=True,id=5afdc9d0-9595-4904-b83b-3d24f739ffec,network=Network(3633976c-3aa0-4c4a-aa49-e8224cd25e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5afdc9d0-95') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Nov 26 04:43:48 localhost nova_compute[281415]: 2025-11-26 09:43:48.135 281419 DEBUG ovsdbapp.backend.ovs_idl [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 26 04:43:48 localhost nova_compute[281415]: 2025-11-26 09:43:48.136 281419 DEBUG ovsdbapp.backend.ovs_idl [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 26 04:43:48 localhost nova_compute[281415]: 2025-11-26 09:43:48.136 281419 DEBUG ovsdbapp.backend.ovs_idl [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 26 04:43:48 localhost nova_compute[281415]: 2025-11-26 09:43:48.136 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 26 04:43:48 localhost nova_compute[281415]: 2025-11-26 09:43:48.137 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:43:48 localhost nova_compute[281415]: 2025-11-26 09:43:48.137 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 26 04:43:48 localhost nova_compute[281415]: 2025-11-26 09:43:48.137 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:43:48 localhost nova_compute[281415]: 2025-11-26 09:43:48.139 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:43:48 localhost nova_compute[281415]: 2025-11-26 09:43:48.142 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:43:48 localhost nova_compute[281415]: 2025-11-26 09:43:48.157 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:43:48 localhost nova_compute[281415]: 2025-11-26 09:43:48.158 281419 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:43:48 localhost nova_compute[281415]: 2025-11-26 09:43:48.158 281419 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 26 04:43:48 localhost nova_compute[281415]: 2025-11-26 09:43:48.159 281419 INFO oslo.privsep.daemon [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpgw9bybs0/privsep.sock']#033[00m Nov 26 04:43:48 localhost nova_compute[281415]: 2025-11-26 09:43:48.819 281419 INFO oslo.privsep.daemon [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Nov 26 04:43:48 localhost nova_compute[281415]: 2025-11-26 09:43:48.679 281691 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Nov 26 04:43:48 localhost nova_compute[281415]: 2025-11-26 09:43:48.684 281691 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Nov 26 04:43:48 localhost nova_compute[281415]: 2025-11-26 09:43:48.688 281691 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m Nov 26 04:43:48 localhost nova_compute[281415]: 2025-11-26 09:43:48.688 281691 INFO oslo.privsep.daemon [-] privsep daemon running as pid 281691#033[00m Nov 26 04:43:49 localhost nova_compute[281415]: 2025-11-26 09:43:49.124 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:43:49 localhost nova_compute[281415]: 2025-11-26 09:43:49.125 281419 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5afdc9d0-95, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:43:49 localhost nova_compute[281415]: 2025-11-26 09:43:49.126 281419 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5afdc9d0-95, col_values=(('external_ids', {'iface-id': '5afdc9d0-9595-4904-b83b-3d24f739ffec', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8c:0f:d8', 'vm-uuid': '9d78bef9-6977-4fb5-b50b-ae75124e73af'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:43:49 localhost nova_compute[281415]: 2025-11-26 09:43:49.127 281419 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 26 04:43:49 localhost nova_compute[281415]: 2025-11-26 09:43:49.127 281419 INFO os_vif [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8c:0f:d8,bridge_name='br-int',has_traffic_filtering=True,id=5afdc9d0-9595-4904-b83b-3d24f739ffec,network=Network(3633976c-3aa0-4c4a-aa49-e8224cd25e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5afdc9d0-95')#033[00m Nov 26 04:43:49 localhost nova_compute[281415]: 2025-11-26 09:43:49.128 281419 DEBUG nova.compute.manager [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 26 04:43:49 localhost nova_compute[281415]: 2025-11-26 09:43:49.132 281419 DEBUG nova.compute.manager [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304#033[00m Nov 26 04:43:49 localhost nova_compute[281415]: 2025-11-26 09:43:49.133 281419 INFO nova.compute.manager [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m Nov 26 04:43:49 localhost nova_compute[281415]: 2025-11-26 09:43:49.294 281419 DEBUG oslo_concurrency.lockutils [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:43:49 localhost nova_compute[281415]: 2025-11-26 09:43:49.295 281419 DEBUG oslo_concurrency.lockutils [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:43:49 localhost nova_compute[281415]: 2025-11-26 09:43:49.295 281419 DEBUG oslo_concurrency.lockutils [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:43:49 localhost nova_compute[281415]: 2025-11-26 09:43:49.296 281419 DEBUG nova.compute.resource_tracker [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 04:43:49 localhost nova_compute[281415]: 2025-11-26 09:43:49.296 281419 DEBUG oslo_concurrency.processutils [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:43:49 localhost nova_compute[281415]: 2025-11-26 09:43:49.783 281419 DEBUG oslo_concurrency.processutils [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:43:49 localhost nova_compute[281415]: 2025-11-26 09:43:49.849 281419 DEBUG nova.virt.libvirt.driver [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:43:49 localhost nova_compute[281415]: 2025-11-26 09:43:49.849 281419 DEBUG nova.virt.libvirt.driver [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:43:50 localhost nova_compute[281415]: 2025-11-26 09:43:50.093 281419 WARNING nova.virt.libvirt.driver [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 04:43:50 localhost nova_compute[281415]: 2025-11-26 09:43:50.095 281419 DEBUG nova.compute.resource_tracker [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=12125MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 04:43:50 localhost nova_compute[281415]: 2025-11-26 09:43:50.095 281419 DEBUG oslo_concurrency.lockutils [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:43:50 localhost nova_compute[281415]: 2025-11-26 09:43:50.096 281419 DEBUG oslo_concurrency.lockutils [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:43:50 localhost nova_compute[281415]: 2025-11-26 09:43:50.288 281419 DEBUG nova.compute.resource_tracker [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 04:43:50 localhost nova_compute[281415]: 2025-11-26 09:43:50.289 281419 DEBUG nova.compute.resource_tracker [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 04:43:50 localhost nova_compute[281415]: 2025-11-26 09:43:50.289 281419 DEBUG nova.compute.resource_tracker [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 04:43:50 localhost nova_compute[281415]: 2025-11-26 09:43:50.367 281419 DEBUG nova.scheduler.client.report [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Refreshing inventories for resource provider 05276789-7461-410b-9529-16f5185a8bff _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Nov 26 04:43:50 localhost nova_compute[281415]: 2025-11-26 09:43:50.384 281419 DEBUG nova.scheduler.client.report [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Updating ProviderTree inventory for provider 05276789-7461-410b-9529-16f5185a8bff from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Nov 26 04:43:50 localhost nova_compute[281415]: 2025-11-26 09:43:50.384 281419 DEBUG nova.compute.provider_tree [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Updating inventory in ProviderTree for provider 05276789-7461-410b-9529-16f5185a8bff with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Nov 26 04:43:50 localhost nova_compute[281415]: 2025-11-26 09:43:50.402 281419 DEBUG nova.scheduler.client.report [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Refreshing aggregate associations for resource provider 05276789-7461-410b-9529-16f5185a8bff, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Nov 26 04:43:50 localhost nova_compute[281415]: 2025-11-26 09:43:50.434 281419 DEBUG nova.scheduler.client.report [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Refreshing trait associations for resource provider 05276789-7461-410b-9529-16f5185a8bff, traits: COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_F16C,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_FMA3,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ACCELERATORS,HW_CPU_X86_AESNI,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_CLMUL,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_BMI,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Nov 26 04:43:50 localhost nova_compute[281415]: 2025-11-26 09:43:50.471 281419 DEBUG oslo_concurrency.processutils [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:43:50 localhost nova_compute[281415]: 2025-11-26 09:43:50.961 281419 DEBUG oslo_concurrency.processutils [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:43:50 localhost nova_compute[281415]: 2025-11-26 09:43:50.968 281419 DEBUG nova.virt.libvirt.host [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N Nov 26 04:43:50 localhost nova_compute[281415]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m Nov 26 04:43:50 localhost nova_compute[281415]: 2025-11-26 09:43:50.968 281419 INFO nova.virt.libvirt.host [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] kernel doesn't support AMD SEV#033[00m Nov 26 04:43:50 localhost nova_compute[281415]: 2025-11-26 09:43:50.970 281419 DEBUG nova.compute.provider_tree [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 04:43:50 localhost nova_compute[281415]: 2025-11-26 09:43:50.970 281419 DEBUG nova.virt.libvirt.driver [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Nov 26 04:43:51 localhost nova_compute[281415]: 2025-11-26 09:43:51.004 281419 DEBUG nova.scheduler.client.report [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 04:43:51 localhost nova_compute[281415]: 2025-11-26 09:43:51.038 281419 DEBUG nova.compute.resource_tracker [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 04:43:51 localhost nova_compute[281415]: 2025-11-26 09:43:51.038 281419 DEBUG oslo_concurrency.lockutils [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.943s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:43:51 localhost nova_compute[281415]: 2025-11-26 09:43:51.039 281419 DEBUG nova.service [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m Nov 26 04:43:51 localhost nova_compute[281415]: 2025-11-26 09:43:51.065 281419 DEBUG nova.service [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m Nov 26 04:43:51 localhost nova_compute[281415]: 2025-11-26 09:43:51.066 281419 DEBUG nova.servicegroup.drivers.db [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] DB_Driver: join new ServiceGroup member np0005536118.localdomain to the compute group, service = join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m Nov 26 04:43:51 localhost nova_compute[281415]: 2025-11-26 09:43:51.294 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:43:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:43:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:43:52 localhost systemd[1]: tmp-crun.RQPp8X.mount: Deactivated successfully. Nov 26 04:43:52 localhost podman[281740]: 2025-11-26 09:43:52.842419612 +0000 UTC m=+0.093553683 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, managed_by=edpm_ansible) Nov 26 04:43:52 localhost podman[281740]: 2025-11-26 09:43:52.854270009 +0000 UTC m=+0.105404120 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:43:52 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:43:52 localhost podman[281739]: 2025-11-26 09:43:52.930380132 +0000 UTC m=+0.185127545 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 26 04:43:52 localhost podman[281739]: 2025-11-26 09:43:52.961915248 +0000 UTC m=+0.216662691 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 26 04:43:52 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:43:53 localhost nova_compute[281415]: 2025-11-26 09:43:53.176 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:43:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36916 DF PROTO=TCP SPT=53058 DPT=9102 SEQ=4057334572 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C963490000000001030307) Nov 26 04:43:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36917 DF PROTO=TCP SPT=53058 DPT=9102 SEQ=4057334572 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C9673C0000000001030307) Nov 26 04:43:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60815 DF PROTO=TCP SPT=59486 DPT=9102 SEQ=338990211 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C969FC0000000001030307) Nov 26 04:43:56 localhost nova_compute[281415]: 2025-11-26 09:43:56.306 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:43:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36918 DF PROTO=TCP SPT=53058 DPT=9102 SEQ=4057334572 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C96F3D0000000001030307) Nov 26 04:43:57 localhost podman[240049]: time="2025-11-26T09:43:57Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:43:57 localhost podman[240049]: @ - - [26/Nov/2025:09:43:57 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148615 "" "Go-http-client/1.1" Nov 26 04:43:57 localhost podman[240049]: @ - - [26/Nov/2025:09:43:57 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17232 "" "Go-http-client/1.1" Nov 26 04:43:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25487 DF PROTO=TCP SPT=38760 DPT=9102 SEQ=2507384895 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C973FC0000000001030307) Nov 26 04:43:58 localhost nova_compute[281415]: 2025-11-26 09:43:58.214 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36919 DF PROTO=TCP SPT=53058 DPT=9102 SEQ=4057334572 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C97EFC0000000001030307) Nov 26 04:44:01 localhost nova_compute[281415]: 2025-11-26 09:44:01.360 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:03 localhost nova_compute[281415]: 2025-11-26 09:44:03.217 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.584 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'name': 'test', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005536118.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'hostId': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.585 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.608 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/memory.usage volume: 52.296875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b913ed06-ea94-4153-ba55-a166dc5dacff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.296875, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T09:44:03.585615', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '718327a2-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10718.850669568, 'message_signature': 'bb2094171cbbd559ed3f8057f29e04eb2f768b557f0f8c80572bc612454afe11'}]}, 'timestamp': '2025-11-26 09:44:03.609477', '_unique_id': '32ae9e25769d42a6bda3941841c07868'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.611 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.612 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.616 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd81bfe30-f905-4b35-acdc-fa20929a54f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:44:03.612814', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '71843e30-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10718.855067774, 'message_signature': '1c05599f742448b11401bba5a3732660e5697f9118ff6c80db1fc73413af8262'}]}, 'timestamp': '2025-11-26 09:44:03.616555', '_unique_id': 'bf2359a9f4aa458284e93aee4ba78d86'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.617 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.618 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 26 04:44:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:03.646 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:44:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:03.647 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:44:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:03.649 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.655 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 524 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.656 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b7bf48d7-6685-48d2-8e89-f557dabd9093', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 524, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:44:03.618761', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '718a50cc-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10718.861009928, 'message_signature': '679350d0b4e047434ad548cb04e6c67642522f93eee162cc65b4e0d2a85c3cc7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:44:03.618761', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '718a6242-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10718.861009928, 'message_signature': '693efdacb3829cefa9d27dda03f5de4d56e2730a9b02d5b65fd3c501a8ab67db'}]}, 'timestamp': '2025-11-26 09:44:03.656763', '_unique_id': 'a3b963ef0c65456cb0ebccf65699106a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.657 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.658 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.659 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.659 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5759f3e1-5be9-43e6-9c11-dddfbf62ed1c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:44:03.659067', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '718ace4e-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10718.861009928, 'message_signature': 'ab278ad0b14c8c14337413bcfb01afaa66caab7755ebd8aafb8d575f01eefdb2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:44:03.659067', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '718adf42-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10718.861009928, 'message_signature': 'de4c402eca94fcf5189e3d5c856eb4a6d005a1de860ae19d139bd6ccf51bf83c'}]}, 'timestamp': '2025-11-26 09:44:03.659989', '_unique_id': '3ddfc6bbc9fc45dc9237453b0bf06caf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.660 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.662 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.662 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.662 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b800d456-17aa-475e-a614-ab5f2ae9d2a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:44:03.662180', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '718b46b2-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10718.861009928, 'message_signature': 'c5a8fecc69c5cfe9d1773821e5f8a002fbabf292ca86b3bbe346b7ab5c3f56e1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:44:03.662180', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '718b57e2-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10718.861009928, 'message_signature': '5a9a41275eb19bfd5892c3ce961f065a556a5fe65f8b0c7e1a8801edca48d2d9'}]}, 'timestamp': '2025-11-26 09:44:03.663094', '_unique_id': 'bfb25f3f34a547049e0d602b34b20a2d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.664 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.665 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.665 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 627516836 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.665 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 21052656 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2cd2e60e-a40a-4ef3-81b4-355cf63e1bc4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 627516836, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:44:03.665332', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '718bc286-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10718.861009928, 'message_signature': '231806daab9f21d46a80878971095a7cc918b48f459614d40d779ff5c81aacb7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21052656, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:44:03.665332', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '718bd44c-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10718.861009928, 'message_signature': '928125ac141489757fa9de18c59040ec8f656c4b0783125e21646bf10f87e3ad'}]}, 'timestamp': '2025-11-26 09:44:03.666240', '_unique_id': '1c15d687e13148b89358ca3a51a2f071'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.667 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.668 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.668 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.668 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '64ac89bc-2678-4c49-a103-9389bdf88a31', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:44:03.668621', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '718c4332-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10718.855067774, 'message_signature': '24c57cbce0e9e0d225a6546d40fdfc63faebc5e6fcb2a01e586e9537525a60cd'}]}, 'timestamp': '2025-11-26 09:44:03.669141', '_unique_id': 'efad336fd02f46efa8abe6e2c47c0f8c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.670 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.671 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.671 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea836cb9-5aad-40f7-8007-b303fcdcef37', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:44:03.671249', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '718caa0c-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10718.855067774, 'message_signature': '82e5036172af00a61f3f7abe8a5b06e0b214a01de9ea17af71afb0ab8a94c4dd'}]}, 'timestamp': '2025-11-26 09:44:03.671737', '_unique_id': '515337303ff442a98169a173e1a7a489'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.672 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.673 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.673 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '68cf67e0-f15d-413c-88ff-4d80348c6951', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:44:03.673836', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '718d0f42-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10718.855067774, 'message_signature': 'f65053768d4140c57502ede3d0d7736dedb373e9a7b466ccd1f389510de74186'}]}, 'timestamp': '2025-11-26 09:44:03.674323', '_unique_id': 'c58d3b0fc4174924a10d9e424756f3de'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.675 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.676 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.690 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.690 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '185c891b-7d07-4d2c-81b6-e37305b59ac1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:44:03.676458', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '718f8cf4-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10718.918713881, 'message_signature': 'c004996e1fcbe0458fb478e6966cd42031991228677982069de2386ffa66ddb0'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:44:03.676458', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '718f9d8e-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10718.918713881, 'message_signature': '35370c5bbafb3e1dc99b76ee597b5133bd85ad6daa71fb8594df0ef3e3389c40'}]}, 'timestamp': '2025-11-26 09:44:03.691098', '_unique_id': 'd1eaa8320ac34d1f812c34ca7034ac10'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.692 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.693 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.693 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.693 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 73912320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.693 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '486da985-3573-4dc5-a887-4b59e63ca3f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73912320, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:44:03.693449', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '71900c56-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10718.861009928, 'message_signature': '5c50fd0f41b951ed063050bd4d3b2f95ecfcabe435621684f78dd5eed6177ce2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:44:03.693449', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '71901e62-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10718.861009928, 'message_signature': '141f3b31bf59e1dbffb51c9888d340df24823940cac4327b4a651d134d1f83b2'}]}, 'timestamp': '2025-11-26 09:44:03.694346', '_unique_id': '15e7fb7acde540f9ba7710362de18cfd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.695 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.696 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.696 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e013849-dfab-4cc4-b150-b0c022011589', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:44:03.696520', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '71908410-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10718.855067774, 'message_signature': '560267f78d9aaf5c6aa12dd27992779295b8f500aae1c4454f67ca9d51dc7f0b'}]}, 'timestamp': '2025-11-26 09:44:03.697006', '_unique_id': '7926d6a6659743b18bc0482d2679210f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.697 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.698 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.699 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.699 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '467cc5c2-7a4b-46ce-9466-914ac4d56b17', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:44:03.699094', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7190e946-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10718.918713881, 'message_signature': '08b8da5bd65e968defd53cc7552f33daad3217937b62428ad6b7f569a37a4fb6'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:44:03.699094', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7190f9ae-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10718.918713881, 'message_signature': '01776bea2b312e15b6f5b8197022e2220b237983eb754ac26392778907ae4dd3'}]}, 'timestamp': '2025-11-26 09:44:03.699992', '_unique_id': '29e56b26a4a143998bac61652776745a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.700 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.701 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.702 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0316d3ca-639a-49e7-877f-e792436ea644', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:44:03.702120', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '71915ee4-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10718.855067774, 'message_signature': '8c3a941ff93b756c0b0655f884cd9fb10b7bc80a69637638cc6f2f05f45b4ecb'}]}, 'timestamp': '2025-11-26 09:44:03.702576', '_unique_id': '97ec1ac54a5848baad578f9cb6ce251d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.703 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.704 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.704 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.705 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '33aa2ab4-6e02-4d0b-9570-a04c74edfda1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:44:03.704686', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7191c2c6-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10718.918713881, 'message_signature': 'a8b57d90fcd4e686f788a3aa38c251e589bd9c8e7705167ac5b8a6f2124fb03f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:44:03.704686', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7191d432-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10718.918713881, 'message_signature': '621abc645954dd9dac687089f9b1bf7987c25a330d1a652516fbd09ca4f0ac33'}]}, 'timestamp': '2025-11-26 09:44:03.705550', '_unique_id': '387c634d1ea74cf089e315ea7939987a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.706 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.707 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.707 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.707 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dedb70d4-aa87-42fa-a58e-7d05b0fe00ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:44:03.707839', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '71924002-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10718.855067774, 'message_signature': 'a9d7775753575e32c507af45623d787a97ec31843ce829601fcfcce0ccc2c332'}]}, 'timestamp': '2025-11-26 09:44:03.708344', '_unique_id': '9d607445ab0c4fc7add39f30ef7f4d84'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.709 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.710 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.710 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/cpu volume: 62590000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c2697af-6ad9-40e8-95bf-f6452383ed72', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 62590000000, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T09:44:03.710438', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '7192a376-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10718.850669568, 'message_signature': '627bbdd396c63e2dc51e3bc4f0eb6a8a9f5254a58d6e26f1f9a32b07f0d8fe27'}]}, 'timestamp': '2025-11-26 09:44:03.710871', '_unique_id': '9f7aebdb2d844c1882eff1a79371286d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.711 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.713 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.713 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes volume: 9035 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '79cdb9e8-3916-4fef-b01b-513b0447fe65', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9035, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:44:03.713178', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '71930ece-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10718.855067774, 'message_signature': 'b31ff652d75b22be78ecb8973f6dbebb72fee2362654cf4e156f8e57139ab182'}]}, 'timestamp': '2025-11-26 09:44:03.713636', '_unique_id': 'acc809ce444b46b38735f5e57e142668'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.714 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.715 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.716 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets volume: 88 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fe6b45fc-7265-4e05-b3f2-b1897aca790f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 88, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:44:03.716031', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '71937e54-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10718.855067774, 'message_signature': 'da2b6db9e1a968cd80bf1905cbff51e3aa235c7e7682009a02ce236cb836cdcb'}]}, 'timestamp': '2025-11-26 09:44:03.716490', '_unique_id': '258093715bc84744b2684d55328634bc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.717 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.718 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.718 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '30bc4302-14bb-4af5-8d96-7c693c22f75e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:44:03.718577', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '7193e18c-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10718.855067774, 'message_signature': 'fd184ec83d9a03215d1fc80a295607d3c9fd4dfe280f247466136aa8c54c980f'}]}, 'timestamp': '2025-11-26 09:44:03.719059', '_unique_id': 'cc58d702d0ad4a3f89709fe59297cc6b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.719 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.721 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.721 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.721 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 1141678425 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.721 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 173265014 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '308141ea-2453-495e-a704-dbd56246b3fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1141678425, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:44:03.721304', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '71944c30-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10718.861009928, 'message_signature': 'c656fe08592b54a6e1178ea24e3614b3ea510d1ee353ec09af8af14ea85c9cee'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 173265014, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:44:03.721304', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '71945d42-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10718.861009928, 'message_signature': '09c9ddd1c2f55e9928d93ede0ecded58c04f0277c8ce7092b2e9b1d7aef52784'}]}, 'timestamp': '2025-11-26 09:44:03.722168', '_unique_id': 'c49fc04f257a43faa9f254935918c74e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:44:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:44:03.723 12 ERROR oslo_messaging.notify.messaging Nov 26 04:44:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:44:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:44:05 localhost podman[281779]: 2025-11-26 09:44:05.83763392 +0000 UTC m=+0.088603711 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible) Nov 26 04:44:05 localhost podman[281779]: 2025-11-26 09:44:05.875346106 +0000 UTC m=+0.126315867 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 26 04:44:05 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:44:05 localhost podman[281778]: 2025-11-26 09:44:05.884726706 +0000 UTC m=+0.137924176 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:44:05 localhost podman[281778]: 2025-11-26 09:44:05.968498906 +0000 UTC m=+0.221696436 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:44:05 localhost nova_compute[281415]: 2025-11-26 09:44:05.967 281419 DEBUG nova.compute.manager [None req-60af4fb7-d09c-438e-b468-3abf8301ed8f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 26 04:44:05 localhost nova_compute[281415]: 2025-11-26 09:44:05.974 281419 INFO nova.compute.manager [None req-60af4fb7-d09c-438e-b468-3abf8301ed8f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Retrieving diagnostics#033[00m Nov 26 04:44:05 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:44:06 localhost nova_compute[281415]: 2025-11-26 09:44:06.405 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:08 localhost nova_compute[281415]: 2025-11-26 09:44:08.254 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36920 DF PROTO=TCP SPT=53058 DPT=9102 SEQ=4057334572 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C99FFC0000000001030307) Nov 26 04:44:11 localhost nova_compute[281415]: 2025-11-26 09:44:11.434 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:12 localhost nova_compute[281415]: 2025-11-26 09:44:12.251 281419 DEBUG oslo_concurrency.lockutils [None req-95b57a64-2dc8-489b-b4b2-e0927dca73e1 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Acquiring lock "9d78bef9-6977-4fb5-b50b-ae75124e73af" by "nova.compute.manager.ComputeManager.stop_instance..do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:44:12 localhost nova_compute[281415]: 2025-11-26 09:44:12.252 281419 DEBUG oslo_concurrency.lockutils [None req-95b57a64-2dc8-489b-b4b2-e0927dca73e1 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Lock "9d78bef9-6977-4fb5-b50b-ae75124e73af" acquired by "nova.compute.manager.ComputeManager.stop_instance..do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:44:12 localhost nova_compute[281415]: 2025-11-26 09:44:12.252 281419 DEBUG nova.compute.manager [None req-95b57a64-2dc8-489b-b4b2-e0927dca73e1 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 26 04:44:12 localhost nova_compute[281415]: 2025-11-26 09:44:12.257 281419 DEBUG nova.compute.manager [None req-95b57a64-2dc8-489b-b4b2-e0927dca73e1 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m Nov 26 04:44:12 localhost nova_compute[281415]: 2025-11-26 09:44:12.261 281419 DEBUG nova.objects.instance [None req-95b57a64-2dc8-489b-b4b2-e0927dca73e1 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Lazy-loading 'flavor' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:44:12 localhost nova_compute[281415]: 2025-11-26 09:44:12.304 281419 DEBUG nova.virt.libvirt.driver [None req-95b57a64-2dc8-489b-b4b2-e0927dca73e1 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m Nov 26 04:44:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:44:12 localhost systemd[1]: tmp-crun.eTnpkh.mount: Deactivated successfully. Nov 26 04:44:12 localhost podman[281839]: 2025-11-26 09:44:12.704582826 +0000 UTC m=+0.087745963 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Nov 26 04:44:12 localhost podman[281839]: 2025-11-26 09:44:12.789292065 +0000 UTC m=+0.172455202 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2) Nov 26 04:44:12 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:44:13 localhost nova_compute[281415]: 2025-11-26 09:44:13.256 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:44:14 localhost systemd[1]: tmp-crun.4bjMOB.mount: Deactivated successfully. Nov 26 04:44:14 localhost podman[281932]: 2025-11-26 09:44:14.83419957 +0000 UTC m=+0.091691795 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, vcs-type=git, version=9.6, container_name=openstack_network_exporter, release=1755695350, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, name=ubi9-minimal, build-date=2025-08-20T13:12:41) Nov 26 04:44:14 localhost kernel: device tap5afdc9d0-95 left promiscuous mode Nov 26 04:44:14 localhost podman[281932]: 2025-11-26 09:44:14.881237165 +0000 UTC m=+0.138729430 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, version=9.6, distribution-scope=public, release=1755695350, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, architecture=x86_64, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.) Nov 26 04:44:14 localhost NetworkManager[5970]: [1764150254.8820] device (tap5afdc9d0-95): state change: disconnected -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed') Nov 26 04:44:14 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:44:14 localhost nova_compute[281415]: 2025-11-26 09:44:14.899 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:14 localhost ovn_controller[153664]: 2025-11-26T09:44:14Z|00047|binding|INFO|Releasing lport 5afdc9d0-9595-4904-b83b-3d24f739ffec from this chassis (sb_readonly=0) Nov 26 04:44:14 localhost ovn_controller[153664]: 2025-11-26T09:44:14Z|00048|binding|INFO|Setting lport 5afdc9d0-9595-4904-b83b-3d24f739ffec down in Southbound Nov 26 04:44:14 localhost ovn_controller[153664]: 2025-11-26T09:44:14Z|00049|binding|INFO|Removing iface tap5afdc9d0-95 ovn-installed in OVS Nov 26 04:44:14 localhost nova_compute[281415]: 2025-11-26 09:44:14.903 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:14 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:14.911 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:0f:d8 192.168.0.160'], port_security=['fa:16:3e:8c:0f:d8 192.168.0.160'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.160/24', 'neutron:device_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'neutron:device_owner': 'compute:nova', 'neutron:host_id': 'np0005536118.localdomain', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3633976c-3aa0-4c4a-aa49-e8224cd25e39', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'neutron:revision_number': '7', 'neutron:security_group_ids': '10c2b79b-e9f0-444f-8b9c-e9015cac7c52 4b147283-0178-4a15-bbd3-c1ef9b53dbb6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9eb25cee-4262-4506-9877-de1032fbc4e7, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=5afdc9d0-9595-4904-b83b-3d24f739ffec) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 04:44:14 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:14.912 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 5afdc9d0-9595-4904-b83b-3d24f739ffec in datapath 3633976c-3aa0-4c4a-aa49-e8224cd25e39 unbound from our chassis#033[00m Nov 26 04:44:14 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:14.913 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3633976c-3aa0-4c4a-aa49-e8224cd25e39, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 04:44:14 localhost nova_compute[281415]: 2025-11-26 09:44:14.915 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:14 localhost ovn_controller[153664]: 2025-11-26T09:44:14Z|00050|ovn_bfd|INFO|Disabled BFD on interface ovn-0e4a56-0 Nov 26 04:44:14 localhost ovn_controller[153664]: 2025-11-26T09:44:14Z|00051|ovn_bfd|INFO|Disabled BFD on interface ovn-9f6a17-0 Nov 26 04:44:14 localhost ovn_controller[153664]: 2025-11-26T09:44:14Z|00052|ovn_bfd|INFO|Disabled BFD on interface ovn-7174ad-0 Nov 26 04:44:14 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:14.920 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[a9a8f039-ba05-4dc7-8bff-c43f6408c99a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:44:14 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:14.921 159486 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3633976c-3aa0-4c4a-aa49-e8224cd25e39 namespace which is not needed anymore#033[00m Nov 26 04:44:14 localhost ovn_controller[153664]: 2025-11-26T09:44:14Z|00053|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 04:44:14 localhost nova_compute[281415]: 2025-11-26 09:44:14.921 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:14 localhost systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Deactivated successfully. Nov 26 04:44:14 localhost systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Consumed 4min 2.995s CPU time. Nov 26 04:44:14 localhost nova_compute[281415]: 2025-11-26 09:44:14.928 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:14 localhost systemd-machined[83873]: Machine qemu-1-instance-00000002 terminated. Nov 26 04:44:14 localhost nova_compute[281415]: 2025-11-26 09:44:14.965 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:14 localhost ovn_controller[153664]: 2025-11-26T09:44:14Z|00054|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 04:44:14 localhost nova_compute[281415]: 2025-11-26 09:44:14.976 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:15 localhost nova_compute[281415]: 2025-11-26 09:44:15.285 281419 DEBUG nova.compute.manager [req-a3a8b503-927a-4ab9-a574-6ddad2651ce0 req-e1618c7a-2f49-49a6-a5a6-eebdcf52b8ee ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Received event network-vif-unplugged-5afdc9d0-9595-4904-b83b-3d24f739ffec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Nov 26 04:44:15 localhost nova_compute[281415]: 2025-11-26 09:44:15.286 281419 DEBUG oslo_concurrency.lockutils [req-a3a8b503-927a-4ab9-a574-6ddad2651ce0 req-e1618c7a-2f49-49a6-a5a6-eebdcf52b8ee ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] Acquiring lock "9d78bef9-6977-4fb5-b50b-ae75124e73af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:44:15 localhost nova_compute[281415]: 2025-11-26 09:44:15.287 281419 DEBUG oslo_concurrency.lockutils [req-a3a8b503-927a-4ab9-a574-6ddad2651ce0 req-e1618c7a-2f49-49a6-a5a6-eebdcf52b8ee ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] Lock "9d78bef9-6977-4fb5-b50b-ae75124e73af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:44:15 localhost nova_compute[281415]: 2025-11-26 09:44:15.287 281419 DEBUG oslo_concurrency.lockutils [req-a3a8b503-927a-4ab9-a574-6ddad2651ce0 req-e1618c7a-2f49-49a6-a5a6-eebdcf52b8ee ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] Lock "9d78bef9-6977-4fb5-b50b-ae75124e73af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:44:15 localhost nova_compute[281415]: 2025-11-26 09:44:15.287 281419 DEBUG nova.compute.manager [req-a3a8b503-927a-4ab9-a574-6ddad2651ce0 req-e1618c7a-2f49-49a6-a5a6-eebdcf52b8ee ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] No waiting events found dispatching network-vif-unplugged-5afdc9d0-9595-4904-b83b-3d24f739ffec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Nov 26 04:44:15 localhost nova_compute[281415]: 2025-11-26 09:44:15.288 281419 WARNING nova.compute.manager [req-a3a8b503-927a-4ab9-a574-6ddad2651ce0 req-e1618c7a-2f49-49a6-a5a6-eebdcf52b8ee ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Received unexpected event network-vif-unplugged-5afdc9d0-9595-4904-b83b-3d24f739ffec for instance with vm_state active and task_state powering-off.#033[00m Nov 26 04:44:15 localhost nova_compute[281415]: 2025-11-26 09:44:15.327 281419 INFO nova.virt.libvirt.driver [None req-95b57a64-2dc8-489b-b4b2-e0927dca73e1 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Instance shutdown successfully after 3 seconds.#033[00m Nov 26 04:44:15 localhost nova_compute[281415]: 2025-11-26 09:44:15.335 281419 INFO nova.virt.libvirt.driver [-] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Instance destroyed successfully.#033[00m Nov 26 04:44:15 localhost nova_compute[281415]: 2025-11-26 09:44:15.336 281419 DEBUG nova.objects.instance [None req-95b57a64-2dc8-489b-b4b2-e0927dca73e1 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Lazy-loading 'numa_topology' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:44:15 localhost nova_compute[281415]: 2025-11-26 09:44:15.371 281419 DEBUG nova.compute.manager [None req-95b57a64-2dc8-489b-b4b2-e0927dca73e1 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 26 04:44:15 localhost nova_compute[281415]: 2025-11-26 09:44:15.452 281419 DEBUG oslo_concurrency.lockutils [None req-95b57a64-2dc8-489b-b4b2-e0927dca73e1 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Lock "9d78bef9-6977-4fb5-b50b-ae75124e73af" "released" by "nova.compute.manager.ComputeManager.stop_instance..do_stop_instance" :: held 3.200s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:44:15 localhost openstack_network_exporter[242153]: ERROR 09:44:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:44:15 localhost openstack_network_exporter[242153]: ERROR 09:44:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:44:15 localhost openstack_network_exporter[242153]: ERROR 09:44:15 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:44:15 localhost openstack_network_exporter[242153]: ERROR 09:44:15 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:44:15 localhost openstack_network_exporter[242153]: Nov 26 04:44:15 localhost openstack_network_exporter[242153]: ERROR 09:44:15 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:44:15 localhost openstack_network_exporter[242153]: Nov 26 04:44:16 localhost nova_compute[281415]: 2025-11-26 09:44:16.027 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:16 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:16.028 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:5e:b4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '86:cf:7c:68:02:df'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 04:44:16 localhost nova_compute[281415]: 2025-11-26 09:44:16.436 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:44:16 localhost systemd[1]: tmp-crun.xWF29X.mount: Deactivated successfully. Nov 26 04:44:16 localhost podman[282006]: 2025-11-26 09:44:16.834026015 +0000 UTC m=+0.092496101 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 04:44:16 localhost podman[282006]: 2025-11-26 09:44:16.850377565 +0000 UTC m=+0.108847641 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 26 04:44:16 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:44:17 localhost nova_compute[281415]: 2025-11-26 09:44:17.321 281419 DEBUG nova.compute.manager [req-bc59eb23-cf22-488e-8a90-fb35ec7764fa req-36fd0d58-d985-4404-bbba-1367d5f20daf ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Received event network-vif-plugged-5afdc9d0-9595-4904-b83b-3d24f739ffec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Nov 26 04:44:17 localhost nova_compute[281415]: 2025-11-26 09:44:17.321 281419 DEBUG oslo_concurrency.lockutils [req-bc59eb23-cf22-488e-8a90-fb35ec7764fa req-36fd0d58-d985-4404-bbba-1367d5f20daf ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] Acquiring lock "9d78bef9-6977-4fb5-b50b-ae75124e73af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:44:17 localhost nova_compute[281415]: 2025-11-26 09:44:17.322 281419 DEBUG oslo_concurrency.lockutils [req-bc59eb23-cf22-488e-8a90-fb35ec7764fa req-36fd0d58-d985-4404-bbba-1367d5f20daf ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] Lock "9d78bef9-6977-4fb5-b50b-ae75124e73af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:44:17 localhost nova_compute[281415]: 2025-11-26 09:44:17.322 281419 DEBUG oslo_concurrency.lockutils [req-bc59eb23-cf22-488e-8a90-fb35ec7764fa req-36fd0d58-d985-4404-bbba-1367d5f20daf ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] Lock "9d78bef9-6977-4fb5-b50b-ae75124e73af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:44:17 localhost nova_compute[281415]: 2025-11-26 09:44:17.322 281419 DEBUG nova.compute.manager [req-bc59eb23-cf22-488e-8a90-fb35ec7764fa req-36fd0d58-d985-4404-bbba-1367d5f20daf ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] No waiting events found dispatching network-vif-plugged-5afdc9d0-9595-4904-b83b-3d24f739ffec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Nov 26 04:44:17 localhost nova_compute[281415]: 2025-11-26 09:44:17.322 281419 WARNING nova.compute.manager [req-bc59eb23-cf22-488e-8a90-fb35ec7764fa req-36fd0d58-d985-4404-bbba-1367d5f20daf ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Received unexpected event network-vif-plugged-5afdc9d0-9595-4904-b83b-3d24f739ffec for instance with vm_state stopped and task_state None.#033[00m Nov 26 04:44:17 localhost nova_compute[281415]: 2025-11-26 09:44:17.890 281419 DEBUG nova.compute.manager [None req-58a74114-0f87-4df2-b56f-67eb339ac369 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 26 04:44:17 localhost nova_compute[281415]: 2025-11-26 09:44:17.913 281419 ERROR oslo_messaging.rpc.server [None req-58a74114-0f87-4df2-b56f-67eb339ac369 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Exception during message handling: nova.exception.InstanceInvalidState: Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af in power state shutdown. Cannot get_diagnostics while the instance is in this state. Nov 26 04:44:17 localhost nova_compute[281415]: 2025-11-26 09:44:17.913 281419 ERROR oslo_messaging.rpc.server Traceback (most recent call last): Nov 26 04:44:17 localhost nova_compute[281415]: 2025-11-26 09:44:17.913 281419 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming Nov 26 04:44:17 localhost nova_compute[281415]: 2025-11-26 09:44:17.913 281419 ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) Nov 26 04:44:17 localhost nova_compute[281415]: 2025-11-26 09:44:17.913 281419 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch Nov 26 04:44:17 localhost nova_compute[281415]: 2025-11-26 09:44:17.913 281419 ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) Nov 26 04:44:17 localhost nova_compute[281415]: 2025-11-26 09:44:17.913 281419 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch Nov 26 04:44:17 localhost nova_compute[281415]: 2025-11-26 09:44:17.913 281419 ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) Nov 26 04:44:17 localhost nova_compute[281415]: 2025-11-26 09:44:17.913 281419 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 71, in wrapped Nov 26 04:44:17 localhost nova_compute[281415]: 2025-11-26 09:44:17.913 281419 ERROR oslo_messaging.rpc.server _emit_versioned_exception_notification( Nov 26 04:44:17 localhost nova_compute[281415]: 2025-11-26 09:44:17.913 281419 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__ Nov 26 04:44:17 localhost nova_compute[281415]: 2025-11-26 09:44:17.913 281419 ERROR oslo_messaging.rpc.server self.force_reraise() Nov 26 04:44:17 localhost nova_compute[281415]: 2025-11-26 09:44:17.913 281419 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise Nov 26 04:44:17 localhost nova_compute[281415]: 2025-11-26 09:44:17.913 281419 ERROR oslo_messaging.rpc.server raise self.value Nov 26 04:44:17 localhost nova_compute[281415]: 2025-11-26 09:44:17.913 281419 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 63, in wrapped Nov 26 04:44:17 localhost nova_compute[281415]: 2025-11-26 09:44:17.913 281419 ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) Nov 26 04:44:17 localhost nova_compute[281415]: 2025-11-26 09:44:17.913 281419 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 214, in decorated_function Nov 26 04:44:17 localhost nova_compute[281415]: 2025-11-26 09:44:17.913 281419 ERROR oslo_messaging.rpc.server compute_utils.add_instance_fault_from_exc(context, Nov 26 04:44:17 localhost nova_compute[281415]: 2025-11-26 09:44:17.913 281419 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__ Nov 26 04:44:17 localhost nova_compute[281415]: 2025-11-26 09:44:17.913 281419 ERROR oslo_messaging.rpc.server self.force_reraise() Nov 26 04:44:17 localhost nova_compute[281415]: 2025-11-26 09:44:17.913 281419 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise Nov 26 04:44:17 localhost nova_compute[281415]: 2025-11-26 09:44:17.913 281419 ERROR oslo_messaging.rpc.server raise self.value Nov 26 04:44:17 localhost nova_compute[281415]: 2025-11-26 09:44:17.913 281419 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 203, in decorated_function Nov 26 04:44:17 localhost nova_compute[281415]: 2025-11-26 09:44:17.913 281419 ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) Nov 26 04:44:17 localhost nova_compute[281415]: 2025-11-26 09:44:17.913 281419 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 6739, in get_instance_diagnostics Nov 26 04:44:17 localhost nova_compute[281415]: 2025-11-26 09:44:17.913 281419 ERROR oslo_messaging.rpc.server raise exception.InstanceInvalidState( Nov 26 04:44:17 localhost nova_compute[281415]: 2025-11-26 09:44:17.913 281419 ERROR oslo_messaging.rpc.server nova.exception.InstanceInvalidState: Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af in power state shutdown. Cannot get_diagnostics while the instance is in this state. Nov 26 04:44:17 localhost nova_compute[281415]: 2025-11-26 09:44:17.913 281419 ERROR oslo_messaging.rpc.server #033[00m Nov 26 04:44:18 localhost nova_compute[281415]: 2025-11-26 09:44:18.298 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:20 localhost sshd[282030]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:44:21 localhost nova_compute[281415]: 2025-11-26 09:44:21.440 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:23 localhost nova_compute[281415]: 2025-11-26 09:44:23.323 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:44:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:44:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17602 DF PROTO=TCP SPT=42498 DPT=9102 SEQ=367521857 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C9D8790000000001030307) Nov 26 04:44:23 localhost podman[282033]: 2025-11-26 09:44:23.829669337 +0000 UTC m=+0.083682931 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=multipathd, tcib_managed=true) Nov 26 04:44:23 localhost podman[282033]: 2025-11-26 09:44:23.845345637 +0000 UTC m=+0.099359181 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:44:23 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:44:23 localhost podman[282032]: 2025-11-26 09:44:23.934091443 +0000 UTC m=+0.192886213 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 26 04:44:23 localhost podman[282032]: 2025-11-26 09:44:23.969380012 +0000 UTC m=+0.228174782 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:44:23 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:44:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17603 DF PROTO=TCP SPT=42498 DPT=9102 SEQ=367521857 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C9DC7D0000000001030307) Nov 26 04:44:25 localhost systemd[1]: libpod-80fe362c998c364f9ce1ffb3e71d38513195bcaa2e7c8e4ba20a3e7439113a4e.scope: Deactivated successfully. Nov 26 04:44:25 localhost podman[281983]: 2025-11-26 09:44:25.107530503 +0000 UTC m=+10.085563608 container died 80fe362c998c364f9ce1ffb3e71d38513195bcaa2e7c8e4ba20a3e7439113a4e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-3633976c-3aa0-4c4a-aa49-e8224cd25e39, version=17.1.12, batch=17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public) Nov 26 04:44:25 localhost podman[281983]: 2025-11-26 09:44:25.295614068 +0000 UTC m=+10.273647173 container cleanup 80fe362c998c364f9ce1ffb3e71d38513195bcaa2e7c8e4ba20a3e7439113a4e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-3633976c-3aa0-4c4a-aa49-e8224cd25e39, distribution-scope=public, io.openshift.expose-services=, release=1761123044, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team) Nov 26 04:44:25 localhost podman[282072]: 2025-11-26 09:44:25.308642216 +0000 UTC m=+0.190437508 container cleanup 80fe362c998c364f9ce1ffb3e71d38513195bcaa2e7c8e4ba20a3e7439113a4e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-3633976c-3aa0-4c4a-aa49-e8224cd25e39, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 26 04:44:25 localhost systemd[1]: libpod-conmon-80fe362c998c364f9ce1ffb3e71d38513195bcaa2e7c8e4ba20a3e7439113a4e.scope: Deactivated successfully. Nov 26 04:44:25 localhost podman[282090]: 2025-11-26 09:44:25.400559308 +0000 UTC m=+0.082102263 container remove 80fe362c998c364f9ce1ffb3e71d38513195bcaa2e7c8e4ba20a3e7439113a4e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-3633976c-3aa0-4c4a-aa49-e8224cd25e39, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public) Nov 26 04:44:25 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:25.407 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[ae0a1433-bc7a-43b0-a36b-e33386df9c9b]: (4, ('Wed Nov 26 09:44:15 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-3633976c-3aa0-4c4a-aa49-e8224cd25e39 (80fe362c998c364f9ce1ffb3e71d38513195bcaa2e7c8e4ba20a3e7439113a4e)\n80fe362c998c364f9ce1ffb3e71d38513195bcaa2e7c8e4ba20a3e7439113a4e\nWed Nov 26 09:44:25 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-3633976c-3aa0-4c4a-aa49-e8224cd25e39 (80fe362c998c364f9ce1ffb3e71d38513195bcaa2e7c8e4ba20a3e7439113a4e)\n80fe362c998c364f9ce1ffb3e71d38513195bcaa2e7c8e4ba20a3e7439113a4e\n', 'time="2025-11-26T09:44:25Z" level=warning msg="StopSignal SIGTERM failed to stop container neutron-haproxy-ovnmeta-3633976c-3aa0-4c4a-aa49-e8224cd25e39 in 10 seconds, resorting to SIGKILL"\n', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:44:25 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:25.411 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[4cc950e2-5075-48db-85cd-bf0a31c545e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:44:25 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:25.412 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3633976c-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:44:25 localhost nova_compute[281415]: 2025-11-26 09:44:25.415 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:25 localhost kernel: device tap3633976c-30 left promiscuous mode Nov 26 04:44:25 localhost nova_compute[281415]: 2025-11-26 09:44:25.428 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:25 localhost nova_compute[281415]: 2025-11-26 09:44:25.430 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:25 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:25.433 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[cc302ca5-cdc0-4daf-9c89-0482e4630c93]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:44:25 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:25.448 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[f23ce09f-b272-48d0-b275-e37e4d8437f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:44:25 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:25.449 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[bf666745-c68c-450d-b056-0769abbba213]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:44:25 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:25.468 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[aeada6ff-f77f-4777-87c3-621efe5f8151]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 623492, 'reachable_time': 35867, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 282109, 'error': None, 'target': 'ovnmeta-3633976c-3aa0-4c4a-aa49-e8224cd25e39', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:44:25 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:25.479 159623 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3633976c-3aa0-4c4a-aa49-e8224cd25e39 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m Nov 26 04:44:25 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:25.480 159623 DEBUG oslo.privsep.daemon [-] privsep: reply[23cad1fb-8b59-4ef7-a107-ae1d668f622b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:44:25 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:25.481 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 26 04:44:25 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:25.482 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fad182b-d1fd-4eb1-a4d3-436a76a6f49e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:44:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36921 DF PROTO=TCP SPT=53058 DPT=9102 SEQ=4057334572 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C9DFFC0000000001030307) Nov 26 04:44:26 localhost systemd[1]: var-lib-containers-storage-overlay-f7559b400252157619e8013c4f621a43e0a48c792e3e4ada3b21ecf95d0bea65-merged.mount: Deactivated successfully. Nov 26 04:44:26 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-80fe362c998c364f9ce1ffb3e71d38513195bcaa2e7c8e4ba20a3e7439113a4e-userdata-shm.mount: Deactivated successfully. Nov 26 04:44:26 localhost systemd[1]: run-netns-ovnmeta\x2d3633976c\x2d3aa0\x2d4c4a\x2daa49\x2de8224cd25e39.mount: Deactivated successfully. Nov 26 04:44:26 localhost nova_compute[281415]: 2025-11-26 09:44:26.478 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17604 DF PROTO=TCP SPT=42498 DPT=9102 SEQ=367521857 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C9E47C0000000001030307) Nov 26 04:44:27 localhost podman[240049]: time="2025-11-26T09:44:27Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:44:27 localhost podman[240049]: @ - - [26/Nov/2025:09:44:27 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146339 "" "Go-http-client/1.1" Nov 26 04:44:27 localhost podman[240049]: @ - - [26/Nov/2025:09:44:27 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16748 "" "Go-http-client/1.1" Nov 26 04:44:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60816 DF PROTO=TCP SPT=59486 DPT=9102 SEQ=338990211 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C9E7FC0000000001030307) Nov 26 04:44:28 localhost nova_compute[281415]: 2025-11-26 09:44:28.360 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:30 localhost nova_compute[281415]: 2025-11-26 09:44:30.126 281419 DEBUG nova.virt.driver [-] Emitting event Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Nov 26 04:44:30 localhost nova_compute[281415]: 2025-11-26 09:44:30.127 281419 INFO nova.compute.manager [-] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] VM Stopped (Lifecycle Event)#033[00m Nov 26 04:44:30 localhost nova_compute[281415]: 2025-11-26 09:44:30.159 281419 DEBUG nova.compute.manager [None req-34728364-c483-4889-97e3-5893b836789b - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 26 04:44:30 localhost nova_compute[281415]: 2025-11-26 09:44:30.165 281419 DEBUG nova.compute.manager [None req-34728364-c483-4889-97e3-5893b836789b - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: stopped, current task_state: None, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Nov 26 04:44:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17605 DF PROTO=TCP SPT=42498 DPT=9102 SEQ=367521857 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C9F43C0000000001030307) Nov 26 04:44:31 localhost nova_compute[281415]: 2025-11-26 09:44:31.516 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:33 localhost nova_compute[281415]: 2025-11-26 09:44:33.400 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:36 localhost nova_compute[281415]: 2025-11-26 09:44:36.329 281419 DEBUG nova.compute.manager [None req-c8a3bfdd-c6ba-4436-976f-a0f5bd57b86f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 26 04:44:36 localhost nova_compute[281415]: 2025-11-26 09:44:36.357 281419 ERROR oslo_messaging.rpc.server [None req-c8a3bfdd-c6ba-4436-976f-a0f5bd57b86f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Exception during message handling: nova.exception.InstanceInvalidState: Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af in power state shutdown. Cannot get_diagnostics while the instance is in this state. Nov 26 04:44:36 localhost nova_compute[281415]: 2025-11-26 09:44:36.357 281419 ERROR oslo_messaging.rpc.server Traceback (most recent call last): Nov 26 04:44:36 localhost nova_compute[281415]: 2025-11-26 09:44:36.357 281419 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming Nov 26 04:44:36 localhost nova_compute[281415]: 2025-11-26 09:44:36.357 281419 ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) Nov 26 04:44:36 localhost nova_compute[281415]: 2025-11-26 09:44:36.357 281419 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch Nov 26 04:44:36 localhost nova_compute[281415]: 2025-11-26 09:44:36.357 281419 ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) Nov 26 04:44:36 localhost nova_compute[281415]: 2025-11-26 09:44:36.357 281419 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch Nov 26 04:44:36 localhost nova_compute[281415]: 2025-11-26 09:44:36.357 281419 ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) Nov 26 04:44:36 localhost nova_compute[281415]: 2025-11-26 09:44:36.357 281419 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 71, in wrapped Nov 26 04:44:36 localhost nova_compute[281415]: 2025-11-26 09:44:36.357 281419 ERROR oslo_messaging.rpc.server _emit_versioned_exception_notification( Nov 26 04:44:36 localhost nova_compute[281415]: 2025-11-26 09:44:36.357 281419 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__ Nov 26 04:44:36 localhost nova_compute[281415]: 2025-11-26 09:44:36.357 281419 ERROR oslo_messaging.rpc.server self.force_reraise() Nov 26 04:44:36 localhost nova_compute[281415]: 2025-11-26 09:44:36.357 281419 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise Nov 26 04:44:36 localhost nova_compute[281415]: 2025-11-26 09:44:36.357 281419 ERROR oslo_messaging.rpc.server raise self.value Nov 26 04:44:36 localhost nova_compute[281415]: 2025-11-26 09:44:36.357 281419 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 63, in wrapped Nov 26 04:44:36 localhost nova_compute[281415]: 2025-11-26 09:44:36.357 281419 ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) Nov 26 04:44:36 localhost nova_compute[281415]: 2025-11-26 09:44:36.357 281419 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 214, in decorated_function Nov 26 04:44:36 localhost nova_compute[281415]: 2025-11-26 09:44:36.357 281419 ERROR oslo_messaging.rpc.server compute_utils.add_instance_fault_from_exc(context, Nov 26 04:44:36 localhost nova_compute[281415]: 2025-11-26 09:44:36.357 281419 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__ Nov 26 04:44:36 localhost nova_compute[281415]: 2025-11-26 09:44:36.357 281419 ERROR oslo_messaging.rpc.server self.force_reraise() Nov 26 04:44:36 localhost nova_compute[281415]: 2025-11-26 09:44:36.357 281419 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise Nov 26 04:44:36 localhost nova_compute[281415]: 2025-11-26 09:44:36.357 281419 ERROR oslo_messaging.rpc.server raise self.value Nov 26 04:44:36 localhost nova_compute[281415]: 2025-11-26 09:44:36.357 281419 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 203, in decorated_function Nov 26 04:44:36 localhost nova_compute[281415]: 2025-11-26 09:44:36.357 281419 ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) Nov 26 04:44:36 localhost nova_compute[281415]: 2025-11-26 09:44:36.357 281419 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 6739, in get_instance_diagnostics Nov 26 04:44:36 localhost nova_compute[281415]: 2025-11-26 09:44:36.357 281419 ERROR oslo_messaging.rpc.server raise exception.InstanceInvalidState( Nov 26 04:44:36 localhost nova_compute[281415]: 2025-11-26 09:44:36.357 281419 ERROR oslo_messaging.rpc.server nova.exception.InstanceInvalidState: Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af in power state shutdown. Cannot get_diagnostics while the instance is in this state. Nov 26 04:44:36 localhost nova_compute[281415]: 2025-11-26 09:44:36.357 281419 ERROR oslo_messaging.rpc.server #033[00m Nov 26 04:44:36 localhost nova_compute[281415]: 2025-11-26 09:44:36.555 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:44:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:44:36 localhost podman[282112]: 2025-11-26 09:44:36.829405835 +0000 UTC m=+0.086488667 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 26 04:44:36 localhost podman[282112]: 2025-11-26 09:44:36.842120704 +0000 UTC m=+0.099203536 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:44:36 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:44:36 localhost podman[282113]: 2025-11-26 09:44:36.94168147 +0000 UTC m=+0.197638138 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute) Nov 26 04:44:36 localhost podman[282113]: 2025-11-26 09:44:36.982478838 +0000 UTC m=+0.238435506 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true) Nov 26 04:44:36 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:44:37 localhost nova_compute[281415]: 2025-11-26 09:44:37.067 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:44:37 localhost nova_compute[281415]: 2025-11-26 09:44:37.095 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Triggering sync for uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Nov 26 04:44:37 localhost nova_compute[281415]: 2025-11-26 09:44:37.095 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "9d78bef9-6977-4fb5-b50b-ae75124e73af" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:44:37 localhost nova_compute[281415]: 2025-11-26 09:44:37.096 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "9d78bef9-6977-4fb5-b50b-ae75124e73af" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:44:37 localhost nova_compute[281415]: 2025-11-26 09:44:37.096 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:44:37 localhost nova_compute[281415]: 2025-11-26 09:44:37.159 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "9d78bef9-6977-4fb5-b50b-ae75124e73af" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.064s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:44:38 localhost nova_compute[281415]: 2025-11-26 09:44:38.432 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17606 DF PROTO=TCP SPT=42498 DPT=9102 SEQ=367521857 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CA13FD0000000001030307) Nov 26 04:44:41 localhost nova_compute[281415]: 2025-11-26 09:44:41.590 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:41 localhost nova_compute[281415]: 2025-11-26 09:44:41.608 281419 DEBUG nova.objects.instance [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Lazy-loading 'flavor' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:44:41 localhost nova_compute[281415]: 2025-11-26 09:44:41.629 281419 DEBUG oslo_concurrency.lockutils [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:44:41 localhost nova_compute[281415]: 2025-11-26 09:44:41.630 281419 DEBUG oslo_concurrency.lockutils [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:44:41 localhost nova_compute[281415]: 2025-11-26 09:44:41.630 281419 DEBUG nova.network.neutron [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m Nov 26 04:44:41 localhost nova_compute[281415]: 2025-11-26 09:44:41.631 281419 DEBUG nova.objects.instance [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:44:43 localhost nova_compute[281415]: 2025-11-26 09:44:43.356 281419 DEBUG nova.network.neutron [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:44:43 localhost nova_compute[281415]: 2025-11-26 09:44:43.434 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:43 localhost nova_compute[281415]: 2025-11-26 09:44:43.458 281419 DEBUG oslo_concurrency.lockutils [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:44:43 localhost nova_compute[281415]: 2025-11-26 09:44:43.493 281419 INFO nova.virt.libvirt.driver [-] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Instance destroyed successfully.#033[00m Nov 26 04:44:43 localhost nova_compute[281415]: 2025-11-26 09:44:43.493 281419 DEBUG nova.objects.instance [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Lazy-loading 'numa_topology' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:44:43 localhost nova_compute[281415]: 2025-11-26 09:44:43.543 281419 DEBUG nova.objects.instance [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Lazy-loading 'resources' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:44:43 localhost nova_compute[281415]: 2025-11-26 09:44:43.600 281419 DEBUG nova.virt.libvirt.vif [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-26T08:29:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=Flavor(2),hidden=False,host='np0005536118.localdomain',hostname='test',id=2,image_ref='7ebee4f6-b3ad-441d-abd0-239ae838ae37',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-11-26T08:29:20Z,launched_on='np0005536118.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=None,node='np0005536118.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=4,progress=0,project_id='b2fe3cd6f6ea49b8a2de01b236dd92e3',ramdisk_id='',reservation_id='r-hokjvvqr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='7ebee4f6-b3ad-441d-abd0-239ae838ae37',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=,task_state='powering-on',terminated_at=None,trusted_certs=,updated_at=2025-11-26T09:44:15Z,user_data=None,user_id='9f8fafc3f43241c3a71039595891ea0e',uuid=9d78bef9-6977-4fb5-b50b-ae75124e73af,vcpu_model=,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m Nov 26 04:44:43 localhost nova_compute[281415]: 2025-11-26 09:44:43.600 281419 DEBUG nova.network.os_vif_util [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Converting VIF {"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Nov 26 04:44:43 localhost nova_compute[281415]: 2025-11-26 09:44:43.601 281419 DEBUG nova.network.os_vif_util [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8c:0f:d8,bridge_name='br-int',has_traffic_filtering=True,id=5afdc9d0-9595-4904-b83b-3d24f739ffec,network=Network(3633976c-3aa0-4c4a-aa49-e8224cd25e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5afdc9d0-95') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Nov 26 04:44:43 localhost nova_compute[281415]: 2025-11-26 09:44:43.602 281419 DEBUG os_vif [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:0f:d8,bridge_name='br-int',has_traffic_filtering=True,id=5afdc9d0-9595-4904-b83b-3d24f739ffec,network=Network(3633976c-3aa0-4c4a-aa49-e8224cd25e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5afdc9d0-95') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m Nov 26 04:44:43 localhost nova_compute[281415]: 2025-11-26 09:44:43.605 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:43 localhost nova_compute[281415]: 2025-11-26 09:44:43.606 281419 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5afdc9d0-95, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:44:43 localhost nova_compute[281415]: 2025-11-26 09:44:43.608 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:43 localhost nova_compute[281415]: 2025-11-26 09:44:43.611 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:43 localhost nova_compute[281415]: 2025-11-26 09:44:43.615 281419 INFO os_vif [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:0f:d8,bridge_name='br-int',has_traffic_filtering=True,id=5afdc9d0-9595-4904-b83b-3d24f739ffec,network=Network(3633976c-3aa0-4c4a-aa49-e8224cd25e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5afdc9d0-95')#033[00m Nov 26 04:44:43 localhost nova_compute[281415]: 2025-11-26 09:44:43.617 281419 DEBUG nova.virt.libvirt.host [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m Nov 26 04:44:43 localhost nova_compute[281415]: 2025-11-26 09:44:43.618 281419 INFO nova.virt.libvirt.host [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] UEFI support detected#033[00m Nov 26 04:44:43 localhost nova_compute[281415]: 2025-11-26 09:44:43.626 281419 DEBUG nova.virt.libvirt.driver [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Start _get_guest_xml network_info=[{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.eph0': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum=,container_format='bare',created_at=,direct_url=,disk_format='qcow2',id=7ebee4f6-b3ad-441d-abd0-239ae838ae37,min_disk=1,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=,status=,tags=,updated_at=,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'image_id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}], 'ephemerals': [{'guest_format': None, 'size': 1, 'encryption_secret_uuid': None, 'encryption_options': None, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vdb', 'device_type': 'disk', 'disk_bus': 'virtio'}], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m Nov 26 04:44:43 localhost nova_compute[281415]: 2025-11-26 09:44:43.631 281419 WARNING nova.virt.libvirt.driver [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 04:44:43 localhost nova_compute[281415]: 2025-11-26 09:44:43.634 281419 DEBUG nova.virt.libvirt.host [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Searching host: 'np0005536118.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m Nov 26 04:44:43 localhost nova_compute[281415]: 2025-11-26 09:44:43.635 281419 DEBUG nova.virt.libvirt.host [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m Nov 26 04:44:43 localhost nova_compute[281415]: 2025-11-26 09:44:43.637 281419 DEBUG nova.virt.libvirt.host [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Searching host: 'np0005536118.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m Nov 26 04:44:43 localhost nova_compute[281415]: 2025-11-26 09:44:43.638 281419 DEBUG nova.virt.libvirt.host [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m Nov 26 04:44:43 localhost nova_compute[281415]: 2025-11-26 09:44:43.639 281419 DEBUG nova.virt.libvirt.driver [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Nov 26 04:44:43 localhost nova_compute[281415]: 2025-11-26 09:44:43.639 281419 DEBUG nova.virt.hardware [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-26T08:28:14Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=1,extra_specs={},flavorid='a8cafabf-98f1-4bbc-a3ca-a9382f40900b',id=2,is_public=True,memory_mb=512,name='m1.small',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format='bare',created_at=,direct_url=,disk_format='qcow2',id=7ebee4f6-b3ad-441d-abd0-239ae838ae37,min_disk=1,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=,status=,tags=,updated_at=,virtual_size=,visibility=), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m Nov 26 04:44:43 localhost nova_compute[281415]: 2025-11-26 09:44:43.640 281419 DEBUG nova.virt.hardware [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m Nov 26 04:44:43 localhost nova_compute[281415]: 2025-11-26 09:44:43.640 281419 DEBUG nova.virt.hardware [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m Nov 26 04:44:43 localhost nova_compute[281415]: 2025-11-26 09:44:43.641 281419 DEBUG nova.virt.hardware [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m Nov 26 04:44:43 localhost nova_compute[281415]: 2025-11-26 09:44:43.641 281419 DEBUG nova.virt.hardware [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m Nov 26 04:44:43 localhost nova_compute[281415]: 2025-11-26 09:44:43.641 281419 DEBUG nova.virt.hardware [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m Nov 26 04:44:43 localhost nova_compute[281415]: 2025-11-26 09:44:43.642 281419 DEBUG nova.virt.hardware [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m Nov 26 04:44:43 localhost nova_compute[281415]: 2025-11-26 09:44:43.642 281419 DEBUG nova.virt.hardware [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m Nov 26 04:44:43 localhost nova_compute[281415]: 2025-11-26 09:44:43.643 281419 DEBUG nova.virt.hardware [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m Nov 26 04:44:43 localhost nova_compute[281415]: 2025-11-26 09:44:43.643 281419 DEBUG nova.virt.hardware [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m Nov 26 04:44:43 localhost nova_compute[281415]: 2025-11-26 09:44:43.644 281419 DEBUG nova.virt.hardware [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m Nov 26 04:44:43 localhost nova_compute[281415]: 2025-11-26 09:44:43.644 281419 DEBUG nova.objects.instance [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:44:43 localhost nova_compute[281415]: 2025-11-26 09:44:43.669 281419 DEBUG nova.privsep.utils [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m Nov 26 04:44:43 localhost nova_compute[281415]: 2025-11-26 09:44:43.670 281419 DEBUG oslo_concurrency.processutils [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:44:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:44:43 localhost podman[282155]: 2025-11-26 09:44:43.830818024 +0000 UTC m=+0.087308823 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3) Nov 26 04:44:43 localhost podman[282155]: 2025-11-26 09:44:43.895539074 +0000 UTC m=+0.152029823 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Nov 26 04:44:43 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:44:44 localhost nova_compute[281415]: 2025-11-26 09:44:44.141 281419 DEBUG oslo_concurrency.processutils [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:44:44 localhost nova_compute[281415]: 2025-11-26 09:44:44.143 281419 DEBUG oslo_concurrency.processutils [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:44:44 localhost nova_compute[281415]: 2025-11-26 09:44:44.563 281419 DEBUG oslo_concurrency.processutils [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.419s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:44:44 localhost nova_compute[281415]: 2025-11-26 09:44:44.566 281419 DEBUG nova.virt.libvirt.vif [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-26T08:29:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=Flavor(2),hidden=False,host='np0005536118.localdomain',hostname='test',id=2,image_ref='7ebee4f6-b3ad-441d-abd0-239ae838ae37',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-11-26T08:29:20Z,launched_on='np0005536118.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=None,node='np0005536118.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=4,progress=0,project_id='b2fe3cd6f6ea49b8a2de01b236dd92e3',ramdisk_id='',reservation_id='r-hokjvvqr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='7ebee4f6-b3ad-441d-abd0-239ae838ae37',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=,task_state='powering-on',terminated_at=None,trusted_certs=,updated_at=2025-11-26T09:44:15Z,user_data=None,user_id='9f8fafc3f43241c3a71039595891ea0e',uuid=9d78bef9-6977-4fb5-b50b-ae75124e73af,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m Nov 26 04:44:44 localhost nova_compute[281415]: 2025-11-26 09:44:44.567 281419 DEBUG nova.network.os_vif_util [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Converting VIF {"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Nov 26 04:44:44 localhost nova_compute[281415]: 2025-11-26 09:44:44.568 281419 DEBUG nova.network.os_vif_util [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8c:0f:d8,bridge_name='br-int',has_traffic_filtering=True,id=5afdc9d0-9595-4904-b83b-3d24f739ffec,network=Network(3633976c-3aa0-4c4a-aa49-e8224cd25e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5afdc9d0-95') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Nov 26 04:44:44 localhost nova_compute[281415]: 2025-11-26 09:44:44.571 281419 DEBUG nova.objects.instance [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:44:44 localhost nova_compute[281415]: 2025-11-26 09:44:44.596 281419 DEBUG nova.virt.libvirt.driver [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] End _get_guest_xml xml= Nov 26 04:44:44 localhost nova_compute[281415]: 9d78bef9-6977-4fb5-b50b-ae75124e73af Nov 26 04:44:44 localhost nova_compute[281415]: instance-00000002 Nov 26 04:44:44 localhost nova_compute[281415]: 524288 Nov 26 04:44:44 localhost nova_compute[281415]: 1 Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: test Nov 26 04:44:44 localhost nova_compute[281415]: 2025-11-26 09:44:43 Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: 512 Nov 26 04:44:44 localhost nova_compute[281415]: 1 Nov 26 04:44:44 localhost nova_compute[281415]: 0 Nov 26 04:44:44 localhost nova_compute[281415]: 1 Nov 26 04:44:44 localhost nova_compute[281415]: 1 Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: admin Nov 26 04:44:44 localhost nova_compute[281415]: admin Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: RDO Nov 26 04:44:44 localhost nova_compute[281415]: OpenStack Compute Nov 26 04:44:44 localhost nova_compute[281415]: 27.5.2-0.20250829104910.6f8decf.el9 Nov 26 04:44:44 localhost nova_compute[281415]: 9d78bef9-6977-4fb5-b50b-ae75124e73af Nov 26 04:44:44 localhost nova_compute[281415]: 9d78bef9-6977-4fb5-b50b-ae75124e73af Nov 26 04:44:44 localhost nova_compute[281415]: Virtual Machine Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: hvm Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: /dev/urandom Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: Nov 26 04:44:44 localhost nova_compute[281415]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m Nov 26 04:44:44 localhost nova_compute[281415]: 2025-11-26 09:44:44.597 281419 DEBUG nova.virt.libvirt.driver [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:44:44 localhost nova_compute[281415]: 2025-11-26 09:44:44.598 281419 DEBUG nova.virt.libvirt.driver [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:44:44 localhost nova_compute[281415]: 2025-11-26 09:44:44.598 281419 DEBUG nova.virt.libvirt.vif [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-26T08:29:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=Flavor(2),hidden=False,host='np0005536118.localdomain',hostname='test',id=2,image_ref='7ebee4f6-b3ad-441d-abd0-239ae838ae37',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-11-26T08:29:20Z,launched_on='np0005536118.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=None,node='np0005536118.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=,power_state=4,progress=0,project_id='b2fe3cd6f6ea49b8a2de01b236dd92e3',ramdisk_id='',reservation_id='r-hokjvvqr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='7ebee4f6-b3ad-441d-abd0-239ae838ae37',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=,task_state='powering-on',terminated_at=None,trusted_certs=,updated_at=2025-11-26T09:44:15Z,user_data=None,user_id='9f8fafc3f43241c3a71039595891ea0e',uuid=9d78bef9-6977-4fb5-b50b-ae75124e73af,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Nov 26 04:44:44 localhost nova_compute[281415]: 2025-11-26 09:44:44.599 281419 DEBUG nova.network.os_vif_util [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Converting VIF {"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Nov 26 04:44:44 localhost nova_compute[281415]: 2025-11-26 09:44:44.599 281419 DEBUG nova.network.os_vif_util [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8c:0f:d8,bridge_name='br-int',has_traffic_filtering=True,id=5afdc9d0-9595-4904-b83b-3d24f739ffec,network=Network(3633976c-3aa0-4c4a-aa49-e8224cd25e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5afdc9d0-95') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Nov 26 04:44:44 localhost nova_compute[281415]: 2025-11-26 09:44:44.599 281419 DEBUG os_vif [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:0f:d8,bridge_name='br-int',has_traffic_filtering=True,id=5afdc9d0-9595-4904-b83b-3d24f739ffec,network=Network(3633976c-3aa0-4c4a-aa49-e8224cd25e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5afdc9d0-95') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Nov 26 04:44:44 localhost nova_compute[281415]: 2025-11-26 09:44:44.600 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:44 localhost nova_compute[281415]: 2025-11-26 09:44:44.600 281419 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:44:44 localhost nova_compute[281415]: 2025-11-26 09:44:44.601 281419 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 26 04:44:44 localhost nova_compute[281415]: 2025-11-26 09:44:44.603 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:44 localhost nova_compute[281415]: 2025-11-26 09:44:44.604 281419 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5afdc9d0-95, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:44:44 localhost nova_compute[281415]: 2025-11-26 09:44:44.604 281419 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5afdc9d0-95, col_values=(('external_ids', {'iface-id': '5afdc9d0-9595-4904-b83b-3d24f739ffec', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8c:0f:d8', 'vm-uuid': '9d78bef9-6977-4fb5-b50b-ae75124e73af'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:44:44 localhost nova_compute[281415]: 2025-11-26 09:44:44.638 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:44 localhost nova_compute[281415]: 2025-11-26 09:44:44.641 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 26 04:44:44 localhost nova_compute[281415]: 2025-11-26 09:44:44.644 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:44 localhost nova_compute[281415]: 2025-11-26 09:44:44.645 281419 INFO os_vif [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:0f:d8,bridge_name='br-int',has_traffic_filtering=True,id=5afdc9d0-9595-4904-b83b-3d24f739ffec,network=Network(3633976c-3aa0-4c4a-aa49-e8224cd25e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5afdc9d0-95')#033[00m Nov 26 04:44:44 localhost systemd[1]: Started libvirt secret daemon. Nov 26 04:44:44 localhost kernel: device tap5afdc9d0-95 entered promiscuous mode Nov 26 04:44:44 localhost NetworkManager[5970]: [1764150284.7667] manager: (tap5afdc9d0-95): new Tun device (/org/freedesktop/NetworkManager/Devices/15) Nov 26 04:44:44 localhost systemd-udevd[282250]: Network interface NamePolicy= disabled on kernel command line. Nov 26 04:44:44 localhost ovn_controller[153664]: 2025-11-26T09:44:44Z|00055|binding|INFO|Claiming lport 5afdc9d0-9595-4904-b83b-3d24f739ffec for this chassis. Nov 26 04:44:44 localhost ovn_controller[153664]: 2025-11-26T09:44:44Z|00056|binding|INFO|5afdc9d0-9595-4904-b83b-3d24f739ffec: Claiming fa:16:3e:8c:0f:d8 192.168.0.160 Nov 26 04:44:44 localhost nova_compute[281415]: 2025-11-26 09:44:44.769 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:44 localhost nova_compute[281415]: 2025-11-26 09:44:44.776 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:44 localhost nova_compute[281415]: 2025-11-26 09:44:44.779 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:44 localhost NetworkManager[5970]: [1764150284.7847] device (tap5afdc9d0-95): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Nov 26 04:44:44 localhost NetworkManager[5970]: [1764150284.7864] device (tap5afdc9d0-95): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external') Nov 26 04:44:44 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:44.799 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:0f:d8 192.168.0.160'], port_security=['fa:16:3e:8c:0f:d8 192.168.0.160'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.160/24', 'neutron:device_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3633976c-3aa0-4c4a-aa49-e8224cd25e39', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'neutron:revision_number': '8', 'neutron:security_group_ids': '10c2b79b-e9f0-444f-8b9c-e9015cac7c52 4b147283-0178-4a15-bbd3-c1ef9b53dbb6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9eb25cee-4262-4506-9877-de1032fbc4e7, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=5afdc9d0-9595-4904-b83b-3d24f739ffec) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 04:44:44 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:44.801 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 5afdc9d0-9595-4904-b83b-3d24f739ffec in datapath 3633976c-3aa0-4c4a-aa49-e8224cd25e39 bound to our chassis#033[00m Nov 26 04:44:44 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:44.803 159486 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3633976c-3aa0-4c4a-aa49-e8224cd25e39#033[00m Nov 26 04:44:44 localhost ovn_controller[153664]: 2025-11-26T09:44:44Z|00057|ovn_bfd|INFO|Enabled BFD on interface ovn-0e4a56-0 Nov 26 04:44:44 localhost ovn_controller[153664]: 2025-11-26T09:44:44Z|00058|ovn_bfd|INFO|Enabled BFD on interface ovn-9f6a17-0 Nov 26 04:44:44 localhost ovn_controller[153664]: 2025-11-26T09:44:44Z|00059|ovn_bfd|INFO|Enabled BFD on interface ovn-7174ad-0 Nov 26 04:44:44 localhost nova_compute[281415]: 2025-11-26 09:44:44.809 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:44 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:44.815 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[337a0324-44ca-4756-bd8d-77de06613707]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:44:44 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:44.818 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3633976c-31 in ovnmeta-3633976c-3aa0-4c4a-aa49-e8224cd25e39 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m Nov 26 04:44:44 localhost nova_compute[281415]: 2025-11-26 09:44:44.821 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:44 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:44.821 159592 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3633976c-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m Nov 26 04:44:44 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:44.821 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[6c190031-8444-4a81-8f8c-ccde0e9de9ee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:44:44 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:44.824 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[938b217d-ca1c-451a-b163-3307288b4edb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:44:44 localhost nova_compute[281415]: 2025-11-26 09:44:44.827 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:44 localhost ovn_controller[153664]: 2025-11-26T09:44:44Z|00060|binding|INFO|Setting lport 5afdc9d0-9595-4904-b83b-3d24f739ffec ovn-installed in OVS Nov 26 04:44:44 localhost ovn_controller[153664]: 2025-11-26T09:44:44Z|00061|binding|INFO|Setting lport 5afdc9d0-9595-4904-b83b-3d24f739ffec up in Southbound Nov 26 04:44:44 localhost nova_compute[281415]: 2025-11-26 09:44:44.840 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:44 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:44.842 159623 DEBUG oslo.privsep.daemon [-] privsep: reply[a743bfda-2ebe-44ba-8a0a-d934849b6ce4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:44:44 localhost nova_compute[281415]: 2025-11-26 09:44:44.858 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:44 localhost systemd-machined[83873]: New machine qemu-2-instance-00000002. Nov 26 04:44:44 localhost nova_compute[281415]: 2025-11-26 09:44:44.866 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:44 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:44.868 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[90cf7872-33a7-4bac-85d0-8c7801a59a64]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:44:44 localhost systemd[1]: Started Virtual Machine qemu-2-instance-00000002. Nov 26 04:44:44 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:44.898 159603 DEBUG oslo.privsep.daemon [-] privsep: reply[d6c864d2-f3ed-41e1-9fde-3124775f82db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:44:44 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:44.902 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[5f202b2a-cb5d-4ab5-9950-8de4359f7a62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:44:44 localhost NetworkManager[5970]: [1764150284.9047] manager: (tap3633976c-30): new Veth device (/org/freedesktop/NetworkManager/Devices/16) Nov 26 04:44:44 localhost systemd-udevd[282253]: Network interface NamePolicy= disabled on kernel command line. Nov 26 04:44:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:44:44 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:44.940 159603 DEBUG oslo.privsep.daemon [-] privsep: reply[65d06e34-01fa-46bc-a6e8-a3f930e6ddd1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:44:44 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:44.945 159603 DEBUG oslo.privsep.daemon [-] privsep: reply[fd3cccd7-2368-43e3-85b6-c6b2bcdd6201]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:44:44 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap3633976c-31: link becomes ready Nov 26 04:44:44 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap3633976c-30: link becomes ready Nov 26 04:44:44 localhost NetworkManager[5970]: [1764150284.9699] device (tap3633976c-30): carrier: link connected Nov 26 04:44:44 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:44.977 159603 DEBUG oslo.privsep.daemon [-] privsep: reply[7fbffe87-805f-4ff3-9c5e-f2c60168b37e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:44:44 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:44.995 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[f665e7ae-3e00-4f71-88e0-20ce3e71143f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3633976c-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:45:53:57'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1076014, 'reachable_time': 36284, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 282300, 'error': None, 'target': 'ovnmeta-3633976c-3aa0-4c4a-aa49-e8224cd25e39', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:45.009 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[4b8179ac-33f3-4aee-9aa3-59fd988661c0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe45:5357'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1076014, 'tstamp': 1076014}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 282303, 'error': None, 'target': 'ovnmeta-3633976c-3aa0-4c4a-aa49-e8224cd25e39', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:45.022 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[451249a3-c82e-4dfc-bcc6-7616afe24f6d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3633976c-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:45:53:57'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1076014, 'reachable_time': 36284, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 282304, 'error': None, 'target': 'ovnmeta-3633976c-3aa0-4c4a-aa49-e8224cd25e39', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:44:45 localhost podman[282278]: 2025-11-26 09:44:45.035594114 +0000 UTC m=+0.095168183 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, distribution-scope=public, maintainer=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:45.051 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[f50217dd-af02-4cd2-8454-e44f7292e59e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:44:45 localhost podman[282278]: 2025-11-26 09:44:45.078788396 +0000 UTC m=+0.138362455 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, version=9.6, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Nov 26 04:44:45 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:45.115 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[92a8187a-b93a-4691-a4dd-1a02a23101a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:45.117 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3633976c-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:45.117 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:45.118 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3633976c-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:44:45 localhost kernel: device tap3633976c-30 entered promiscuous mode Nov 26 04:44:45 localhost nova_compute[281415]: 2025-11-26 09:44:45.120 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:45.125 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3633976c-30, col_values=(('external_ids', {'iface-id': '7d243368-b21b-43d3-98dc-158093f352bc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:44:45 localhost nova_compute[281415]: 2025-11-26 09:44:45.126 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:45 localhost ovn_controller[153664]: 2025-11-26T09:44:45Z|00062|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 04:44:45 localhost nova_compute[281415]: 2025-11-26 09:44:45.134 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:45 localhost nova_compute[281415]: 2025-11-26 09:44:45.135 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:45.136 159486 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3633976c-3aa0-4c4a-aa49-e8224cd25e39.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3633976c-3aa0-4c4a-aa49-e8224cd25e39.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:45.137 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[a25ee0f3-67ba-48f2-bf9a-afdaeb4cf2b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:45.139 159486 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: global Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: log /dev/log local0 debug Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: log-tag haproxy-metadata-proxy-3633976c-3aa0-4c4a-aa49-e8224cd25e39 Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: user root Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: group root Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: maxconn 1024 Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: pidfile /var/lib/neutron/external/pids/3633976c-3aa0-4c4a-aa49-e8224cd25e39.pid.haproxy Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: daemon Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: defaults Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: log global Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: mode http Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: option httplog Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: option dontlognull Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: option http-server-close Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: option forwardfor Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: retries 3 Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: timeout http-request 30s Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: timeout connect 30s Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: timeout client 32s Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: timeout server 32s Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: timeout http-keep-alive 30s Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: listen listener Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: bind 169.254.169.254:80 Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: server metadata /var/lib/neutron/metadata_proxy Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: http-request add-header X-OVN-Network-ID 3633976c-3aa0-4c4a-aa49-e8224cd25e39 Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m Nov 26 04:44:45 localhost ovn_metadata_agent[159481]: 2025-11-26 09:44:45.140 159486 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3633976c-3aa0-4c4a-aa49-e8224cd25e39', 'env', 'PROCESS_TAG=haproxy-3633976c-3aa0-4c4a-aa49-e8224cd25e39', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3633976c-3aa0-4c4a-aa49-e8224cd25e39.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m Nov 26 04:44:45 localhost nova_compute[281415]: 2025-11-26 09:44:45.164 281419 DEBUG nova.compute.manager [req-f0b88403-0e34-4237-9edb-0f3cfdc2ad23 req-d36aab95-eda2-4617-9a3a-91d9468190b9 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Received event network-vif-plugged-5afdc9d0-9595-4904-b83b-3d24f739ffec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Nov 26 04:44:45 localhost nova_compute[281415]: 2025-11-26 09:44:45.164 281419 DEBUG oslo_concurrency.lockutils [req-f0b88403-0e34-4237-9edb-0f3cfdc2ad23 req-d36aab95-eda2-4617-9a3a-91d9468190b9 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] Acquiring lock "9d78bef9-6977-4fb5-b50b-ae75124e73af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:44:45 localhost nova_compute[281415]: 2025-11-26 09:44:45.164 281419 DEBUG oslo_concurrency.lockutils [req-f0b88403-0e34-4237-9edb-0f3cfdc2ad23 req-d36aab95-eda2-4617-9a3a-91d9468190b9 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] Lock "9d78bef9-6977-4fb5-b50b-ae75124e73af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:44:45 localhost nova_compute[281415]: 2025-11-26 09:44:45.165 281419 DEBUG oslo_concurrency.lockutils [req-f0b88403-0e34-4237-9edb-0f3cfdc2ad23 req-d36aab95-eda2-4617-9a3a-91d9468190b9 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] Lock "9d78bef9-6977-4fb5-b50b-ae75124e73af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:44:45 localhost nova_compute[281415]: 2025-11-26 09:44:45.165 281419 DEBUG nova.compute.manager [req-f0b88403-0e34-4237-9edb-0f3cfdc2ad23 req-d36aab95-eda2-4617-9a3a-91d9468190b9 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] No waiting events found dispatching network-vif-plugged-5afdc9d0-9595-4904-b83b-3d24f739ffec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Nov 26 04:44:45 localhost nova_compute[281415]: 2025-11-26 09:44:45.165 281419 WARNING nova.compute.manager [req-f0b88403-0e34-4237-9edb-0f3cfdc2ad23 req-d36aab95-eda2-4617-9a3a-91d9468190b9 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Received unexpected event network-vif-plugged-5afdc9d0-9595-4904-b83b-3d24f739ffec for instance with vm_state stopped and task_state powering-on.#033[00m Nov 26 04:44:45 localhost nova_compute[281415]: 2025-11-26 09:44:45.323 281419 DEBUG nova.compute.manager [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Instance event wait completed in 0 seconds for wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m Nov 26 04:44:45 localhost nova_compute[281415]: 2025-11-26 09:44:45.326 281419 DEBUG nova.virt.driver [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Emitting event Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Nov 26 04:44:45 localhost nova_compute[281415]: 2025-11-26 09:44:45.326 281419 INFO nova.compute.manager [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] VM Resumed (Lifecycle Event)#033[00m Nov 26 04:44:45 localhost nova_compute[281415]: 2025-11-26 09:44:45.332 281419 INFO nova.virt.libvirt.driver [-] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Instance rebooted successfully.#033[00m Nov 26 04:44:45 localhost nova_compute[281415]: 2025-11-26 09:44:45.333 281419 DEBUG nova.compute.manager [None req-59f0dc08-bc65-48ae-8441-bf64679b490f 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 26 04:44:45 localhost nova_compute[281415]: 2025-11-26 09:44:45.370 281419 DEBUG nova.compute.manager [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 26 04:44:45 localhost nova_compute[281415]: 2025-11-26 09:44:45.375 281419 DEBUG nova.compute.manager [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Nov 26 04:44:45 localhost nova_compute[281415]: 2025-11-26 09:44:45.408 281419 INFO nova.compute.manager [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] During sync_power_state the instance has a pending task (powering-on). Skip.#033[00m Nov 26 04:44:45 localhost nova_compute[281415]: 2025-11-26 09:44:45.409 281419 DEBUG nova.virt.driver [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Emitting event Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Nov 26 04:44:45 localhost nova_compute[281415]: 2025-11-26 09:44:45.409 281419 INFO nova.compute.manager [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] VM Started (Lifecycle Event)#033[00m Nov 26 04:44:45 localhost nova_compute[281415]: 2025-11-26 09:44:45.447 281419 DEBUG nova.compute.manager [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 26 04:44:45 localhost nova_compute[281415]: 2025-11-26 09:44:45.452 281419 DEBUG nova.compute.manager [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Nov 26 04:44:45 localhost podman[282386]: Nov 26 04:44:45 localhost podman[282386]: 2025-11-26 09:44:45.587753057 +0000 UTC m=+0.102779445 container create b85a89debc53c717715b8fb44f9dccc24b644a20657e6cb4e43fd9e13e676cf7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3633976c-3aa0-4c4a-aa49-e8224cd25e39, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 26 04:44:45 localhost podman[282386]: 2025-11-26 09:44:45.541781511 +0000 UTC m=+0.056807919 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Nov 26 04:44:45 localhost systemd[1]: Started libpod-conmon-b85a89debc53c717715b8fb44f9dccc24b644a20657e6cb4e43fd9e13e676cf7.scope. Nov 26 04:44:45 localhost systemd[1]: Started libcrun container. Nov 26 04:44:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/96fdae78b702dacb1dfcb1b10db586dd25126d89939a99d3cba0a814f6491eaf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 04:44:45 localhost podman[282386]: 2025-11-26 09:44:45.669773997 +0000 UTC m=+0.184800375 container init b85a89debc53c717715b8fb44f9dccc24b644a20657e6cb4e43fd9e13e676cf7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3633976c-3aa0-4c4a-aa49-e8224cd25e39, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true) Nov 26 04:44:45 localhost podman[282386]: 2025-11-26 09:44:45.67775416 +0000 UTC m=+0.192780538 container start b85a89debc53c717715b8fb44f9dccc24b644a20657e6cb4e43fd9e13e676cf7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3633976c-3aa0-4c4a-aa49-e8224cd25e39, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 26 04:44:45 localhost neutron-haproxy-ovnmeta-3633976c-3aa0-4c4a-aa49-e8224cd25e39[282400]: [NOTICE] (282404) : New worker (282406) forked Nov 26 04:44:45 localhost neutron-haproxy-ovnmeta-3633976c-3aa0-4c4a-aa49-e8224cd25e39[282400]: [NOTICE] (282404) : Loading success. Nov 26 04:44:45 localhost openstack_network_exporter[242153]: ERROR 09:44:45 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:44:45 localhost openstack_network_exporter[242153]: ERROR 09:44:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:44:45 localhost openstack_network_exporter[242153]: ERROR 09:44:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:44:45 localhost openstack_network_exporter[242153]: ERROR 09:44:45 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:44:45 localhost openstack_network_exporter[242153]: Nov 26 04:44:45 localhost openstack_network_exporter[242153]: ERROR 09:44:45 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:44:45 localhost openstack_network_exporter[242153]: Nov 26 04:44:45 localhost ovn_controller[153664]: 2025-11-26T09:44:45Z|00063|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 04:44:45 localhost nova_compute[281415]: 2025-11-26 09:44:45.804 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:46 localhost ovn_controller[153664]: 2025-11-26T09:44:46Z|00064|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 04:44:46 localhost nova_compute[281415]: 2025-11-26 09:44:46.523 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:46 localhost ovn_controller[153664]: 2025-11-26T09:44:46Z|00065|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 04:44:46 localhost nova_compute[281415]: 2025-11-26 09:44:46.535 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:46 localhost nova_compute[281415]: 2025-11-26 09:44:46.590 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:46 localhost snmpd[66980]: IfIndex of an interface changed. Such interfaces will appear multiple times in IF-MIB. Nov 26 04:44:46 localhost nova_compute[281415]: 2025-11-26 09:44:46.952 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:44:46 localhost nova_compute[281415]: 2025-11-26 09:44:46.954 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:44:46 localhost nova_compute[281415]: 2025-11-26 09:44:46.954 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 04:44:46 localhost nova_compute[281415]: 2025-11-26 09:44:46.955 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 04:44:47 localhost nova_compute[281415]: 2025-11-26 09:44:47.083 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:44:47 localhost nova_compute[281415]: 2025-11-26 09:44:47.083 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:44:47 localhost nova_compute[281415]: 2025-11-26 09:44:47.084 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 04:44:47 localhost nova_compute[281415]: 2025-11-26 09:44:47.085 281419 DEBUG nova.objects.instance [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:44:47 localhost nova_compute[281415]: 2025-11-26 09:44:47.234 281419 DEBUG nova.compute.manager [req-47d67dc4-8ff4-4217-9e51-21873eb12ce6 req-965d71df-9584-4503-8849-e9e1cdde5063 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Received event network-vif-plugged-5afdc9d0-9595-4904-b83b-3d24f739ffec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Nov 26 04:44:47 localhost nova_compute[281415]: 2025-11-26 09:44:47.236 281419 DEBUG oslo_concurrency.lockutils [req-47d67dc4-8ff4-4217-9e51-21873eb12ce6 req-965d71df-9584-4503-8849-e9e1cdde5063 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] Acquiring lock "9d78bef9-6977-4fb5-b50b-ae75124e73af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:44:47 localhost nova_compute[281415]: 2025-11-26 09:44:47.236 281419 DEBUG oslo_concurrency.lockutils [req-47d67dc4-8ff4-4217-9e51-21873eb12ce6 req-965d71df-9584-4503-8849-e9e1cdde5063 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] Lock "9d78bef9-6977-4fb5-b50b-ae75124e73af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:44:47 localhost nova_compute[281415]: 2025-11-26 09:44:47.237 281419 DEBUG oslo_concurrency.lockutils [req-47d67dc4-8ff4-4217-9e51-21873eb12ce6 req-965d71df-9584-4503-8849-e9e1cdde5063 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] Lock "9d78bef9-6977-4fb5-b50b-ae75124e73af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:44:47 localhost nova_compute[281415]: 2025-11-26 09:44:47.238 281419 DEBUG nova.compute.manager [req-47d67dc4-8ff4-4217-9e51-21873eb12ce6 req-965d71df-9584-4503-8849-e9e1cdde5063 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] No waiting events found dispatching network-vif-plugged-5afdc9d0-9595-4904-b83b-3d24f739ffec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Nov 26 04:44:47 localhost nova_compute[281415]: 2025-11-26 09:44:47.239 281419 WARNING nova.compute.manager [req-47d67dc4-8ff4-4217-9e51-21873eb12ce6 req-965d71df-9584-4503-8849-e9e1cdde5063 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Received unexpected event network-vif-plugged-5afdc9d0-9595-4904-b83b-3d24f739ffec for instance with vm_state active and task_state None.#033[00m Nov 26 04:44:47 localhost nova_compute[281415]: 2025-11-26 09:44:47.586 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:44:47 localhost nova_compute[281415]: 2025-11-26 09:44:47.615 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:44:47 localhost nova_compute[281415]: 2025-11-26 09:44:47.616 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 04:44:47 localhost nova_compute[281415]: 2025-11-26 09:44:47.617 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:44:47 localhost nova_compute[281415]: 2025-11-26 09:44:47.618 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:44:47 localhost nova_compute[281415]: 2025-11-26 09:44:47.619 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:44:47 localhost nova_compute[281415]: 2025-11-26 09:44:47.620 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:44:47 localhost nova_compute[281415]: 2025-11-26 09:44:47.621 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:44:47 localhost nova_compute[281415]: 2025-11-26 09:44:47.622 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:44:47 localhost nova_compute[281415]: 2025-11-26 09:44:47.623 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 04:44:47 localhost nova_compute[281415]: 2025-11-26 09:44:47.623 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:44:47 localhost nova_compute[281415]: 2025-11-26 09:44:47.645 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:44:47 localhost nova_compute[281415]: 2025-11-26 09:44:47.646 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:44:47 localhost nova_compute[281415]: 2025-11-26 09:44:47.646 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:44:47 localhost nova_compute[281415]: 2025-11-26 09:44:47.647 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 04:44:47 localhost nova_compute[281415]: 2025-11-26 09:44:47.648 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:44:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:44:47 localhost podman[282416]: 2025-11-26 09:44:47.846417192 +0000 UTC m=+0.099711843 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 04:44:47 localhost podman[282416]: 2025-11-26 09:44:47.866661031 +0000 UTC m=+0.119955682 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 26 04:44:47 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:44:48 localhost nova_compute[281415]: 2025-11-26 09:44:48.192 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.544s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:44:48 localhost nova_compute[281415]: 2025-11-26 09:44:48.302 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:44:48 localhost nova_compute[281415]: 2025-11-26 09:44:48.305 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:44:48 localhost nova_compute[281415]: 2025-11-26 09:44:48.522 281419 WARNING nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 04:44:48 localhost nova_compute[281415]: 2025-11-26 09:44:48.523 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=12436MB free_disk=41.8370475769043GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 04:44:48 localhost nova_compute[281415]: 2025-11-26 09:44:48.524 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:44:48 localhost nova_compute[281415]: 2025-11-26 09:44:48.524 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:44:48 localhost nova_compute[281415]: 2025-11-26 09:44:48.615 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 04:44:48 localhost nova_compute[281415]: 2025-11-26 09:44:48.615 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 04:44:48 localhost nova_compute[281415]: 2025-11-26 09:44:48.616 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 04:44:48 localhost nova_compute[281415]: 2025-11-26 09:44:48.654 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:44:49 localhost nova_compute[281415]: 2025-11-26 09:44:49.115 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:44:49 localhost nova_compute[281415]: 2025-11-26 09:44:49.123 281419 DEBUG nova.compute.provider_tree [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 04:44:49 localhost nova_compute[281415]: 2025-11-26 09:44:49.154 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 04:44:49 localhost nova_compute[281415]: 2025-11-26 09:44:49.157 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 04:44:49 localhost nova_compute[281415]: 2025-11-26 09:44:49.158 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.634s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:44:49 localhost nova_compute[281415]: 2025-11-26 09:44:49.679 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:51 localhost nova_compute[281415]: 2025-11-26 09:44:51.594 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39795 DF PROTO=TCP SPT=60250 DPT=9102 SEQ=1690994296 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CA4DA90000000001030307) Nov 26 04:44:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:44:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:44:54 localhost podman[282482]: 2025-11-26 09:44:54.248229215 +0000 UTC m=+0.092396548 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent) Nov 26 04:44:54 localhost podman[282482]: 2025-11-26 09:44:54.254370972 +0000 UTC m=+0.098538295 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0) Nov 26 04:44:54 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:44:54 localhost podman[282483]: 2025-11-26 09:44:54.310404707 +0000 UTC m=+0.148983889 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_id=multipathd, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:44:54 localhost podman[282483]: 2025-11-26 09:44:54.32325565 +0000 UTC m=+0.161834812 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Nov 26 04:44:54 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:44:54 localhost nova_compute[281415]: 2025-11-26 09:44:54.713 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39796 DF PROTO=TCP SPT=60250 DPT=9102 SEQ=1690994296 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CA51BD0000000001030307) Nov 26 04:44:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17607 DF PROTO=TCP SPT=42498 DPT=9102 SEQ=367521857 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CA53FC0000000001030307) Nov 26 04:44:56 localhost nova_compute[281415]: 2025-11-26 09:44:56.637 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:44:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39797 DF PROTO=TCP SPT=60250 DPT=9102 SEQ=1690994296 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CA59BD0000000001030307) Nov 26 04:44:57 localhost podman[240049]: time="2025-11-26T09:44:57Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:44:57 localhost podman[240049]: @ - - [26/Nov/2025:09:44:57 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147525 "" "Go-http-client/1.1" Nov 26 04:44:57 localhost podman[240049]: @ - - [26/Nov/2025:09:44:57 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17239 "" "Go-http-client/1.1" Nov 26 04:44:57 localhost sshd[282518]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:44:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36922 DF PROTO=TCP SPT=53058 DPT=9102 SEQ=4057334572 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CA5DFC0000000001030307) Nov 26 04:44:59 localhost ovn_controller[153664]: 2025-11-26T09:44:59Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8c:0f:d8 192.168.0.160 Nov 26 04:44:59 localhost nova_compute[281415]: 2025-11-26 09:44:59.753 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:45:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39798 DF PROTO=TCP SPT=60250 DPT=9102 SEQ=1690994296 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CA697C0000000001030307) Nov 26 04:45:01 localhost nova_compute[281415]: 2025-11-26 09:45:01.678 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:45:03 localhost nova_compute[281415]: 2025-11-26 09:45:03.482 281419 DEBUG nova.compute.manager [None req-fb07a290-93df-4d1f-bc3b-ff55b57b064b 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 26 04:45:03 localhost nova_compute[281415]: 2025-11-26 09:45:03.488 281419 INFO nova.compute.manager [None req-fb07a290-93df-4d1f-bc3b-ff55b57b064b 9f8fafc3f43241c3a71039595891ea0e b2fe3cd6f6ea49b8a2de01b236dd92e3 - - default default] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Retrieving diagnostics#033[00m Nov 26 04:45:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:03.648 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:45:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:03.649 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:45:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:03.650 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:45:04 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:04.227 159587 DEBUG eventlet.wsgi.server [-] (159587) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 26 04:45:04 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:04.229 159587 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0#015 Nov 26 04:45:04 localhost ovn_metadata_agent[159481]: Accept: */*#015 Nov 26 04:45:04 localhost ovn_metadata_agent[159481]: Connection: close#015 Nov 26 04:45:04 localhost ovn_metadata_agent[159481]: Content-Type: text/plain#015 Nov 26 04:45:04 localhost ovn_metadata_agent[159481]: Host: 169.254.169.254#015 Nov 26 04:45:04 localhost ovn_metadata_agent[159481]: User-Agent: curl/7.84.0#015 Nov 26 04:45:04 localhost ovn_metadata_agent[159481]: X-Forwarded-For: 192.168.0.160#015 Nov 26 04:45:04 localhost ovn_metadata_agent[159481]: X-Ovn-Network-Id: 3633976c-3aa0-4c4a-aa49-e8224cd25e39 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 26 04:45:04 localhost nova_compute[281415]: 2025-11-26 09:45:04.756 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.006 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:5e:b4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '86:cf:7c:68:02:df'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.008 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 26 04:45:06 localhost nova_compute[281415]: 2025-11-26 09:45:06.044 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.065 159587 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.066 159587 INFO eventlet.wsgi.server [-] 192.168.0.160, "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200 len: 146 time: 1.8375912#033[00m Nov 26 04:45:06 localhost haproxy-metadata-proxy-3633976c-3aa0-4c4a-aa49-e8224cd25e39[282406]: 192.168.0.160:37784 [26/Nov/2025:09:45:04.226] listener listener/metadata 0/0/0/1840/1840 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1" Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.085 159587 DEBUG eventlet.wsgi.server [-] (159587) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.086 159587 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Accept: */*#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Connection: close#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Content-Type: text/plain#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Host: 169.254.169.254#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: User-Agent: curl/7.84.0#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: X-Forwarded-For: 192.168.0.160#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: X-Ovn-Network-Id: 3633976c-3aa0-4c4a-aa49-e8224cd25e39 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 26 04:45:06 localhost haproxy-metadata-proxy-3633976c-3aa0-4c4a-aa49-e8224cd25e39[282406]: 192.168.0.160:37788 [26/Nov/2025:09:45:06.085] listener listener/metadata 0/0/0/25/25 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1" Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.110 159587 INFO eventlet.wsgi.server [-] 192.168.0.160, "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 404 len: 297 time: 0.0233381#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.125 159587 DEBUG eventlet.wsgi.server [-] (159587) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.126 159587 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Accept: */*#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Connection: close#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Content-Type: text/plain#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Host: 169.254.169.254#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: User-Agent: curl/7.84.0#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: X-Forwarded-For: 192.168.0.160#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: X-Ovn-Network-Id: 3633976c-3aa0-4c4a-aa49-e8224cd25e39 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.140 159587 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 26 04:45:06 localhost haproxy-metadata-proxy-3633976c-3aa0-4c4a-aa49-e8224cd25e39[282406]: 192.168.0.160:37790 [26/Nov/2025:09:45:06.124] listener listener/metadata 0/0/0/16/16 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1" Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.141 159587 INFO eventlet.wsgi.server [-] 192.168.0.160, "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200 len: 146 time: 0.0150323#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.147 159587 DEBUG eventlet.wsgi.server [-] (159587) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.148 159587 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Accept: */*#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Connection: close#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Content-Type: text/plain#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Host: 169.254.169.254#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: User-Agent: curl/7.84.0#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: X-Forwarded-For: 192.168.0.160#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: X-Ovn-Network-Id: 3633976c-3aa0-4c4a-aa49-e8224cd25e39 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.162 159587 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 26 04:45:06 localhost haproxy-metadata-proxy-3633976c-3aa0-4c4a-aa49-e8224cd25e39[282406]: 192.168.0.160:37792 [26/Nov/2025:09:45:06.147] listener listener/metadata 0/0/0/15/15 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.163 159587 INFO eventlet.wsgi.server [-] 192.168.0.160, "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200 len: 136 time: 0.0143328#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.169 159587 DEBUG eventlet.wsgi.server [-] (159587) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.170 159587 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Accept: */*#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Connection: close#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Content-Type: text/plain#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Host: 169.254.169.254#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: User-Agent: curl/7.84.0#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: X-Forwarded-For: 192.168.0.160#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: X-Ovn-Network-Id: 3633976c-3aa0-4c4a-aa49-e8224cd25e39 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.183 159587 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 26 04:45:06 localhost haproxy-metadata-proxy-3633976c-3aa0-4c4a-aa49-e8224cd25e39[282406]: 192.168.0.160:37802 [26/Nov/2025:09:45:06.169] listener listener/metadata 0/0/0/15/15 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1" Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.184 159587 INFO eventlet.wsgi.server [-] 192.168.0.160, "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200 len: 143 time: 0.0144806#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.191 159587 DEBUG eventlet.wsgi.server [-] (159587) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.192 159587 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Accept: */*#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Connection: close#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Content-Type: text/plain#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Host: 169.254.169.254#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: User-Agent: curl/7.84.0#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: X-Forwarded-For: 192.168.0.160#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: X-Ovn-Network-Id: 3633976c-3aa0-4c4a-aa49-e8224cd25e39 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.204 159587 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 26 04:45:06 localhost haproxy-metadata-proxy-3633976c-3aa0-4c4a-aa49-e8224cd25e39[282406]: 192.168.0.160:37818 [26/Nov/2025:09:45:06.190] listener listener/metadata 0/0/0/14/14 200 133 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.204 159587 INFO eventlet.wsgi.server [-] 192.168.0.160, "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200 len: 149 time: 0.0127079#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.211 159587 DEBUG eventlet.wsgi.server [-] (159587) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.212 159587 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Accept: */*#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Connection: close#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Content-Type: text/plain#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Host: 169.254.169.254#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: User-Agent: curl/7.84.0#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: X-Forwarded-For: 192.168.0.160#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: X-Ovn-Network-Id: 3633976c-3aa0-4c4a-aa49-e8224cd25e39 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.223 159587 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 26 04:45:06 localhost haproxy-metadata-proxy-3633976c-3aa0-4c4a-aa49-e8224cd25e39[282406]: 192.168.0.160:37826 [26/Nov/2025:09:45:06.210] listener listener/metadata 0/0/0/13/13 200 134 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.224 159587 INFO eventlet.wsgi.server [-] 192.168.0.160, "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200 len: 150 time: 0.0119083#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.230 159587 DEBUG eventlet.wsgi.server [-] (159587) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.231 159587 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Accept: */*#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Connection: close#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Content-Type: text/plain#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Host: 169.254.169.254#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: User-Agent: curl/7.84.0#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: X-Forwarded-For: 192.168.0.160#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: X-Ovn-Network-Id: 3633976c-3aa0-4c4a-aa49-e8224cd25e39 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.241 159587 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.242 159587 INFO eventlet.wsgi.server [-] 192.168.0.160, "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200 len: 139 time: 0.0106044#033[00m Nov 26 04:45:06 localhost haproxy-metadata-proxy-3633976c-3aa0-4c4a-aa49-e8224cd25e39[282406]: 192.168.0.160:37840 [26/Nov/2025:09:45:06.230] listener listener/metadata 0/0/0/11/11 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1" Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.248 159587 DEBUG eventlet.wsgi.server [-] (159587) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.249 159587 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Accept: */*#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Connection: close#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Content-Type: text/plain#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Host: 169.254.169.254#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: User-Agent: curl/7.84.0#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: X-Forwarded-For: 192.168.0.160#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: X-Ovn-Network-Id: 3633976c-3aa0-4c4a-aa49-e8224cd25e39 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.260 159587 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.260 159587 INFO eventlet.wsgi.server [-] 192.168.0.160, "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200 len: 139 time: 0.0114219#033[00m Nov 26 04:45:06 localhost haproxy-metadata-proxy-3633976c-3aa0-4c4a-aa49-e8224cd25e39[282406]: 192.168.0.160:37848 [26/Nov/2025:09:45:06.248] listener listener/metadata 0/0/0/12/12 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.267 159587 DEBUG eventlet.wsgi.server [-] (159587) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.268 159587 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Accept: */*#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Connection: close#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Content-Type: text/plain#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Host: 169.254.169.254#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: User-Agent: curl/7.84.0#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: X-Forwarded-For: 192.168.0.160#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: X-Ovn-Network-Id: 3633976c-3aa0-4c4a-aa49-e8224cd25e39 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 26 04:45:06 localhost haproxy-metadata-proxy-3633976c-3aa0-4c4a-aa49-e8224cd25e39[282406]: 192.168.0.160:37862 [26/Nov/2025:09:45:06.267] listener listener/metadata 0/0/0/15/15 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1" Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.282 159587 INFO eventlet.wsgi.server [-] 192.168.0.160, "GET /2009-04-04/user-data HTTP/1.1" status: 404 len: 297 time: 0.0144210#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.296 159587 DEBUG eventlet.wsgi.server [-] (159587) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.297 159587 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Accept: */*#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Connection: close#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Content-Type: text/plain#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Host: 169.254.169.254#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: User-Agent: curl/7.84.0#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: X-Forwarded-For: 192.168.0.160#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: X-Ovn-Network-Id: 3633976c-3aa0-4c4a-aa49-e8224cd25e39 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.310 159587 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 26 04:45:06 localhost haproxy-metadata-proxy-3633976c-3aa0-4c4a-aa49-e8224cd25e39[282406]: 192.168.0.160:37876 [26/Nov/2025:09:45:06.296] listener listener/metadata 0/0/0/14/14 200 139 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.310 159587 INFO eventlet.wsgi.server [-] 192.168.0.160, "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200 len: 155 time: 0.0131390#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.315 159587 DEBUG eventlet.wsgi.server [-] (159587) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.316 159587 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Accept: */*#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Connection: close#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Content-Type: text/plain#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Host: 169.254.169.254#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: User-Agent: curl/7.84.0#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: X-Forwarded-For: 192.168.0.160#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: X-Ovn-Network-Id: 3633976c-3aa0-4c4a-aa49-e8224cd25e39 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.330 159587 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 26 04:45:06 localhost haproxy-metadata-proxy-3633976c-3aa0-4c4a-aa49-e8224cd25e39[282406]: 192.168.0.160:37884 [26/Nov/2025:09:45:06.315] listener listener/metadata 0/0/0/15/15 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.331 159587 INFO eventlet.wsgi.server [-] 192.168.0.160, "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200 len: 138 time: 0.0146658#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.336 159587 DEBUG eventlet.wsgi.server [-] (159587) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.336 159587 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.0#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Accept: */*#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Connection: close#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Content-Type: text/plain#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Host: 169.254.169.254#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: User-Agent: curl/7.84.0#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: X-Forwarded-For: 192.168.0.160#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: X-Ovn-Network-Id: 3633976c-3aa0-4c4a-aa49-e8224cd25e39 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.348 159587 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 26 04:45:06 localhost haproxy-metadata-proxy-3633976c-3aa0-4c4a-aa49-e8224cd25e39[282406]: 192.168.0.160:37898 [26/Nov/2025:09:45:06.335] listener listener/metadata 0/0/0/12/12 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1" Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.348 159587 INFO eventlet.wsgi.server [-] 192.168.0.160, "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1" status: 200 len: 143 time: 0.0118423#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.353 159587 DEBUG eventlet.wsgi.server [-] (159587) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.354 159587 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Accept: */*#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Connection: close#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Content-Type: text/plain#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Host: 169.254.169.254#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: User-Agent: curl/7.84.0#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: X-Forwarded-For: 192.168.0.160#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: X-Ovn-Network-Id: 3633976c-3aa0-4c4a-aa49-e8224cd25e39 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.367 159587 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 26 04:45:06 localhost haproxy-metadata-proxy-3633976c-3aa0-4c4a-aa49-e8224cd25e39[282406]: 192.168.0.160:37906 [26/Nov/2025:09:45:06.352] listener listener/metadata 0/0/0/15/15 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.368 159587 INFO eventlet.wsgi.server [-] 192.168.0.160, "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200 len: 143 time: 0.0141013#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.374 159587 DEBUG eventlet.wsgi.server [-] (159587) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.375 159587 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Accept: */*#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Connection: close#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Content-Type: text/plain#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Host: 169.254.169.254#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: User-Agent: curl/7.84.0#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: X-Forwarded-For: 192.168.0.160#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: X-Ovn-Network-Id: 3633976c-3aa0-4c4a-aa49-e8224cd25e39 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.386 159587 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 26 04:45:06 localhost haproxy-metadata-proxy-3633976c-3aa0-4c4a-aa49-e8224cd25e39[282406]: 192.168.0.160:37908 [26/Nov/2025:09:45:06.373] listener listener/metadata 0/0/0/13/13 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.387 159587 INFO eventlet.wsgi.server [-] 192.168.0.160, "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200 len: 139 time: 0.0122156#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.393 159587 DEBUG eventlet.wsgi.server [-] (159587) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.394 159587 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Accept: */*#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Connection: close#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Content-Type: text/plain#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: Host: 169.254.169.254#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: User-Agent: curl/7.84.0#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: X-Forwarded-For: 192.168.0.160#015 Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: X-Ovn-Network-Id: 3633976c-3aa0-4c4a-aa49-e8224cd25e39 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.407 159587 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 26 04:45:06 localhost haproxy-metadata-proxy-3633976c-3aa0-4c4a-aa49-e8224cd25e39[282406]: 192.168.0.160:37918 [26/Nov/2025:09:45:06.393] listener listener/metadata 0/0/0/15/15 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" Nov 26 04:45:06 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:06.408 159587 INFO eventlet.wsgi.server [-] 192.168.0.160, "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200 len: 139 time: 0.0140781#033[00m Nov 26 04:45:06 localhost nova_compute[281415]: 2025-11-26 09:45:06.680 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:45:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:45:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:45:07 localhost podman[282520]: 2025-11-26 09:45:07.840969834 +0000 UTC m=+0.095123601 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 26 04:45:07 localhost systemd[1]: tmp-crun.SAy8IE.mount: Deactivated successfully. Nov 26 04:45:07 localhost podman[282521]: 2025-11-26 09:45:07.899111213 +0000 UTC m=+0.149869727 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_managed=true) Nov 26 04:45:07 localhost podman[282520]: 2025-11-26 09:45:07.907223571 +0000 UTC m=+0.161377358 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 04:45:07 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:45:07 localhost podman[282521]: 2025-11-26 09:45:07.962059889 +0000 UTC m=+0.212818443 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:45:07 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:45:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39799 DF PROTO=TCP SPT=60250 DPT=9102 SEQ=1690994296 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CA89FC0000000001030307) Nov 26 04:45:09 localhost nova_compute[281415]: 2025-11-26 09:45:09.794 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:45:11 localhost ovn_metadata_agent[159481]: 2025-11-26 09:45:11.010 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fad182b-d1fd-4eb1-a4d3-436a76a6f49e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:45:11 localhost nova_compute[281415]: 2025-11-26 09:45:11.713 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:45:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:45:14 localhost systemd[1]: tmp-crun.cUUr2M.mount: Deactivated successfully. Nov 26 04:45:14 localhost podman[282582]: 2025-11-26 09:45:14.28171624 +0000 UTC m=+0.091596074 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 26 04:45:14 localhost podman[282582]: 2025-11-26 09:45:14.40723617 +0000 UTC m=+0.217115954 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 26 04:45:14 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:45:14 localhost nova_compute[281415]: 2025-11-26 09:45:14.797 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:45:14 localhost ovn_controller[153664]: 2025-11-26T09:45:14Z|00066|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory Nov 26 04:45:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:45:15 localhost openstack_network_exporter[242153]: ERROR 09:45:15 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:45:15 localhost openstack_network_exporter[242153]: ERROR 09:45:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:45:15 localhost openstack_network_exporter[242153]: ERROR 09:45:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:45:15 localhost openstack_network_exporter[242153]: ERROR 09:45:15 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:45:15 localhost openstack_network_exporter[242153]: Nov 26 04:45:15 localhost openstack_network_exporter[242153]: ERROR 09:45:15 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:45:15 localhost openstack_network_exporter[242153]: Nov 26 04:45:15 localhost podman[282654]: 2025-11-26 09:45:15.837312183 +0000 UTC m=+0.093355537 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, distribution-scope=public, managed_by=edpm_ansible, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm) Nov 26 04:45:15 localhost podman[282654]: 2025-11-26 09:45:15.855408867 +0000 UTC m=+0.111452211 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Nov 26 04:45:15 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:45:16 localhost nova_compute[281415]: 2025-11-26 09:45:16.716 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:45:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:45:18 localhost podman[282673]: 2025-11-26 09:45:18.800061859 +0000 UTC m=+0.067958510 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 26 04:45:18 localhost podman[282673]: 2025-11-26 09:45:18.811422736 +0000 UTC m=+0.079319377 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 26 04:45:18 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:45:19 localhost nova_compute[281415]: 2025-11-26 09:45:19.830 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:45:21 localhost nova_compute[281415]: 2025-11-26 09:45:21.719 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:45:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11659 DF PROTO=TCP SPT=35588 DPT=9102 SEQ=2229333146 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CAC2D90000000001030307) Nov 26 04:45:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:45:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:45:24 localhost podman[282713]: 2025-11-26 09:45:24.835548014 +0000 UTC m=+0.090324674 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118) Nov 26 04:45:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11660 DF PROTO=TCP SPT=35588 DPT=9102 SEQ=2229333146 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CAC6FC0000000001030307) Nov 26 04:45:24 localhost nova_compute[281415]: 2025-11-26 09:45:24.865 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:45:24 localhost podman[282713]: 2025-11-26 09:45:24.866578304 +0000 UTC m=+0.121354904 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3) Nov 26 04:45:24 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:45:24 localhost systemd[1]: tmp-crun.of42qB.mount: Deactivated successfully. Nov 26 04:45:24 localhost podman[282714]: 2025-11-26 09:45:24.914249892 +0000 UTC m=+0.163356508 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd) Nov 26 04:45:24 localhost podman[282714]: 2025-11-26 09:45:24.930826229 +0000 UTC m=+0.179932845 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:45:24 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:45:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39800 DF PROTO=TCP SPT=60250 DPT=9102 SEQ=1690994296 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CAC9FD0000000001030307) Nov 26 04:45:26 localhost nova_compute[281415]: 2025-11-26 09:45:26.748 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:45:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11661 DF PROTO=TCP SPT=35588 DPT=9102 SEQ=2229333146 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CACEFC0000000001030307) Nov 26 04:45:27 localhost podman[240049]: time="2025-11-26T09:45:27Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:45:27 localhost podman[240049]: @ - - [26/Nov/2025:09:45:27 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147525 "" "Go-http-client/1.1" Nov 26 04:45:27 localhost podman[240049]: @ - - [26/Nov/2025:09:45:27 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17231 "" "Go-http-client/1.1" Nov 26 04:45:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17608 DF PROTO=TCP SPT=42498 DPT=9102 SEQ=367521857 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CAD1FD0000000001030307) Nov 26 04:45:29 localhost nova_compute[281415]: 2025-11-26 09:45:29.905 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:45:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11662 DF PROTO=TCP SPT=35588 DPT=9102 SEQ=2229333146 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CADEBC0000000001030307) Nov 26 04:45:31 localhost nova_compute[281415]: 2025-11-26 09:45:31.751 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:45:34 localhost snmpd[66980]: empty variable list in _query Nov 26 04:45:34 localhost snmpd[66980]: empty variable list in _query Nov 26 04:45:34 localhost nova_compute[281415]: 2025-11-26 09:45:34.907 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:45:35 localhost sshd[282750]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:45:36 localhost nova_compute[281415]: 2025-11-26 09:45:36.788 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:45:36 localhost sshd[282752]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:45:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:45:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:45:38 localhost systemd[1]: tmp-crun.fUqxuG.mount: Deactivated successfully. Nov 26 04:45:38 localhost podman[282754]: 2025-11-26 09:45:38.846828147 +0000 UTC m=+0.101014452 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 26 04:45:38 localhost podman[282754]: 2025-11-26 09:45:38.861498245 +0000 UTC m=+0.115684500 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 26 04:45:38 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:45:38 localhost podman[282755]: 2025-11-26 09:45:38.942220025 +0000 UTC m=+0.193470800 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118) Nov 26 04:45:38 localhost podman[282755]: 2025-11-26 09:45:38.95743551 +0000 UTC m=+0.208686255 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118) Nov 26 04:45:38 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:45:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11663 DF PROTO=TCP SPT=35588 DPT=9102 SEQ=2229333146 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CAFFFC0000000001030307) Nov 26 04:45:39 localhost nova_compute[281415]: 2025-11-26 09:45:39.953 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:45:41 localhost nova_compute[281415]: 2025-11-26 09:45:41.811 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:45:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:45:44 localhost podman[282796]: 2025-11-26 09:45:44.834153928 +0000 UTC m=+0.086323982 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 26 04:45:44 localhost podman[282796]: 2025-11-26 09:45:44.935358324 +0000 UTC m=+0.187528398 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 26 04:45:44 localhost nova_compute[281415]: 2025-11-26 09:45:44.956 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:45:44 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:45:45 localhost openstack_network_exporter[242153]: ERROR 09:45:45 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:45:45 localhost openstack_network_exporter[242153]: ERROR 09:45:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:45:45 localhost openstack_network_exporter[242153]: ERROR 09:45:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:45:45 localhost openstack_network_exporter[242153]: ERROR 09:45:45 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:45:45 localhost openstack_network_exporter[242153]: Nov 26 04:45:45 localhost openstack_network_exporter[242153]: ERROR 09:45:45 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:45:45 localhost openstack_network_exporter[242153]: Nov 26 04:45:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:45:46 localhost nova_compute[281415]: 2025-11-26 09:45:46.816 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:45:46 localhost podman[282820]: 2025-11-26 09:45:46.837625364 +0000 UTC m=+0.094968527 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_id=edpm, architecture=x86_64, vcs-type=git, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350) Nov 26 04:45:46 localhost podman[282820]: 2025-11-26 09:45:46.853448858 +0000 UTC m=+0.110792041 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, vcs-type=git, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Nov 26 04:45:46 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:45:49 localhost nova_compute[281415]: 2025-11-26 09:45:49.048 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:45:49 localhost nova_compute[281415]: 2025-11-26 09:45:49.048 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:45:49 localhost nova_compute[281415]: 2025-11-26 09:45:49.076 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:45:49 localhost nova_compute[281415]: 2025-11-26 09:45:49.076 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 04:45:49 localhost nova_compute[281415]: 2025-11-26 09:45:49.077 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 04:45:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:45:49 localhost podman[282839]: 2025-11-26 09:45:49.826343823 +0000 UTC m=+0.090125328 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 04:45:49 localhost podman[282839]: 2025-11-26 09:45:49.839707402 +0000 UTC m=+0.103488907 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:45:49 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:45:49 localhost nova_compute[281415]: 2025-11-26 09:45:49.987 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:45:50 localhost nova_compute[281415]: 2025-11-26 09:45:50.102 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:45:50 localhost nova_compute[281415]: 2025-11-26 09:45:50.102 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:45:50 localhost nova_compute[281415]: 2025-11-26 09:45:50.103 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 04:45:50 localhost nova_compute[281415]: 2025-11-26 09:45:50.103 281419 DEBUG nova.objects.instance [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:45:51 localhost nova_compute[281415]: 2025-11-26 09:45:51.816 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:45:52 localhost nova_compute[281415]: 2025-11-26 09:45:52.272 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:45:52 localhost nova_compute[281415]: 2025-11-26 09:45:52.940 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:45:52 localhost nova_compute[281415]: 2025-11-26 09:45:52.940 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 04:45:52 localhost nova_compute[281415]: 2025-11-26 09:45:52.941 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:45:52 localhost nova_compute[281415]: 2025-11-26 09:45:52.942 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:45:52 localhost nova_compute[281415]: 2025-11-26 09:45:52.942 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:45:52 localhost nova_compute[281415]: 2025-11-26 09:45:52.942 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:45:52 localhost nova_compute[281415]: 2025-11-26 09:45:52.943 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:45:52 localhost nova_compute[281415]: 2025-11-26 09:45:52.943 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:45:52 localhost nova_compute[281415]: 2025-11-26 09:45:52.944 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 04:45:52 localhost nova_compute[281415]: 2025-11-26 09:45:52.944 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:45:52 localhost nova_compute[281415]: 2025-11-26 09:45:52.974 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:45:52 localhost nova_compute[281415]: 2025-11-26 09:45:52.974 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:45:52 localhost nova_compute[281415]: 2025-11-26 09:45:52.974 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:45:52 localhost nova_compute[281415]: 2025-11-26 09:45:52.975 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 04:45:52 localhost nova_compute[281415]: 2025-11-26 09:45:52.975 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:45:53 localhost nova_compute[281415]: 2025-11-26 09:45:53.436 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:45:53 localhost nova_compute[281415]: 2025-11-26 09:45:53.512 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:45:53 localhost nova_compute[281415]: 2025-11-26 09:45:53.512 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:45:53 localhost nova_compute[281415]: 2025-11-26 09:45:53.741 281419 WARNING nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 04:45:53 localhost nova_compute[281415]: 2025-11-26 09:45:53.743 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=12262MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 04:45:53 localhost nova_compute[281415]: 2025-11-26 09:45:53.743 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:45:53 localhost nova_compute[281415]: 2025-11-26 09:45:53.744 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:45:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52373 DF PROTO=TCP SPT=57868 DPT=9102 SEQ=187707623 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CB38090000000001030307) Nov 26 04:45:53 localhost nova_compute[281415]: 2025-11-26 09:45:53.859 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 04:45:53 localhost nova_compute[281415]: 2025-11-26 09:45:53.860 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 04:45:53 localhost nova_compute[281415]: 2025-11-26 09:45:53.860 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 04:45:53 localhost nova_compute[281415]: 2025-11-26 09:45:53.912 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:45:54 localhost nova_compute[281415]: 2025-11-26 09:45:54.683 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.771s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:45:54 localhost nova_compute[281415]: 2025-11-26 09:45:54.692 281419 DEBUG nova.compute.provider_tree [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 04:45:54 localhost nova_compute[281415]: 2025-11-26 09:45:54.719 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 04:45:54 localhost nova_compute[281415]: 2025-11-26 09:45:54.722 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 04:45:54 localhost nova_compute[281415]: 2025-11-26 09:45:54.723 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.979s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:45:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52374 DF PROTO=TCP SPT=57868 DPT=9102 SEQ=187707623 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CB3BFC0000000001030307) Nov 26 04:45:54 localhost nova_compute[281415]: 2025-11-26 09:45:54.990 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:45:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:45:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:45:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11664 DF PROTO=TCP SPT=35588 DPT=9102 SEQ=2229333146 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CB3FFC0000000001030307) Nov 26 04:45:55 localhost systemd[1]: tmp-crun.LuObpX.mount: Deactivated successfully. Nov 26 04:45:55 localhost podman[282907]: 2025-11-26 09:45:55.891835936 +0000 UTC m=+0.146225575 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:45:55 localhost podman[282906]: 2025-11-26 09:45:55.846751647 +0000 UTC m=+0.105263561 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true) Nov 26 04:45:55 localhost podman[282906]: 2025-11-26 09:45:55.927853858 +0000 UTC m=+0.186365812 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:45:55 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:45:55 localhost podman[282907]: 2025-11-26 09:45:55.983739118 +0000 UTC m=+0.238128767 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:45:55 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:45:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52375 DF PROTO=TCP SPT=57868 DPT=9102 SEQ=187707623 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CB43FC0000000001030307) Nov 26 04:45:56 localhost nova_compute[281415]: 2025-11-26 09:45:56.846 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:45:57 localhost podman[240049]: time="2025-11-26T09:45:57Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:45:57 localhost podman[240049]: @ - - [26/Nov/2025:09:45:57 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147525 "" "Go-http-client/1.1" Nov 26 04:45:57 localhost podman[240049]: @ - - [26/Nov/2025:09:45:57 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17229 "" "Go-http-client/1.1" Nov 26 04:45:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39801 DF PROTO=TCP SPT=60250 DPT=9102 SEQ=1690994296 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CB47FD0000000001030307) Nov 26 04:46:00 localhost nova_compute[281415]: 2025-11-26 09:46:00.030 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:46:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52376 DF PROTO=TCP SPT=57868 DPT=9102 SEQ=187707623 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CB53BC0000000001030307) Nov 26 04:46:01 localhost nova_compute[281415]: 2025-11-26 09:46:01.848 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.580 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'name': 'test', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005536118.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'hostId': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.581 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.585 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6da60880-e47a-4a92-8f36-2a747b227189', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:46:03.581629', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'b9062aac-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10838.823884175, 'message_signature': '20c567873f008e39fb9f8a85af4c8fc846b23c5bb3f6b3707e9fb07793cd32a1'}]}, 'timestamp': '2025-11-26 09:46:03.586374', '_unique_id': '5bedda7a92a64a1dbba410438b280d37'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.588 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.589 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.589 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.617 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 1143371229 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.618 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 23326743 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a33356de-53ce-4219-8ffa-de800bdb8e47', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1143371229, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:46:03.589773', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b90b156c-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10838.832023745, 'message_signature': '0d0dc6a776e8a674d6fa07bfa049ca55dfd6c74a95f69ad92e8a080e85fc2c82'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 23326743, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:46:03.589773', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b90b266a-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10838.832023745, 'message_signature': '5fa27f861e941b12406a69fe5f918c47538de978b416d43bfac114d7c57982a4'}]}, 'timestamp': '2025-11-26 09:46:03.618878', '_unique_id': '7608e18102f744a2936366596e927983'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.619 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.621 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.621 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b118d4d5-4943-45f7-91b9-20ea599770f3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:46:03.621145', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'b90b9000-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10838.823884175, 'message_signature': 'f8c137c8c11335cb4acbd8d6a14a8c11c6daee490bd94edb7ddc667099be5c05'}]}, 'timestamp': '2025-11-26 09:46:03.621612', '_unique_id': '0b689d3eea6e4c4c9bfe8ef5b1baeb84'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.622 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.623 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.623 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.624 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '10b50b2c-c850-4682-8241-758ea8687cb1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:46:03.623759', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b90bf720-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10838.832023745, 'message_signature': '003b19b4a3e30601e2e8b46d5e2e52d5197e01e1a02548a6c335e67c6ebc3290'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:46:03.623759', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b90c079c-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10838.832023745, 'message_signature': '2c18fe5524ab2b344cfcc2fb4e335b7ec9a0d5edf9a3c446f4ed4c9907e04c7f'}]}, 'timestamp': '2025-11-26 09:46:03.624640', '_unique_id': 'ea3f8b68715d470fb351e8c4c7a76e79'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.625 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.626 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.626 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd18dc2e1-4548-4682-b76e-7643ecd17774', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:46:03.626808', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'b90c6e44-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10838.823884175, 'message_signature': '18e6addeb66be297f8e48895a70d22baba233ee2246c874d57c465c5f94e0d77'}]}, 'timestamp': '2025-11-26 09:46:03.627297', '_unique_id': '562668348ee2476780037e62d9a97959'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.628 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.629 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.648 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/cpu volume: 11740000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '164188bc-7cdf-4138-bf76-831da6349f0d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11740000000, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T09:46:03.629414', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'b90fb554-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10838.890442562, 'message_signature': '09286bcefdcff4b41b179b31ac32b9661759555fe1cdff3dded1b07014ad4b49'}]}, 'timestamp': '2025-11-26 09:46:03.648769', '_unique_id': 'd729856ce62e4f5eb955698c2bbd2437'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.649 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:46:03.649 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.650 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 26 04:46:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:46:03.650 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:46:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:46:03.651 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.662 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.663 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0de5aec3-3c25-4f6c-818e-fa35032c0a86', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:46:03.650918', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b911e536-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10838.893192766, 'message_signature': '8205db1c40876901fd89ea6b854111e6c21689d279395155525fe4e5e45e0fe7'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:46:03.650918', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b911f76a-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10838.893192766, 'message_signature': '4a4f623c16c09da44b9e175c50d91962d38fb999da05567bbb147e11bc4a1647'}]}, 'timestamp': '2025-11-26 09:46:03.663548', '_unique_id': '012c8bb0f6964e94a41c1a88712ad765'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.664 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.665 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.665 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.665 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.666 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b12b350-8dd8-4eb5-83eb-fa53395b15e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:46:03.665844', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b9126344-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10838.832023745, 'message_signature': 'e2b1c8bc83aceaace1e726325459f0f97b28e0d4ccf281576b0e296700f03646'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:46:03.665844', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b9127474-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10838.832023745, 'message_signature': '80c93e07d637049f3764559ed2b8078324855d89791c35073ab86e1711227856'}]}, 'timestamp': '2025-11-26 09:46:03.666746', '_unique_id': 'e408114ceeff4faaae2a5653f0b4f11e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.667 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.668 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.668 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.669 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '37deb66d-aa1d-42ee-95bf-3a720ae65008', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:46:03.669057', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'b912dfc2-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10838.823884175, 'message_signature': '430896f79d732e876aa73550ae4c43a22f8d254dd56e1327afad0c3ad57ff721'}]}, 'timestamp': '2025-11-26 09:46:03.669524', '_unique_id': 'e6cec74983c54040b1a1b7770573a101'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.670 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.671 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.671 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 1723586642 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.672 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 89399569 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '81bd0ab4-03f0-494b-b7fb-20fc950c7f6b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1723586642, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:46:03.671624', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b913430e-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10838.832023745, 'message_signature': '6ac9f881e1ec7b570bc079012ce80b8ac2fa07f2a46fd88801d71adb6384daac'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89399569, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:46:03.671624', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b9135452-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10838.832023745, 'message_signature': '6a43062c238afc8d7332204c98e543c460fbd4ec2190641adf4c18b9749af930'}]}, 'timestamp': '2025-11-26 09:46:03.672479', '_unique_id': '0d17311763584504b21da0bc359f24e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.673 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.674 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.674 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes.delta volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d17a88c-24a1-4fe1-9751-72d78ee16bf9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:46:03.674617', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'b913b870-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10838.823884175, 'message_signature': '955acad64c5b37d7441c52fd14f61734c230ff9e5a06468f5f44e77be948679d'}]}, 'timestamp': '2025-11-26 09:46:03.675102', '_unique_id': 'e8fb47c504d54c1f8b840e43daa9e47f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.675 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.677 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.677 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.677 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2cf30a59-5b9d-46ee-ad9b-abd134387dcb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:46:03.677178', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b9141c2a-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10838.832023745, 'message_signature': '96695d21dbed7572bcd5226aaed8c65b071ba25813ebea25fe2ac6fd3f4f8600'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:46:03.677178', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b9142c2e-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10838.832023745, 'message_signature': '531f6cb7346cfb472a3be2f06d0e8a528f8a2c21cf906ff3243a71bcad8cdf86'}]}, 'timestamp': '2025-11-26 09:46:03.678037', '_unique_id': 'fac8760efa2a4d99870a204c3eaad18f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.678 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.680 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.680 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.680 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '99ed4c78-dd76-4498-a257-b01720dfdce0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:46:03.680156', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b9149088-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10838.893192766, 'message_signature': 'daf3cfdfd0cb50a6db58909e65dbae9d03560b7304b4b9f4446c911de1c4cce8'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:46:03.680156', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b914a06e-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10838.893192766, 'message_signature': '0f46bc953d128609034fa359fcaae2fd4ba0a0e892bde41d35f5b9e6a87964ce'}]}, 'timestamp': '2025-11-26 09:46:03.681008', '_unique_id': 'd626f70594b947dabab567a1c5b2925c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.681 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.683 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.683 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.683 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1516c8fe-a6f5-40a8-9e8b-c1e73cc5c0bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:46:03.683143', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b915054a-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10838.893192766, 'message_signature': '346f4b32014e52cc86f773e37a8c2b34ed786c11ea3218684923318740414cf9'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:46:03.683143', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b9151526-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10838.893192766, 'message_signature': '802e092c2a4978d577ab52756cecfe06610e715ed7a9b4343f6b85b22acc0dce'}]}, 'timestamp': '2025-11-26 09:46:03.683994', '_unique_id': 'fb538de53c37467a9341f93a25761798'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.684 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.685 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.686 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.686 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.686 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fd0d9dee-5dc9-459e-9974-9c68f16e2784', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:46:03.686262', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b9157f66-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10838.832023745, 'message_signature': '51e891522122fdc41fe85369f75a0f6a320acf4d44e7887501cc8de6fb512ee9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:46:03.686262', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b9158f42-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10838.832023745, 'message_signature': 'b667530cf78e24f8eaf6f7508c02382764cd9f4ff3a0799736560c8284233ec3'}]}, 'timestamp': '2025-11-26 09:46:03.687124', '_unique_id': 'b84dc0155d6f454696259a9985b6b6bc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.688 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.689 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.689 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/memory.usage volume: 51.79296875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4bf795c4-60b4-4a87-9ca1-5c58daa79a94', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.79296875, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T09:46:03.689220', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'b915f2b6-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10838.890442562, 'message_signature': '56334cf1bd0b656b0374f7a21990c1836f0c9ad66d16bc7961461f6f7e0d438d'}]}, 'timestamp': '2025-11-26 09:46:03.689652', '_unique_id': 'c9f65c0893244406b57037e38b4c0451'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.690 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.691 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.691 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes.delta volume: 7111 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2297a09b-40ef-463b-8551-9c820667195c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 7111, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:46:03.691736', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'b916560c-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10838.823884175, 'message_signature': '3cb780defbf26e2efaceb458222d9289421e776812989f02b8ce6102df7703f9'}]}, 'timestamp': '2025-11-26 09:46:03.692214', '_unique_id': 'ce015caa737a4405befdc645d1d39df1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.693 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.694 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.694 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '51bcb214-7ca0-4e8a-8f68-699ddaba06b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:46:03.694308', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'b916b962-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10838.823884175, 'message_signature': '0f0e86ea5a5a2dcdef78cf18278df8a68b166b7b6a0e6330120954680dd3da88'}]}, 'timestamp': '2025-11-26 09:46:03.694754', '_unique_id': '42386e116a884580a887d46130420737'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.695 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.696 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.697 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c22f08fe-89aa-45a3-b69d-25b8e4cd947d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:46:03.697088', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'b9172686-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10838.823884175, 'message_signature': '1da9ad044cc9b8a8aad0663abf47f51aadebdd13d07afe8905f6228a7cc9fd77'}]}, 'timestamp': '2025-11-26 09:46:03.697546', '_unique_id': '2f1635d8168e47f48bcc2abb7b6e6e2d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.698 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.699 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.699 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d65791f-ca32-4598-b6d4-0da3b323ddaf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:46:03.699623', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'b917890a-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10838.823884175, 'message_signature': '2eef68a557c9535b13a4565e35ac4444ed98663bbb7de201e98289a5dcda06ff'}]}, 'timestamp': '2025-11-26 09:46:03.700100', '_unique_id': '9f7f9b66b398414b83b1978690826842'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.700 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.702 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.702 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes volume: 7111 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ebb5ab9-d7d9-4f81-87de-ae4df2b7368e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7111, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:46:03.702148', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'b917ec56-caac-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10838.823884175, 'message_signature': 'fb8eb890918653975a5df5ea9246ac51485e5d8494d12bd2ff966952b60dbd54'}]}, 'timestamp': '2025-11-26 09:46:03.702527', '_unique_id': '63bc4d11adf3498e8d36f390a76165bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:46:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:46:03.703 12 ERROR oslo_messaging.notify.messaging Nov 26 04:46:05 localhost nova_compute[281415]: 2025-11-26 09:46:05.076 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:46:05 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 26 04:46:05 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7200.1 total, 600.0 interval#012Cumulative writes: 5728 writes, 25K keys, 5728 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5728 writes, 781 syncs, 7.33 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 36 writes, 94 keys, 36 commit groups, 1.0 writes per commit group, ingest: 0.06 MB, 0.00 MB/s#012Interval WAL: 36 writes, 18 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 26 04:46:06 localhost nova_compute[281415]: 2025-11-26 09:46:06.883 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:46:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52377 DF PROTO=TCP SPT=57868 DPT=9102 SEQ=187707623 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CB73FC0000000001030307) Nov 26 04:46:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 26 04:46:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7200.1 total, 600.0 interval#012Cumulative writes: 4966 writes, 22K keys, 4966 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4966 writes, 663 syncs, 7.49 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 106 writes, 334 keys, 106 commit groups, 1.0 writes per commit group, ingest: 0.53 MB, 0.00 MB/s#012Interval WAL: 106 writes, 42 syncs, 2.52 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 26 04:46:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:46:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:46:09 localhost podman[282945]: 2025-11-26 09:46:09.83878072 +0000 UTC m=+0.091926453 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Nov 26 04:46:09 localhost podman[282945]: 2025-11-26 09:46:09.855532253 +0000 UTC m=+0.108677986 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:46:09 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:46:09 localhost podman[282944]: 2025-11-26 09:46:09.939402829 +0000 UTC m=+0.196543604 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 04:46:09 localhost podman[282944]: 2025-11-26 09:46:09.946279629 +0000 UTC m=+0.203420404 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 26 04:46:09 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:46:10 localhost nova_compute[281415]: 2025-11-26 09:46:10.120 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:46:11 localhost nova_compute[281415]: 2025-11-26 09:46:11.925 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:46:15 localhost nova_compute[281415]: 2025-11-26 09:46:15.161 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:46:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:46:15 localhost openstack_network_exporter[242153]: ERROR 09:46:15 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:46:15 localhost openstack_network_exporter[242153]: ERROR 09:46:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:46:15 localhost openstack_network_exporter[242153]: ERROR 09:46:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:46:15 localhost openstack_network_exporter[242153]: ERROR 09:46:15 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:46:15 localhost openstack_network_exporter[242153]: Nov 26 04:46:15 localhost openstack_network_exporter[242153]: ERROR 09:46:15 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:46:15 localhost openstack_network_exporter[242153]: Nov 26 04:46:15 localhost podman[282986]: 2025-11-26 09:46:15.841471683 +0000 UTC m=+0.099583688 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 26 04:46:15 localhost podman[282986]: 2025-11-26 09:46:15.886390868 +0000 UTC m=+0.144502873 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:46:15 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:46:16 localhost sshd[283011]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:46:16 localhost nova_compute[281415]: 2025-11-26 09:46:16.971 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:46:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:46:17 localhost podman[283013]: 2025-11-26 09:46:17.824252836 +0000 UTC m=+0.086240899 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, architecture=x86_64, config_id=edpm, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350) Nov 26 04:46:17 localhost podman[283013]: 2025-11-26 09:46:17.839110521 +0000 UTC m=+0.101098644 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_id=edpm, distribution-scope=public, release=1755695350, container_name=openstack_network_exporter, name=ubi9-minimal, architecture=x86_64) Nov 26 04:46:17 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:46:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:46:20 localhost nova_compute[281415]: 2025-11-26 09:46:20.192 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:46:20 localhost podman[283121]: 2025-11-26 09:46:20.25743304 +0000 UTC m=+0.123200830 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 04:46:20 localhost podman[283121]: 2025-11-26 09:46:20.266363463 +0000 UTC m=+0.132131273 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 26 04:46:20 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:46:22 localhost nova_compute[281415]: 2025-11-26 09:46:22.010 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:46:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16476 DF PROTO=TCP SPT=52444 DPT=9102 SEQ=469011641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CBAD390000000001030307) Nov 26 04:46:24 localhost sshd[283198]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:46:24 localhost systemd-logind[761]: New session 62 of user zuul. Nov 26 04:46:24 localhost systemd[1]: Started Session 62 of User zuul. Nov 26 04:46:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16477 DF PROTO=TCP SPT=52444 DPT=9102 SEQ=469011641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CBB13C0000000001030307) Nov 26 04:46:24 localhost python3[283220]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager unregister _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:46:25 localhost nova_compute[281415]: 2025-11-26 09:46:25.245 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:46:25 localhost systemd-journald[47778]: Field hash table of /run/log/journal/ea6370aa35b896eb1e7cdbd81aa316d7/system.journal has a fill level at 75.7 (252 of 333 items), suggesting rotation. Nov 26 04:46:25 localhost systemd-journald[47778]: /run/log/journal/ea6370aa35b896eb1e7cdbd81aa316d7/system.journal: Journal header limits reached or header out-of-date, rotating. Nov 26 04:46:25 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 26 04:46:25 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 26 04:46:25 localhost subscription-manager[283221]: Unregistered machine with identity: c3ce536d-b475-4829-b391-421a6c252b3b Nov 26 04:46:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52378 DF PROTO=TCP SPT=57868 DPT=9102 SEQ=187707623 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CBB3FC0000000001030307) Nov 26 04:46:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:46:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:46:26 localhost podman[283224]: 2025-11-26 09:46:26.829071995 +0000 UTC m=+0.089892104 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118) Nov 26 04:46:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16478 DF PROTO=TCP SPT=52444 DPT=9102 SEQ=469011641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CBB93C0000000001030307) Nov 26 04:46:26 localhost podman[283224]: 2025-11-26 09:46:26.865481128 +0000 UTC m=+0.126301217 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent) Nov 26 04:46:26 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:46:26 localhost systemd[1]: tmp-crun.LH9YSw.mount: Deactivated successfully. Nov 26 04:46:26 localhost podman[283225]: 2025-11-26 09:46:26.893990197 +0000 UTC m=+0.149658497 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:46:26 localhost podman[283225]: 2025-11-26 09:46:26.914274682 +0000 UTC m=+0.169943012 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Nov 26 04:46:26 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:46:27 localhost nova_compute[281415]: 2025-11-26 09:46:27.051 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:46:27 localhost podman[240049]: time="2025-11-26T09:46:27Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:46:27 localhost podman[240049]: @ - - [26/Nov/2025:09:46:27 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147525 "" "Go-http-client/1.1" Nov 26 04:46:27 localhost podman[240049]: @ - - [26/Nov/2025:09:46:27 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17233 "" "Go-http-client/1.1" Nov 26 04:46:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11665 DF PROTO=TCP SPT=35588 DPT=9102 SEQ=2229333146 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CBBDFC0000000001030307) Nov 26 04:46:30 localhost nova_compute[281415]: 2025-11-26 09:46:30.280 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:46:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16479 DF PROTO=TCP SPT=52444 DPT=9102 SEQ=469011641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CBC8FD0000000001030307) Nov 26 04:46:32 localhost nova_compute[281415]: 2025-11-26 09:46:32.087 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:46:35 localhost nova_compute[281415]: 2025-11-26 09:46:35.313 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:46:37 localhost nova_compute[281415]: 2025-11-26 09:46:37.110 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:46:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16480 DF PROTO=TCP SPT=52444 DPT=9102 SEQ=469011641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CBE9FC0000000001030307) Nov 26 04:46:40 localhost nova_compute[281415]: 2025-11-26 09:46:40.349 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:46:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:46:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:46:40 localhost systemd[1]: tmp-crun.sLhiF1.mount: Deactivated successfully. Nov 26 04:46:40 localhost podman[283262]: 2025-11-26 09:46:40.848775613 +0000 UTC m=+0.105298319 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 04:46:40 localhost podman[283262]: 2025-11-26 09:46:40.862370382 +0000 UTC m=+0.118893178 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:46:40 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:46:40 localhost podman[283263]: 2025-11-26 09:46:40.940584975 +0000 UTC m=+0.194928593 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:46:40 localhost podman[283263]: 2025-11-26 09:46:40.980431014 +0000 UTC m=+0.234774612 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 26 04:46:40 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:46:42 localhost nova_compute[281415]: 2025-11-26 09:46:42.134 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:46:44 localhost systemd[1]: virtsecretd.service: Deactivated successfully. Nov 26 04:46:45 localhost nova_compute[281415]: 2025-11-26 09:46:45.391 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:46:45 localhost openstack_network_exporter[242153]: ERROR 09:46:45 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:46:45 localhost openstack_network_exporter[242153]: ERROR 09:46:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:46:45 localhost openstack_network_exporter[242153]: ERROR 09:46:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:46:45 localhost openstack_network_exporter[242153]: ERROR 09:46:45 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:46:45 localhost openstack_network_exporter[242153]: Nov 26 04:46:45 localhost openstack_network_exporter[242153]: ERROR 09:46:45 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:46:45 localhost openstack_network_exporter[242153]: Nov 26 04:46:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:46:46 localhost podman[283304]: 2025-11-26 09:46:46.794759893 +0000 UTC m=+0.057199286 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3) Nov 26 04:46:46 localhost podman[283304]: 2025-11-26 09:46:46.896627374 +0000 UTC m=+0.159066787 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller) Nov 26 04:46:46 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:46:47 localhost nova_compute[281415]: 2025-11-26 09:46:47.136 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:46:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:46:48 localhost podman[283330]: 2025-11-26 09:46:48.822375473 +0000 UTC m=+0.079935437 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_id=edpm, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, version=9.6, release=1755695350, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible) Nov 26 04:46:48 localhost podman[283330]: 2025-11-26 09:46:48.838300664 +0000 UTC m=+0.095860668 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, distribution-scope=public, maintainer=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.expose-services=, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible) Nov 26 04:46:48 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:46:50 localhost nova_compute[281415]: 2025-11-26 09:46:50.432 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:46:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:46:50 localhost podman[283351]: 2025-11-26 09:46:50.822356142 +0000 UTC m=+0.082311421 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 26 04:46:50 localhost podman[283351]: 2025-11-26 09:46:50.830165752 +0000 UTC m=+0.090121061 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 26 04:46:50 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:46:52 localhost nova_compute[281415]: 2025-11-26 09:46:52.139 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:46:53 localhost sshd[283374]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:46:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39820 DF PROTO=TCP SPT=38548 DPT=9102 SEQ=465806097 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CC22690000000001030307) Nov 26 04:46:54 localhost nova_compute[281415]: 2025-11-26 09:46:54.725 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:46:54 localhost nova_compute[281415]: 2025-11-26 09:46:54.728 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:46:54 localhost nova_compute[281415]: 2025-11-26 09:46:54.728 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 04:46:54 localhost nova_compute[281415]: 2025-11-26 09:46:54.729 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 04:46:54 localhost nova_compute[281415]: 2025-11-26 09:46:54.814 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:46:54 localhost nova_compute[281415]: 2025-11-26 09:46:54.815 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:46:54 localhost nova_compute[281415]: 2025-11-26 09:46:54.815 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 04:46:54 localhost nova_compute[281415]: 2025-11-26 09:46:54.816 281419 DEBUG nova.objects.instance [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:46:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39821 DF PROTO=TCP SPT=38548 DPT=9102 SEQ=465806097 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CC267C0000000001030307) Nov 26 04:46:55 localhost nova_compute[281415]: 2025-11-26 09:46:55.375 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:46:55 localhost nova_compute[281415]: 2025-11-26 09:46:55.388 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:46:55 localhost nova_compute[281415]: 2025-11-26 09:46:55.389 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 04:46:55 localhost nova_compute[281415]: 2025-11-26 09:46:55.390 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:46:55 localhost nova_compute[281415]: 2025-11-26 09:46:55.390 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:46:55 localhost nova_compute[281415]: 2025-11-26 09:46:55.391 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:46:55 localhost nova_compute[281415]: 2025-11-26 09:46:55.391 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:46:55 localhost nova_compute[281415]: 2025-11-26 09:46:55.391 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:46:55 localhost nova_compute[281415]: 2025-11-26 09:46:55.392 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:46:55 localhost nova_compute[281415]: 2025-11-26 09:46:55.393 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 04:46:55 localhost nova_compute[281415]: 2025-11-26 09:46:55.393 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:46:55 localhost nova_compute[281415]: 2025-11-26 09:46:55.411 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:46:55 localhost nova_compute[281415]: 2025-11-26 09:46:55.412 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:46:55 localhost nova_compute[281415]: 2025-11-26 09:46:55.412 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:46:55 localhost nova_compute[281415]: 2025-11-26 09:46:55.413 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 04:46:55 localhost nova_compute[281415]: 2025-11-26 09:46:55.413 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:46:55 localhost nova_compute[281415]: 2025-11-26 09:46:55.470 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:46:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16481 DF PROTO=TCP SPT=52444 DPT=9102 SEQ=469011641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CC29FC0000000001030307) Nov 26 04:46:55 localhost nova_compute[281415]: 2025-11-26 09:46:55.902 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:46:55 localhost nova_compute[281415]: 2025-11-26 09:46:55.968 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:46:55 localhost nova_compute[281415]: 2025-11-26 09:46:55.969 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:46:56 localhost nova_compute[281415]: 2025-11-26 09:46:56.207 281419 WARNING nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 04:46:56 localhost nova_compute[281415]: 2025-11-26 09:46:56.210 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=12286MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 04:46:56 localhost nova_compute[281415]: 2025-11-26 09:46:56.211 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:46:56 localhost nova_compute[281415]: 2025-11-26 09:46:56.212 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:46:56 localhost nova_compute[281415]: 2025-11-26 09:46:56.291 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 04:46:56 localhost nova_compute[281415]: 2025-11-26 09:46:56.292 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 04:46:56 localhost nova_compute[281415]: 2025-11-26 09:46:56.292 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 04:46:56 localhost nova_compute[281415]: 2025-11-26 09:46:56.345 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:46:56 localhost nova_compute[281415]: 2025-11-26 09:46:56.809 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:46:56 localhost nova_compute[281415]: 2025-11-26 09:46:56.817 281419 DEBUG nova.compute.provider_tree [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 04:46:56 localhost nova_compute[281415]: 2025-11-26 09:46:56.838 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 04:46:56 localhost nova_compute[281415]: 2025-11-26 09:46:56.841 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 04:46:56 localhost nova_compute[281415]: 2025-11-26 09:46:56.842 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.630s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:46:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39822 DF PROTO=TCP SPT=38548 DPT=9102 SEQ=465806097 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CC2E7D0000000001030307) Nov 26 04:46:57 localhost nova_compute[281415]: 2025-11-26 09:46:57.187 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:46:57 localhost podman[240049]: time="2025-11-26T09:46:57Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:46:57 localhost podman[240049]: @ - - [26/Nov/2025:09:46:57 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147525 "" "Go-http-client/1.1" Nov 26 04:46:57 localhost podman[240049]: @ - - [26/Nov/2025:09:46:57 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17232 "" "Go-http-client/1.1" Nov 26 04:46:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:46:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:46:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52379 DF PROTO=TCP SPT=57868 DPT=9102 SEQ=187707623 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CC31FC0000000001030307) Nov 26 04:46:57 localhost podman[283474]: 2025-11-26 09:46:57.827092088 +0000 UTC m=+0.083982871 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0) Nov 26 04:46:57 localhost podman[283474]: 2025-11-26 09:46:57.861054745 +0000 UTC m=+0.117945508 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Nov 26 04:46:57 localhost systemd[1]: tmp-crun.jTkIkb.mount: Deactivated successfully. Nov 26 04:46:57 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:46:57 localhost podman[283475]: 2025-11-26 09:46:57.884958443 +0000 UTC m=+0.137775610 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd) Nov 26 04:46:57 localhost podman[283475]: 2025-11-26 09:46:57.897168659 +0000 UTC m=+0.149985846 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118) Nov 26 04:46:57 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:47:00 localhost nova_compute[281415]: 2025-11-26 09:47:00.520 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:47:00 localhost sshd[283513]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:47:00 localhost systemd-logind[761]: New session 63 of user tripleo-admin. Nov 26 04:47:00 localhost systemd[1]: Created slice User Slice of UID 1003. Nov 26 04:47:00 localhost systemd[1]: Starting User Runtime Directory /run/user/1003... Nov 26 04:47:00 localhost systemd[1]: Finished User Runtime Directory /run/user/1003. Nov 26 04:47:00 localhost systemd[1]: Starting User Manager for UID 1003... Nov 26 04:47:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:2e:91:c0 MACDST=fa:16:3e:70:c4:f6 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39823 DF PROTO=TCP SPT=38548 DPT=9102 SEQ=465806097 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CC3E3C0000000001030307) Nov 26 04:47:00 localhost systemd[283517]: Queued start job for default target Main User Target. Nov 26 04:47:00 localhost systemd[283517]: Created slice User Application Slice. Nov 26 04:47:00 localhost systemd[283517]: Started Mark boot as successful after the user session has run 2 minutes. Nov 26 04:47:00 localhost systemd[283517]: Started Daily Cleanup of User's Temporary Directories. Nov 26 04:47:00 localhost systemd[283517]: Reached target Paths. Nov 26 04:47:00 localhost systemd[283517]: Reached target Timers. Nov 26 04:47:00 localhost systemd[283517]: Starting D-Bus User Message Bus Socket... Nov 26 04:47:01 localhost systemd[283517]: Starting Create User's Volatile Files and Directories... Nov 26 04:47:01 localhost systemd[283517]: Listening on D-Bus User Message Bus Socket. Nov 26 04:47:01 localhost systemd[283517]: Reached target Sockets. Nov 26 04:47:01 localhost systemd[283517]: Finished Create User's Volatile Files and Directories. Nov 26 04:47:01 localhost systemd[283517]: Reached target Basic System. Nov 26 04:47:01 localhost systemd[283517]: Reached target Main User Target. Nov 26 04:47:01 localhost systemd[283517]: Startup finished in 158ms. Nov 26 04:47:01 localhost systemd[1]: Started User Manager for UID 1003. Nov 26 04:47:01 localhost systemd[1]: Started Session 63 of User tripleo-admin. Nov 26 04:47:01 localhost python3[283660]: ansible-ansible.builtin.blockinfile Invoked with marker_begin=BEGIN ceph firewall rules marker_end=END ceph firewall rules path=/etc/nftables/edpm-rules.nft mode=0644 block=# 100 ceph_alertmanager (9093)#012add rule inet filter EDPM_INPUT tcp dport { 9093 } ct state new counter accept comment "100 ceph_alertmanager"#012# 100 ceph_dashboard (8443)#012add rule inet filter EDPM_INPUT tcp dport { 8443 } ct state new counter accept comment "100 ceph_dashboard"#012# 100 ceph_grafana (3100)#012add rule inet filter EDPM_INPUT tcp dport { 3100 } ct state new counter accept comment "100 ceph_grafana"#012# 100 ceph_prometheus (9092)#012add rule inet filter EDPM_INPUT tcp dport { 9092 } ct state new counter accept comment "100 ceph_prometheus"#012# 100 ceph_rgw (8080)#012add rule inet filter EDPM_INPUT tcp dport { 8080 } ct state new counter accept comment "100 ceph_rgw"#012# 110 ceph_mon (6789, 3300, 9100)#012add rule inet filter EDPM_INPUT tcp dport { 6789,3300,9100 } ct state new counter accept comment "110 ceph_mon"#012# 112 ceph_mds (6800-7300, 9100)#012add rule inet filter EDPM_INPUT tcp dport { 6800-7300,9100 } ct state new counter accept comment "112 ceph_mds"#012# 113 ceph_mgr (6800-7300, 8444)#012add rule inet filter EDPM_INPUT tcp dport { 6800-7300,8444 } ct state new counter accept comment "113 ceph_mgr"#012# 120 ceph_nfs (2049, 12049)#012add rule inet filter EDPM_INPUT tcp dport { 2049,12049 } ct state new counter accept comment "120 ceph_nfs"#012# 123 ceph_dashboard (9090, 9094, 9283)#012add rule inet filter EDPM_INPUT tcp dport { 9090,9094,9283 } ct state new counter accept comment "123 ceph_dashboard"#012 insertbefore=^# Lock down INPUT chains state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False unsafe_writes=False insertafter=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:47:02 localhost nova_compute[281415]: 2025-11-26 09:47:02.231 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:47:02 localhost python3[283804]: ansible-ansible.builtin.systemd Invoked with name=nftables state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 26 04:47:02 localhost systemd[1]: Stopping Netfilter Tables... Nov 26 04:47:02 localhost systemd[1]: nftables.service: Deactivated successfully. Nov 26 04:47:02 localhost systemd[1]: Stopped Netfilter Tables. Nov 26 04:47:02 localhost systemd[1]: Starting Netfilter Tables... Nov 26 04:47:02 localhost systemd[1]: Finished Netfilter Tables. Nov 26 04:47:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:47:03.651 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:47:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:47:03.651 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:47:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:47:03.653 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:47:05 localhost nova_compute[281415]: 2025-11-26 09:47:05.565 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:47:07 localhost nova_compute[281415]: 2025-11-26 09:47:07.276 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:47:10 localhost podman[283942]: Nov 26 04:47:10 localhost podman[283942]: 2025-11-26 09:47:10.347370516 +0000 UTC m=+0.084022202 container create 9d4ec7146d4a66f9e3e53cb64b1007e36263101c0dbe468a24b01a54e404569a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_hertz, architecture=x86_64, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, version=7, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, distribution-scope=public, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, release=553, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux ) Nov 26 04:47:10 localhost systemd[1]: Started libpod-conmon-9d4ec7146d4a66f9e3e53cb64b1007e36263101c0dbe468a24b01a54e404569a.scope. Nov 26 04:47:10 localhost podman[283942]: 2025-11-26 09:47:10.311039726 +0000 UTC m=+0.047691462 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:47:10 localhost systemd[1]: Started libcrun container. Nov 26 04:47:10 localhost podman[283942]: 2025-11-26 09:47:10.440086596 +0000 UTC m=+0.176738292 container init 9d4ec7146d4a66f9e3e53cb64b1007e36263101c0dbe468a24b01a54e404569a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_hertz, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, vcs-type=git, vendor=Red Hat, Inc., release=553, maintainer=Guillaume Abrioux , architecture=x86_64, distribution-scope=public, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 26 04:47:10 localhost podman[283942]: 2025-11-26 09:47:10.453976815 +0000 UTC m=+0.190628511 container start 9d4ec7146d4a66f9e3e53cb64b1007e36263101c0dbe468a24b01a54e404569a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_hertz, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, name=rhceph, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , version=7, build-date=2025-09-24T08:57:55, distribution-scope=public, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, ceph=True, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, architecture=x86_64, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7) Nov 26 04:47:10 localhost podman[283942]: 2025-11-26 09:47:10.454388408 +0000 UTC m=+0.191040124 container attach 9d4ec7146d4a66f9e3e53cb64b1007e36263101c0dbe468a24b01a54e404569a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_hertz, RELEASE=main, maintainer=Guillaume Abrioux , version=7, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, GIT_CLEAN=True, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 26 04:47:10 localhost frosty_hertz[283958]: 167 167 Nov 26 04:47:10 localhost systemd[1]: libpod-9d4ec7146d4a66f9e3e53cb64b1007e36263101c0dbe468a24b01a54e404569a.scope: Deactivated successfully. Nov 26 04:47:10 localhost podman[283942]: 2025-11-26 09:47:10.460444075 +0000 UTC m=+0.197095801 container died 9d4ec7146d4a66f9e3e53cb64b1007e36263101c0dbe468a24b01a54e404569a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_hertz, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_CLEAN=True, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, release=553, vendor=Red Hat, Inc., RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, ceph=True, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 26 04:47:10 localhost podman[283963]: 2025-11-26 09:47:10.568412234 +0000 UTC m=+0.091738929 container remove 9d4ec7146d4a66f9e3e53cb64b1007e36263101c0dbe468a24b01a54e404569a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_hertz, name=rhceph, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, version=7, distribution-scope=public, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vcs-type=git, release=553, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12) Nov 26 04:47:10 localhost nova_compute[281415]: 2025-11-26 09:47:10.568 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:47:10 localhost systemd[1]: libpod-conmon-9d4ec7146d4a66f9e3e53cb64b1007e36263101c0dbe468a24b01a54e404569a.scope: Deactivated successfully. Nov 26 04:47:10 localhost podman[283985]: Nov 26 04:47:10 localhost podman[283985]: 2025-11-26 09:47:10.828626571 +0000 UTC m=+0.088013846 container create 4135a9dd6096e5e4af2b1d9754de9090a3b16788e4027a2ff4738a710dfe8f9c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_pike, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, RELEASE=main, ceph=True, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, release=553, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, name=rhceph, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 26 04:47:10 localhost systemd[1]: Started libpod-conmon-4135a9dd6096e5e4af2b1d9754de9090a3b16788e4027a2ff4738a710dfe8f9c.scope. Nov 26 04:47:10 localhost podman[283985]: 2025-11-26 09:47:10.792718653 +0000 UTC m=+0.052105958 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:47:10 localhost systemd[1]: Started libcrun container. Nov 26 04:47:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/afbc4cc4a8df1bde2b8457f0e837ed4f1038d5c628f76234d2bf9834d7df92a2/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 26 04:47:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:47:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/afbc4cc4a8df1bde2b8457f0e837ed4f1038d5c628f76234d2bf9834d7df92a2/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 26 04:47:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/afbc4cc4a8df1bde2b8457f0e837ed4f1038d5c628f76234d2bf9834d7df92a2/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Nov 26 04:47:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/afbc4cc4a8df1bde2b8457f0e837ed4f1038d5c628f76234d2bf9834d7df92a2/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 26 04:47:10 localhost podman[283985]: 2025-11-26 09:47:10.906300747 +0000 UTC m=+0.165688032 container init 4135a9dd6096e5e4af2b1d9754de9090a3b16788e4027a2ff4738a710dfe8f9c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_pike, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, distribution-scope=public, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, version=7, io.buildah.version=1.33.12) Nov 26 04:47:10 localhost podman[283985]: 2025-11-26 09:47:10.919224045 +0000 UTC m=+0.178611330 container start 4135a9dd6096e5e4af2b1d9754de9090a3b16788e4027a2ff4738a710dfe8f9c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_pike, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, build-date=2025-09-24T08:57:55, ceph=True, io.openshift.expose-services=, vcs-type=git, GIT_BRANCH=main, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, release=553, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.buildah.version=1.33.12) Nov 26 04:47:10 localhost podman[283985]: 2025-11-26 09:47:10.919503974 +0000 UTC m=+0.178891259 container attach 4135a9dd6096e5e4af2b1d9754de9090a3b16788e4027a2ff4738a710dfe8f9c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_pike, vcs-type=git, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, io.buildah.version=1.33.12, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, build-date=2025-09-24T08:57:55, release=553, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 26 04:47:11 localhost podman[284004]: 2025-11-26 09:47:11.013478002 +0000 UTC m=+0.095603529 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 26 04:47:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:47:11 localhost podman[284004]: 2025-11-26 09:47:11.054411985 +0000 UTC m=+0.136537452 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:47:11 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:47:11 localhost podman[284030]: 2025-11-26 09:47:11.153162241 +0000 UTC m=+0.100995206 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, config_id=edpm) Nov 26 04:47:11 localhost podman[284030]: 2025-11-26 09:47:11.192412112 +0000 UTC m=+0.140245067 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Nov 26 04:47:11 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:47:11 localhost systemd[1]: var-lib-containers-storage-overlay-74b143e69cc6dea9da425b9471e05de9fb475e46aa3a6b4684a578e0920ddb9b-merged.mount: Deactivated successfully. Nov 26 04:47:12 localhost unruffled_pike[284001]: [ Nov 26 04:47:12 localhost unruffled_pike[284001]: { Nov 26 04:47:12 localhost unruffled_pike[284001]: "available": false, Nov 26 04:47:12 localhost unruffled_pike[284001]: "ceph_device": false, Nov 26 04:47:12 localhost unruffled_pike[284001]: "device_id": "QEMU_DVD-ROM_QM00001", Nov 26 04:47:12 localhost unruffled_pike[284001]: "lsm_data": {}, Nov 26 04:47:12 localhost unruffled_pike[284001]: "lvs": [], Nov 26 04:47:12 localhost unruffled_pike[284001]: "path": "/dev/sr0", Nov 26 04:47:12 localhost unruffled_pike[284001]: "rejected_reasons": [ Nov 26 04:47:12 localhost unruffled_pike[284001]: "Insufficient space (<5GB)", Nov 26 04:47:12 localhost unruffled_pike[284001]: "Has a FileSystem" Nov 26 04:47:12 localhost unruffled_pike[284001]: ], Nov 26 04:47:12 localhost unruffled_pike[284001]: "sys_api": { Nov 26 04:47:12 localhost unruffled_pike[284001]: "actuators": null, Nov 26 04:47:12 localhost unruffled_pike[284001]: "device_nodes": "sr0", Nov 26 04:47:12 localhost unruffled_pike[284001]: "human_readable_size": "482.00 KB", Nov 26 04:47:12 localhost unruffled_pike[284001]: "id_bus": "ata", Nov 26 04:47:12 localhost unruffled_pike[284001]: "model": "QEMU DVD-ROM", Nov 26 04:47:12 localhost unruffled_pike[284001]: "nr_requests": "2", Nov 26 04:47:12 localhost unruffled_pike[284001]: "partitions": {}, Nov 26 04:47:12 localhost unruffled_pike[284001]: "path": "/dev/sr0", Nov 26 04:47:12 localhost unruffled_pike[284001]: "removable": "1", Nov 26 04:47:12 localhost unruffled_pike[284001]: "rev": "2.5+", Nov 26 04:47:12 localhost unruffled_pike[284001]: "ro": "0", Nov 26 04:47:12 localhost unruffled_pike[284001]: "rotational": "1", Nov 26 04:47:12 localhost unruffled_pike[284001]: "sas_address": "", Nov 26 04:47:12 localhost unruffled_pike[284001]: "sas_device_handle": "", Nov 26 04:47:12 localhost unruffled_pike[284001]: "scheduler_mode": "mq-deadline", Nov 26 04:47:12 localhost unruffled_pike[284001]: "sectors": 0, Nov 26 04:47:12 localhost unruffled_pike[284001]: "sectorsize": "2048", Nov 26 04:47:12 localhost unruffled_pike[284001]: "size": 493568.0, Nov 26 04:47:12 localhost unruffled_pike[284001]: "support_discard": "0", Nov 26 04:47:12 localhost unruffled_pike[284001]: "type": "disk", Nov 26 04:47:12 localhost unruffled_pike[284001]: "vendor": "QEMU" Nov 26 04:47:12 localhost unruffled_pike[284001]: } Nov 26 04:47:12 localhost unruffled_pike[284001]: } Nov 26 04:47:12 localhost unruffled_pike[284001]: ] Nov 26 04:47:12 localhost systemd[1]: libpod-4135a9dd6096e5e4af2b1d9754de9090a3b16788e4027a2ff4738a710dfe8f9c.scope: Deactivated successfully. Nov 26 04:47:12 localhost systemd[1]: libpod-4135a9dd6096e5e4af2b1d9754de9090a3b16788e4027a2ff4738a710dfe8f9c.scope: Consumed 1.248s CPU time. Nov 26 04:47:12 localhost podman[283985]: 2025-11-26 09:47:12.151576896 +0000 UTC m=+1.410964171 container died 4135a9dd6096e5e4af2b1d9754de9090a3b16788e4027a2ff4738a710dfe8f9c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_pike, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, io.buildah.version=1.33.12, distribution-scope=public, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-type=git, version=7, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7) Nov 26 04:47:12 localhost systemd[1]: tmp-crun.MRKWOt.mount: Deactivated successfully. Nov 26 04:47:12 localhost systemd[1]: var-lib-containers-storage-overlay-afbc4cc4a8df1bde2b8457f0e837ed4f1038d5c628f76234d2bf9834d7df92a2-merged.mount: Deactivated successfully. Nov 26 04:47:12 localhost podman[285783]: 2025-11-26 09:47:12.271589108 +0000 UTC m=+0.106621360 container remove 4135a9dd6096e5e4af2b1d9754de9090a3b16788e4027a2ff4738a710dfe8f9c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_pike, distribution-scope=public, ceph=True, RELEASE=main, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, name=rhceph, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7) Nov 26 04:47:12 localhost systemd[1]: libpod-conmon-4135a9dd6096e5e4af2b1d9754de9090a3b16788e4027a2ff4738a710dfe8f9c.scope: Deactivated successfully. Nov 26 04:47:12 localhost nova_compute[281415]: 2025-11-26 09:47:12.315 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:47:15 localhost nova_compute[281415]: 2025-11-26 09:47:15.616 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:47:15 localhost openstack_network_exporter[242153]: ERROR 09:47:15 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:47:15 localhost openstack_network_exporter[242153]: ERROR 09:47:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:47:15 localhost openstack_network_exporter[242153]: ERROR 09:47:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:47:15 localhost openstack_network_exporter[242153]: ERROR 09:47:15 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:47:15 localhost openstack_network_exporter[242153]: Nov 26 04:47:15 localhost openstack_network_exporter[242153]: ERROR 09:47:15 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:47:15 localhost openstack_network_exporter[242153]: Nov 26 04:47:17 localhost nova_compute[281415]: 2025-11-26 09:47:17.365 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:47:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:47:17 localhost podman[285868]: 2025-11-26 09:47:17.564051959 +0000 UTC m=+0.091687598 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2) Nov 26 04:47:17 localhost podman[285868]: 2025-11-26 09:47:17.642388885 +0000 UTC m=+0.170024484 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible) Nov 26 04:47:17 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:47:18 localhost podman[285950]: Nov 26 04:47:18 localhost podman[285950]: 2025-11-26 09:47:18.192089671 +0000 UTC m=+0.085533319 container create 698c7c6caedee0f1b15d59af646823ee937749ef434be1e1e6532005340cf342 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_wing, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, name=rhceph, RELEASE=main, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, ceph=True, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 26 04:47:18 localhost systemd[1]: Started libpod-conmon-698c7c6caedee0f1b15d59af646823ee937749ef434be1e1e6532005340cf342.scope. Nov 26 04:47:18 localhost podman[285950]: 2025-11-26 09:47:18.155207203 +0000 UTC m=+0.048650881 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:47:18 localhost systemd[1]: tmp-crun.0DasJf.mount: Deactivated successfully. Nov 26 04:47:18 localhost systemd[1]: Started libcrun container. Nov 26 04:47:18 localhost podman[285950]: 2025-11-26 09:47:18.303742915 +0000 UTC m=+0.197186553 container init 698c7c6caedee0f1b15d59af646823ee937749ef434be1e1e6532005340cf342 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_wing, name=rhceph, architecture=x86_64, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, GIT_CLEAN=True, version=7, io.openshift.expose-services=, distribution-scope=public, release=553, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, GIT_BRANCH=main) Nov 26 04:47:18 localhost podman[285950]: 2025-11-26 09:47:18.316378145 +0000 UTC m=+0.209821783 container start 698c7c6caedee0f1b15d59af646823ee937749ef434be1e1e6532005340cf342 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_wing, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, architecture=x86_64, vendor=Red Hat, Inc., GIT_CLEAN=True, release=553, name=rhceph, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, io.openshift.expose-services=, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 26 04:47:18 localhost podman[285950]: 2025-11-26 09:47:18.316684624 +0000 UTC m=+0.210128312 container attach 698c7c6caedee0f1b15d59af646823ee937749ef434be1e1e6532005340cf342 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_wing, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, version=7, vcs-type=git, GIT_BRANCH=main, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, release=553, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, name=rhceph, GIT_CLEAN=True, ceph=True, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux ) Nov 26 04:47:18 localhost modest_wing[285965]: 167 167 Nov 26 04:47:18 localhost systemd[1]: libpod-698c7c6caedee0f1b15d59af646823ee937749ef434be1e1e6532005340cf342.scope: Deactivated successfully. Nov 26 04:47:18 localhost podman[285950]: 2025-11-26 09:47:18.321364949 +0000 UTC m=+0.214808667 container died 698c7c6caedee0f1b15d59af646823ee937749ef434be1e1e6532005340cf342 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_wing, architecture=x86_64, distribution-scope=public, version=7, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, GIT_CLEAN=True, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 26 04:47:18 localhost podman[285970]: 2025-11-26 09:47:18.432345671 +0000 UTC m=+0.093799344 container remove 698c7c6caedee0f1b15d59af646823ee937749ef434be1e1e6532005340cf342 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_wing, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, distribution-scope=public, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.openshift.expose-services=, CEPH_POINT_RELEASE=, ceph=True, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , vcs-type=git, description=Red Hat Ceph Storage 7, name=rhceph, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.buildah.version=1.33.12) Nov 26 04:47:18 localhost systemd[1]: libpod-conmon-698c7c6caedee0f1b15d59af646823ee937749ef434be1e1e6532005340cf342.scope: Deactivated successfully. Nov 26 04:47:18 localhost systemd[1]: Reloading. Nov 26 04:47:18 localhost systemd-rc-local-generator[286008]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:47:18 localhost systemd-sysv-generator[286015]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:47:18 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:47:18 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:47:18 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:47:18 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:47:18 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:47:18 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:47:18 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:47:18 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:47:18 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:47:18 localhost systemd[1]: var-lib-containers-storage-overlay-061c82e54b760bbb2e77c00c42467761a7a86dcb7312d5415c8a7b38e86a5039-merged.mount: Deactivated successfully. Nov 26 04:47:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:47:18 localhost systemd[1]: Reloading. Nov 26 04:47:18 localhost podman[286024]: 2025-11-26 09:47:18.990331403 +0000 UTC m=+0.097527900 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.openshift.expose-services=, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal) Nov 26 04:47:19 localhost systemd-rc-local-generator[286074]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:47:19 localhost systemd-sysv-generator[286077]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:47:19 localhost podman[286024]: 2025-11-26 09:47:19.042369177 +0000 UTC m=+0.149565694 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_id=edpm, container_name=openstack_network_exporter) Nov 26 04:47:19 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:47:19 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:47:19 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:47:19 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:47:19 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:47:19 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:47:19 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:47:19 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:47:19 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:47:19 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:47:19 localhost systemd[1]: Starting Ceph mds.mds.np0005536118.kohnma for 0d5e5e6d-3c4b-5efe-8c65-346ae6715606... Nov 26 04:47:19 localhost podman[286134]: Nov 26 04:47:19 localhost podman[286134]: 2025-11-26 09:47:19.670639226 +0000 UTC m=+0.100350756 container create 1d1f97ef5369495d8412723f2e36f857f66a41bbb9df9f6854ff9a4f0a76e6fd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mds-mds-np0005536118-kohnma, distribution-scope=public, io.openshift.expose-services=, ceph=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, maintainer=Guillaume Abrioux , GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., version=7, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, release=553) Nov 26 04:47:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa4dd5b112d68de86682272ed5249227c1e7d70302ad17acfcb4e4777b42f19c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Nov 26 04:47:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa4dd5b112d68de86682272ed5249227c1e7d70302ad17acfcb4e4777b42f19c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 26 04:47:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa4dd5b112d68de86682272ed5249227c1e7d70302ad17acfcb4e4777b42f19c/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 26 04:47:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa4dd5b112d68de86682272ed5249227c1e7d70302ad17acfcb4e4777b42f19c/merged/var/lib/ceph/mds/ceph-mds.np0005536118.kohnma supports timestamps until 2038 (0x7fffffff) Nov 26 04:47:19 localhost podman[286134]: 2025-11-26 09:47:19.73367593 +0000 UTC m=+0.163387490 container init 1d1f97ef5369495d8412723f2e36f857f66a41bbb9df9f6854ff9a4f0a76e6fd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mds-mds-np0005536118-kohnma, name=rhceph, vendor=Red Hat, Inc., release=553, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, io.buildah.version=1.33.12, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.openshift.expose-services=, ceph=True, CEPH_POINT_RELEASE=, architecture=x86_64, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main) Nov 26 04:47:19 localhost podman[286134]: 2025-11-26 09:47:19.637520585 +0000 UTC m=+0.067232135 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:47:19 localhost podman[286134]: 2025-11-26 09:47:19.744486494 +0000 UTC m=+0.174198054 container start 1d1f97ef5369495d8412723f2e36f857f66a41bbb9df9f6854ff9a4f0a76e6fd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mds-mds-np0005536118-kohnma, vcs-type=git, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, ceph=True, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7) Nov 26 04:47:19 localhost bash[286134]: 1d1f97ef5369495d8412723f2e36f857f66a41bbb9df9f6854ff9a4f0a76e6fd Nov 26 04:47:19 localhost systemd[1]: Started Ceph mds.mds.np0005536118.kohnma for 0d5e5e6d-3c4b-5efe-8c65-346ae6715606. Nov 26 04:47:19 localhost ceph-mds[286153]: set uid:gid to 167:167 (ceph:ceph) Nov 26 04:47:19 localhost ceph-mds[286153]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mds, pid 2 Nov 26 04:47:19 localhost ceph-mds[286153]: main not setting numa affinity Nov 26 04:47:19 localhost ceph-mds[286153]: pidfile_write: ignore empty --pid-file Nov 26 04:47:19 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mds-mds-np0005536118-kohnma[286149]: starting mds.mds.np0005536118.kohnma at Nov 26 04:47:19 localhost ceph-mds[286153]: mds.mds.np0005536118.kohnma Updating MDS map to version 7 from mon.0 Nov 26 04:47:20 localhost nova_compute[281415]: 2025-11-26 09:47:20.620 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:47:20 localhost ceph-mds[286153]: mds.mds.np0005536118.kohnma Updating MDS map to version 8 from mon.0 Nov 26 04:47:20 localhost ceph-mds[286153]: mds.mds.np0005536118.kohnma Monitors have assigned me to become a standby. Nov 26 04:47:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:47:21 localhost podman[286173]: 2025-11-26 09:47:21.816989299 +0000 UTC m=+0.081726373 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 26 04:47:21 localhost podman[286173]: 2025-11-26 09:47:21.853435083 +0000 UTC m=+0.118172137 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:47:21 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:47:22 localhost nova_compute[281415]: 2025-11-26 09:47:22.403 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:47:23 localhost podman[286321]: 2025-11-26 09:47:23.72468096 +0000 UTC m=+0.085630422 container exec a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, distribution-scope=public, version=7, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, ceph=True, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, maintainer=Guillaume Abrioux , architecture=x86_64) Nov 26 04:47:23 localhost podman[286321]: 2025-11-26 09:47:23.86376534 +0000 UTC m=+0.224714812 container exec_died a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, name=rhceph, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, RELEASE=main, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64) Nov 26 04:47:25 localhost systemd[1]: session-62.scope: Deactivated successfully. Nov 26 04:47:25 localhost systemd-logind[761]: Session 62 logged out. Waiting for processes to exit. Nov 26 04:47:25 localhost systemd-logind[761]: Removed session 62. Nov 26 04:47:25 localhost nova_compute[281415]: 2025-11-26 09:47:25.645 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:47:27 localhost nova_compute[281415]: 2025-11-26 09:47:27.447 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:47:27 localhost podman[240049]: time="2025-11-26T09:47:27Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:47:27 localhost podman[240049]: @ - - [26/Nov/2025:09:47:27 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149603 "" "Go-http-client/1.1" Nov 26 04:47:27 localhost podman[240049]: @ - - [26/Nov/2025:09:47:27 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17724 "" "Go-http-client/1.1" Nov 26 04:47:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:47:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:47:28 localhost systemd[1]: tmp-crun.vt3QoI.mount: Deactivated successfully. Nov 26 04:47:28 localhost podman[286506]: 2025-11-26 09:47:28.851257027 +0000 UTC m=+0.105237627 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 26 04:47:28 localhost podman[286506]: 2025-11-26 09:47:28.887538506 +0000 UTC m=+0.141519066 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:47:28 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:47:28 localhost podman[286507]: 2025-11-26 09:47:28.941916283 +0000 UTC m=+0.196008877 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:47:28 localhost podman[286507]: 2025-11-26 09:47:28.957298087 +0000 UTC m=+0.211390671 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS) Nov 26 04:47:28 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:47:30 localhost nova_compute[281415]: 2025-11-26 09:47:30.685 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:47:31 localhost sshd[286543]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:47:32 localhost nova_compute[281415]: 2025-11-26 09:47:32.466 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:47:35 localhost nova_compute[281415]: 2025-11-26 09:47:35.688 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:47:35 localhost sshd[286545]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:47:37 localhost nova_compute[281415]: 2025-11-26 09:47:37.470 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:47:40 localhost nova_compute[281415]: 2025-11-26 09:47:40.731 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:47:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:47:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:47:41 localhost podman[286548]: 2025-11-26 09:47:41.844243958 +0000 UTC m=+0.098504680 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute) Nov 26 04:47:41 localhost systemd[1]: tmp-crun.CqaA4g.mount: Deactivated successfully. Nov 26 04:47:41 localhost podman[286547]: 2025-11-26 09:47:41.883501589 +0000 UTC m=+0.142109064 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 26 04:47:41 localhost podman[286547]: 2025-11-26 09:47:41.893086124 +0000 UTC m=+0.151693619 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 26 04:47:41 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:47:41 localhost podman[286548]: 2025-11-26 09:47:41.977506828 +0000 UTC m=+0.231767510 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2) Nov 26 04:47:41 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:47:42 localhost nova_compute[281415]: 2025-11-26 09:47:42.514 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:47:45 localhost nova_compute[281415]: 2025-11-26 09:47:45.763 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:47:45 localhost openstack_network_exporter[242153]: ERROR 09:47:45 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:47:45 localhost openstack_network_exporter[242153]: ERROR 09:47:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:47:45 localhost openstack_network_exporter[242153]: ERROR 09:47:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:47:45 localhost openstack_network_exporter[242153]: ERROR 09:47:45 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:47:45 localhost openstack_network_exporter[242153]: Nov 26 04:47:45 localhost openstack_network_exporter[242153]: ERROR 09:47:45 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:47:45 localhost openstack_network_exporter[242153]: Nov 26 04:47:47 localhost nova_compute[281415]: 2025-11-26 09:47:47.559 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:47:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:47:47 localhost podman[286590]: 2025-11-26 09:47:47.833359929 +0000 UTC m=+0.093168366 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, config_id=ovn_controller) Nov 26 04:47:47 localhost podman[286590]: 2025-11-26 09:47:47.882391221 +0000 UTC m=+0.142199698 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 26 04:47:47 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:47:48 localhost nova_compute[281415]: 2025-11-26 09:47:48.957 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:47:48 localhost nova_compute[281415]: 2025-11-26 09:47:48.958 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:47:49 localhost nova_compute[281415]: 2025-11-26 09:47:49.018 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:47:49 localhost nova_compute[281415]: 2025-11-26 09:47:49.018 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:47:49 localhost nova_compute[281415]: 2025-11-26 09:47:49.018 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:47:49 localhost nova_compute[281415]: 2025-11-26 09:47:49.019 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:47:49 localhost nova_compute[281415]: 2025-11-26 09:47:49.019 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:47:49 localhost nova_compute[281415]: 2025-11-26 09:47:49.019 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:47:49 localhost nova_compute[281415]: 2025-11-26 09:47:49.137 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:47:49 localhost nova_compute[281415]: 2025-11-26 09:47:49.137 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:47:49 localhost nova_compute[281415]: 2025-11-26 09:47:49.138 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:47:49 localhost nova_compute[281415]: 2025-11-26 09:47:49.138 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 04:47:49 localhost nova_compute[281415]: 2025-11-26 09:47:49.138 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:47:49 localhost nova_compute[281415]: 2025-11-26 09:47:49.570 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:47:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:47:49 localhost nova_compute[281415]: 2025-11-26 09:47:49.777 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:47:49 localhost nova_compute[281415]: 2025-11-26 09:47:49.778 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:47:49 localhost systemd[1]: tmp-crun.ghcqYT.mount: Deactivated successfully. Nov 26 04:47:49 localhost podman[286637]: 2025-11-26 09:47:49.847297267 +0000 UTC m=+0.103531575 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.buildah.version=1.33.7, version=9.6, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers) Nov 26 04:47:49 localhost podman[286637]: 2025-11-26 09:47:49.860191555 +0000 UTC m=+0.116425823 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, version=9.6, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, maintainer=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal) Nov 26 04:47:49 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:47:50 localhost nova_compute[281415]: 2025-11-26 09:47:50.022 281419 WARNING nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 04:47:50 localhost nova_compute[281415]: 2025-11-26 09:47:50.024 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=12246MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 04:47:50 localhost nova_compute[281415]: 2025-11-26 09:47:50.024 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:47:50 localhost nova_compute[281415]: 2025-11-26 09:47:50.024 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:47:50 localhost nova_compute[281415]: 2025-11-26 09:47:50.222 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 04:47:50 localhost nova_compute[281415]: 2025-11-26 09:47:50.222 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 04:47:50 localhost nova_compute[281415]: 2025-11-26 09:47:50.223 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 04:47:50 localhost nova_compute[281415]: 2025-11-26 09:47:50.257 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:47:50 localhost nova_compute[281415]: 2025-11-26 09:47:50.733 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:47:50 localhost nova_compute[281415]: 2025-11-26 09:47:50.740 281419 DEBUG nova.compute.provider_tree [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 04:47:50 localhost nova_compute[281415]: 2025-11-26 09:47:50.766 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:47:50 localhost nova_compute[281415]: 2025-11-26 09:47:50.831 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 04:47:50 localhost nova_compute[281415]: 2025-11-26 09:47:50.836 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 04:47:50 localhost nova_compute[281415]: 2025-11-26 09:47:50.837 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.812s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:47:51 localhost nova_compute[281415]: 2025-11-26 09:47:51.666 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:47:51 localhost nova_compute[281415]: 2025-11-26 09:47:51.667 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 04:47:52 localhost nova_compute[281415]: 2025-11-26 09:47:52.602 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:47:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:47:52 localhost podman[286679]: 2025-11-26 09:47:52.82069416 +0000 UTC m=+0.083216458 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 26 04:47:52 localhost podman[286679]: 2025-11-26 09:47:52.828566653 +0000 UTC m=+0.091088871 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 04:47:52 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:47:52 localhost nova_compute[281415]: 2025-11-26 09:47:52.849 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:47:52 localhost nova_compute[281415]: 2025-11-26 09:47:52.850 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 04:47:52 localhost nova_compute[281415]: 2025-11-26 09:47:52.850 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 04:47:53 localhost nova_compute[281415]: 2025-11-26 09:47:53.181 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:47:53 localhost nova_compute[281415]: 2025-11-26 09:47:53.182 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:47:53 localhost nova_compute[281415]: 2025-11-26 09:47:53.182 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 04:47:53 localhost nova_compute[281415]: 2025-11-26 09:47:53.183 281419 DEBUG nova.objects.instance [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:47:53 localhost nova_compute[281415]: 2025-11-26 09:47:53.568 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:47:53 localhost nova_compute[281415]: 2025-11-26 09:47:53.623 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:47:53 localhost nova_compute[281415]: 2025-11-26 09:47:53.624 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 04:47:55 localhost nova_compute[281415]: 2025-11-26 09:47:55.797 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:47:57 localhost podman[240049]: time="2025-11-26T09:47:57Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:47:57 localhost podman[240049]: @ - - [26/Nov/2025:09:47:57 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149603 "" "Go-http-client/1.1" Nov 26 04:47:57 localhost podman[240049]: @ - - [26/Nov/2025:09:47:57 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17725 "" "Go-http-client/1.1" Nov 26 04:47:57 localhost nova_compute[281415]: 2025-11-26 09:47:57.629 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:47:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:47:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:47:59 localhost systemd[1]: tmp-crun.a4Xivw.mount: Deactivated successfully. Nov 26 04:47:59 localhost podman[286703]: 2025-11-26 09:47:59.843541956 +0000 UTC m=+0.094247848 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent) Nov 26 04:47:59 localhost podman[286703]: 2025-11-26 09:47:59.877579246 +0000 UTC m=+0.128285128 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:47:59 localhost podman[286704]: 2025-11-26 09:47:59.893058833 +0000 UTC m=+0.139182073 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:47:59 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:47:59 localhost podman[286704]: 2025-11-26 09:47:59.907431917 +0000 UTC m=+0.153555137 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:47:59 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:48:00 localhost nova_compute[281415]: 2025-11-26 09:48:00.831 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:48:02 localhost systemd-logind[761]: Session 63 logged out. Waiting for processes to exit. Nov 26 04:48:02 localhost systemd[1]: session-63.scope: Deactivated successfully. Nov 26 04:48:02 localhost systemd[1]: session-63.scope: Consumed 1.367s CPU time. Nov 26 04:48:02 localhost systemd-logind[761]: Removed session 63. Nov 26 04:48:02 localhost nova_compute[281415]: 2025-11-26 09:48:02.667 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.580 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'name': 'test', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005536118.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'hostId': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.581 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.595 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.596 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '68671c8b-07a6-44ad-9782-c49b2201409f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:48:03.582250', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '008e3f7c-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10958.824542868, 'message_signature': '525584b12aebb9b844cb6fcc1c92585607732a7cf8ab0957b509e9ea82d66fa1'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:48:03.582250', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '008e5a2a-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10958.824542868, 'message_signature': 'e4ff2cad75cb03c8c81b22ab650b91fc99bab8c72efd814d63b36f407b39afce'}]}, 'timestamp': '2025-11-26 09:48:03.597114', '_unique_id': '52fde16e8af04929b845a8d39a8d1d15'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.599 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.601 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.635 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 1723586642 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.635 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 89399569 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '21f4ba58-96ff-4608-8568-36745f54a4b8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1723586642, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:48:03.601435', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '009443cc-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10958.84371629, 'message_signature': '474fb218e8fe51749dcad9519b5f7e3ba7ce71f2aa6e21c560e4065b00ee81a6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89399569, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:48:03.601435', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '00945c68-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10958.84371629, 'message_signature': 'c5afdfab9977e5547a6ffa595c19815953f4e499c9faf2b59d54a9f380c7f5f2'}]}, 'timestamp': '2025-11-26 09:48:03.636427', '_unique_id': 'bbc9dc01f2794ab389f540222ea3ad60'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.637 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.639 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.639 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.639 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.640 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a2add7b9-5521-491b-9374-1cd3fcb4a15e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:48:03.639478', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0094ec0a-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10958.84371629, 'message_signature': '048f13e0da8bd4da419f10a46a6b65b96ce66f3defe836a531b60d2bab0b47d0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:48:03.639478', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '00950474-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10958.84371629, 'message_signature': '77187986c409c0ae7b0a2f00bb27aa4b63c4435d785676076b70d8ec639ccdae'}]}, 'timestamp': '2025-11-26 09:48:03.640659', '_unique_id': 'c6ee7b5ae13a4a6c82fd9b286ed78384'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.641 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.642 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.643 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.643 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cf531f37-1d9e-4b06-85b8-278d73bfe2aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:48:03.643099', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '00957922-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10958.84371629, 'message_signature': 'c4122de025e4de6b2235b772b3a973099e6bac18fb745a9535255a96624ec65b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:48:03.643099', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '00958eb2-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10958.84371629, 'message_signature': '15e31c4bdb4095bf239d75a355ecfdafdb76ca87a19cff306342bbfd81ba1d2a'}]}, 'timestamp': '2025-11-26 09:48:03.644201', '_unique_id': 'b072b680a6ad452eb076bf255d5c8e56'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.645 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.646 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.651 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b8a8f57d-62e9-492d-bc9c-941c7d7002db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:48:03.646626', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '0096b030-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10958.888927614, 'message_signature': '99b4a55f14c4e43fb42607f16f2837baae8b065a48235d08800b96145ba70cd4'}]}, 'timestamp': '2025-11-26 09:48:03.651634', '_unique_id': '65128cb92b4c4be4b445d2978f9400af'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.652 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.653 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.654 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:48:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:48:03.652 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:48:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:48:03.653 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:48:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:48:03.654 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '832d169e-197b-4caf-923c-2afdf8395f73', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:48:03.654062', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '009726be-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10958.888927614, 'message_signature': 'eb28d11a9b9f508d1d15b3acfa26fe1949b650e7f864dc2137b6611e23741241'}]}, 'timestamp': '2025-11-26 09:48:03.654699', '_unique_id': '2bd2dd3877be4d10bbaed1d990281f20'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.655 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.656 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.672 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/memory.usage volume: 51.79296875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f058af3-e2b9-4883-bdae-b73069506a5a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.79296875, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T09:48:03.657063', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '0099f272-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10958.914597277, 'message_signature': '6102d736ea0f534bbcb84504f4d43e30917a862a2d485fbb33fc06ace2ed7f95'}]}, 'timestamp': '2025-11-26 09:48:03.673049', '_unique_id': '863d7b37182041aaa7054b1dcae7838a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.674 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.675 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.675 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0801ecdc-c3da-4334-8f47-65b73caa1d52', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:48:03.675446', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '009a68c4-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10958.888927614, 'message_signature': '3a78642254053c51951cd78d084c72be8ce1068b1f159345062f1f6073f425de'}]}, 'timestamp': '2025-11-26 09:48:03.676104', '_unique_id': 'f1d2043aca6e4ec584f025cdd81cb9e0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.677 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.678 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.678 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.679 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '72d52e31-8a37-4995-ba50-942c84dcf98c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:48:03.678446', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '009add7c-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10958.84371629, 'message_signature': '1d7edd4d47b3ee9bf6394695c64eb48cb7dddbfb256bf120e9bdc533d558630a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:48:03.678446', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '009af42e-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10958.84371629, 'message_signature': 'd5cbe142c24cf04faef0dd72d16f553c00730e3a42a554281539f52bd3d28cfe'}]}, 'timestamp': '2025-11-26 09:48:03.679562', '_unique_id': '4f38d64b60c54bd192d5520c92fcd8e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.680 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.681 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.682 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes volume: 7111 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '939ae1a6-701f-4de9-99d4-1afdfd22338b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7111, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:48:03.681986', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '009b68dc-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10958.888927614, 'message_signature': '4135adedee815636fc5c934b66853bd69c65072e52afc96cc3fd6862f926b10a'}]}, 'timestamp': '2025-11-26 09:48:03.682600', '_unique_id': '5ad624f5ad1c470b9e309d759e2de875'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.683 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.684 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.684 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.685 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8ed34230-404d-426b-b45e-747d32ca69a2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:48:03.684892', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '009bdbf0-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10958.824542868, 'message_signature': 'd62b39dbab278929dff8323d8143be105adc7fb98c16add794f045bfa7376160'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:48:03.684892', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '009bf0a4-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10958.824542868, 'message_signature': '1787fc4afbc05b0b5c236852b80ed9876f274da0aec3f0d4758a60fb25973274'}]}, 'timestamp': '2025-11-26 09:48:03.686087', '_unique_id': 'a22fdbe4421c413cafa52406723bf09e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.687 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.688 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.688 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.688 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.688 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b51577fd-6264-4294-ace3-2410d2b187f8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:48:03.688877', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '009c770e-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10958.888927614, 'message_signature': '72016e8ad2bf677069e64c222d71dd7d5d4103185574040b3ed175b3f796b15c'}]}, 'timestamp': '2025-11-26 09:48:03.689493', '_unique_id': '075816c2c0a94da89d60ffa5f772c87f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.690 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.691 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.691 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 1143371229 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.692 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 23326743 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5c8e355-3dd8-4130-93fb-3b5942beacc1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1143371229, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:48:03.691778', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '009ce89c-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10958.84371629, 'message_signature': '6d5893eaa215923c26a742c829fd0cbe9603f357893aa59f0036e9450047716a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 23326743, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:48:03.691778', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '009cfca6-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10958.84371629, 'message_signature': '5151d8e9c0973f7789f455d0b70f6052b44f48d596084ac5f8cabe1a64b5ab25'}]}, 'timestamp': '2025-11-26 09:48:03.692887', '_unique_id': '05c3b1124f1f4122b0db316d208d378d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.693 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.695 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.695 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.695 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6bda2132-e4ce-422b-8a3e-c72c82f2e79d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:48:03.695265', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '009d6ec0-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10958.824542868, 'message_signature': '279b010ea739a4e5a99ed908b0fc56f04a27e0ab44601c73e9b4fd1bf888fa49'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:48:03.695265', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '009d843c-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10958.824542868, 'message_signature': '95a2c6fba5b26177cf82b50c703b05d7ad2890aa437885b6d60b59bd808ca80e'}]}, 'timestamp': '2025-11-26 09:48:03.696358', '_unique_id': 'fd48cb1e0935486892719afda4f3c453'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.697 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.698 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.699 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '28a1e379-cbbc-409c-b521-b92c9a2a8b52', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:48:03.698984', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '009e0092-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10958.888927614, 'message_signature': '1a1c5f6d015f14f60a79ebaedc945ca4d01919934f61dcd4dcc1510a776c120f'}]}, 'timestamp': '2025-11-26 09:48:03.699612', '_unique_id': 'd768a04880e54aa0817d6a8338d9b088'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.700 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.701 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.702 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.702 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1bf48ca2-a23b-463c-8ca5-1f0449b04472', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:48:03.702208', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '009e7ebe-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10958.888927614, 'message_signature': 'ce79b646755999101f2425374691d1081dd96bd698021197fd4ed9e07f2d6fb5'}]}, 'timestamp': '2025-11-26 09:48:03.702790', '_unique_id': '3cd781f9a9f34ddcbdbed08a0cafe21f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.703 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.704 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.705 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.705 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '275ae07f-b427-4345-91fe-1d4ff6513ada', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:48:03.705129', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '009ef0ec-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10958.84371629, 'message_signature': 'f419d2ac5a96c6c4aee2ab9fcd60223c3a1e9601f4c40f2ffe6d9b1e16dd6fef'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:48:03.705129', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '009f065e-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10958.84371629, 'message_signature': 'aa641a5d8e096612b9f8d027773a9aad09b64695a580fbb86e77029587ac1304'}]}, 'timestamp': '2025-11-26 09:48:03.706240', '_unique_id': '70efa34cd9554e3b9ce7dc74fd264bd3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.706 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.707 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.707 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8099f1e-7edd-4ada-9dfb-50e86a8e272c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:48:03.707706', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '009f5000-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10958.888927614, 'message_signature': '2f07e5a3d0c444aaecca078a9f0d16df4cb68406e39a3820e44b68213b300bb6'}]}, 'timestamp': '2025-11-26 09:48:03.708120', '_unique_id': 'ae1bb2fc03414cb48f8652dcfbc43206'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.708 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.709 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.709 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a886b462-5a9d-436d-b895-d577c6b152a4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:48:03.709722', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '009f9eac-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10958.888927614, 'message_signature': '5e983cc54628b012f4b763fc0937ef2861beb48bd3cb17088608c456e9f31af1'}]}, 'timestamp': '2025-11-26 09:48:03.710138', '_unique_id': '202dbf99c66d451a88b60b26caa91fa3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.710 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.711 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.711 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '38d7c6a9-c5fe-4c37-a7dd-6ba9ee1ad475', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:48:03.711542', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '009fe5ba-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10958.888927614, 'message_signature': 'c20c23df77905e5142238f585be6f7253cca6dcecceb9e735d2f0d8c87fa1e84'}]}, 'timestamp': '2025-11-26 09:48:03.711922', '_unique_id': 'dac82f6c09d24956bce704f98188a626'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.712 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.713 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.713 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/cpu volume: 12350000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd1a458dc-2659-42d7-91a8-d2f3d8139e19', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12350000000, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T09:48:03.713373', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '00a02d2c-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 10958.914597277, 'message_signature': '5bdef2c292fe8c3f2969b283cb6b1ce244eab8f7b98e1cda25adcaa9954e8ad3'}]}, 'timestamp': '2025-11-26 09:48:03.713740', '_unique_id': '37e347b49914400aa395bf3ae5313629'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:48:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:48:03.714 12 ERROR oslo_messaging.notify.messaging Nov 26 04:48:05 localhost nova_compute[281415]: 2025-11-26 09:48:05.946 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:48:07 localhost nova_compute[281415]: 2025-11-26 09:48:07.701 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:48:09 localhost sshd[286757]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:48:10 localhost nova_compute[281415]: 2025-11-26 09:48:10.979 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:48:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:48:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:48:12 localhost systemd[1]: Stopping User Manager for UID 1003... Nov 26 04:48:12 localhost systemd[283517]: Activating special unit Exit the Session... Nov 26 04:48:12 localhost systemd[283517]: Stopped target Main User Target. Nov 26 04:48:12 localhost systemd[283517]: Stopped target Basic System. Nov 26 04:48:12 localhost systemd[283517]: Stopped target Paths. Nov 26 04:48:12 localhost systemd[283517]: Stopped target Sockets. Nov 26 04:48:12 localhost systemd[283517]: Stopped target Timers. Nov 26 04:48:12 localhost systemd[283517]: Stopped Mark boot as successful after the user session has run 2 minutes. Nov 26 04:48:12 localhost systemd[283517]: Stopped Daily Cleanup of User's Temporary Directories. Nov 26 04:48:12 localhost systemd[283517]: Closed D-Bus User Message Bus Socket. Nov 26 04:48:12 localhost systemd[283517]: Stopped Create User's Volatile Files and Directories. Nov 26 04:48:12 localhost systemd[283517]: Removed slice User Application Slice. Nov 26 04:48:12 localhost systemd[283517]: Reached target Shutdown. Nov 26 04:48:12 localhost systemd[283517]: Finished Exit the Session. Nov 26 04:48:12 localhost systemd[283517]: Reached target Exit the Session. Nov 26 04:48:12 localhost systemd[1]: user@1003.service: Deactivated successfully. Nov 26 04:48:12 localhost systemd[1]: Stopped User Manager for UID 1003. Nov 26 04:48:12 localhost systemd[1]: Stopping User Runtime Directory /run/user/1003... Nov 26 04:48:12 localhost systemd[1]: run-user-1003.mount: Deactivated successfully. Nov 26 04:48:12 localhost systemd[1]: user-runtime-dir@1003.service: Deactivated successfully. Nov 26 04:48:12 localhost systemd[1]: Stopped User Runtime Directory /run/user/1003. Nov 26 04:48:12 localhost systemd[1]: Removed slice User Slice of UID 1003. Nov 26 04:48:12 localhost systemd[1]: user-1003.slice: Consumed 1.796s CPU time. Nov 26 04:48:12 localhost podman[286760]: 2025-11-26 09:48:12.608068209 +0000 UTC m=+0.101031537 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Nov 26 04:48:12 localhost podman[286760]: 2025-11-26 09:48:12.648477606 +0000 UTC m=+0.141440924 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 26 04:48:12 localhost podman[286759]: 2025-11-26 09:48:12.660030582 +0000 UTC m=+0.149289475 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:48:12 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:48:12 localhost podman[286759]: 2025-11-26 09:48:12.672356862 +0000 UTC m=+0.161615755 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 26 04:48:12 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:48:12 localhost nova_compute[281415]: 2025-11-26 09:48:12.743 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:48:15 localhost openstack_network_exporter[242153]: ERROR 09:48:15 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:48:15 localhost openstack_network_exporter[242153]: ERROR 09:48:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:48:15 localhost openstack_network_exporter[242153]: ERROR 09:48:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:48:15 localhost openstack_network_exporter[242153]: ERROR 09:48:15 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:48:15 localhost openstack_network_exporter[242153]: Nov 26 04:48:15 localhost openstack_network_exporter[242153]: ERROR 09:48:15 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:48:15 localhost openstack_network_exporter[242153]: Nov 26 04:48:15 localhost nova_compute[281415]: 2025-11-26 09:48:15.980 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:48:17 localhost nova_compute[281415]: 2025-11-26 09:48:17.780 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:48:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:48:18 localhost podman[286823]: 2025-11-26 09:48:18.830839876 +0000 UTC m=+0.089596055 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 26 04:48:18 localhost podman[286823]: 2025-11-26 09:48:18.897526142 +0000 UTC m=+0.156282331 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 26 04:48:18 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:48:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:48:20 localhost podman[286884]: 2025-11-26 09:48:20.827992336 +0000 UTC m=+0.087466019 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, managed_by=edpm_ansible, release=1755695350, config_id=edpm, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Nov 26 04:48:20 localhost podman[286884]: 2025-11-26 09:48:20.843390721 +0000 UTC m=+0.102864394 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, config_id=edpm) Nov 26 04:48:20 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:48:21 localhost nova_compute[281415]: 2025-11-26 09:48:21.030 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:48:22 localhost nova_compute[281415]: 2025-11-26 09:48:22.835 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:48:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:48:23 localhost podman[286903]: 2025-11-26 09:48:23.820883601 +0000 UTC m=+0.080248197 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 26 04:48:23 localhost podman[286903]: 2025-11-26 09:48:23.828773685 +0000 UTC m=+0.088138301 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 26 04:48:23 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:48:26 localhost nova_compute[281415]: 2025-11-26 09:48:26.078 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:48:27 localhost podman[240049]: time="2025-11-26T09:48:27Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:48:27 localhost podman[240049]: @ - - [26/Nov/2025:09:48:27 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149603 "" "Go-http-client/1.1" Nov 26 04:48:27 localhost podman[240049]: @ - - [26/Nov/2025:09:48:27 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17730 "" "Go-http-client/1.1" Nov 26 04:48:27 localhost nova_compute[281415]: 2025-11-26 09:48:27.880 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:48:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:48:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:48:30 localhost podman[286926]: 2025-11-26 09:48:30.829433986 +0000 UTC m=+0.088543082 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118) Nov 26 04:48:30 localhost podman[286926]: 2025-11-26 09:48:30.836487993 +0000 UTC m=+0.095597110 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible) Nov 26 04:48:30 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:48:30 localhost podman[286927]: 2025-11-26 09:48:30.93101957 +0000 UTC m=+0.187884557 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, org.label-schema.schema-version=1.0) Nov 26 04:48:30 localhost podman[286927]: 2025-11-26 09:48:30.943098402 +0000 UTC m=+0.199963339 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 26 04:48:30 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:48:31 localhost nova_compute[281415]: 2025-11-26 09:48:31.123 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:48:32 localhost nova_compute[281415]: 2025-11-26 09:48:32.938 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:48:36 localhost nova_compute[281415]: 2025-11-26 09:48:36.159 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:48:37 localhost nova_compute[281415]: 2025-11-26 09:48:37.943 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:48:41 localhost nova_compute[281415]: 2025-11-26 09:48:41.211 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:48:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:48:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:48:42 localhost podman[287086]: 2025-11-26 09:48:42.849737913 +0000 UTC m=+0.103416042 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:48:42 localhost systemd[1]: tmp-crun.gLCbXk.mount: Deactivated successfully. Nov 26 04:48:42 localhost podman[287087]: 2025-11-26 09:48:42.909432725 +0000 UTC m=+0.158066153 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true) Nov 26 04:48:42 localhost podman[287087]: 2025-11-26 09:48:42.92384754 +0000 UTC m=+0.172481018 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible) Nov 26 04:48:42 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:48:42 localhost podman[287086]: 2025-11-26 09:48:42.990852002 +0000 UTC m=+0.244530141 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:48:42 localhost nova_compute[281415]: 2025-11-26 09:48:42.994 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:48:43 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:48:45 localhost podman[287203]: Nov 26 04:48:45 localhost podman[287203]: 2025-11-26 09:48:45.640881162 +0000 UTC m=+0.086230834 container create 5caeaabbad727212b2fa74a3b1f2e1322ba012daf9b220c99e0eb7b231121939 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_napier, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, GIT_BRANCH=main, release=553, CEPH_POINT_RELEASE=, architecture=x86_64, distribution-scope=public, ceph=True, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, com.redhat.component=rhceph-container, io.openshift.expose-services=) Nov 26 04:48:45 localhost systemd[1]: Started libpod-conmon-5caeaabbad727212b2fa74a3b1f2e1322ba012daf9b220c99e0eb7b231121939.scope. Nov 26 04:48:45 localhost podman[287203]: 2025-11-26 09:48:45.604906226 +0000 UTC m=+0.050255948 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:48:45 localhost systemd[1]: Started libcrun container. Nov 26 04:48:45 localhost podman[287203]: 2025-11-26 09:48:45.736347254 +0000 UTC m=+0.181696926 container init 5caeaabbad727212b2fa74a3b1f2e1322ba012daf9b220c99e0eb7b231121939 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_napier, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, GIT_BRANCH=main, vcs-type=git, build-date=2025-09-24T08:57:55, name=rhceph, version=7, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, release=553, vendor=Red Hat, Inc.) Nov 26 04:48:45 localhost podman[287203]: 2025-11-26 09:48:45.74782092 +0000 UTC m=+0.193170592 container start 5caeaabbad727212b2fa74a3b1f2e1322ba012daf9b220c99e0eb7b231121939 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_napier, ceph=True, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , version=7, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, com.redhat.component=rhceph-container, vcs-type=git, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph) Nov 26 04:48:45 localhost podman[287203]: 2025-11-26 09:48:45.748108319 +0000 UTC m=+0.193458041 container attach 5caeaabbad727212b2fa74a3b1f2e1322ba012daf9b220c99e0eb7b231121939 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_napier, name=rhceph, RELEASE=main, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, io.openshift.expose-services=, CEPH_POINT_RELEASE=, vcs-type=git, GIT_BRANCH=main, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.openshift.tags=rhceph ceph, version=7, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, release=553, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 26 04:48:45 localhost upbeat_napier[287218]: 167 167 Nov 26 04:48:45 localhost systemd[1]: libpod-5caeaabbad727212b2fa74a3b1f2e1322ba012daf9b220c99e0eb7b231121939.scope: Deactivated successfully. Nov 26 04:48:45 localhost podman[287203]: 2025-11-26 09:48:45.755476591 +0000 UTC m=+0.200826243 container died 5caeaabbad727212b2fa74a3b1f2e1322ba012daf9b220c99e0eb7b231121939 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_napier, vcs-type=git, name=rhceph, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.buildah.version=1.33.12, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , distribution-scope=public, RELEASE=main, release=553, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, version=7, com.redhat.component=rhceph-container, io.openshift.expose-services=) Nov 26 04:48:45 localhost openstack_network_exporter[242153]: ERROR 09:48:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:48:45 localhost openstack_network_exporter[242153]: ERROR 09:48:45 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:48:45 localhost openstack_network_exporter[242153]: ERROR 09:48:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:48:45 localhost openstack_network_exporter[242153]: ERROR 09:48:45 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:48:45 localhost openstack_network_exporter[242153]: Nov 26 04:48:45 localhost openstack_network_exporter[242153]: ERROR 09:48:45 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:48:45 localhost openstack_network_exporter[242153]: Nov 26 04:48:45 localhost podman[287223]: 2025-11-26 09:48:45.901142898 +0000 UTC m=+0.135376908 container remove 5caeaabbad727212b2fa74a3b1f2e1322ba012daf9b220c99e0eb7b231121939 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_napier, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, io.buildah.version=1.33.12, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, com.redhat.component=rhceph-container, architecture=x86_64, distribution-scope=public, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, ceph=True, maintainer=Guillaume Abrioux , GIT_CLEAN=True, version=7) Nov 26 04:48:45 localhost systemd[1]: libpod-conmon-5caeaabbad727212b2fa74a3b1f2e1322ba012daf9b220c99e0eb7b231121939.scope: Deactivated successfully. Nov 26 04:48:45 localhost systemd[1]: Reloading. Nov 26 04:48:46 localhost systemd-rc-local-generator[287260]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:48:46 localhost systemd-sysv-generator[287268]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:48:46 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:48:46 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:48:46 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:48:46 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:48:46 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:48:46 localhost nova_compute[281415]: 2025-11-26 09:48:46.213 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:48:46 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:48:46 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:48:46 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:48:46 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:48:46 localhost systemd[1]: var-lib-containers-storage-overlay-968b8b7923df940f1c2d8135ba568f44c4b627cd097d9965bb9255040be0f129-merged.mount: Deactivated successfully. Nov 26 04:48:46 localhost systemd[1]: Reloading. Nov 26 04:48:46 localhost systemd-rc-local-generator[287305]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:48:46 localhost systemd-sysv-generator[287309]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:48:46 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:48:46 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:48:46 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:48:46 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:48:46 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:48:46 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:48:46 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:48:46 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:48:46 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:48:46 localhost systemd[1]: Starting Ceph mgr.np0005536118.anceyj for 0d5e5e6d-3c4b-5efe-8c65-346ae6715606... Nov 26 04:48:46 localhost nova_compute[281415]: 2025-11-26 09:48:46.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:48:46 localhost nova_compute[281415]: 2025-11-26 09:48:46.848 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Nov 26 04:48:46 localhost nova_compute[281415]: 2025-11-26 09:48:46.880 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Nov 26 04:48:46 localhost nova_compute[281415]: 2025-11-26 09:48:46.881 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:48:46 localhost nova_compute[281415]: 2025-11-26 09:48:46.881 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Nov 26 04:48:46 localhost nova_compute[281415]: 2025-11-26 09:48:46.900 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:48:47 localhost podman[287369]: Nov 26 04:48:47 localhost podman[287369]: 2025-11-26 09:48:47.212294124 +0000 UTC m=+0.085726769 container create e3f2c55f710feac8733d3167c070a7fb150bff9b82ab1e7135a9899403ae8b36 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., vcs-type=git, version=7, release=553, CEPH_POINT_RELEASE=, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, distribution-scope=public, GIT_BRANCH=main, com.redhat.component=rhceph-container, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , ceph=True, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 26 04:48:47 localhost systemd[1]: tmp-crun.0j5iZN.mount: Deactivated successfully. Nov 26 04:48:47 localhost podman[287369]: 2025-11-26 09:48:47.177015689 +0000 UTC m=+0.050448334 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:48:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bf6512f8818b10390521a1f0c53d66031122a5bafd3664b1cc90de2b0ca7d164/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 26 04:48:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bf6512f8818b10390521a1f0c53d66031122a5bafd3664b1cc90de2b0ca7d164/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Nov 26 04:48:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bf6512f8818b10390521a1f0c53d66031122a5bafd3664b1cc90de2b0ca7d164/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 26 04:48:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bf6512f8818b10390521a1f0c53d66031122a5bafd3664b1cc90de2b0ca7d164/merged/var/lib/ceph/mgr/ceph-np0005536118.anceyj supports timestamps until 2038 (0x7fffffff) Nov 26 04:48:47 localhost podman[287369]: 2025-11-26 09:48:47.290644039 +0000 UTC m=+0.164076654 container init e3f2c55f710feac8733d3167c070a7fb150bff9b82ab1e7135a9899403ae8b36 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, version=7, ceph=True, RELEASE=main, vendor=Red Hat, Inc., name=rhceph) Nov 26 04:48:47 localhost podman[287369]: 2025-11-26 09:48:47.299989131 +0000 UTC m=+0.173421746 container start e3f2c55f710feac8733d3167c070a7fb150bff9b82ab1e7135a9899403ae8b36 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj, release=553, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, maintainer=Guillaume Abrioux , GIT_BRANCH=main, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, CEPH_POINT_RELEASE=, ceph=True, vcs-type=git, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 26 04:48:47 localhost bash[287369]: e3f2c55f710feac8733d3167c070a7fb150bff9b82ab1e7135a9899403ae8b36 Nov 26 04:48:47 localhost systemd[1]: Started Ceph mgr.np0005536118.anceyj for 0d5e5e6d-3c4b-5efe-8c65-346ae6715606. Nov 26 04:48:47 localhost ceph-mgr[287388]: set uid:gid to 167:167 (ceph:ceph) Nov 26 04:48:47 localhost ceph-mgr[287388]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mgr, pid 2 Nov 26 04:48:47 localhost ceph-mgr[287388]: pidfile_write: ignore empty --pid-file Nov 26 04:48:47 localhost ceph-mgr[287388]: mgr[py] Loading python module 'alerts' Nov 26 04:48:47 localhost ceph-mgr[287388]: mgr[py] Module alerts has missing NOTIFY_TYPES member Nov 26 04:48:47 localhost ceph-mgr[287388]: mgr[py] Loading python module 'balancer' Nov 26 04:48:47 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:48:47.481+0000 7f54618cb140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member Nov 26 04:48:47 localhost ceph-mgr[287388]: mgr[py] Module balancer has missing NOTIFY_TYPES member Nov 26 04:48:47 localhost ceph-mgr[287388]: mgr[py] Loading python module 'cephadm' Nov 26 04:48:47 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:48:47.550+0000 7f54618cb140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member Nov 26 04:48:47 localhost nova_compute[281415]: 2025-11-26 09:48:47.915 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:48:48 localhost nova_compute[281415]: 2025-11-26 09:48:48.040 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:48:48 localhost ceph-mgr[287388]: mgr[py] Loading python module 'crash' Nov 26 04:48:48 localhost ceph-mgr[287388]: mgr[py] Module crash has missing NOTIFY_TYPES member Nov 26 04:48:48 localhost ceph-mgr[287388]: mgr[py] Loading python module 'dashboard' Nov 26 04:48:48 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:48:48.207+0000 7f54618cb140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member Nov 26 04:48:48 localhost sshd[287418]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:48:48 localhost ceph-mgr[287388]: mgr[py] Loading python module 'devicehealth' Nov 26 04:48:48 localhost ceph-mgr[287388]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member Nov 26 04:48:48 localhost ceph-mgr[287388]: mgr[py] Loading python module 'diskprediction_local' Nov 26 04:48:48 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:48:48.855+0000 7f54618cb140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member Nov 26 04:48:48 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode. Nov 26 04:48:48 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve. Nov 26 04:48:48 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: from numpy import show_config as show_numpy_config Nov 26 04:48:49 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:48:49.000+0000 7f54618cb140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member Nov 26 04:48:49 localhost ceph-mgr[287388]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member Nov 26 04:48:49 localhost ceph-mgr[287388]: mgr[py] Loading python module 'influx' Nov 26 04:48:49 localhost ceph-mgr[287388]: mgr[py] Module influx has missing NOTIFY_TYPES member Nov 26 04:48:49 localhost ceph-mgr[287388]: mgr[py] Loading python module 'insights' Nov 26 04:48:49 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:48:49.060+0000 7f54618cb140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member Nov 26 04:48:49 localhost ceph-mgr[287388]: mgr[py] Loading python module 'iostat' Nov 26 04:48:49 localhost ceph-mgr[287388]: mgr[py] Module iostat has missing NOTIFY_TYPES member Nov 26 04:48:49 localhost ceph-mgr[287388]: mgr[py] Loading python module 'k8sevents' Nov 26 04:48:49 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:48:49.177+0000 7f54618cb140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member Nov 26 04:48:49 localhost ceph-mgr[287388]: mgr[py] Loading python module 'localpool' Nov 26 04:48:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:48:49 localhost ceph-mgr[287388]: mgr[py] Loading python module 'mds_autoscaler' Nov 26 04:48:49 localhost podman[287420]: 2025-11-26 09:48:49.654082646 +0000 UTC m=+0.102966898 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 26 04:48:49 localhost podman[287420]: 2025-11-26 09:48:49.74331529 +0000 UTC m=+0.192199572 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.build-date=20251118) Nov 26 04:48:49 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:48:49 localhost ceph-mgr[287388]: mgr[py] Loading python module 'mirroring' Nov 26 04:48:49 localhost nova_compute[281415]: 2025-11-26 09:48:49.843 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:48:49 localhost nova_compute[281415]: 2025-11-26 09:48:49.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:48:49 localhost nova_compute[281415]: 2025-11-26 09:48:49.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:48:49 localhost nova_compute[281415]: 2025-11-26 09:48:49.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:48:49 localhost nova_compute[281415]: 2025-11-26 09:48:49.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:48:49 localhost ceph-mgr[287388]: mgr[py] Loading python module 'nfs' Nov 26 04:48:49 localhost nova_compute[281415]: 2025-11-26 09:48:49.883 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:48:49 localhost nova_compute[281415]: 2025-11-26 09:48:49.884 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:48:49 localhost nova_compute[281415]: 2025-11-26 09:48:49.884 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:48:49 localhost nova_compute[281415]: 2025-11-26 09:48:49.884 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 04:48:49 localhost nova_compute[281415]: 2025-11-26 09:48:49.885 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:48:50 localhost ceph-mgr[287388]: mgr[py] Module nfs has missing NOTIFY_TYPES member Nov 26 04:48:50 localhost ceph-mgr[287388]: mgr[py] Loading python module 'orchestrator' Nov 26 04:48:50 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:48:50.062+0000 7f54618cb140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member Nov 26 04:48:50 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:48:50.218+0000 7f54618cb140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member Nov 26 04:48:50 localhost ceph-mgr[287388]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member Nov 26 04:48:50 localhost ceph-mgr[287388]: mgr[py] Loading python module 'osd_perf_query' Nov 26 04:48:50 localhost ceph-mgr[287388]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member Nov 26 04:48:50 localhost ceph-mgr[287388]: mgr[py] Loading python module 'osd_support' Nov 26 04:48:50 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:48:50.287+0000 7f54618cb140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member Nov 26 04:48:50 localhost nova_compute[281415]: 2025-11-26 09:48:50.349 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:48:50 localhost ceph-mgr[287388]: mgr[py] Module osd_support has missing NOTIFY_TYPES member Nov 26 04:48:50 localhost ceph-mgr[287388]: mgr[py] Loading python module 'pg_autoscaler' Nov 26 04:48:50 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:48:50.352+0000 7f54618cb140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member Nov 26 04:48:50 localhost ceph-mgr[287388]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member Nov 26 04:48:50 localhost ceph-mgr[287388]: mgr[py] Loading python module 'progress' Nov 26 04:48:50 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:48:50.422+0000 7f54618cb140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member Nov 26 04:48:50 localhost nova_compute[281415]: 2025-11-26 09:48:50.452 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:48:50 localhost nova_compute[281415]: 2025-11-26 09:48:50.453 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:48:50 localhost ceph-mgr[287388]: mgr[py] Module progress has missing NOTIFY_TYPES member Nov 26 04:48:50 localhost ceph-mgr[287388]: mgr[py] Loading python module 'prometheus' Nov 26 04:48:50 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:48:50.492+0000 7f54618cb140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member Nov 26 04:48:50 localhost nova_compute[281415]: 2025-11-26 09:48:50.678 281419 WARNING nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 04:48:50 localhost nova_compute[281415]: 2025-11-26 09:48:50.680 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=11998MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 04:48:50 localhost nova_compute[281415]: 2025-11-26 09:48:50.681 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:48:50 localhost nova_compute[281415]: 2025-11-26 09:48:50.682 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:48:50 localhost ceph-mgr[287388]: mgr[py] Module prometheus has missing NOTIFY_TYPES member Nov 26 04:48:50 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:48:50.818+0000 7f54618cb140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member Nov 26 04:48:50 localhost ceph-mgr[287388]: mgr[py] Loading python module 'rbd_support' Nov 26 04:48:50 localhost nova_compute[281415]: 2025-11-26 09:48:50.821 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 04:48:50 localhost nova_compute[281415]: 2025-11-26 09:48:50.822 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 04:48:50 localhost nova_compute[281415]: 2025-11-26 09:48:50.822 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 04:48:50 localhost nova_compute[281415]: 2025-11-26 09:48:50.879 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Refreshing inventories for resource provider 05276789-7461-410b-9529-16f5185a8bff _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Nov 26 04:48:50 localhost ceph-mgr[287388]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member Nov 26 04:48:50 localhost ceph-mgr[287388]: mgr[py] Loading python module 'restful' Nov 26 04:48:50 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:48:50.904+0000 7f54618cb140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member Nov 26 04:48:50 localhost nova_compute[281415]: 2025-11-26 09:48:50.962 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Updating ProviderTree inventory for provider 05276789-7461-410b-9529-16f5185a8bff from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Nov 26 04:48:50 localhost nova_compute[281415]: 2025-11-26 09:48:50.963 281419 DEBUG nova.compute.provider_tree [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Updating inventory in ProviderTree for provider 05276789-7461-410b-9529-16f5185a8bff with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Nov 26 04:48:50 localhost nova_compute[281415]: 2025-11-26 09:48:50.991 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Refreshing aggregate associations for resource provider 05276789-7461-410b-9529-16f5185a8bff, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Nov 26 04:48:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:48:51 localhost nova_compute[281415]: 2025-11-26 09:48:51.017 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Refreshing trait associations for resource provider 05276789-7461-410b-9529-16f5185a8bff, traits: COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_F16C,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_FMA3,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ACCELERATORS,HW_CPU_X86_AESNI,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_CLMUL,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_BMI,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Nov 26 04:48:51 localhost nova_compute[281415]: 2025-11-26 09:48:51.064 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:48:51 localhost ceph-mgr[287388]: mgr[py] Loading python module 'rgw' Nov 26 04:48:51 localhost podman[287560]: 2025-11-26 09:48:51.118238121 +0000 UTC m=+0.091378410 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, version=9.6, io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-type=git, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9) Nov 26 04:48:51 localhost podman[287560]: 2025-11-26 09:48:51.135383118 +0000 UTC m=+0.108523457 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9-minimal, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 04:48:51 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:48:51 localhost ceph-mgr[287388]: mgr[py] Module rgw has missing NOTIFY_TYPES member Nov 26 04:48:51 localhost ceph-mgr[287388]: mgr[py] Loading python module 'rook' Nov 26 04:48:51 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:48:51.246+0000 7f54618cb140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member Nov 26 04:48:51 localhost nova_compute[281415]: 2025-11-26 09:48:51.270 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:48:51 localhost podman[287630]: 2025-11-26 09:48:51.389009844 +0000 UTC m=+0.084170971 container exec a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, ceph=True, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.expose-services=, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, CEPH_POINT_RELEASE=, vcs-type=git) Nov 26 04:48:51 localhost podman[287630]: 2025-11-26 09:48:51.493447506 +0000 UTC m=+0.188608613 container exec_died a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, io.buildah.version=1.33.12, GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, architecture=x86_64, CEPH_POINT_RELEASE=, RELEASE=main, ceph=True, name=rhceph, distribution-scope=public, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 26 04:48:51 localhost nova_compute[281415]: 2025-11-26 09:48:51.589 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.526s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:48:51 localhost nova_compute[281415]: 2025-11-26 09:48:51.600 281419 DEBUG nova.compute.provider_tree [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 04:48:51 localhost nova_compute[281415]: 2025-11-26 09:48:51.621 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 04:48:51 localhost nova_compute[281415]: 2025-11-26 09:48:51.624 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 04:48:51 localhost nova_compute[281415]: 2025-11-26 09:48:51.624 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.943s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:48:51 localhost ceph-mgr[287388]: mgr[py] Module rook has missing NOTIFY_TYPES member Nov 26 04:48:51 localhost ceph-mgr[287388]: mgr[py] Loading python module 'selftest' Nov 26 04:48:51 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:48:51.704+0000 7f54618cb140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member Nov 26 04:48:51 localhost ceph-mgr[287388]: mgr[py] Module selftest has missing NOTIFY_TYPES member Nov 26 04:48:51 localhost ceph-mgr[287388]: mgr[py] Loading python module 'snap_schedule' Nov 26 04:48:51 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:48:51.766+0000 7f54618cb140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member Nov 26 04:48:51 localhost ceph-mgr[287388]: mgr[py] Loading python module 'stats' Nov 26 04:48:51 localhost ceph-mgr[287388]: mgr[py] Loading python module 'status' Nov 26 04:48:51 localhost ceph-mgr[287388]: mgr[py] Module status has missing NOTIFY_TYPES member Nov 26 04:48:51 localhost ceph-mgr[287388]: mgr[py] Loading python module 'telegraf' Nov 26 04:48:51 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:48:51.963+0000 7f54618cb140 -1 mgr[py] Module status has missing NOTIFY_TYPES member Nov 26 04:48:52 localhost ceph-mgr[287388]: mgr[py] Module telegraf has missing NOTIFY_TYPES member Nov 26 04:48:52 localhost ceph-mgr[287388]: mgr[py] Loading python module 'telemetry' Nov 26 04:48:52 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:48:52.031+0000 7f54618cb140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member Nov 26 04:48:52 localhost ceph-mgr[287388]: mgr[py] Module telemetry has missing NOTIFY_TYPES member Nov 26 04:48:52 localhost ceph-mgr[287388]: mgr[py] Loading python module 'test_orchestrator' Nov 26 04:48:52 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:48:52.180+0000 7f54618cb140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member Nov 26 04:48:52 localhost ceph-mgr[287388]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member Nov 26 04:48:52 localhost ceph-mgr[287388]: mgr[py] Loading python module 'volumes' Nov 26 04:48:52 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:48:52.346+0000 7f54618cb140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member Nov 26 04:48:52 localhost ceph-mgr[287388]: mgr[py] Module volumes has missing NOTIFY_TYPES member Nov 26 04:48:52 localhost ceph-mgr[287388]: mgr[py] Loading python module 'zabbix' Nov 26 04:48:52 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:48:52.543+0000 7f54618cb140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member Nov 26 04:48:52 localhost ceph-mgr[287388]: mgr[py] Module zabbix has missing NOTIFY_TYPES member Nov 26 04:48:52 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:48:52.610+0000 7f54618cb140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member Nov 26 04:48:52 localhost ceph-mgr[287388]: ms_deliver_dispatch: unhandled message 0x55fcbfcdb1e0 mon_map magic: 0 from mon.2 v2:172.18.0.104:3300/0 Nov 26 04:48:52 localhost ceph-mgr[287388]: client.0 ms_handle_reset on v2:172.18.0.103:6800/3586462190 Nov 26 04:48:52 localhost nova_compute[281415]: 2025-11-26 09:48:52.625 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:48:52 localhost nova_compute[281415]: 2025-11-26 09:48:52.626 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:48:52 localhost nova_compute[281415]: 2025-11-26 09:48:52.626 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 04:48:52 localhost nova_compute[281415]: 2025-11-26 09:48:52.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:48:52 localhost nova_compute[281415]: 2025-11-26 09:48:52.848 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 04:48:52 localhost nova_compute[281415]: 2025-11-26 09:48:52.848 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 04:48:53 localhost nova_compute[281415]: 2025-11-26 09:48:53.072 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:48:53 localhost nova_compute[281415]: 2025-11-26 09:48:53.188 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:48:53 localhost nova_compute[281415]: 2025-11-26 09:48:53.188 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:48:53 localhost nova_compute[281415]: 2025-11-26 09:48:53.189 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 04:48:53 localhost nova_compute[281415]: 2025-11-26 09:48:53.189 281419 DEBUG nova.objects.instance [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:48:53 localhost nova_compute[281415]: 2025-11-26 09:48:53.547 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:48:53 localhost nova_compute[281415]: 2025-11-26 09:48:53.562 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:48:53 localhost nova_compute[281415]: 2025-11-26 09:48:53.563 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 04:48:53 localhost ceph-mgr[287388]: client.0 ms_handle_reset on v2:172.18.0.103:6800/3586462190 Nov 26 04:48:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:48:54 localhost podman[287771]: 2025-11-26 09:48:54.395913493 +0000 UTC m=+0.241291733 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 26 04:48:54 localhost podman[287771]: 2025-11-26 09:48:54.432472447 +0000 UTC m=+0.277850697 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 26 04:48:54 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:48:56 localhost nova_compute[281415]: 2025-11-26 09:48:56.298 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:48:57 localhost podman[240049]: time="2025-11-26T09:48:57Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:48:57 localhost podman[240049]: @ - - [26/Nov/2025:09:48:57 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 151669 "" "Go-http-client/1.1" Nov 26 04:48:57 localhost podman[240049]: @ - - [26/Nov/2025:09:48:57 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18213 "" "Go-http-client/1.1" Nov 26 04:48:58 localhost nova_compute[281415]: 2025-11-26 09:48:58.110 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:49:01 localhost nova_compute[281415]: 2025-11-26 09:49:01.340 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:49:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:49:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:49:01 localhost podman[288470]: 2025-11-26 09:49:01.841044479 +0000 UTC m=+0.088033899 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:49:01 localhost podman[288470]: 2025-11-26 09:49:01.850351269 +0000 UTC m=+0.097340719 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Nov 26 04:49:01 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:49:01 localhost podman[288471]: 2025-11-26 09:49:01.941976815 +0000 UTC m=+0.183570392 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true) Nov 26 04:49:01 localhost podman[288471]: 2025-11-26 09:49:01.983424166 +0000 UTC m=+0.225017753 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:49:01 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:49:03 localhost nova_compute[281415]: 2025-11-26 09:49:03.163 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:49:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:49:03.654 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:49:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:49:03.654 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:49:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:49:03.655 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:49:06 localhost ceph-mgr[287388]: ms_deliver_dispatch: unhandled message 0x55fcbfcdb1e0 mon_map magic: 0 from mon.2 v2:172.18.0.104:3300/0 Nov 26 04:49:06 localhost nova_compute[281415]: 2025-11-26 09:49:06.388 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:49:06 localhost podman[288587]: Nov 26 04:49:06 localhost podman[288587]: 2025-11-26 09:49:06.557233353 +0000 UTC m=+0.074544291 container create 3c3ebd7f240dca761963ee3a78674843086db9c8c3a423c1a983defb691bb7a6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_bartik, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, vcs-type=git, name=rhceph, version=7, vendor=Red Hat, Inc., io.buildah.version=1.33.12, GIT_CLEAN=True, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , release=553, com.redhat.component=rhceph-container) Nov 26 04:49:06 localhost systemd[1]: Started libpod-conmon-3c3ebd7f240dca761963ee3a78674843086db9c8c3a423c1a983defb691bb7a6.scope. Nov 26 04:49:06 localhost systemd[1]: Started libcrun container. Nov 26 04:49:06 localhost podman[288587]: 2025-11-26 09:49:06.526085033 +0000 UTC m=+0.043395991 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:49:06 localhost podman[288587]: 2025-11-26 09:49:06.629466693 +0000 UTC m=+0.146777641 container init 3c3ebd7f240dca761963ee3a78674843086db9c8c3a423c1a983defb691bb7a6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_bartik, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, vcs-type=git, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, distribution-scope=public, ceph=True, GIT_CLEAN=True, release=553, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=) Nov 26 04:49:06 localhost podman[288587]: 2025-11-26 09:49:06.642078573 +0000 UTC m=+0.159389511 container start 3c3ebd7f240dca761963ee3a78674843086db9c8c3a423c1a983defb691bb7a6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_bartik, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, RELEASE=main, name=rhceph, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , vcs-type=git, GIT_BRANCH=main, CEPH_POINT_RELEASE=, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, version=7) Nov 26 04:49:06 localhost podman[288587]: 2025-11-26 09:49:06.644863337 +0000 UTC m=+0.162174275 container attach 3c3ebd7f240dca761963ee3a78674843086db9c8c3a423c1a983defb691bb7a6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_bartik, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, release=553, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, io.openshift.expose-services=) Nov 26 04:49:06 localhost vigilant_bartik[288602]: 167 167 Nov 26 04:49:06 localhost systemd[1]: libpod-3c3ebd7f240dca761963ee3a78674843086db9c8c3a423c1a983defb691bb7a6.scope: Deactivated successfully. Nov 26 04:49:06 localhost podman[288587]: 2025-11-26 09:49:06.648140157 +0000 UTC m=+0.165451105 container died 3c3ebd7f240dca761963ee3a78674843086db9c8c3a423c1a983defb691bb7a6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_bartik, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , GIT_CLEAN=True, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, vendor=Red Hat, Inc., GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, ceph=True, RELEASE=main, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 26 04:49:06 localhost podman[288607]: 2025-11-26 09:49:06.752409194 +0000 UTC m=+0.092400730 container remove 3c3ebd7f240dca761963ee3a78674843086db9c8c3a423c1a983defb691bb7a6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_bartik, io.openshift.tags=rhceph ceph, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, vcs-type=git, GIT_CLEAN=True, version=7, io.openshift.expose-services=, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, io.buildah.version=1.33.12, name=rhceph) Nov 26 04:49:06 localhost systemd[1]: libpod-conmon-3c3ebd7f240dca761963ee3a78674843086db9c8c3a423c1a983defb691bb7a6.scope: Deactivated successfully. Nov 26 04:49:06 localhost podman[288623]: Nov 26 04:49:06 localhost podman[288623]: 2025-11-26 09:49:06.859162876 +0000 UTC m=+0.071846700 container create 471dc38713eccb7780ab9a35dad9b1137a142a043b605cccc2b282c5c6216dad (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_mestorf, distribution-scope=public, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , vcs-type=git, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_BRANCH=main, version=7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, ceph=True, description=Red Hat Ceph Storage 7, release=553) Nov 26 04:49:06 localhost systemd[1]: Started libpod-conmon-471dc38713eccb7780ab9a35dad9b1137a142a043b605cccc2b282c5c6216dad.scope. Nov 26 04:49:06 localhost systemd[1]: Started libcrun container. Nov 26 04:49:06 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5fafde19d01616282ad5d50ce6bc9184c714fc63c9f514e2af122c56964ff504/merged/tmp/config supports timestamps until 2038 (0x7fffffff) Nov 26 04:49:06 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5fafde19d01616282ad5d50ce6bc9184c714fc63c9f514e2af122c56964ff504/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff) Nov 26 04:49:06 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5fafde19d01616282ad5d50ce6bc9184c714fc63c9f514e2af122c56964ff504/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 26 04:49:06 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5fafde19d01616282ad5d50ce6bc9184c714fc63c9f514e2af122c56964ff504/merged/var/lib/ceph/mon/ceph-np0005536118 supports timestamps until 2038 (0x7fffffff) Nov 26 04:49:06 localhost podman[288623]: 2025-11-26 09:49:06.924001383 +0000 UTC m=+0.136685207 container init 471dc38713eccb7780ab9a35dad9b1137a142a043b605cccc2b282c5c6216dad (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_mestorf, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, RELEASE=main, vcs-type=git, vendor=Red Hat, Inc., GIT_BRANCH=main, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, release=553, ceph=True, name=rhceph, distribution-scope=public, io.openshift.tags=rhceph ceph) Nov 26 04:49:06 localhost podman[288623]: 2025-11-26 09:49:06.932576552 +0000 UTC m=+0.145260376 container start 471dc38713eccb7780ab9a35dad9b1137a142a043b605cccc2b282c5c6216dad (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_mestorf, GIT_BRANCH=main, RELEASE=main, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., ceph=True, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55) Nov 26 04:49:06 localhost podman[288623]: 2025-11-26 09:49:06.9328486 +0000 UTC m=+0.145532424 container attach 471dc38713eccb7780ab9a35dad9b1137a142a043b605cccc2b282c5c6216dad (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_mestorf, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.openshift.expose-services=, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, release=553, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-type=git, vendor=Red Hat, Inc., name=rhceph, RELEASE=main, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 26 04:49:06 localhost podman[288623]: 2025-11-26 09:49:06.834800931 +0000 UTC m=+0.047484825 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:49:07 localhost systemd[1]: libpod-471dc38713eccb7780ab9a35dad9b1137a142a043b605cccc2b282c5c6216dad.scope: Deactivated successfully. Nov 26 04:49:07 localhost podman[288623]: 2025-11-26 09:49:07.03154387 +0000 UTC m=+0.244227744 container died 471dc38713eccb7780ab9a35dad9b1137a142a043b605cccc2b282c5c6216dad (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_mestorf, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, RELEASE=main, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, distribution-scope=public, release=553, GIT_BRANCH=main, io.buildah.version=1.33.12) Nov 26 04:49:07 localhost podman[288664]: 2025-11-26 09:49:07.108510353 +0000 UTC m=+0.066198820 container remove 471dc38713eccb7780ab9a35dad9b1137a142a043b605cccc2b282c5c6216dad (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_mestorf, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, GIT_CLEAN=True, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, name=rhceph, architecture=x86_64, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, distribution-scope=public, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_BRANCH=main, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, vendor=Red Hat, Inc.) Nov 26 04:49:07 localhost systemd[1]: libpod-conmon-471dc38713eccb7780ab9a35dad9b1137a142a043b605cccc2b282c5c6216dad.scope: Deactivated successfully. Nov 26 04:49:07 localhost systemd[1]: Reloading. Nov 26 04:49:07 localhost systemd-sysv-generator[288709]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:49:07 localhost systemd-rc-local-generator[288702]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:49:07 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:49:07 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:49:07 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:49:07 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:49:07 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:49:07 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:49:07 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:49:07 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:49:07 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:49:07 localhost systemd[1]: var-lib-containers-storage-overlay-b3a6a4a00dec74b84513ba8eacbf1c55b42eba4d2ace26666e464eb083b512d0-merged.mount: Deactivated successfully. Nov 26 04:49:07 localhost systemd[1]: Reloading. Nov 26 04:49:07 localhost systemd-rc-local-generator[288747]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:49:07 localhost systemd-sysv-generator[288750]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:49:07 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:49:07 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:49:07 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:49:07 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:49:07 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:49:07 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:49:07 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:49:07 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:49:07 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:49:07 localhost systemd[1]: Starting Ceph mon.np0005536118 for 0d5e5e6d-3c4b-5efe-8c65-346ae6715606... Nov 26 04:49:08 localhost nova_compute[281415]: 2025-11-26 09:49:08.165 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:49:08 localhost podman[288811]: Nov 26 04:49:08 localhost podman[288811]: 2025-11-26 09:49:08.262700551 +0000 UTC m=+0.076254743 container create bbb2d15582705a5b34fb4367dd88de91e3439671aa3a6cf770afa1b9821f781f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mon-np0005536118, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, maintainer=Guillaume Abrioux , release=553, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, RELEASE=main, vcs-type=git, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., distribution-scope=public) Nov 26 04:49:08 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/182a500ca4ade7edc3bd32620c5cdbda21648c79651a358836c91ebc9ca1b0cc/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Nov 26 04:49:08 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/182a500ca4ade7edc3bd32620c5cdbda21648c79651a358836c91ebc9ca1b0cc/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 26 04:49:08 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/182a500ca4ade7edc3bd32620c5cdbda21648c79651a358836c91ebc9ca1b0cc/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 26 04:49:08 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/182a500ca4ade7edc3bd32620c5cdbda21648c79651a358836c91ebc9ca1b0cc/merged/var/lib/ceph/mon/ceph-np0005536118 supports timestamps until 2038 (0x7fffffff) Nov 26 04:49:08 localhost podman[288811]: 2025-11-26 09:49:08.320155335 +0000 UTC m=+0.133709527 container init bbb2d15582705a5b34fb4367dd88de91e3439671aa3a6cf770afa1b9821f781f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mon-np0005536118, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, RELEASE=main, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, name=rhceph, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vcs-type=git, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, CEPH_POINT_RELEASE=, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 26 04:49:08 localhost podman[288811]: 2025-11-26 09:49:08.227519698 +0000 UTC m=+0.041073920 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:49:08 localhost podman[288811]: 2025-11-26 09:49:08.329771195 +0000 UTC m=+0.143325387 container start bbb2d15582705a5b34fb4367dd88de91e3439671aa3a6cf770afa1b9821f781f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mon-np0005536118, vcs-type=git, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.buildah.version=1.33.12, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, architecture=x86_64) Nov 26 04:49:08 localhost bash[288811]: bbb2d15582705a5b34fb4367dd88de91e3439671aa3a6cf770afa1b9821f781f Nov 26 04:49:08 localhost systemd[1]: Started Ceph mon.np0005536118 for 0d5e5e6d-3c4b-5efe-8c65-346ae6715606. Nov 26 04:49:08 localhost ceph-mon[288827]: set uid:gid to 167:167 (ceph:ceph) Nov 26 04:49:08 localhost ceph-mon[288827]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mon, pid 2 Nov 26 04:49:08 localhost ceph-mon[288827]: pidfile_write: ignore empty --pid-file Nov 26 04:49:08 localhost ceph-mon[288827]: load: jerasure load: lrc Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: RocksDB version: 7.9.2 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Git sha 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Compile date 2025-09-23 00:00:00 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: DB SUMMARY Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: DB Session ID: ZOF5ONGIRCTUGR7KNLS5 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: CURRENT file: CURRENT Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: IDENTITY file: IDENTITY Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: MANIFEST file: MANIFEST-000005 size: 59 Bytes Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: SST files in /var/lib/ceph/mon/ceph-np0005536118/store.db dir, Total Num: 0, files: Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-np0005536118/store.db: 000004.log size: 761 ; Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.error_if_exists: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.create_if_missing: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.paranoid_checks: 1 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.flush_verify_memtable_count: 1 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.env: 0x55b98f99d9e0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.fs: PosixFileSystem Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.info_log: 0x55b99064ed20 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.max_file_opening_threads: 16 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.statistics: (nil) Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.use_fsync: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.max_log_file_size: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.max_manifest_file_size: 1073741824 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.log_file_time_to_roll: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.keep_log_file_num: 1000 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.recycle_log_file_num: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.allow_fallocate: 1 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.allow_mmap_reads: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.allow_mmap_writes: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.use_direct_reads: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.create_missing_column_families: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.db_log_dir: Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.wal_dir: Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.table_cache_numshardbits: 6 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.WAL_ttl_seconds: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.WAL_size_limit_MB: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.manifest_preallocation_size: 4194304 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.is_fd_close_on_exec: 1 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.advise_random_on_open: 1 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.db_write_buffer_size: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.write_buffer_manager: 0x55b99065f540 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.access_hint_on_compaction_start: 1 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.random_access_max_buffer_size: 1048576 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.use_adaptive_mutex: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.rate_limiter: (nil) Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.wal_recovery_mode: 2 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.enable_thread_tracking: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.enable_pipelined_write: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.unordered_write: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.allow_concurrent_memtable_write: 1 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.write_thread_max_yield_usec: 100 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.write_thread_slow_yield_usec: 3 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.row_cache: None Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.wal_filter: None Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.avoid_flush_during_recovery: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.allow_ingest_behind: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.two_write_queues: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.manual_wal_flush: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.wal_compression: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.atomic_flush: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.persist_stats_to_disk: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.write_dbid_to_manifest: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.log_readahead_size: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.file_checksum_gen_factory: Unknown Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.best_efforts_recovery: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.allow_data_in_errors: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.db_host_id: __hostname__ Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.enforce_single_del_contracts: true Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.max_background_jobs: 2 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.max_background_compactions: -1 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.max_subcompactions: 1 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.avoid_flush_during_shutdown: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.writable_file_max_buffer_size: 1048576 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.delayed_write_rate : 16777216 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.max_total_wal_size: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.stats_dump_period_sec: 600 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.stats_persist_period_sec: 600 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.stats_history_buffer_size: 1048576 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.max_open_files: -1 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.bytes_per_sync: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.wal_bytes_per_sync: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.strict_bytes_per_sync: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.compaction_readahead_size: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.max_background_flushes: -1 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Compression algorithms supported: Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: #011kZSTD supported: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: #011kXpressCompression supported: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: #011kBZip2Compression supported: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: #011kLZ4Compression supported: 1 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: #011kZlibCompression supported: 1 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: #011kLZ4HCCompression supported: 1 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: #011kSnappyCompression supported: 1 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Fast CRC32 supported: Supported on x86 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: DMutex implementation: pthread_mutex_t Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-np0005536118/store.db/MANIFEST-000005 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.merge_operator: Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.compaction_filter: None Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.compaction_filter_factory: None Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.sst_partitioner_factory: None Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.memtable_factory: SkipListFactory Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.table_factory: BlockBasedTable Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55b99064e980)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55b99064b350#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.write_buffer_size: 33554432 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.max_write_buffer_number: 2 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.compression: NoCompression Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.bottommost_compression: Disabled Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.prefix_extractor: nullptr Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.num_levels: 7 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.min_write_buffer_number_to_merge: 1 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.compression_opts.window_bits: -14 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.compression_opts.level: 32767 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.compression_opts.strategy: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.compression_opts.enabled: false Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.level0_file_num_compaction_trigger: 4 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.target_file_size_base: 67108864 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.target_file_size_multiplier: 1 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.max_bytes_for_level_base: 268435456 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.arena_block_size: 1048576 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.disable_auto_compactions: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.table_properties_collectors: Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.inplace_update_support: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.memtable_huge_page_size: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.bloom_locality: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.max_successive_merges: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.paranoid_file_checks: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.force_consistency_checks: 1 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.report_bg_io_stats: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.ttl: 2592000 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.enable_blob_files: false Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.min_blob_size: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.blob_file_size: 268435456 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.blob_compression_type: NoCompression Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.enable_blob_garbage_collection: false Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.blob_file_starting_level: 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-np0005536118/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 96260784-ab37-4bfa-a747-b54286a1d4f8 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150548391225, "job": 1, "event": "recovery_started", "wal_files": [4]} Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150548393958, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1887, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 773, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 651, "raw_average_value_size": 130, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764150548, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "96260784-ab37-4bfa-a747-b54286a1d4f8", "db_session_id": "ZOF5ONGIRCTUGR7KNLS5", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}} Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150548394135, "job": 1, "event": "recovery_finished"} Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: [db/version_set.cc:5047] Creating manifest 10 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55b990672e00 Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: DB pointer 0x55b990768000 Nov 26 04:49:08 localhost ceph-mon[288827]: mon.np0005536118 does not exist in monmap, will attempt to join an existing cluster Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 26 04:49:08 localhost ceph-mon[288827]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.0 total, 0.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 1/0 1.84 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.7 0.00 0.00 1 0.003 0 0 0.0 0.0#012 Sum 1/0 1.84 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.7 0.00 0.00 1 0.003 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.7 0.00 0.00 1 0.003 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.7 0.00 0.00 1 0.003 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.11 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.11 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55b99064b350#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 3.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] ** Nov 26 04:49:08 localhost ceph-mon[288827]: using public_addr v2:172.18.0.107:0/0 -> [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] Nov 26 04:49:08 localhost ceph-mon[288827]: starting mon.np0005536118 rank -1 at public addrs [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] at bind addrs [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon_data /var/lib/ceph/mon/ceph-np0005536118 fsid 0d5e5e6d-3c4b-5efe-8c65-346ae6715606 Nov 26 04:49:08 localhost ceph-mon[288827]: mon.np0005536118@-1(???) e0 preinit fsid 0d5e5e6d-3c4b-5efe-8c65-346ae6715606 Nov 26 04:49:08 localhost ceph-mon[288827]: mon.np0005536118@-1(synchronizing) e4 sync_obtain_latest_monmap Nov 26 04:49:08 localhost ceph-mon[288827]: mon.np0005536118@-1(synchronizing) e4 sync_obtain_latest_monmap obtained monmap e4 Nov 26 04:49:08 localhost ceph-mon[288827]: mon.np0005536118@-1(synchronizing).mds e16 new map Nov 26 04:49:08 localhost ceph-mon[288827]: mon.np0005536118@-1(synchronizing).mds e16 print_map#012e16#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#01114#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-11-26T08:04:18.785479+0000#012modified#0112025-11-26T09:47:53.723677+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#01179#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=26654}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[6]#012metadata_pool#0117#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 26654 members: 26654#012[mds.mds.np0005536117.tfthzg{0:26654} state up:active seq 11 addr [v2:172.18.0.106:6808/3840665669,v1:172.18.0.106:6809/3840665669] compat {c=[1],r=[1],i=[17ff]}]#012 #012 #012Standby daemons:#012 #012[mds.mds.np0005536119.dxhchp{-1:16902} state up:standby seq 1 addr [v2:172.18.0.108:6808/374897363,v1:172.18.0.108:6809/374897363] compat {c=[1],r=[1],i=[17ff]}]#012[mds.mds.np0005536118.kohnma{-1:16908} state up:standby seq 1 addr [v2:172.18.0.107:6808/2578626940,v1:172.18.0.107:6809/2578626940] compat {c=[1],r=[1],i=[17ff]}] Nov 26 04:49:08 localhost ceph-mon[288827]: mon.np0005536118@-1(synchronizing).osd e81 crush map has features 3314933000854323200, adjusting msgr requires Nov 26 04:49:08 localhost ceph-mon[288827]: mon.np0005536118@-1(synchronizing).osd e81 crush map has features 432629239337189376, adjusting msgr requires Nov 26 04:49:08 localhost ceph-mon[288827]: mon.np0005536118@-1(synchronizing).osd e81 crush map has features 432629239337189376, adjusting msgr requires Nov 26 04:49:08 localhost ceph-mon[288827]: mon.np0005536118@-1(synchronizing).osd e81 crush map has features 432629239337189376, adjusting msgr requires Nov 26 04:49:08 localhost ceph-mon[288827]: Added label mgr to host np0005536117.localdomain Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: Added label mgr to host np0005536118.localdomain Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: Added label mgr to host np0005536119.localdomain Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536117.ggibwg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005536117.ggibwg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: Saving service mgr spec with placement label:mgr Nov 26 04:49:08 localhost ceph-mon[288827]: Deploying daemon mgr.np0005536117.ggibwg on np0005536117.localdomain Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536118.anceyj", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005536118.anceyj", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: Deploying daemon mgr.np0005536118.anceyj on np0005536118.localdomain Nov 26 04:49:08 localhost ceph-mon[288827]: Added label mon to host np0005536112.localdomain Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: Added label _admin to host np0005536112.localdomain Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536119.eupicg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005536119.eupicg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: Deploying daemon mgr.np0005536119.eupicg on np0005536119.localdomain Nov 26 04:49:08 localhost ceph-mon[288827]: Added label mon to host np0005536113.localdomain Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: Added label _admin to host np0005536113.localdomain Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: Added label mon to host np0005536114.localdomain Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: Added label _admin to host np0005536114.localdomain Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: Added label mon to host np0005536117.localdomain Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:49:08 localhost ceph-mon[288827]: Added label _admin to host np0005536117.localdomain Nov 26 04:49:08 localhost ceph-mon[288827]: Updating np0005536117.localdomain:/etc/ceph/ceph.conf Nov 26 04:49:08 localhost ceph-mon[288827]: Updating np0005536117.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: Added label mon to host np0005536118.localdomain Nov 26 04:49:08 localhost ceph-mon[288827]: Updating np0005536117.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: Updating np0005536117.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.client.admin.keyring Nov 26 04:49:08 localhost ceph-mon[288827]: Added label _admin to host np0005536118.localdomain Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: Updating np0005536118.localdomain:/etc/ceph/ceph.conf Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: Added label mon to host np0005536119.localdomain Nov 26 04:49:08 localhost ceph-mon[288827]: Updating np0005536118.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:49:08 localhost ceph-mon[288827]: Updating np0005536118.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: Added label _admin to host np0005536119.localdomain Nov 26 04:49:08 localhost ceph-mon[288827]: Updating np0005536118.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.client.admin.keyring Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: Saving service mon spec with placement label:mon Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:49:08 localhost ceph-mon[288827]: Updating np0005536119.localdomain:/etc/ceph/ceph.conf Nov 26 04:49:08 localhost ceph-mon[288827]: Updating np0005536119.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:49:08 localhost ceph-mon[288827]: Updating np0005536119.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 26 04:49:08 localhost ceph-mon[288827]: Updating np0005536119.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.client.admin.keyring Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:08 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 26 04:49:08 localhost ceph-mon[288827]: Deploying daemon mon.np0005536119 on np0005536119.localdomain Nov 26 04:49:08 localhost ceph-mon[288827]: mon.np0005536118@-1(synchronizing).paxosservice(auth 1..34) refresh upgraded, format 0 -> 3 Nov 26 04:49:11 localhost nova_compute[281415]: 2025-11-26 09:49:11.440 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:49:12 localhost ceph-mgr[287388]: ms_deliver_dispatch: unhandled message 0x55fcbfcdaf20 mon_map magic: 0 from mon.2 v2:172.18.0.104:3300/0 Nov 26 04:49:13 localhost nova_compute[281415]: 2025-11-26 09:49:13.169 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:49:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:49:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:49:13 localhost podman[288866]: 2025-11-26 09:49:13.838116389 +0000 UTC m=+0.095897266 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:49:13 localhost podman[288866]: 2025-11-26 09:49:13.85240497 +0000 UTC m=+0.110185827 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:49:13 localhost podman[288867]: 2025-11-26 09:49:13.890980785 +0000 UTC m=+0.139436500 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=edpm) Nov 26 04:49:13 localhost podman[288867]: 2025-11-26 09:49:13.904825443 +0000 UTC m=+0.153281168 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm) Nov 26 04:49:13 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:49:13 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:49:14 localhost ceph-mon[288827]: mon.np0005536118@-1(probing) e4 adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints Nov 26 04:49:14 localhost ceph-mon[288827]: mon.np0005536118@-1(probing) e4 adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints Nov 26 04:49:14 localhost ceph-mon[288827]: mon.np0005536118@-1(probing) e5 my rank is now 4 (was -1) Nov 26 04:49:14 localhost ceph-mon[288827]: log_channel(cluster) log [INF] : mon.np0005536118 calling monitor election Nov 26 04:49:14 localhost ceph-mon[288827]: paxos.4).electionLogic(0) init, first boot, initializing epoch at 1 Nov 26 04:49:14 localhost ceph-mon[288827]: mon.np0005536118@4(electing) e5 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:49:15 localhost openstack_network_exporter[242153]: ERROR 09:49:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:49:15 localhost openstack_network_exporter[242153]: ERROR 09:49:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:49:15 localhost openstack_network_exporter[242153]: ERROR 09:49:15 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:49:15 localhost openstack_network_exporter[242153]: ERROR 09:49:15 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:49:15 localhost openstack_network_exporter[242153]: Nov 26 04:49:15 localhost openstack_network_exporter[242153]: ERROR 09:49:15 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:49:15 localhost openstack_network_exporter[242153]: Nov 26 04:49:16 localhost ceph-mon[288827]: mon.np0005536118@4(electing) e5 adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints Nov 26 04:49:16 localhost nova_compute[281415]: 2025-11-26 09:49:16.442 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:49:18 localhost nova_compute[281415]: 2025-11-26 09:49:18.202 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:49:18 localhost ceph-mon[288827]: mon.np0005536118@4(electing) e5 adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints Nov 26 04:49:18 localhost ceph-mgr[287388]: ms_deliver_dispatch: unhandled message 0x55fcbfcdb600 mon_map magic: 0 from mon.2 v2:172.18.0.104:3300/0 Nov 26 04:49:18 localhost ceph-mon[288827]: mon.np0005536118@4(electing).elector(1) mon.0 v2:172.18.0.103:3300/0 has newer monmap epoch 6 > my epoch 5, taking it Nov 26 04:49:18 localhost ceph-mon[288827]: Deploying daemon mon.np0005536118 on np0005536118.localdomain Nov 26 04:49:18 localhost ceph-mon[288827]: mon.np0005536114 calling monitor election Nov 26 04:49:18 localhost ceph-mon[288827]: mon.np0005536112 calling monitor election Nov 26 04:49:18 localhost ceph-mon[288827]: mon.np0005536113 calling monitor election Nov 26 04:49:18 localhost ceph-mon[288827]: mon.np0005536119 calling monitor election Nov 26 04:49:18 localhost ceph-mon[288827]: mon.np0005536112 is new leader, mons np0005536112,np0005536114,np0005536113,np0005536119 in quorum (ranks 0,1,2,3) Nov 26 04:49:18 localhost ceph-mon[288827]: overall HEALTH_OK Nov 26 04:49:18 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:18 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:18 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:18 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 26 04:49:18 localhost ceph-mon[288827]: Deploying daemon mon.np0005536117 on np0005536117.localdomain Nov 26 04:49:18 localhost ceph-mon[288827]: mon.np0005536113 calling monitor election Nov 26 04:49:18 localhost ceph-mon[288827]: mon.np0005536114 calling monitor election Nov 26 04:49:18 localhost ceph-mon[288827]: mon.np0005536119 calling monitor election Nov 26 04:49:18 localhost ceph-mon[288827]: mon.np0005536112 calling monitor election Nov 26 04:49:18 localhost ceph-mon[288827]: mon.np0005536112 is new leader, mons np0005536112,np0005536114,np0005536113,np0005536119 in quorum (ranks 0,1,2,3) Nov 26 04:49:18 localhost ceph-mon[288827]: Health check failed: 1/5 mons down, quorum np0005536112,np0005536114,np0005536113,np0005536119 (MON_DOWN) Nov 26 04:49:18 localhost ceph-mon[288827]: Health detail: HEALTH_WARN 1/5 mons down, quorum np0005536112,np0005536114,np0005536113,np0005536119 Nov 26 04:49:18 localhost ceph-mon[288827]: [WRN] MON_DOWN: 1/5 mons down, quorum np0005536112,np0005536114,np0005536113,np0005536119 Nov 26 04:49:18 localhost ceph-mon[288827]: mon.np0005536118 (rank 4) addr [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] is down (out of quorum) Nov 26 04:49:18 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:18 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:18 localhost ceph-mon[288827]: log_channel(cluster) log [INF] : mon.np0005536118 calling monitor election Nov 26 04:49:18 localhost ceph-mon[288827]: paxos.4).electionLogic(0) init, first boot, initializing epoch at 1 Nov 26 04:49:18 localhost ceph-mon[288827]: mon.np0005536118@4(electing) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:49:19 localhost podman[289035]: 2025-11-26 09:49:19.540690946 +0000 UTC m=+0.104326730 container exec a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, version=7, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, release=553, GIT_CLEAN=True, com.redhat.component=rhceph-container, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, maintainer=Guillaume Abrioux ) Nov 26 04:49:19 localhost podman[289035]: 2025-11-26 09:49:19.678061543 +0000 UTC m=+0.241697327 container exec_died a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_BRANCH=main, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, RELEASE=main, release=553, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git) Nov 26 04:49:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:49:19 localhost podman[289088]: 2025-11-26 09:49:19.986347718 +0000 UTC m=+0.102421583 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ovn_controller) Nov 26 04:49:20 localhost podman[289088]: 2025-11-26 09:49:20.078428247 +0000 UTC m=+0.194502112 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 26 04:49:20 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:49:21 localhost nova_compute[281415]: 2025-11-26 09:49:21.483 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:49:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:49:21 localhost podman[289181]: 2025-11-26 09:49:21.830487412 +0000 UTC m=+0.084595715 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, name=ubi9-minimal, distribution-scope=public, managed_by=edpm_ansible, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Nov 26 04:49:21 localhost podman[289181]: 2025-11-26 09:49:21.849342641 +0000 UTC m=+0.103450984 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, name=ubi9-minimal) Nov 26 04:49:21 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:49:23 localhost nova_compute[281415]: 2025-11-26 09:49:23.240 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:49:23 localhost ceph-mon[288827]: log_channel(cluster) log [INF] : mon.np0005536118 calling monitor election Nov 26 04:49:23 localhost ceph-mon[288827]: paxos.4).electionLogic(0) init, first boot, initializing epoch at 1 Nov 26 04:49:23 localhost ceph-mon[288827]: mon.np0005536118@4(electing) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:49:23 localhost ceph-mon[288827]: mon.np0005536118@4(electing) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:49:23 localhost ceph-mon[288827]: mon.np0005536118@4(electing) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:49:23 localhost ceph-mon[288827]: mon.np0005536118@4(peon) e6 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code} Nov 26 04:49:23 localhost ceph-mon[288827]: mon.np0005536118@4(peon) e6 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout} Nov 26 04:49:23 localhost ceph-mon[288827]: mon.np0005536118@4(peon) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:49:23 localhost ceph-mon[288827]: mgrc update_daemon_metadata mon.np0005536118 metadata {addrs=[v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable),ceph_version_short=18.2.1-361.el9cp,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=np0005536118.localdomain,container_image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=rhel,distro_description=Red Hat Enterprise Linux 9.6 (Plow),distro_version=9.6,hostname=np0005536118.localdomain,kernel_description=#1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023,kernel_version=5.14.0-284.11.1.el9_2.x86_64,mem_swap_kb=1048572,mem_total_kb=16116612,os=Linux} Nov 26 04:49:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:49:24 localhost podman[289344]: 2025-11-26 09:49:24.638109517 +0000 UTC m=+0.089927995 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 26 04:49:24 localhost podman[289344]: 2025-11-26 09:49:24.646255983 +0000 UTC m=+0.098074481 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 04:49:24 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:49:24 localhost ceph-mon[288827]: mon.np0005536118 calling monitor election Nov 26 04:49:24 localhost ceph-mon[288827]: mon.np0005536118 calling monitor election Nov 26 04:49:24 localhost ceph-mon[288827]: mon.np0005536117 calling monitor election Nov 26 04:49:24 localhost ceph-mon[288827]: mon.np0005536113 calling monitor election Nov 26 04:49:24 localhost ceph-mon[288827]: mon.np0005536118 calling monitor election Nov 26 04:49:24 localhost ceph-mon[288827]: mon.np0005536119 calling monitor election Nov 26 04:49:24 localhost ceph-mon[288827]: mon.np0005536112 calling monitor election Nov 26 04:49:24 localhost ceph-mon[288827]: mon.np0005536114 calling monitor election Nov 26 04:49:24 localhost ceph-mon[288827]: mon.np0005536112 is new leader, mons np0005536112,np0005536114,np0005536113,np0005536119,np0005536118,np0005536117 in quorum (ranks 0,1,2,3,4,5) Nov 26 04:49:24 localhost ceph-mon[288827]: Health check cleared: MON_DOWN (was: 2/6 mons down, quorum np0005536112,np0005536114,np0005536113,np0005536119) Nov 26 04:49:24 localhost ceph-mon[288827]: Cluster is now healthy Nov 26 04:49:24 localhost ceph-mon[288827]: overall HEALTH_OK Nov 26 04:49:24 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:24 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:24 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:24 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:24 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:24 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:24 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:24 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' cmd={"prefix": "config rm", "who": "osd/host:np0005536112", "name": "osd_memory_target"} : dispatch Nov 26 04:49:24 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:24 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' cmd={"prefix": "config rm", "who": "osd/host:np0005536113", "name": "osd_memory_target"} : dispatch Nov 26 04:49:24 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:24 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:24 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:24 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:49:25 localhost ceph-mon[288827]: Updating np0005536112.localdomain:/etc/ceph/ceph.conf Nov 26 04:49:25 localhost ceph-mon[288827]: Updating np0005536113.localdomain:/etc/ceph/ceph.conf Nov 26 04:49:25 localhost ceph-mon[288827]: Updating np0005536114.localdomain:/etc/ceph/ceph.conf Nov 26 04:49:25 localhost ceph-mon[288827]: Updating np0005536117.localdomain:/etc/ceph/ceph.conf Nov 26 04:49:25 localhost ceph-mon[288827]: Updating np0005536118.localdomain:/etc/ceph/ceph.conf Nov 26 04:49:25 localhost ceph-mon[288827]: Updating np0005536119.localdomain:/etc/ceph/ceph.conf Nov 26 04:49:25 localhost ceph-mon[288827]: Updating np0005536119.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:49:25 localhost ceph-mon[288827]: Updating np0005536112.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:49:25 localhost ceph-mon[288827]: Updating np0005536113.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:49:25 localhost ceph-mon[288827]: Updating np0005536114.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:49:25 localhost ceph-mon[288827]: Updating np0005536118.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:49:25 localhost ceph-mon[288827]: Updating np0005536117.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:49:25 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:25 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:25 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:25 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:25 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:25 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:25 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:25 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:25 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:25 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:25 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:25 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:25 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:26 localhost nova_compute[281415]: 2025-11-26 09:49:26.517 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:49:26 localhost sshd[289563]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:49:26 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 26 04:49:27 localhost podman[240049]: time="2025-11-26T09:49:27Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:49:27 localhost podman[240049]: @ - - [26/Nov/2025:09:49:27 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153864 "" "Go-http-client/1.1" Nov 26 04:49:27 localhost podman[240049]: @ - - [26/Nov/2025:09:49:27 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18702 "" "Go-http-client/1.1" Nov 26 04:49:27 localhost ceph-mon[288827]: Reconfiguring mon.np0005536112 (monmap changed)... Nov 26 04:49:27 localhost ceph-mon[288827]: Reconfiguring daemon mon.np0005536112 on np0005536112.localdomain Nov 26 04:49:27 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:27 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:27 localhost ceph-mon[288827]: Reconfiguring mgr.np0005536112.srlncr (monmap changed)... Nov 26 04:49:27 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536112.srlncr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:49:27 localhost ceph-mon[288827]: Reconfiguring daemon mgr.np0005536112.srlncr on np0005536112.localdomain Nov 26 04:49:27 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:27 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:27 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536112.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:49:28 localhost nova_compute[281415]: 2025-11-26 09:49:28.274 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:49:28 localhost ceph-mon[288827]: Reconfiguring crash.np0005536112 (monmap changed)... Nov 26 04:49:28 localhost ceph-mon[288827]: Reconfiguring daemon crash.np0005536112 on np0005536112.localdomain Nov 26 04:49:28 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:28 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:28 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536113.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:49:29 localhost ceph-mon[288827]: Reconfiguring crash.np0005536113 (monmap changed)... Nov 26 04:49:29 localhost ceph-mon[288827]: Reconfiguring daemon crash.np0005536113 on np0005536113.localdomain Nov 26 04:49:29 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:29 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:29 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:29 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 26 04:49:30 localhost ceph-mon[288827]: Reconfiguring mon.np0005536113 (monmap changed)... Nov 26 04:49:30 localhost ceph-mon[288827]: Reconfiguring daemon mon.np0005536113 on np0005536113.localdomain Nov 26 04:49:30 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:30 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:30 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536113.tjpmyn", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:49:31 localhost nova_compute[281415]: 2025-11-26 09:49:31.562 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:49:31 localhost ceph-mon[288827]: Reconfiguring mgr.np0005536113.tjpmyn (monmap changed)... Nov 26 04:49:31 localhost ceph-mon[288827]: Reconfiguring daemon mgr.np0005536113.tjpmyn on np0005536113.localdomain Nov 26 04:49:31 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:31 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:31 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 26 04:49:32 localhost ceph-mon[288827]: mon.np0005536118@4(peon).osd e81 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375 Nov 26 04:49:32 localhost ceph-mon[288827]: mon.np0005536118@4(peon).osd e81 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1 Nov 26 04:49:32 localhost ceph-mon[288827]: mon.np0005536118@4(peon).osd e82 e82: 6 total, 6 up, 6 in Nov 26 04:49:32 localhost systemd[1]: session-26.scope: Deactivated successfully. Nov 26 04:49:32 localhost systemd[1]: session-20.scope: Deactivated successfully. Nov 26 04:49:32 localhost systemd[1]: session-15.scope: Deactivated successfully. Nov 26 04:49:32 localhost systemd[1]: session-22.scope: Deactivated successfully. Nov 26 04:49:32 localhost systemd[1]: session-17.scope: Deactivated successfully. Nov 26 04:49:32 localhost systemd[1]: session-21.scope: Deactivated successfully. Nov 26 04:49:32 localhost systemd[1]: session-23.scope: Deactivated successfully. Nov 26 04:49:32 localhost systemd[1]: session-25.scope: Deactivated successfully. Nov 26 04:49:32 localhost systemd[1]: session-24.scope: Deactivated successfully. Nov 26 04:49:32 localhost systemd[1]: session-19.scope: Deactivated successfully. Nov 26 04:49:32 localhost systemd[1]: session-27.scope: Deactivated successfully. Nov 26 04:49:32 localhost systemd[1]: session-27.scope: Consumed 3min 27.151s CPU time. Nov 26 04:49:32 localhost systemd[1]: session-18.scope: Deactivated successfully. Nov 26 04:49:32 localhost systemd-logind[761]: Session 26 logged out. Waiting for processes to exit. Nov 26 04:49:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:49:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:49:32 localhost systemd-logind[761]: Session 21 logged out. Waiting for processes to exit. Nov 26 04:49:32 localhost systemd-logind[761]: Session 20 logged out. Waiting for processes to exit. Nov 26 04:49:32 localhost systemd-logind[761]: Session 22 logged out. Waiting for processes to exit. Nov 26 04:49:32 localhost systemd-logind[761]: Session 19 logged out. Waiting for processes to exit. Nov 26 04:49:32 localhost systemd-logind[761]: Session 23 logged out. Waiting for processes to exit. Nov 26 04:49:32 localhost systemd-logind[761]: Session 25 logged out. Waiting for processes to exit. Nov 26 04:49:32 localhost systemd-logind[761]: Session 24 logged out. Waiting for processes to exit. Nov 26 04:49:32 localhost systemd-logind[761]: Session 27 logged out. Waiting for processes to exit. Nov 26 04:49:32 localhost systemd-logind[761]: Session 18 logged out. Waiting for processes to exit. Nov 26 04:49:32 localhost systemd-logind[761]: Session 17 logged out. Waiting for processes to exit. Nov 26 04:49:32 localhost systemd-logind[761]: Session 15 logged out. Waiting for processes to exit. Nov 26 04:49:32 localhost systemd-logind[761]: Removed session 26. Nov 26 04:49:32 localhost systemd-logind[761]: Removed session 20. Nov 26 04:49:32 localhost systemd-logind[761]: Removed session 15. Nov 26 04:49:32 localhost systemd-logind[761]: Removed session 22. Nov 26 04:49:32 localhost systemd-logind[761]: Removed session 17. Nov 26 04:49:32 localhost systemd-logind[761]: Removed session 21. Nov 26 04:49:32 localhost systemd-logind[761]: Removed session 23. Nov 26 04:49:32 localhost systemd-logind[761]: Removed session 25. Nov 26 04:49:32 localhost systemd-logind[761]: Removed session 24. Nov 26 04:49:32 localhost systemd-logind[761]: Removed session 19. Nov 26 04:49:32 localhost systemd-logind[761]: Removed session 27. Nov 26 04:49:32 localhost systemd-logind[761]: Removed session 18. Nov 26 04:49:32 localhost podman[289569]: 2025-11-26 09:49:32.411339966 +0000 UTC m=+0.083205252 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible) Nov 26 04:49:32 localhost podman[289569]: 2025-11-26 09:49:32.422083901 +0000 UTC m=+0.093949197 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:49:32 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:49:32 localhost podman[289568]: 2025-11-26 09:49:32.465235183 +0000 UTC m=+0.139441720 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Nov 26 04:49:32 localhost podman[289568]: 2025-11-26 09:49:32.49725316 +0000 UTC m=+0.171459707 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118) Nov 26 04:49:32 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:49:32 localhost sshd[289607]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:49:32 localhost systemd-logind[761]: New session 65 of user ceph-admin. Nov 26 04:49:32 localhost systemd[1]: Started Session 65 of User ceph-admin. Nov 26 04:49:32 localhost ceph-mon[288827]: Reconfiguring mon.np0005536114 (monmap changed)... Nov 26 04:49:32 localhost ceph-mon[288827]: Reconfiguring daemon mon.np0005536114 on np0005536114.localdomain Nov 26 04:49:32 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:32 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' Nov 26 04:49:32 localhost ceph-mon[288827]: from='mgr.14120 172.18.0.103:0/3832063502' entity='mgr.np0005536112.srlncr' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536114.ddbqmi", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:49:32 localhost ceph-mon[288827]: from='client.? 172.18.0.103:0/955379167' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Nov 26 04:49:32 localhost ceph-mon[288827]: Activating manager daemon np0005536114.ddbqmi Nov 26 04:49:32 localhost ceph-mon[288827]: from='client.? 172.18.0.103:0/955379167' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Nov 26 04:49:32 localhost ceph-mon[288827]: Manager daemon np0005536114.ddbqmi is now available Nov 26 04:49:32 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005536114.ddbqmi/mirror_snapshot_schedule"} : dispatch Nov 26 04:49:32 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005536114.ddbqmi/mirror_snapshot_schedule"} : dispatch Nov 26 04:49:32 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005536114.ddbqmi/trash_purge_schedule"} : dispatch Nov 26 04:49:32 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005536114.ddbqmi/trash_purge_schedule"} : dispatch Nov 26 04:49:33 localhost nova_compute[281415]: 2025-11-26 09:49:33.326 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:49:33 localhost ceph-mon[288827]: mon.np0005536118@4(peon).osd e82 _set_new_cache_sizes cache_size:1019552366 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:49:33 localhost systemd[1]: tmp-crun.Q5NxOb.mount: Deactivated successfully. Nov 26 04:49:34 localhost podman[289720]: 2025-11-26 09:49:33.999272377 +0000 UTC m=+0.114511878 container exec a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, vendor=Red Hat, Inc., distribution-scope=public, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, GIT_CLEAN=True, RELEASE=main, name=rhceph, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7) Nov 26 04:49:34 localhost podman[289720]: 2025-11-26 09:49:34.123450755 +0000 UTC m=+0.238690236 container exec_died a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, version=7, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, distribution-scope=public, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, GIT_CLEAN=True, RELEASE=main, release=553, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 26 04:49:34 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:34 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:34 localhost sshd[289768]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:49:35 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:35 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:35 localhost ceph-mon[288827]: [26/Nov/2025:09:49:34] ENGINE Bus STARTING Nov 26 04:49:35 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:35 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:35 localhost ceph-mon[288827]: [26/Nov/2025:09:49:34] ENGINE Serving on http://172.18.0.105:8765 Nov 26 04:49:35 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:35 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:35 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:35 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:35 localhost ceph-mon[288827]: [26/Nov/2025:09:49:34] ENGINE Serving on https://172.18.0.105:7150 Nov 26 04:49:35 localhost ceph-mon[288827]: [26/Nov/2025:09:49:34] ENGINE Bus STARTED Nov 26 04:49:35 localhost ceph-mon[288827]: [26/Nov/2025:09:49:34] ENGINE Client ('172.18.0.105', 59832) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Nov 26 04:49:35 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:35 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:36 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:36 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "config rm", "who": "osd/host:np0005536113", "name": "osd_memory_target"} : dispatch Nov 26 04:49:36 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:36 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "config rm", "who": "osd/host:np0005536113", "name": "osd_memory_target"} : dispatch Nov 26 04:49:36 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:36 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "config rm", "who": "osd/host:np0005536112", "name": "osd_memory_target"} : dispatch Nov 26 04:49:36 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:36 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "config rm", "who": "osd/host:np0005536112", "name": "osd_memory_target"} : dispatch Nov 26 04:49:36 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:36 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:36 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:36 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 26 04:49:36 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:36 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 26 04:49:36 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 26 04:49:36 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 26 04:49:36 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 26 04:49:36 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 26 04:49:36 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 26 04:49:36 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 26 04:49:36 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:36 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:36 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "config rm", "who": "osd/host:np0005536114", "name": "osd_memory_target"} : dispatch Nov 26 04:49:36 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "config rm", "who": "osd/host:np0005536114", "name": "osd_memory_target"} : dispatch Nov 26 04:49:36 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:36 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 26 04:49:36 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:36 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 26 04:49:36 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 26 04:49:36 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 26 04:49:36 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:49:36 localhost nova_compute[281415]: 2025-11-26 09:49:36.611 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:49:37 localhost ceph-mon[288827]: Adjusting osd_memory_target on np0005536117.localdomain to 836.6M Nov 26 04:49:37 localhost ceph-mon[288827]: Adjusting osd_memory_target on np0005536119.localdomain to 836.6M Nov 26 04:49:37 localhost ceph-mon[288827]: Unable to set osd_memory_target on np0005536117.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 26 04:49:37 localhost ceph-mon[288827]: Unable to set osd_memory_target on np0005536119.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 26 04:49:37 localhost ceph-mon[288827]: Adjusting osd_memory_target on np0005536118.localdomain to 836.6M Nov 26 04:49:37 localhost ceph-mon[288827]: Unable to set osd_memory_target on np0005536118.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 26 04:49:37 localhost ceph-mon[288827]: Updating np0005536112.localdomain:/etc/ceph/ceph.conf Nov 26 04:49:37 localhost ceph-mon[288827]: Updating np0005536113.localdomain:/etc/ceph/ceph.conf Nov 26 04:49:37 localhost ceph-mon[288827]: Updating np0005536114.localdomain:/etc/ceph/ceph.conf Nov 26 04:49:37 localhost ceph-mon[288827]: Updating np0005536117.localdomain:/etc/ceph/ceph.conf Nov 26 04:49:37 localhost ceph-mon[288827]: Updating np0005536118.localdomain:/etc/ceph/ceph.conf Nov 26 04:49:37 localhost ceph-mon[288827]: Updating np0005536119.localdomain:/etc/ceph/ceph.conf Nov 26 04:49:37 localhost ceph-mon[288827]: Updating np0005536119.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:49:37 localhost ceph-mon[288827]: Updating np0005536112.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:49:37 localhost ceph-mon[288827]: Updating np0005536117.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:49:38 localhost nova_compute[281415]: 2025-11-26 09:49:38.324 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:49:38 localhost ceph-mon[288827]: mon.np0005536118@4(peon).osd e82 _set_new_cache_sizes cache_size:1020042302 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:49:38 localhost ceph-mon[288827]: Updating np0005536113.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:49:38 localhost ceph-mon[288827]: Updating np0005536114.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:49:38 localhost ceph-mon[288827]: Updating np0005536118.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:49:38 localhost ceph-mon[288827]: Updating np0005536119.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 26 04:49:38 localhost ceph-mon[288827]: Updating np0005536112.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 26 04:49:38 localhost ceph-mon[288827]: Updating np0005536117.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 26 04:49:38 localhost ceph-mon[288827]: Updating np0005536113.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 26 04:49:38 localhost ceph-mon[288827]: Updating np0005536114.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 26 04:49:38 localhost ceph-mon[288827]: Updating np0005536118.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 26 04:49:39 localhost ceph-mon[288827]: Updating np0005536112.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.client.admin.keyring Nov 26 04:49:39 localhost ceph-mon[288827]: Updating np0005536119.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.client.admin.keyring Nov 26 04:49:39 localhost ceph-mon[288827]: Updating np0005536117.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.client.admin.keyring Nov 26 04:49:39 localhost ceph-mon[288827]: Updating np0005536114.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.client.admin.keyring Nov 26 04:49:39 localhost ceph-mon[288827]: Updating np0005536113.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.client.admin.keyring Nov 26 04:49:39 localhost ceph-mon[288827]: Updating np0005536118.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.client.admin.keyring Nov 26 04:49:39 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:39 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:39 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:39 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:39 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:39 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:39 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:39 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:39 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:39 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:39 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:39 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:39 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:40 localhost ceph-mon[288827]: Reconfiguring mgr.np0005536114.ddbqmi (monmap changed)... Nov 26 04:49:40 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536114.ddbqmi", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:49:40 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536114.ddbqmi", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:49:40 localhost ceph-mon[288827]: Reconfiguring daemon mgr.np0005536114.ddbqmi on np0005536114.localdomain Nov 26 04:49:41 localhost nova_compute[281415]: 2025-11-26 09:49:41.645 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:49:41 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:41 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:41 localhost ceph-mon[288827]: Reconfiguring crash.np0005536114 (monmap changed)... Nov 26 04:49:41 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536114.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:49:41 localhost ceph-mon[288827]: Reconfiguring daemon crash.np0005536114 on np0005536114.localdomain Nov 26 04:49:41 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536114.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:49:41 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:41 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536117.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:49:41 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:41 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536117.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:49:42 localhost ceph-mon[288827]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0. Nov 26 04:49:42 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:49:42.682186) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 26 04:49:42 localhost ceph-mon[288827]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13 Nov 26 04:49:42 localhost ceph-mon[288827]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150582682421, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 11290, "num_deletes": 765, "total_data_size": 17010169, "memory_usage": 17409256, "flush_reason": "Manual Compaction"} Nov 26 04:49:42 localhost ceph-mon[288827]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started Nov 26 04:49:42 localhost ceph-mon[288827]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150582733508, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 10565747, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 11295, "table_properties": {"data_size": 10514035, "index_size": 26326, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24325, "raw_key_size": 253901, "raw_average_key_size": 26, "raw_value_size": 10351783, "raw_average_value_size": 1064, "num_data_blocks": 990, "num_entries": 9727, "num_filter_entries": 9727, "num_deletions": 761, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764150548, "oldest_key_time": 1764150548, "file_creation_time": 1764150582, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "96260784-ab37-4bfa-a747-b54286a1d4f8", "db_session_id": "ZOF5ONGIRCTUGR7KNLS5", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}} Nov 26 04:49:42 localhost ceph-mon[288827]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 51367 microseconds, and 27302 cpu microseconds. Nov 26 04:49:42 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:49:42.733592) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 10565747 bytes OK Nov 26 04:49:42 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:49:42.733624) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started Nov 26 04:49:42 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:49:42.735553) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done Nov 26 04:49:42 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:49:42.735579) EVENT_LOG_v1 {"time_micros": 1764150582735571, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0} Nov 26 04:49:42 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:49:42.735595) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50 Nov 26 04:49:42 localhost ceph-mon[288827]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 16933734, prev total WAL file size 16934483, number of live WAL files 2. Nov 26 04:49:42 localhost ceph-mon[288827]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 04:49:42 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:49:42.738573) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033353037' seq:72057594037927935, type:22 .. '6D6772737461740033373538' seq:0, type:0; will stop at (end) Nov 26 04:49:42 localhost ceph-mon[288827]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00 Nov 26 04:49:42 localhost ceph-mon[288827]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(10MB) 8(1887B)] Nov 26 04:49:42 localhost ceph-mon[288827]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150582738707, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 10567634, "oldest_snapshot_seqno": -1} Nov 26 04:49:42 localhost ceph-mon[288827]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 8970 keys, 10554585 bytes, temperature: kUnknown Nov 26 04:49:42 localhost ceph-mon[288827]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150582785825, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 10554585, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10504789, "index_size": 26271, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 22469, "raw_key_size": 240731, "raw_average_key_size": 26, "raw_value_size": 10352313, "raw_average_value_size": 1154, "num_data_blocks": 988, "num_entries": 8970, "num_filter_entries": 8970, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764150548, "oldest_key_time": 0, "file_creation_time": 1764150582, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "96260784-ab37-4bfa-a747-b54286a1d4f8", "db_session_id": "ZOF5ONGIRCTUGR7KNLS5", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}} Nov 26 04:49:42 localhost ceph-mon[288827]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 04:49:42 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:49:42.786573) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 10554585 bytes Nov 26 04:49:42 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:49:42.788087) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 222.1 rd, 221.9 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(10.1, 0.0 +0.0 blob) out(10.1 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 9732, records dropped: 762 output_compression: NoCompression Nov 26 04:49:42 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:49:42.788120) EVENT_LOG_v1 {"time_micros": 1764150582788105, "job": 4, "event": "compaction_finished", "compaction_time_micros": 47573, "compaction_time_cpu_micros": 23423, "output_level": 6, "num_output_files": 1, "total_output_size": 10554585, "num_input_records": 9732, "num_output_records": 8970, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 26 04:49:42 localhost ceph-mon[288827]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 04:49:42 localhost ceph-mon[288827]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150582790433, "job": 4, "event": "table_file_deletion", "file_number": 14} Nov 26 04:49:42 localhost ceph-mon[288827]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 04:49:42 localhost ceph-mon[288827]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150582790666, "job": 4, "event": "table_file_deletion", "file_number": 8} Nov 26 04:49:42 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:49:42.738382) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:49:42 localhost ceph-mon[288827]: Reconfiguring crash.np0005536117 (monmap changed)... Nov 26 04:49:42 localhost ceph-mon[288827]: Reconfiguring daemon crash.np0005536117 on np0005536117.localdomain Nov 26 04:49:42 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:42 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:42 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:42 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Nov 26 04:49:43 localhost nova_compute[281415]: 2025-11-26 09:49:43.328 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:49:43 localhost ceph-mon[288827]: mon.np0005536118@4(peon).osd e82 _set_new_cache_sizes cache_size:1020054514 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:49:43 localhost ceph-mon[288827]: Reconfiguring osd.2 (monmap changed)... Nov 26 04:49:43 localhost ceph-mon[288827]: Reconfiguring daemon osd.2 on np0005536117.localdomain Nov 26 04:49:43 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:43 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Nov 26 04:49:43 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:49:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:49:44 localhost systemd[1]: tmp-crun.L6qU5t.mount: Deactivated successfully. Nov 26 04:49:44 localhost ceph-mon[288827]: Reconfiguring osd.5 (monmap changed)... Nov 26 04:49:44 localhost ceph-mon[288827]: Reconfiguring daemon osd.5 on np0005536117.localdomain Nov 26 04:49:44 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:44 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005536117.tfthzg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 26 04:49:44 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:44 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005536117.tfthzg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 26 04:49:44 localhost podman[290619]: 2025-11-26 09:49:44.913499293 +0000 UTC m=+0.158278948 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 26 04:49:44 localhost podman[290620]: 2025-11-26 09:49:44.872315249 +0000 UTC m=+0.117437945 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3) Nov 26 04:49:44 localhost podman[290619]: 2025-11-26 09:49:44.927742093 +0000 UTC m=+0.172521778 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:49:44 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:49:44 localhost podman[290620]: 2025-11-26 09:49:44.958276504 +0000 UTC m=+0.203399230 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible) Nov 26 04:49:44 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:49:45 localhost openstack_network_exporter[242153]: ERROR 09:49:45 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:49:45 localhost openstack_network_exporter[242153]: ERROR 09:49:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:49:45 localhost openstack_network_exporter[242153]: ERROR 09:49:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:49:45 localhost openstack_network_exporter[242153]: ERROR 09:49:45 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:49:45 localhost openstack_network_exporter[242153]: Nov 26 04:49:45 localhost openstack_network_exporter[242153]: ERROR 09:49:45 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:49:45 localhost openstack_network_exporter[242153]: Nov 26 04:49:45 localhost ceph-mon[288827]: Reconfiguring mds.mds.np0005536117.tfthzg (monmap changed)... Nov 26 04:49:45 localhost ceph-mon[288827]: Reconfiguring daemon mds.mds.np0005536117.tfthzg on np0005536117.localdomain Nov 26 04:49:45 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:45 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536117.ggibwg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:49:45 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:45 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536117.ggibwg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:49:46 localhost nova_compute[281415]: 2025-11-26 09:49:46.647 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:49:46 localhost ceph-mon[288827]: Reconfiguring mgr.np0005536117.ggibwg (monmap changed)... Nov 26 04:49:46 localhost ceph-mon[288827]: Reconfiguring daemon mgr.np0005536117.ggibwg on np0005536117.localdomain Nov 26 04:49:46 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:46 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:46 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 26 04:49:47 localhost ceph-mon[288827]: Reconfiguring mon.np0005536117 (monmap changed)... Nov 26 04:49:47 localhost ceph-mon[288827]: Reconfiguring daemon mon.np0005536117 on np0005536117.localdomain Nov 26 04:49:47 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:47 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536118.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:49:47 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:47 localhost ceph-mon[288827]: from='mgr.14184 ' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536118.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:49:47 localhost ceph-mgr[287388]: ms_deliver_dispatch: unhandled message 0x55fcbfcdb080 mon_map magic: 0 from mon.2 v2:172.18.0.104:3300/0 Nov 26 04:49:47 localhost ceph-mgr[287388]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0 Nov 26 04:49:47 localhost ceph-mgr[287388]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0 Nov 26 04:49:47 localhost ceph-mon[288827]: mon.np0005536118@4(peon) e7 my rank is now 3 (was 4) Nov 26 04:49:48 localhost ceph-mon[288827]: log_channel(cluster) log [INF] : mon.np0005536118 calling monitor election Nov 26 04:49:48 localhost ceph-mon[288827]: paxos.3).electionLogic(24) init, last seen epoch 24 Nov 26 04:49:48 localhost ceph-mon[288827]: mon.np0005536118@3(electing) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:49:48 localhost ceph-mgr[287388]: --2- 172.18.0.107:0/56348432 >> [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] conn(0x55fcc9427800 0x55fcc9428b00 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 Nov 26 04:49:48 localhost ceph-mgr[287388]: client.0 ms_handle_reset on v2:172.18.0.106:3300/0 Nov 26 04:49:48 localhost ceph-mgr[287388]: client.0 ms_handle_reset on v2:172.18.0.106:3300/0 Nov 26 04:49:48 localhost ceph-mgr[287388]: ms_deliver_dispatch: unhandled message 0x55fcbfcdb600 mon_map magic: 0 from mon.3 v2:172.18.0.107:3300/0 Nov 26 04:49:48 localhost ceph-mon[288827]: mon.np0005536118@3(electing) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:49:48 localhost ceph-mon[288827]: mon.np0005536118@3(electing) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:49:48 localhost ceph-mon[288827]: mon.np0005536118@3(peon) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:49:48 localhost ceph-mon[288827]: Reconfiguring daemon crash.np0005536118 on np0005536118.localdomain Nov 26 04:49:48 localhost ceph-mon[288827]: Remove daemons mon.np0005536112 Nov 26 04:49:48 localhost ceph-mon[288827]: Safe to remove mon.np0005536112: new quorum should be ['np0005536114', 'np0005536113', 'np0005536119', 'np0005536118', 'np0005536117'] (from ['np0005536114', 'np0005536113', 'np0005536119', 'np0005536118', 'np0005536117']) Nov 26 04:49:48 localhost ceph-mon[288827]: Removing monitor np0005536112 from monmap... Nov 26 04:49:48 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "mon rm", "name": "np0005536112"} : dispatch Nov 26 04:49:48 localhost ceph-mon[288827]: Removing daemon mon.np0005536112 from np0005536112.localdomain -- ports [] Nov 26 04:49:48 localhost ceph-mon[288827]: mon.np0005536114 calling monitor election Nov 26 04:49:48 localhost ceph-mon[288827]: mon.np0005536113 calling monitor election Nov 26 04:49:48 localhost ceph-mon[288827]: mon.np0005536119 calling monitor election Nov 26 04:49:48 localhost ceph-mon[288827]: mon.np0005536118 calling monitor election Nov 26 04:49:48 localhost ceph-mon[288827]: mon.np0005536117 calling monitor election Nov 26 04:49:48 localhost ceph-mon[288827]: mon.np0005536114 is new leader, mons np0005536114,np0005536113,np0005536119,np0005536118,np0005536117 in quorum (ranks 0,1,2,3,4) Nov 26 04:49:48 localhost ceph-mon[288827]: overall HEALTH_OK Nov 26 04:49:48 localhost nova_compute[281415]: 2025-11-26 09:49:48.355 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:49:48 localhost podman[290714]: Nov 26 04:49:48 localhost podman[290714]: 2025-11-26 09:49:48.401969899 +0000 UTC m=+0.092930555 container create 1c0e9747763fb4bc1fddcb1de29f1dc5671a880414622e9bc3543326a06956ad (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_hellman, architecture=x86_64, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.buildah.version=1.33.12, name=rhceph, RELEASE=main, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , vcs-type=git, build-date=2025-09-24T08:57:55) Nov 26 04:49:48 localhost ceph-mon[288827]: mon.np0005536118@3(peon).osd e82 _set_new_cache_sizes cache_size:1020054728 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:49:48 localhost systemd[1]: Started libpod-conmon-1c0e9747763fb4bc1fddcb1de29f1dc5671a880414622e9bc3543326a06956ad.scope. Nov 26 04:49:48 localhost systemd[1]: Started libcrun container. Nov 26 04:49:48 localhost podman[290714]: 2025-11-26 09:49:48.363911551 +0000 UTC m=+0.054872237 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:49:48 localhost podman[290714]: 2025-11-26 09:49:48.473823229 +0000 UTC m=+0.164783875 container init 1c0e9747763fb4bc1fddcb1de29f1dc5671a880414622e9bc3543326a06956ad (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_hellman, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, CEPH_POINT_RELEASE=, RELEASE=main, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, ceph=True, io.openshift.tags=rhceph ceph, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , vcs-type=git, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, vendor=Red Hat, Inc., release=553, name=rhceph) Nov 26 04:49:48 localhost podman[290714]: 2025-11-26 09:49:48.484764069 +0000 UTC m=+0.175724725 container start 1c0e9747763fb4bc1fddcb1de29f1dc5671a880414622e9bc3543326a06956ad (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_hellman, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, GIT_BRANCH=main, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, ceph=True, vcs-type=git, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 26 04:49:48 localhost podman[290714]: 2025-11-26 09:49:48.48511967 +0000 UTC m=+0.176080316 container attach 1c0e9747763fb4bc1fddcb1de29f1dc5671a880414622e9bc3543326a06956ad (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_hellman, release=553, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, ceph=True, distribution-scope=public, version=7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 26 04:49:48 localhost silly_hellman[290729]: 167 167 Nov 26 04:49:48 localhost systemd[1]: libpod-1c0e9747763fb4bc1fddcb1de29f1dc5671a880414622e9bc3543326a06956ad.scope: Deactivated successfully. Nov 26 04:49:48 localhost podman[290714]: 2025-11-26 09:49:48.488529172 +0000 UTC m=+0.179489838 container died 1c0e9747763fb4bc1fddcb1de29f1dc5671a880414622e9bc3543326a06956ad (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_hellman, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, CEPH_POINT_RELEASE=, ceph=True, GIT_BRANCH=main, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, version=7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, RELEASE=main) Nov 26 04:49:48 localhost podman[290734]: 2025-11-26 09:49:48.597004137 +0000 UTC m=+0.098630668 container remove 1c0e9747763fb4bc1fddcb1de29f1dc5671a880414622e9bc3543326a06956ad (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_hellman, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, RELEASE=main, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, version=7, maintainer=Guillaume Abrioux , vcs-type=git, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, ceph=True, GIT_BRANCH=main, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7) Nov 26 04:49:48 localhost systemd[1]: libpod-conmon-1c0e9747763fb4bc1fddcb1de29f1dc5671a880414622e9bc3543326a06956ad.scope: Deactivated successfully. Nov 26 04:49:48 localhost nova_compute[281415]: 2025-11-26 09:49:48.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:49:49 localhost podman[290804]: Nov 26 04:49:49 localhost podman[290804]: 2025-11-26 09:49:49.357434049 +0000 UTC m=+0.089991597 container create f96c43222019fdaad84c7e56a3a5a64d987e5519a5b75b5a8654561cd8f0971f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_faraday, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, version=7, GIT_CLEAN=True, ceph=True, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, distribution-scope=public, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, maintainer=Guillaume Abrioux , release=553, CEPH_POINT_RELEASE=, vcs-type=git, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc.) Nov 26 04:49:49 localhost systemd[1]: Started libpod-conmon-f96c43222019fdaad84c7e56a3a5a64d987e5519a5b75b5a8654561cd8f0971f.scope. Nov 26 04:49:49 localhost systemd[1]: var-lib-containers-storage-overlay-ec1fbbab4942da86af0c4de81405e4e0bb722d677d47aa1bd9f50d0aafd43fc5-merged.mount: Deactivated successfully. Nov 26 04:49:49 localhost systemd[1]: Started libcrun container. Nov 26 04:49:49 localhost podman[290804]: 2025-11-26 09:49:49.320398561 +0000 UTC m=+0.052956169 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:49:49 localhost podman[290804]: 2025-11-26 09:49:49.430531035 +0000 UTC m=+0.163088583 container init f96c43222019fdaad84c7e56a3a5a64d987e5519a5b75b5a8654561cd8f0971f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_faraday, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, version=7, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, name=rhceph, release=553, io.openshift.expose-services=, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64) Nov 26 04:49:49 localhost agitated_faraday[290819]: 167 167 Nov 26 04:49:49 localhost podman[290804]: 2025-11-26 09:49:49.440762525 +0000 UTC m=+0.173320083 container start f96c43222019fdaad84c7e56a3a5a64d987e5519a5b75b5a8654561cd8f0971f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_faraday, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, version=7, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, release=553, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, io.openshift.expose-services=, com.redhat.component=rhceph-container, name=rhceph, maintainer=Guillaume Abrioux ) Nov 26 04:49:49 localhost podman[290804]: 2025-11-26 09:49:49.442435225 +0000 UTC m=+0.174992823 container attach f96c43222019fdaad84c7e56a3a5a64d987e5519a5b75b5a8654561cd8f0971f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_faraday, vcs-type=git, RELEASE=main, ceph=True, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, distribution-scope=public, io.openshift.tags=rhceph ceph, version=7, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, name=rhceph, GIT_CLEAN=True, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64) Nov 26 04:49:49 localhost systemd[1]: libpod-f96c43222019fdaad84c7e56a3a5a64d987e5519a5b75b5a8654561cd8f0971f.scope: Deactivated successfully. Nov 26 04:49:49 localhost podman[290804]: 2025-11-26 09:49:49.445982912 +0000 UTC m=+0.178540470 container died f96c43222019fdaad84c7e56a3a5a64d987e5519a5b75b5a8654561cd8f0971f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_faraday, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, RELEASE=main, release=553, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, architecture=x86_64, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, version=7, io.openshift.tags=rhceph ceph) Nov 26 04:49:49 localhost systemd[1]: tmp-crun.la1ZDl.mount: Deactivated successfully. Nov 26 04:49:49 localhost podman[290825]: 2025-11-26 09:49:49.556998594 +0000 UTC m=+0.098438744 container remove f96c43222019fdaad84c7e56a3a5a64d987e5519a5b75b5a8654561cd8f0971f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_faraday, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, name=rhceph, release=553, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, version=7, GIT_CLEAN=True, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, architecture=x86_64, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7) Nov 26 04:49:49 localhost systemd[1]: libpod-conmon-f96c43222019fdaad84c7e56a3a5a64d987e5519a5b75b5a8654561cd8f0971f.scope: Deactivated successfully. Nov 26 04:49:49 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:49 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:49 localhost ceph-mon[288827]: Reconfiguring osd.0 (monmap changed)... Nov 26 04:49:49 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Nov 26 04:49:49 localhost ceph-mon[288827]: Reconfiguring daemon osd.0 on np0005536118.localdomain Nov 26 04:49:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:49:50 localhost podman[290885]: 2025-11-26 09:49:50.337065469 +0000 UTC m=+0.091948417 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Nov 26 04:49:50 localhost systemd[1]: var-lib-containers-storage-overlay-c5ff156e5d97a0800fb6a26672941ffd65eefc8de16bd82b1029250a255170c8-merged.mount: Deactivated successfully. Nov 26 04:49:50 localhost podman[290885]: 2025-11-26 09:49:50.417559619 +0000 UTC m=+0.172442557 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118) Nov 26 04:49:50 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:49:50 localhost podman[290922]: Nov 26 04:49:50 localhost podman[290922]: 2025-11-26 09:49:50.496480351 +0000 UTC m=+0.100114623 container create 0830bc08daea698f2e9f27f3f4b22e636a197846f3880328202bd1e7806a1ba5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_maxwell, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, ceph=True, io.buildah.version=1.33.12, RELEASE=main, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 26 04:49:50 localhost systemd[1]: Started libpod-conmon-0830bc08daea698f2e9f27f3f4b22e636a197846f3880328202bd1e7806a1ba5.scope. Nov 26 04:49:50 localhost podman[290922]: 2025-11-26 09:49:50.460759622 +0000 UTC m=+0.064393934 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:49:50 localhost systemd[1]: Started libcrun container. Nov 26 04:49:50 localhost podman[290922]: 2025-11-26 09:49:50.584523568 +0000 UTC m=+0.188157840 container init 0830bc08daea698f2e9f27f3f4b22e636a197846f3880328202bd1e7806a1ba5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_maxwell, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, version=7, maintainer=Guillaume Abrioux , distribution-scope=public, RELEASE=main, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, io.buildah.version=1.33.12, ceph=True, architecture=x86_64, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 26 04:49:50 localhost podman[290922]: 2025-11-26 09:49:50.602435269 +0000 UTC m=+0.206069551 container start 0830bc08daea698f2e9f27f3f4b22e636a197846f3880328202bd1e7806a1ba5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_maxwell, version=7, maintainer=Guillaume Abrioux , GIT_BRANCH=main, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, distribution-scope=public, RELEASE=main, name=rhceph, vendor=Red Hat, Inc., release=553, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 26 04:49:50 localhost podman[290922]: 2025-11-26 09:49:50.603582533 +0000 UTC m=+0.207216815 container attach 0830bc08daea698f2e9f27f3f4b22e636a197846f3880328202bd1e7806a1ba5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_maxwell, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., RELEASE=main, CEPH_POINT_RELEASE=, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, distribution-scope=public, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, ceph=True, io.openshift.tags=rhceph ceph, version=7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, name=rhceph, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 26 04:49:50 localhost silly_maxwell[290940]: 167 167 Nov 26 04:49:50 localhost systemd[1]: libpod-0830bc08daea698f2e9f27f3f4b22e636a197846f3880328202bd1e7806a1ba5.scope: Deactivated successfully. Nov 26 04:49:50 localhost podman[290922]: 2025-11-26 09:49:50.6100924 +0000 UTC m=+0.213726702 container died 0830bc08daea698f2e9f27f3f4b22e636a197846f3880328202bd1e7806a1ba5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_maxwell, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , distribution-scope=public, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, vcs-type=git, ceph=True, release=553, io.openshift.expose-services=, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 26 04:49:50 localhost podman[290945]: 2025-11-26 09:49:50.714728708 +0000 UTC m=+0.091150662 container remove 0830bc08daea698f2e9f27f3f4b22e636a197846f3880328202bd1e7806a1ba5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_maxwell, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, name=rhceph, distribution-scope=public, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, RELEASE=main, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, version=7) Nov 26 04:49:50 localhost systemd[1]: libpod-conmon-0830bc08daea698f2e9f27f3f4b22e636a197846f3880328202bd1e7806a1ba5.scope: Deactivated successfully. Nov 26 04:49:50 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:50 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:50 localhost ceph-mon[288827]: Reconfiguring osd.4 (monmap changed)... Nov 26 04:49:50 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Nov 26 04:49:50 localhost ceph-mon[288827]: Reconfiguring daemon osd.4 on np0005536118.localdomain Nov 26 04:49:50 localhost nova_compute[281415]: 2025-11-26 09:49:50.843 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:49:50 localhost nova_compute[281415]: 2025-11-26 09:49:50.863 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:49:50 localhost nova_compute[281415]: 2025-11-26 09:49:50.863 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 04:49:51 localhost systemd[1]: tmp-crun.xr8VXG.mount: Deactivated successfully. Nov 26 04:49:51 localhost systemd[1]: var-lib-containers-storage-overlay-921e80bd156d2771677bfdc74bd600074cb3d79aaea43ab3d1c443ea3f38edfe-merged.mount: Deactivated successfully. Nov 26 04:49:51 localhost nova_compute[281415]: 2025-11-26 09:49:51.679 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:49:51 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:51 localhost ceph-mon[288827]: Removed label mon from host np0005536112.localdomain Nov 26 04:49:51 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:51 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:51 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005536118.kohnma", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 26 04:49:51 localhost nova_compute[281415]: 2025-11-26 09:49:51.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:49:51 localhost nova_compute[281415]: 2025-11-26 09:49:51.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:49:51 localhost nova_compute[281415]: 2025-11-26 09:49:51.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:49:51 localhost nova_compute[281415]: 2025-11-26 09:49:51.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:49:51 localhost nova_compute[281415]: 2025-11-26 09:49:51.849 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:49:51 localhost nova_compute[281415]: 2025-11-26 09:49:51.896 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:49:51 localhost nova_compute[281415]: 2025-11-26 09:49:51.897 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:49:51 localhost nova_compute[281415]: 2025-11-26 09:49:51.897 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:49:51 localhost nova_compute[281415]: 2025-11-26 09:49:51.897 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 04:49:51 localhost nova_compute[281415]: 2025-11-26 09:49:51.898 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:49:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:49:52 localhost podman[291021]: 2025-11-26 09:49:52.02743125 +0000 UTC m=+0.094852353 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, name=ubi9-minimal, vendor=Red Hat, Inc.) Nov 26 04:49:52 localhost podman[291029]: Nov 26 04:49:52 localhost podman[291021]: 2025-11-26 09:49:52.052475046 +0000 UTC m=+0.119896149 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, config_id=edpm, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, vcs-type=git) Nov 26 04:49:52 localhost podman[291029]: 2025-11-26 09:49:52.063107437 +0000 UTC m=+0.106654190 container create 461b908aa45793645710886d7dbea01d2c45357a4b4817c57afd68421adbb900 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_gates, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, CEPH_POINT_RELEASE=, GIT_BRANCH=main, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, architecture=x86_64, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, ceph=True, io.openshift.expose-services=, RELEASE=main) Nov 26 04:49:52 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:49:52 localhost systemd[1]: Started libpod-conmon-461b908aa45793645710886d7dbea01d2c45357a4b4817c57afd68421adbb900.scope. Nov 26 04:49:52 localhost podman[291029]: 2025-11-26 09:49:52.023591054 +0000 UTC m=+0.067137867 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:49:52 localhost systemd[1]: Started libcrun container. Nov 26 04:49:52 localhost podman[291029]: 2025-11-26 09:49:52.151466494 +0000 UTC m=+0.195013237 container init 461b908aa45793645710886d7dbea01d2c45357a4b4817c57afd68421adbb900 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_gates, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, version=7, io.buildah.version=1.33.12, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, RELEASE=main, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, GIT_BRANCH=main, name=rhceph, distribution-scope=public) Nov 26 04:49:52 localhost podman[291029]: 2025-11-26 09:49:52.162658112 +0000 UTC m=+0.206204855 container start 461b908aa45793645710886d7dbea01d2c45357a4b4817c57afd68421adbb900 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_gates, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, distribution-scope=public, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , ceph=True, RELEASE=main, architecture=x86_64, release=553, build-date=2025-09-24T08:57:55, version=7, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, io.buildah.version=1.33.12) Nov 26 04:49:52 localhost podman[291029]: 2025-11-26 09:49:52.162991262 +0000 UTC m=+0.206538025 container attach 461b908aa45793645710886d7dbea01d2c45357a4b4817c57afd68421adbb900 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_gates, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, architecture=x86_64, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, io.openshift.expose-services=, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, maintainer=Guillaume Abrioux , GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., ceph=True, name=rhceph, vcs-type=git) Nov 26 04:49:52 localhost angry_gates[291078]: 167 167 Nov 26 04:49:52 localhost systemd[1]: libpod-461b908aa45793645710886d7dbea01d2c45357a4b4817c57afd68421adbb900.scope: Deactivated successfully. Nov 26 04:49:52 localhost podman[291029]: 2025-11-26 09:49:52.168760356 +0000 UTC m=+0.212307099 container died 461b908aa45793645710886d7dbea01d2c45357a4b4817c57afd68421adbb900 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_gates, io.buildah.version=1.33.12, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, RELEASE=main, release=553, maintainer=Guillaume Abrioux , GIT_CLEAN=True, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, vcs-type=git, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main) Nov 26 04:49:52 localhost podman[291083]: 2025-11-26 09:49:52.266913129 +0000 UTC m=+0.086424670 container remove 461b908aa45793645710886d7dbea01d2c45357a4b4817c57afd68421adbb900 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_gates, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, CEPH_POINT_RELEASE=, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., name=rhceph, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, version=7) Nov 26 04:49:52 localhost systemd[1]: libpod-conmon-461b908aa45793645710886d7dbea01d2c45357a4b4817c57afd68421adbb900.scope: Deactivated successfully. Nov 26 04:49:52 localhost ceph-mon[288827]: mon.np0005536118@3(peon) e7 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 04:49:52 localhost ceph-mon[288827]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3809523685' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 04:49:52 localhost nova_compute[281415]: 2025-11-26 09:49:52.356 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:49:52 localhost systemd[1]: var-lib-containers-storage-overlay-5c80f0144c485daee39b27bf7472f66f1bcb8636fbb74b14f5f6003ee868b77e-merged.mount: Deactivated successfully. Nov 26 04:49:52 localhost nova_compute[281415]: 2025-11-26 09:49:52.488 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:49:52 localhost nova_compute[281415]: 2025-11-26 09:49:52.489 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:49:52 localhost nova_compute[281415]: 2025-11-26 09:49:52.679 281419 WARNING nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 04:49:52 localhost nova_compute[281415]: 2025-11-26 09:49:52.681 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=11825MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 04:49:52 localhost nova_compute[281415]: 2025-11-26 09:49:52.681 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:49:52 localhost nova_compute[281415]: 2025-11-26 09:49:52.681 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:49:52 localhost nova_compute[281415]: 2025-11-26 09:49:52.834 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 04:49:52 localhost nova_compute[281415]: 2025-11-26 09:49:52.835 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 04:49:52 localhost nova_compute[281415]: 2025-11-26 09:49:52.835 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 04:49:52 localhost nova_compute[281415]: 2025-11-26 09:49:52.896 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:49:52 localhost podman[291156]: Nov 26 04:49:52 localhost podman[291156]: 2025-11-26 09:49:52.946302975 +0000 UTC m=+0.078850661 container create a7db931c53d9a1b8bac36576871e75ef5ba1eb34e3c9d473a009316e38f77bc5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_chaplygin, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, RELEASE=main, release=553, io.k8s.description=Red Hat Ceph Storage 7, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, CEPH_POINT_RELEASE=, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, distribution-scope=public) Nov 26 04:49:52 localhost systemd[1]: Started libpod-conmon-a7db931c53d9a1b8bac36576871e75ef5ba1eb34e3c9d473a009316e38f77bc5.scope. Nov 26 04:49:53 localhost systemd[1]: Started libcrun container. Nov 26 04:49:53 localhost podman[291156]: 2025-11-26 09:49:53.017057191 +0000 UTC m=+0.149604877 container init a7db931c53d9a1b8bac36576871e75ef5ba1eb34e3c9d473a009316e38f77bc5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_chaplygin, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.tags=rhceph ceph, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, vendor=Red Hat, Inc., version=7, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, RELEASE=main, description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.expose-services=, ceph=True, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 26 04:49:53 localhost podman[291156]: 2025-11-26 09:49:52.926166267 +0000 UTC m=+0.058714023 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:49:53 localhost podman[291156]: 2025-11-26 09:49:53.035548319 +0000 UTC m=+0.168096005 container start a7db931c53d9a1b8bac36576871e75ef5ba1eb34e3c9d473a009316e38f77bc5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_chaplygin, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, vendor=Red Hat, Inc., io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, ceph=True, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55) Nov 26 04:49:53 localhost podman[291156]: 2025-11-26 09:49:53.035769895 +0000 UTC m=+0.168317601 container attach a7db931c53d9a1b8bac36576871e75ef5ba1eb34e3c9d473a009316e38f77bc5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_chaplygin, maintainer=Guillaume Abrioux , io.openshift.expose-services=, name=rhceph, io.openshift.tags=rhceph ceph, architecture=x86_64, ceph=True, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, com.redhat.component=rhceph-container, RELEASE=main, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True) Nov 26 04:49:53 localhost optimistic_chaplygin[291172]: 167 167 Nov 26 04:49:53 localhost systemd[1]: libpod-a7db931c53d9a1b8bac36576871e75ef5ba1eb34e3c9d473a009316e38f77bc5.scope: Deactivated successfully. Nov 26 04:49:53 localhost podman[291156]: 2025-11-26 09:49:53.043825279 +0000 UTC m=+0.176372985 container died a7db931c53d9a1b8bac36576871e75ef5ba1eb34e3c9d473a009316e38f77bc5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_chaplygin, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, vendor=Red Hat, Inc., version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, distribution-scope=public, release=553, vcs-type=git, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_BRANCH=main, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 26 04:49:53 localhost ceph-mon[288827]: Reconfiguring mds.mds.np0005536118.kohnma (monmap changed)... Nov 26 04:49:53 localhost ceph-mon[288827]: Reconfiguring daemon mds.mds.np0005536118.kohnma on np0005536118.localdomain Nov 26 04:49:53 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:53 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:53 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:53 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536118.anceyj", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:49:53 localhost podman[291196]: 2025-11-26 09:49:53.138659231 +0000 UTC m=+0.082573793 container remove a7db931c53d9a1b8bac36576871e75ef5ba1eb34e3c9d473a009316e38f77bc5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_chaplygin, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, name=rhceph, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, release=553, architecture=x86_64, distribution-scope=public, RELEASE=main) Nov 26 04:49:53 localhost systemd[1]: libpod-conmon-a7db931c53d9a1b8bac36576871e75ef5ba1eb34e3c9d473a009316e38f77bc5.scope: Deactivated successfully. Nov 26 04:49:53 localhost ceph-mon[288827]: mon.np0005536118@3(peon) e7 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 04:49:53 localhost ceph-mon[288827]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3493978072' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 04:49:53 localhost nova_compute[281415]: 2025-11-26 09:49:53.403 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:49:53 localhost systemd[1]: tmp-crun.I8oU2l.mount: Deactivated successfully. Nov 26 04:49:53 localhost systemd[1]: var-lib-containers-storage-overlay-a9ff53b4a3b96ab9b0520c04bd2ca7dfaec70fcb973cd5789b3170b8993657ba-merged.mount: Deactivated successfully. Nov 26 04:49:53 localhost ceph-mon[288827]: mon.np0005536118@3(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:49:53 localhost nova_compute[281415]: 2025-11-26 09:49:53.430 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.534s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:49:53 localhost nova_compute[281415]: 2025-11-26 09:49:53.438 281419 DEBUG nova.compute.provider_tree [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 04:49:53 localhost nova_compute[281415]: 2025-11-26 09:49:53.456 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 04:49:53 localhost nova_compute[281415]: 2025-11-26 09:49:53.458 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 04:49:53 localhost nova_compute[281415]: 2025-11-26 09:49:53.458 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.777s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:49:53 localhost podman[291266]: Nov 26 04:49:53 localhost podman[291266]: 2025-11-26 09:49:53.895252818 +0000 UTC m=+0.080083877 container create 94f9ec5152225056ef2f343b23be49336cb18a6d92716d435b767b535b5e27f0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_mcnulty, com.redhat.component=rhceph-container, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, version=7, vcs-type=git, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, release=553, description=Red Hat Ceph Storage 7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, distribution-scope=public, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 26 04:49:53 localhost systemd[1]: Started libpod-conmon-94f9ec5152225056ef2f343b23be49336cb18a6d92716d435b767b535b5e27f0.scope. Nov 26 04:49:53 localhost systemd[1]: Started libcrun container. Nov 26 04:49:53 localhost podman[291266]: 2025-11-26 09:49:53.961312653 +0000 UTC m=+0.146143712 container init 94f9ec5152225056ef2f343b23be49336cb18a6d92716d435b767b535b5e27f0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_mcnulty, vcs-type=git, RELEASE=main, CEPH_POINT_RELEASE=, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, com.redhat.component=rhceph-container, release=553, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_BRANCH=main, architecture=x86_64, description=Red Hat Ceph Storage 7, version=7) Nov 26 04:49:53 localhost podman[291266]: 2025-11-26 09:49:53.871611875 +0000 UTC m=+0.056442924 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:49:53 localhost podman[291266]: 2025-11-26 09:49:53.972028506 +0000 UTC m=+0.156859565 container start 94f9ec5152225056ef2f343b23be49336cb18a6d92716d435b767b535b5e27f0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_mcnulty, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, RELEASE=main, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, version=7, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, vcs-type=git, release=553, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, GIT_CLEAN=True, architecture=x86_64, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux ) Nov 26 04:49:53 localhost podman[291266]: 2025-11-26 09:49:53.972363446 +0000 UTC m=+0.157194635 container attach 94f9ec5152225056ef2f343b23be49336cb18a6d92716d435b767b535b5e27f0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_mcnulty, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, RELEASE=main, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , GIT_CLEAN=True, ceph=True, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, distribution-scope=public, vendor=Red Hat, Inc., name=rhceph, GIT_BRANCH=main, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 26 04:49:53 localhost optimistic_mcnulty[291281]: 167 167 Nov 26 04:49:53 localhost systemd[1]: libpod-94f9ec5152225056ef2f343b23be49336cb18a6d92716d435b767b535b5e27f0.scope: Deactivated successfully. Nov 26 04:49:53 localhost podman[291266]: 2025-11-26 09:49:53.975770169 +0000 UTC m=+0.160601238 container died 94f9ec5152225056ef2f343b23be49336cb18a6d92716d435b767b535b5e27f0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_mcnulty, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, io.buildah.version=1.33.12, RELEASE=main, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., version=7, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=) Nov 26 04:49:54 localhost nova_compute[281415]: 2025-11-26 09:49:54.457 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:49:54 localhost systemd[1]: var-lib-containers-storage-overlay-d66bb4bb440a09c0045e52a4a521d07b3eed478829aab04d8db3f85744da7917-merged.mount: Deactivated successfully. Nov 26 04:49:54 localhost ceph-mon[288827]: Removed label mgr from host np0005536112.localdomain Nov 26 04:49:54 localhost ceph-mon[288827]: Reconfiguring mgr.np0005536118.anceyj (monmap changed)... Nov 26 04:49:54 localhost ceph-mon[288827]: Reconfiguring daemon mgr.np0005536118.anceyj on np0005536118.localdomain Nov 26 04:49:54 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:54 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:54 localhost ceph-mon[288827]: Removed label _admin from host np0005536112.localdomain Nov 26 04:49:54 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:54 localhost ceph-mon[288827]: Reconfiguring mon.np0005536118 (monmap changed)... Nov 26 04:49:54 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 26 04:49:54 localhost ceph-mon[288827]: Reconfiguring daemon mon.np0005536118 on np0005536118.localdomain Nov 26 04:49:54 localhost podman[291286]: 2025-11-26 09:49:54.548907779 +0000 UTC m=+0.559803108 container remove 94f9ec5152225056ef2f343b23be49336cb18a6d92716d435b767b535b5e27f0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_mcnulty, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , GIT_CLEAN=True, CEPH_POINT_RELEASE=, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, release=553, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, name=rhceph, version=7) Nov 26 04:49:54 localhost systemd[1]: libpod-conmon-94f9ec5152225056ef2f343b23be49336cb18a6d92716d435b767b535b5e27f0.scope: Deactivated successfully. Nov 26 04:49:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:49:54 localhost systemd[1]: tmp-crun.INGanN.mount: Deactivated successfully. Nov 26 04:49:54 localhost podman[291302]: 2025-11-26 09:49:54.838579382 +0000 UTC m=+0.095732651 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 26 04:49:54 localhost nova_compute[281415]: 2025-11-26 09:49:54.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:49:54 localhost nova_compute[281415]: 2025-11-26 09:49:54.848 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 04:49:54 localhost nova_compute[281415]: 2025-11-26 09:49:54.848 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 04:49:54 localhost podman[291302]: 2025-11-26 09:49:54.84877239 +0000 UTC m=+0.105925659 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 26 04:49:54 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:49:55 localhost nova_compute[281415]: 2025-11-26 09:49:55.242 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:49:55 localhost nova_compute[281415]: 2025-11-26 09:49:55.243 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:49:55 localhost nova_compute[281415]: 2025-11-26 09:49:55.243 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 04:49:55 localhost nova_compute[281415]: 2025-11-26 09:49:55.243 281419 DEBUG nova.objects.instance [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:49:55 localhost nova_compute[281415]: 2025-11-26 09:49:55.617 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:49:55 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:55 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:55 localhost ceph-mon[288827]: Reconfiguring crash.np0005536119 (monmap changed)... Nov 26 04:49:55 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536119.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:49:55 localhost ceph-mon[288827]: Reconfiguring daemon crash.np0005536119 on np0005536119.localdomain Nov 26 04:49:55 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:55 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:55 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Nov 26 04:49:55 localhost nova_compute[281415]: 2025-11-26 09:49:55.642 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:49:55 localhost nova_compute[281415]: 2025-11-26 09:49:55.643 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 04:49:56 localhost ceph-mon[288827]: Reconfiguring osd.1 (monmap changed)... Nov 26 04:49:56 localhost ceph-mon[288827]: Reconfiguring daemon osd.1 on np0005536119.localdomain Nov 26 04:49:56 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:56 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:56 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Nov 26 04:49:56 localhost nova_compute[281415]: 2025-11-26 09:49:56.684 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:49:57 localhost podman[240049]: time="2025-11-26T09:49:57Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:49:57 localhost podman[240049]: @ - - [26/Nov/2025:09:49:57 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153864 "" "Go-http-client/1.1" Nov 26 04:49:57 localhost podman[240049]: @ - - [26/Nov/2025:09:49:57 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18707 "" "Go-http-client/1.1" Nov 26 04:49:57 localhost ceph-mon[288827]: Reconfiguring osd.3 (monmap changed)... Nov 26 04:49:57 localhost ceph-mon[288827]: Reconfiguring daemon osd.3 on np0005536119.localdomain Nov 26 04:49:57 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:57 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:57 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005536119.dxhchp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 26 04:49:58 localhost ceph-mon[288827]: mon.np0005536118@3(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:49:58 localhost nova_compute[281415]: 2025-11-26 09:49:58.429 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:49:58 localhost ceph-mon[288827]: Reconfiguring mds.mds.np0005536119.dxhchp (monmap changed)... Nov 26 04:49:58 localhost ceph-mon[288827]: Reconfiguring daemon mds.mds.np0005536119.dxhchp on np0005536119.localdomain Nov 26 04:49:58 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:58 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:58 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536119.eupicg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:49:59 localhost ceph-mon[288827]: Reconfiguring mgr.np0005536119.eupicg (monmap changed)... Nov 26 04:49:59 localhost ceph-mon[288827]: Reconfiguring daemon mgr.np0005536119.eupicg on np0005536119.localdomain Nov 26 04:49:59 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:59 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:49:59 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 26 04:50:00 localhost ceph-mon[288827]: Reconfiguring mon.np0005536119 (monmap changed)... Nov 26 04:50:00 localhost ceph-mon[288827]: Reconfiguring daemon mon.np0005536119 on np0005536119.localdomain Nov 26 04:50:00 localhost ceph-mon[288827]: overall HEALTH_OK Nov 26 04:50:00 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:00 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:01 localhost nova_compute[281415]: 2025-11-26 09:50:01.685 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:50:01 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:01 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:01 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:50:01 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:01 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:50:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:50:02 localhost systemd[1]: tmp-crun.vfuKNy.mount: Deactivated successfully. Nov 26 04:50:02 localhost podman[291539]: 2025-11-26 09:50:02.567999599 +0000 UTC m=+0.093640487 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd) Nov 26 04:50:02 localhost podman[291539]: 2025-11-26 09:50:02.579619309 +0000 UTC m=+0.105260167 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, managed_by=edpm_ansible) Nov 26 04:50:02 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:50:02 localhost podman[291570]: 2025-11-26 09:50:02.669223794 +0000 UTC m=+0.096356739 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:50:02 localhost podman[291570]: 2025-11-26 09:50:02.677471273 +0000 UTC m=+0.104604258 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent) Nov 26 04:50:02 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:50:02 localhost ceph-mon[288827]: Removing np0005536112.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:50:02 localhost ceph-mon[288827]: Updating np0005536113.localdomain:/etc/ceph/ceph.conf Nov 26 04:50:02 localhost ceph-mon[288827]: Updating np0005536114.localdomain:/etc/ceph/ceph.conf Nov 26 04:50:02 localhost ceph-mon[288827]: Updating np0005536117.localdomain:/etc/ceph/ceph.conf Nov 26 04:50:02 localhost ceph-mon[288827]: Updating np0005536118.localdomain:/etc/ceph/ceph.conf Nov 26 04:50:02 localhost ceph-mon[288827]: Updating np0005536119.localdomain:/etc/ceph/ceph.conf Nov 26 04:50:02 localhost ceph-mon[288827]: Removing np0005536112.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 26 04:50:02 localhost ceph-mon[288827]: Removing np0005536112.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.client.admin.keyring Nov 26 04:50:02 localhost ceph-mon[288827]: Updating np0005536113.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:50:02 localhost ceph-mon[288827]: Updating np0005536114.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:50:02 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:02 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:02 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:02 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:02 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:02 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:02 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:02 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:03 localhost ceph-mon[288827]: mon.np0005536118@3(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:50:03 localhost nova_compute[281415]: 2025-11-26 09:50:03.452 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.581 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'name': 'test', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005536118.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'hostId': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.582 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.582 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.589 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes volume: 7111 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a2c5b604-07dc-4ac1-8ed3-ecd42584781c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7111, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:50:03.582651', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '4813d79e-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11078.824931398, 'message_signature': '46415f763bbe6f66fbddc061517189e0ea8a6859a3e2772292683687a3e36e0c'}]}, 'timestamp': '2025-11-26 09:50:03.590188', '_unique_id': '47c87fabbc82445b81b5ce2903e439e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.592 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.593 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.593 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e738fd4-606c-487b-9d6e-cafd98b48b7f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:50:03.593462', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '48146e84-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11078.824931398, 'message_signature': 'eb628fb63c362040ab76c7336034bde7398846aa9fd5eca0c219efcce4dd63d7'}]}, 'timestamp': '2025-11-26 09:50:03.593925', '_unique_id': 'ef499daf668840d58bdf1e7641c2c3dd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.594 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.595 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.626 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.627 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c6db2840-cfa5-4177-9a24-fda37a886021', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:50:03.596094', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4819954e-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11078.838357503, 'message_signature': 'f461ae5a1de70c4bdf607ecfd5711b2426e70e2f7b9d2e298c93de3d436f713f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:50:03.596094', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4819ad0e-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11078.838357503, 'message_signature': '0d8c05639587a377d3e97c516e4a0d1041030f94e0f328992ab9f7fb6da6edea'}]}, 'timestamp': '2025-11-26 09:50:03.628300', '_unique_id': '4cffee3a244d4558af60a9c70b06bf29'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.630 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.631 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.631 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 1723586642 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.632 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 89399569 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '220a74b5-0d79-44d1-af1d-3ff96299eb3c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1723586642, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:50:03.631617', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '481a411a-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11078.838357503, 'message_signature': '3a7229a323c7a960294301046b98c8fdb417e607f639380e061e592eb3ee718e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89399569, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:50:03.631617', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '481a52f4-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11078.838357503, 'message_signature': 'bb5fb150f6066e0469ad0eb4d1893b39a5f231b2f8af1309b5ec068ad5c4fbfd'}]}, 'timestamp': '2025-11-26 09:50:03.632510', '_unique_id': '8f53d0aa444e465c8b0347f84ab43e13'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.633 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.634 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.644 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.645 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '797faf2a-a5f2-476e-89ac-b2a72b1e9d20', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:50:03.634733', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '481c48c0-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11078.877047561, 'message_signature': 'ee479f1d5c5c3dbe617d452e7248262ce85ebb0fbb23ee74582a4ebbe2c2613a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:50:03.634733', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '481c598c-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11078.877047561, 'message_signature': 'a02aca34ccebbd9c1d44c2ad86166c38dacc934402ee805a2f8d673fe36a6372'}]}, 'timestamp': '2025-11-26 09:50:03.645785', '_unique_id': '06ae5ec292cd4f5cab26aa4d7ee3f454'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.646 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.647 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.648 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 26 04:50:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:50:03.655 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:50:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:50:03.655 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:50:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:50:03.656 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.665 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/cpu volume: 12990000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e1d67cc-c9a4-4bad-834a-b98fbbf7611b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12990000000, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T09:50:03.648194', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '481f6410-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11078.907406408, 'message_signature': '8ead8640087bc972980ec0d32adfc61438926b2ac97c79f679961ee6e58ee2c3'}]}, 'timestamp': '2025-11-26 09:50:03.665735', '_unique_id': '5470db72a3ab4bc5b8a8f0fc590082f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.666 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.667 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.668 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 1143371229 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.668 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 23326743 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b395cb7-30ca-4d07-ba12-86600a6e52e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1143371229, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:50:03.668044', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '481fcfae-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11078.838357503, 'message_signature': '5c931deb9e42cf2249f21374aa032a19894fdf829d1e3a7b777196162bfac2a8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 23326743, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:50:03.668044', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '481fdf94-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11078.838357503, 'message_signature': 'b1078c959ed9d48d7aab015bc0ed8bdf40ef11a69169ea8e412fa727bda4b2d3'}]}, 'timestamp': '2025-11-26 09:50:03.668877', '_unique_id': '76f7c07b1e8e403c97efd7aa2244bf6c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.669 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.670 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.671 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.671 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e9b5dcac-b43a-4213-8da3-b46415d27b63', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:50:03.671071', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '482045ec-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11078.838357503, 'message_signature': '122a3bcbf3e6eb80bbb02eaad71292b60b11d55943620e6ad28873ee1d751b8e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:50:03.671071', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '482055b4-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11078.838357503, 'message_signature': 'eeb8b762049e931d8a64953e0bb658c0667930fef6ca745b86dd7f098b913b6a'}]}, 'timestamp': '2025-11-26 09:50:03.671897', '_unique_id': '9ca50c705b4c4b488bf340ec0e26fe95'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.672 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.673 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.674 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '29fafa5a-78b4-45ca-b703-de973bc81917', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:50:03.674046', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '4820ba86-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11078.824931398, 'message_signature': 'cdfd35a77692263129ae7a1e6f54fe7aa2cc30239597f225473bfc3805876078'}]}, 'timestamp': '2025-11-26 09:50:03.674509', '_unique_id': '93aa4e07268d4487be72685c116d9c85'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.675 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.676 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.676 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/memory.usage volume: 51.79296875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '109cdd94-0f5a-4a7e-97fb-9e2cea3efcbc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.79296875, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T09:50:03.676600', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '48211db4-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11078.907406408, 'message_signature': '9de572bf8f526e6cd76cee484df0d02a0a2a04c8446d4b7fde3aa3ecd2a43b1e'}]}, 'timestamp': '2025-11-26 09:50:03.677064', '_unique_id': 'd495361988fe4377a65572ddaf854f7f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.677 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.679 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.679 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e80b4cfc-5d63-43f3-bb87-f7ed4818d527', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:50:03.679146', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '4821815a-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11078.824931398, 'message_signature': 'd8291d35a9a3b199663b570347352e2b066c05535e59c76cb9f863b2f8f4dcc6'}]}, 'timestamp': '2025-11-26 09:50:03.679600', '_unique_id': 'bb75c1c32cd145ad9d2f4f36a349ce77'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.680 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.681 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.681 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.682 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8312ca44-4a44-479c-81cd-5515c5b59905', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:50:03.681801', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4821e9e2-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11078.838357503, 'message_signature': '8f4bbfad67a50c64529fb059767d23cff4ae2d59604e662c983bf91d871f3a29'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:50:03.681801', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4821f9e6-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11078.838357503, 'message_signature': '331523feae510ac56f206885ec5b45a87d6df030f398ce7b3b8637ae99dbe2cf'}]}, 'timestamp': '2025-11-26 09:50:03.682657', '_unique_id': '67c9bc400df347b3a5b851a76ce9c35f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.683 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.684 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.684 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.685 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0472544f-ae8a-4b73-b450-1e5b7692cb7e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:50:03.684976', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '4822653e-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11078.824931398, 'message_signature': 'c5f40471b807ffb1f546e6b1084130f738de6ace9be90411a1da21690c49cef6'}]}, 'timestamp': '2025-11-26 09:50:03.685434', '_unique_id': 'c768d4683fc748808a572298df7b7698'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.686 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.687 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.687 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.687 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69f451b9-aca6-441d-aaf5-9065d3eeebee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:50:03.687495', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4822c72c-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11078.838357503, 'message_signature': '4a6b7146df57d293c08bf3279dd58528c13494193ecbfef734961a9fb625f7e9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:50:03.687495', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4822d85c-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11078.838357503, 'message_signature': '7e297e9976ee3fee190e38b4630c3b0f17379c06a80c4b4d6425e2c7c8d38615'}]}, 'timestamp': '2025-11-26 09:50:03.688352', '_unique_id': '188c89102f1a4c3dbfea414a1757c62e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.689 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.690 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.690 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.690 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd1d71e07-2c8e-476e-b6da-bbc513741050', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:50:03.690461', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '48233b08-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11078.877047561, 'message_signature': 'c86747ac67dc5ba842bf2de4798ea696c90d8940f8d3d9a82960899e173b0dbf'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:50:03.690461', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '48234d0a-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11078.877047561, 'message_signature': '551c4b47b4916de5424db445cd2190fbbf1ded4a1f89d5e8fb73a3458514663e'}]}, 'timestamp': '2025-11-26 09:50:03.691341', '_unique_id': '0d8e7a4ef4894cba80c0957929fde7c0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.692 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.693 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.693 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0d41cfd-175a-45c3-a1d6-d7347b79f7ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:50:03.693424', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '4823af16-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11078.824931398, 'message_signature': '54c69302920ffcd5a4c5e85e152ed894fdd5a3c50a7b01be2fe8713f198e0d31'}]}, 'timestamp': '2025-11-26 09:50:03.693875', '_unique_id': 'dbdad7ee7a7a48ce94f68893219a366a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.694 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.695 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.696 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd8039f41-ad4a-4457-9384-f67d4b5e52d3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:50:03.695967', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '4824124e-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11078.824931398, 'message_signature': 'bc559c517e9cd3ab03e95b017e309d46d4ca3fff8d2f19194022ef5a69a9007e'}]}, 'timestamp': '2025-11-26 09:50:03.696417', '_unique_id': '80a3b224066341dcb9e10ab11912395e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.697 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.698 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.698 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2dbbff87-a024-4529-862b-8f63b79f3b6d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:50:03.698447', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '48247306-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11078.824931398, 'message_signature': 'b6f3eb4e8cf15d014628ff9ad5ce9fade482faf0d41d9f06f5d8e8231d44193d'}]}, 'timestamp': '2025-11-26 09:50:03.698914', '_unique_id': 'fd25dde1699d43be801ee09b75b71b0d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.699 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.701 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.701 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '877628bb-50ec-46ef-b285-ba40f1f57143', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:50:03.701259', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '4824e0d4-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11078.824931398, 'message_signature': 'a387ad4a19da661b8f89562aa73517d4b6c01d724a3284216f1c28cf4d4595d1'}]}, 'timestamp': '2025-11-26 09:50:03.701702', '_unique_id': '8e3821fc8a554d4ba02a79abc9e9c8fc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.702 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.703 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.703 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a4eec2a3-7a22-4a5c-acc9-38a259b31b59', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:50:03.703686', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '48253caa-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11078.824931398, 'message_signature': '3e9f70aa9a04bee043e86a731a088827f437f98db3999288a77574e59b0ab366'}]}, 'timestamp': '2025-11-26 09:50:03.703989', '_unique_id': 'a2347be54c854aa39287e5990deabb0c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.704 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.705 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.705 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.705 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a04b227-59ae-4866-88c9-a2d69e042b41', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:50:03.705267', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '48257a44-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11078.877047561, 'message_signature': '7f75246da4072d5efb57a9596920b8061eb60e726101139e0bc4ec310930554e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:50:03.705267', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '482583d6-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11078.877047561, 'message_signature': 'ccf3a02ced6612239098007412f4a8e90d3273d3f6dfcac971747f226823679b'}]}, 'timestamp': '2025-11-26 09:50:03.705772', '_unique_id': '911ce862adcd46f994cc56e95f624366'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:50:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:50:03.706 12 ERROR oslo_messaging.notify.messaging Nov 26 04:50:03 localhost ceph-mon[288827]: Updating np0005536119.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:50:03 localhost ceph-mon[288827]: Updating np0005536117.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:50:03 localhost ceph-mon[288827]: Updating np0005536118.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:50:03 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:03 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:03 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:03 localhost ceph-mon[288827]: Removing daemon mgr.np0005536112.srlncr from np0005536112.localdomain -- ports [9283, 8765] Nov 26 04:50:05 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:05 localhost ceph-mon[288827]: Added label _no_schedule to host np0005536112.localdomain Nov 26 04:50:05 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:05 localhost ceph-mon[288827]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005536112.localdomain Nov 26 04:50:05 localhost ceph-mon[288827]: Removing key for mgr.np0005536112.srlncr Nov 26 04:50:05 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth rm", "entity": "mgr.np0005536112.srlncr"} : dispatch Nov 26 04:50:05 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd='[{"prefix": "auth rm", "entity": "mgr.np0005536112.srlncr"}]': finished Nov 26 04:50:05 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:05 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:05 localhost sshd[291734]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:50:06 localhost nova_compute[281415]: 2025-11-26 09:50:06.689 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:50:07 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:07 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:07 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:07 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:07 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:07 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:07 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:07 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:07 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:07 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:07 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:50:07 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:08 localhost ceph-mon[288827]: Removing daemon crash.np0005536112 from np0005536112.localdomain -- ports [] Nov 26 04:50:08 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:08 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:08 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005536112.localdomain"} : dispatch Nov 26 04:50:08 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005536112.localdomain"}]': finished Nov 26 04:50:08 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth rm", "entity": "client.crash.np0005536112.localdomain"} : dispatch Nov 26 04:50:08 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd='[{"prefix": "auth rm", "entity": "client.crash.np0005536112.localdomain"}]': finished Nov 26 04:50:08 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:08 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:08 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:50:08 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:08 localhost ceph-mon[288827]: mon.np0005536118@3(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:50:08 localhost nova_compute[281415]: 2025-11-26 09:50:08.488 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:50:09 localhost ceph-mon[288827]: Removing key for client.crash.np0005536112.localdomain Nov 26 04:50:09 localhost ceph-mon[288827]: Removed host np0005536112.localdomain Nov 26 04:50:09 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536113.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:50:10 localhost ceph-mon[288827]: Reconfiguring crash.np0005536113 (monmap changed)... Nov 26 04:50:10 localhost ceph-mon[288827]: Reconfiguring daemon crash.np0005536113 on np0005536113.localdomain Nov 26 04:50:10 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:10 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:10 localhost ceph-mon[288827]: Reconfiguring mon.np0005536113 (monmap changed)... Nov 26 04:50:10 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 26 04:50:10 localhost ceph-mon[288827]: Reconfiguring daemon mon.np0005536113 on np0005536113.localdomain Nov 26 04:50:11 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:11 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:11 localhost ceph-mon[288827]: Reconfiguring mgr.np0005536113.tjpmyn (monmap changed)... Nov 26 04:50:11 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536113.tjpmyn", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:50:11 localhost ceph-mon[288827]: Reconfiguring daemon mgr.np0005536113.tjpmyn on np0005536113.localdomain Nov 26 04:50:11 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:11 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:11 localhost ceph-mon[288827]: Reconfiguring mon.np0005536114 (monmap changed)... Nov 26 04:50:11 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 26 04:50:11 localhost ceph-mon[288827]: Reconfiguring daemon mon.np0005536114 on np0005536114.localdomain Nov 26 04:50:11 localhost nova_compute[281415]: 2025-11-26 09:50:11.693 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:50:12 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:12 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:12 localhost ceph-mon[288827]: Reconfiguring mgr.np0005536114.ddbqmi (monmap changed)... Nov 26 04:50:12 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536114.ddbqmi", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:50:12 localhost ceph-mon[288827]: Reconfiguring daemon mgr.np0005536114.ddbqmi on np0005536114.localdomain Nov 26 04:50:12 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:12 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:12 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:12 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536114.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:50:13 localhost ceph-mon[288827]: mon.np0005536118@3(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:50:13 localhost nova_compute[281415]: 2025-11-26 09:50:13.491 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:50:13 localhost ceph-mon[288827]: Reconfiguring crash.np0005536114 (monmap changed)... Nov 26 04:50:13 localhost ceph-mon[288827]: Reconfiguring daemon crash.np0005536114 on np0005536114.localdomain Nov 26 04:50:13 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:13 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:13 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536117.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:50:14 localhost ceph-mon[288827]: Reconfiguring crash.np0005536117 (monmap changed)... Nov 26 04:50:14 localhost ceph-mon[288827]: Reconfiguring daemon crash.np0005536117 on np0005536117.localdomain Nov 26 04:50:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:50:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:50:15 localhost openstack_network_exporter[242153]: ERROR 09:50:15 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:50:15 localhost openstack_network_exporter[242153]: ERROR 09:50:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:50:15 localhost openstack_network_exporter[242153]: ERROR 09:50:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:50:15 localhost openstack_network_exporter[242153]: ERROR 09:50:15 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:50:15 localhost openstack_network_exporter[242153]: Nov 26 04:50:15 localhost openstack_network_exporter[242153]: ERROR 09:50:15 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:50:15 localhost openstack_network_exporter[242153]: Nov 26 04:50:15 localhost podman[291795]: 2025-11-26 09:50:15.861501443 +0000 UTC m=+0.115961230 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118) Nov 26 04:50:15 localhost podman[291795]: 2025-11-26 09:50:15.877332662 +0000 UTC m=+0.131792479 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Nov 26 04:50:15 localhost systemd[1]: tmp-crun.Qrg4yW.mount: Deactivated successfully. Nov 26 04:50:15 localhost podman[291794]: 2025-11-26 09:50:15.915600167 +0000 UTC m=+0.169787626 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 04:50:15 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:50:15 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:15 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:15 localhost ceph-mon[288827]: Reconfiguring osd.2 (monmap changed)... Nov 26 04:50:15 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Nov 26 04:50:15 localhost ceph-mon[288827]: Reconfiguring daemon osd.2 on np0005536117.localdomain Nov 26 04:50:15 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:15 localhost podman[291794]: 2025-11-26 09:50:15.956485551 +0000 UTC m=+0.210673060 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 04:50:15 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:50:16 localhost ceph-mon[288827]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0. Nov 26 04:50:16 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:50:16.013975) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 26 04:50:16 localhost ceph-mon[288827]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16 Nov 26 04:50:16 localhost ceph-mon[288827]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150616013999, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 1771, "num_deletes": 253, "total_data_size": 2991542, "memory_usage": 3027928, "flush_reason": "Manual Compaction"} Nov 26 04:50:16 localhost ceph-mon[288827]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started Nov 26 04:50:16 localhost ceph-mon[288827]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150616024332, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 1702729, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 11300, "largest_seqno": 13066, "table_properties": {"data_size": 1695164, "index_size": 4266, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2245, "raw_key_size": 20233, "raw_average_key_size": 22, "raw_value_size": 1678413, "raw_average_value_size": 1873, "num_data_blocks": 184, "num_entries": 896, "num_filter_entries": 896, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764150582, "oldest_key_time": 1764150582, "file_creation_time": 1764150616, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "96260784-ab37-4bfa-a747-b54286a1d4f8", "db_session_id": "ZOF5ONGIRCTUGR7KNLS5", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}} Nov 26 04:50:16 localhost ceph-mon[288827]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 10384 microseconds, and 2952 cpu microseconds. Nov 26 04:50:16 localhost ceph-mon[288827]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 04:50:16 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:50:16.024359) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 1702729 bytes OK Nov 26 04:50:16 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:50:16.024373) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started Nov 26 04:50:16 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:50:16.025913) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done Nov 26 04:50:16 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:50:16.025925) EVENT_LOG_v1 {"time_micros": 1764150616025921, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 26 04:50:16 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:50:16.025953) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 26 04:50:16 localhost ceph-mon[288827]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 2982696, prev total WAL file size 2982696, number of live WAL files 2. Nov 26 04:50:16 localhost ceph-mon[288827]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 04:50:16 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:50:16.026542) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130303430' seq:72057594037927935, type:22 .. '7061786F73003130323932' seq:0, type:0; will stop at (end) Nov 26 04:50:16 localhost ceph-mon[288827]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 26 04:50:16 localhost ceph-mon[288827]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(1662KB)], [15(10MB)] Nov 26 04:50:16 localhost ceph-mon[288827]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150616026608, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 12257314, "oldest_snapshot_seqno": -1} Nov 26 04:50:16 localhost ceph-mon[288827]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 9320 keys, 11072922 bytes, temperature: kUnknown Nov 26 04:50:16 localhost ceph-mon[288827]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150616074824, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 11072922, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11020853, "index_size": 27648, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23365, "raw_key_size": 249633, "raw_average_key_size": 26, "raw_value_size": 10862228, "raw_average_value_size": 1165, "num_data_blocks": 1046, "num_entries": 9320, "num_filter_entries": 9320, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764150548, "oldest_key_time": 0, "file_creation_time": 1764150616, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "96260784-ab37-4bfa-a747-b54286a1d4f8", "db_session_id": "ZOF5ONGIRCTUGR7KNLS5", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}} Nov 26 04:50:16 localhost ceph-mon[288827]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 04:50:16 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:50:16.075150) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 11072922 bytes Nov 26 04:50:16 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:50:16.077028) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 253.8 rd, 229.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.6, 10.1 +0.0 blob) out(10.6 +0.0 blob), read-write-amplify(13.7) write-amplify(6.5) OK, records in: 9866, records dropped: 546 output_compression: NoCompression Nov 26 04:50:16 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:50:16.077059) EVENT_LOG_v1 {"time_micros": 1764150616077045, "job": 6, "event": "compaction_finished", "compaction_time_micros": 48289, "compaction_time_cpu_micros": 23593, "output_level": 6, "num_output_files": 1, "total_output_size": 11072922, "num_input_records": 9866, "num_output_records": 9320, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 26 04:50:16 localhost ceph-mon[288827]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 04:50:16 localhost ceph-mon[288827]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150616077498, "job": 6, "event": "table_file_deletion", "file_number": 17} Nov 26 04:50:16 localhost ceph-mon[288827]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 04:50:16 localhost ceph-mon[288827]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150616079870, "job": 6, "event": "table_file_deletion", "file_number": 15} Nov 26 04:50:16 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:50:16.026451) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:50:16 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:50:16.079913) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:50:16 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:50:16.079918) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:50:16 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:50:16.079921) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:50:16 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:50:16.079924) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:50:16 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:50:16.079946) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:50:16 localhost nova_compute[281415]: 2025-11-26 09:50:16.696 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:50:16 localhost ceph-mon[288827]: Saving service mon spec with placement label:mon Nov 26 04:50:16 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:16 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:16 localhost ceph-mon[288827]: Reconfiguring osd.5 (monmap changed)... Nov 26 04:50:16 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Nov 26 04:50:16 localhost ceph-mon[288827]: Reconfiguring daemon osd.5 on np0005536117.localdomain Nov 26 04:50:17 localhost ceph-mgr[287388]: ms_deliver_dispatch: unhandled message 0x55fcbfcdaf20 mon_map magic: 0 from mon.3 v2:172.18.0.107:3300/0 Nov 26 04:50:17 localhost ceph-mon[288827]: log_channel(cluster) log [INF] : mon.np0005536118 calling monitor election Nov 26 04:50:17 localhost ceph-mon[288827]: paxos.3).electionLogic(26) init, last seen epoch 26 Nov 26 04:50:17 localhost ceph-mon[288827]: mon.np0005536118@3(electing) e8 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:50:18 localhost nova_compute[281415]: 2025-11-26 09:50:18.586 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:50:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:50:20 localhost systemd[1]: tmp-crun.dwbdEG.mount: Deactivated successfully. Nov 26 04:50:20 localhost podman[291839]: 2025-11-26 09:50:20.840823641 +0000 UTC m=+0.099272158 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller) Nov 26 04:50:20 localhost podman[291839]: 2025-11-26 09:50:20.919803984 +0000 UTC m=+0.178252491 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Nov 26 04:50:20 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:50:21 localhost nova_compute[281415]: 2025-11-26 09:50:21.699 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:50:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:50:22 localhost ceph-mon[288827]: paxos.3).electionLogic(27) init, last seen epoch 27, mid-election, bumping Nov 26 04:50:22 localhost ceph-mon[288827]: mon.np0005536118@3(electing) e8 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:50:22 localhost ceph-mon[288827]: mon.np0005536118@3(electing) e8 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:50:22 localhost ceph-mon[288827]: mon.np0005536118@3(electing) e8 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:50:22 localhost systemd[1]: tmp-crun.n0Ng1R.mount: Deactivated successfully. Nov 26 04:50:22 localhost podman[291864]: 2025-11-26 09:50:22.832153458 +0000 UTC m=+0.091696199 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.openshift.tags=minimal rhel9, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.buildah.version=1.33.7, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, name=ubi9-minimal) Nov 26 04:50:22 localhost podman[291864]: 2025-11-26 09:50:22.847381197 +0000 UTC m=+0.106923928 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_id=edpm, container_name=openstack_network_exporter, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Nov 26 04:50:22 localhost ceph-mon[288827]: mon.np0005536118@3(peon) e8 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:50:22 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:50:23 localhost ceph-mon[288827]: mon.np0005536118@3(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:50:23 localhost nova_compute[281415]: 2025-11-26 09:50:23.633 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:50:23 localhost ceph-mon[288827]: mon.np0005536113 calling monitor election Nov 26 04:50:23 localhost ceph-mon[288827]: mon.np0005536119 calling monitor election Nov 26 04:50:23 localhost ceph-mon[288827]: mon.np0005536118 calling monitor election Nov 26 04:50:23 localhost ceph-mon[288827]: mon.np0005536113 calling monitor election Nov 26 04:50:23 localhost ceph-mon[288827]: mon.np0005536119 calling monitor election Nov 26 04:50:23 localhost ceph-mon[288827]: Health check failed: 1/4 mons down, quorum np0005536114,np0005536113,np0005536119 (MON_DOWN) Nov 26 04:50:23 localhost ceph-mon[288827]: overall HEALTH_OK Nov 26 04:50:23 localhost ceph-mon[288827]: mon.np0005536114 calling monitor election Nov 26 04:50:23 localhost ceph-mon[288827]: mon.np0005536114 is new leader, mons np0005536114,np0005536113,np0005536119,np0005536118 in quorum (ranks 0,1,2,3) Nov 26 04:50:23 localhost ceph-mon[288827]: Health check cleared: MON_DOWN (was: 1/4 mons down, quorum np0005536114,np0005536113,np0005536119) Nov 26 04:50:23 localhost ceph-mon[288827]: Cluster is now healthy Nov 26 04:50:23 localhost ceph-mon[288827]: overall HEALTH_OK Nov 26 04:50:23 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:23 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:23 localhost ceph-mon[288827]: Reconfiguring mgr.np0005536117.ggibwg (monmap changed)... Nov 26 04:50:23 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536117.ggibwg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:50:23 localhost ceph-mon[288827]: Reconfiguring daemon mgr.np0005536117.ggibwg on np0005536117.localdomain Nov 26 04:50:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:50:24 localhost podman[291935]: 2025-11-26 09:50:24.995773995 +0000 UTC m=+0.080327116 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 26 04:50:25 localhost podman[291935]: 2025-11-26 09:50:25.010262522 +0000 UTC m=+0.094815613 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 26 04:50:25 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:50:25 localhost podman[291943]: Nov 26 04:50:25 localhost podman[291943]: 2025-11-26 09:50:25.08110461 +0000 UTC m=+0.138333976 container create a9f099020562e4d3ef369e65a83d30e9a4fa2c290adac6872ce19951f946340b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_heisenberg, CEPH_POINT_RELEASE=, GIT_BRANCH=main, RELEASE=main, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, name=rhceph, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, build-date=2025-09-24T08:57:55, ceph=True, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, release=553, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., GIT_CLEAN=True, io.openshift.expose-services=) Nov 26 04:50:25 localhost podman[291943]: 2025-11-26 09:50:24.991658361 +0000 UTC m=+0.048887777 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:50:25 localhost systemd[1]: Started libpod-conmon-a9f099020562e4d3ef369e65a83d30e9a4fa2c290adac6872ce19951f946340b.scope. Nov 26 04:50:25 localhost systemd[1]: Started libcrun container. Nov 26 04:50:25 localhost podman[291943]: 2025-11-26 09:50:25.162618951 +0000 UTC m=+0.219848317 container init a9f099020562e4d3ef369e65a83d30e9a4fa2c290adac6872ce19951f946340b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_heisenberg, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, architecture=x86_64, name=rhceph, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, distribution-scope=public, RELEASE=main, release=553, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, maintainer=Guillaume Abrioux , vcs-type=git, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc.) Nov 26 04:50:25 localhost podman[291943]: 2025-11-26 09:50:25.174687875 +0000 UTC m=+0.231917251 container start a9f099020562e4d3ef369e65a83d30e9a4fa2c290adac6872ce19951f946340b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_heisenberg, ceph=True, architecture=x86_64, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, release=553, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, RELEASE=main, GIT_CLEAN=True, version=7) Nov 26 04:50:25 localhost podman[291943]: 2025-11-26 09:50:25.175083207 +0000 UTC m=+0.232312603 container attach a9f099020562e4d3ef369e65a83d30e9a4fa2c290adac6872ce19951f946340b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_heisenberg, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, GIT_BRANCH=main, io.openshift.expose-services=, RELEASE=main, io.openshift.tags=rhceph ceph, architecture=x86_64, name=rhceph, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, vcs-type=git) Nov 26 04:50:25 localhost amazing_heisenberg[291974]: 167 167 Nov 26 04:50:25 localhost podman[291943]: 2025-11-26 09:50:25.179492581 +0000 UTC m=+0.236721947 container died a9f099020562e4d3ef369e65a83d30e9a4fa2c290adac6872ce19951f946340b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_heisenberg, name=rhceph, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, distribution-scope=public, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, release=553, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, ceph=True, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 26 04:50:25 localhost systemd[1]: libpod-a9f099020562e4d3ef369e65a83d30e9a4fa2c290adac6872ce19951f946340b.scope: Deactivated successfully. Nov 26 04:50:25 localhost podman[291979]: 2025-11-26 09:50:25.269187948 +0000 UTC m=+0.076144620 container remove a9f099020562e4d3ef369e65a83d30e9a4fa2c290adac6872ce19951f946340b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_heisenberg, ceph=True, com.redhat.component=rhceph-container, name=rhceph, vcs-type=git, version=7, RELEASE=main, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.openshift.expose-services=) Nov 26 04:50:25 localhost systemd[1]: libpod-conmon-a9f099020562e4d3ef369e65a83d30e9a4fa2c290adac6872ce19951f946340b.scope: Deactivated successfully. Nov 26 04:50:25 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:25 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:25 localhost ceph-mon[288827]: Reconfiguring crash.np0005536118 (monmap changed)... Nov 26 04:50:25 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536118.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:50:25 localhost ceph-mon[288827]: Reconfiguring daemon crash.np0005536118 on np0005536118.localdomain Nov 26 04:50:25 localhost systemd[1]: var-lib-containers-storage-overlay-692b05b77d9414290bb0e67c06e9ed5822b6d4e2ce1f516c3dccb67022f81015-merged.mount: Deactivated successfully. Nov 26 04:50:26 localhost podman[292049]: Nov 26 04:50:26 localhost podman[292049]: 2025-11-26 09:50:26.056306336 +0000 UTC m=+0.086773550 container create c329e530e4b2c7f28b49962c0f8aa51bcf5fb13fa8cb351da5f033f68a68261e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_wiles, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, release=553, GIT_CLEAN=True, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, RELEASE=main, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.33.12, GIT_BRANCH=main, architecture=x86_64, CEPH_POINT_RELEASE=) Nov 26 04:50:26 localhost systemd[1]: Started libpod-conmon-c329e530e4b2c7f28b49962c0f8aa51bcf5fb13fa8cb351da5f033f68a68261e.scope. Nov 26 04:50:26 localhost podman[292049]: 2025-11-26 09:50:26.018057611 +0000 UTC m=+0.048524865 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:50:26 localhost systemd[1]: Started libcrun container. Nov 26 04:50:26 localhost podman[292049]: 2025-11-26 09:50:26.141020083 +0000 UTC m=+0.171487297 container init c329e530e4b2c7f28b49962c0f8aa51bcf5fb13fa8cb351da5f033f68a68261e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_wiles, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , version=7, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, name=rhceph, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, architecture=x86_64, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, ceph=True, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, build-date=2025-09-24T08:57:55) Nov 26 04:50:26 localhost podman[292049]: 2025-11-26 09:50:26.151138168 +0000 UTC m=+0.181605392 container start c329e530e4b2c7f28b49962c0f8aa51bcf5fb13fa8cb351da5f033f68a68261e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_wiles, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, RELEASE=main, CEPH_POINT_RELEASE=, version=7, distribution-scope=public, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, name=rhceph, maintainer=Guillaume Abrioux , io.openshift.expose-services=, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 26 04:50:26 localhost podman[292049]: 2025-11-26 09:50:26.15151379 +0000 UTC m=+0.181981044 container attach c329e530e4b2c7f28b49962c0f8aa51bcf5fb13fa8cb351da5f033f68a68261e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_wiles, vendor=Red Hat, Inc., io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, io.openshift.expose-services=, RELEASE=main, description=Red Hat Ceph Storage 7, vcs-type=git, release=553, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, name=rhceph, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public) Nov 26 04:50:26 localhost elegant_wiles[292065]: 167 167 Nov 26 04:50:26 localhost systemd[1]: libpod-c329e530e4b2c7f28b49962c0f8aa51bcf5fb13fa8cb351da5f033f68a68261e.scope: Deactivated successfully. Nov 26 04:50:26 localhost podman[292049]: 2025-11-26 09:50:26.155325345 +0000 UTC m=+0.185792569 container died c329e530e4b2c7f28b49962c0f8aa51bcf5fb13fa8cb351da5f033f68a68261e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_wiles, version=7, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, description=Red Hat Ceph Storage 7, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True) Nov 26 04:50:26 localhost podman[292070]: 2025-11-26 09:50:26.254193189 +0000 UTC m=+0.085847392 container remove c329e530e4b2c7f28b49962c0f8aa51bcf5fb13fa8cb351da5f033f68a68261e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_wiles, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, distribution-scope=public, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, architecture=x86_64, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, RELEASE=main, build-date=2025-09-24T08:57:55, vcs-type=git, name=rhceph, io.buildah.version=1.33.12) Nov 26 04:50:26 localhost systemd[1]: libpod-conmon-c329e530e4b2c7f28b49962c0f8aa51bcf5fb13fa8cb351da5f033f68a68261e.scope: Deactivated successfully. Nov 26 04:50:26 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:26 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:26 localhost ceph-mon[288827]: Reconfiguring osd.0 (monmap changed)... Nov 26 04:50:26 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Nov 26 04:50:26 localhost ceph-mon[288827]: Reconfiguring daemon osd.0 on np0005536118.localdomain Nov 26 04:50:26 localhost nova_compute[281415]: 2025-11-26 09:50:26.703 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:50:26 localhost systemd[1]: var-lib-containers-storage-overlay-2089a32bda801e6b2dd95fe715fa0874947c85bd0fc7e6a69c31786ef79ada0d-merged.mount: Deactivated successfully. Nov 26 04:50:27 localhost podman[292149]: Nov 26 04:50:27 localhost podman[292149]: 2025-11-26 09:50:27.203093711 +0000 UTC m=+0.090795391 container create 541c40208874bd0656fa318f7fa810048c5f37b1151555171c14b45435426d49 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hardcore_haslett, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, release=553, CEPH_POINT_RELEASE=, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, RELEASE=main, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 26 04:50:27 localhost systemd[1]: Started libpod-conmon-541c40208874bd0656fa318f7fa810048c5f37b1151555171c14b45435426d49.scope. Nov 26 04:50:27 localhost podman[292149]: 2025-11-26 09:50:27.164518676 +0000 UTC m=+0.052220386 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:50:27 localhost systemd[1]: Started libcrun container. Nov 26 04:50:27 localhost podman[292149]: 2025-11-26 09:50:27.28388115 +0000 UTC m=+0.171582830 container init 541c40208874bd0656fa318f7fa810048c5f37b1151555171c14b45435426d49 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hardcore_haslett, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_BRANCH=main, release=553, com.redhat.component=rhceph-container, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, RELEASE=main, build-date=2025-09-24T08:57:55, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 26 04:50:27 localhost podman[292149]: 2025-11-26 09:50:27.298174611 +0000 UTC m=+0.185876291 container start 541c40208874bd0656fa318f7fa810048c5f37b1151555171c14b45435426d49 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hardcore_haslett, io.openshift.expose-services=, version=7, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, release=553, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, name=rhceph, ceph=True, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, vcs-type=git) Nov 26 04:50:27 localhost podman[292149]: 2025-11-26 09:50:27.298752348 +0000 UTC m=+0.186454028 container attach 541c40208874bd0656fa318f7fa810048c5f37b1151555171c14b45435426d49 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hardcore_haslett, vcs-type=git, distribution-scope=public, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, io.openshift.expose-services=, name=rhceph, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , release=553, io.buildah.version=1.33.12, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main) Nov 26 04:50:27 localhost hardcore_haslett[292164]: 167 167 Nov 26 04:50:27 localhost systemd[1]: libpod-541c40208874bd0656fa318f7fa810048c5f37b1151555171c14b45435426d49.scope: Deactivated successfully. Nov 26 04:50:27 localhost podman[292149]: 2025-11-26 09:50:27.303742799 +0000 UTC m=+0.191444899 container died 541c40208874bd0656fa318f7fa810048c5f37b1151555171c14b45435426d49 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hardcore_haslett, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, vcs-type=git, description=Red Hat Ceph Storage 7, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, RELEASE=main, GIT_BRANCH=main, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7) Nov 26 04:50:27 localhost podman[292169]: 2025-11-26 09:50:27.404581613 +0000 UTC m=+0.089870094 container remove 541c40208874bd0656fa318f7fa810048c5f37b1151555171c14b45435426d49 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hardcore_haslett, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, RELEASE=main, distribution-scope=public, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 26 04:50:27 localhost systemd[1]: libpod-conmon-541c40208874bd0656fa318f7fa810048c5f37b1151555171c14b45435426d49.scope: Deactivated successfully. Nov 26 04:50:27 localhost podman[240049]: time="2025-11-26T09:50:27Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:50:27 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:27 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:27 localhost ceph-mon[288827]: Reconfiguring osd.4 (monmap changed)... Nov 26 04:50:27 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Nov 26 04:50:27 localhost ceph-mon[288827]: Reconfiguring daemon osd.4 on np0005536118.localdomain Nov 26 04:50:27 localhost podman[240049]: @ - - [26/Nov/2025:09:50:27 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153864 "" "Go-http-client/1.1" Nov 26 04:50:27 localhost podman[240049]: @ - - [26/Nov/2025:09:50:27 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18714 "" "Go-http-client/1.1" Nov 26 04:50:27 localhost systemd[1]: var-lib-containers-storage-overlay-61bb1e498b0f73537e7b06d538598b1bc4c0c897c0738c4bbea94e8d93e9a23c-merged.mount: Deactivated successfully. Nov 26 04:50:28 localhost podman[292245]: Nov 26 04:50:28 localhost podman[292245]: 2025-11-26 09:50:28.327790248 +0000 UTC m=+0.080671995 container create 59589ac9bb184c611a8205875672a703772f71a82aa5d9c6290a9f6e42c298d8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_bardeen, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, ceph=True, name=rhceph, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , RELEASE=main, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, vcs-type=git, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, GIT_CLEAN=True, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 26 04:50:28 localhost systemd[1]: Started libpod-conmon-59589ac9bb184c611a8205875672a703772f71a82aa5d9c6290a9f6e42c298d8.scope. Nov 26 04:50:28 localhost systemd[1]: Started libcrun container. Nov 26 04:50:28 localhost podman[292245]: 2025-11-26 09:50:28.294517815 +0000 UTC m=+0.047399582 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:50:28 localhost podman[292245]: 2025-11-26 09:50:28.394473981 +0000 UTC m=+0.147355728 container init 59589ac9bb184c611a8205875672a703772f71a82aa5d9c6290a9f6e42c298d8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_bardeen, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, distribution-scope=public, RELEASE=main, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, ceph=True, maintainer=Guillaume Abrioux , name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, architecture=x86_64, io.buildah.version=1.33.12) Nov 26 04:50:28 localhost podman[292245]: 2025-11-26 09:50:28.40668397 +0000 UTC m=+0.159565707 container start 59589ac9bb184c611a8205875672a703772f71a82aa5d9c6290a9f6e42c298d8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_bardeen, io.openshift.expose-services=, GIT_BRANCH=main, GIT_CLEAN=True, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, description=Red Hat Ceph Storage 7, vcs-type=git, CEPH_POINT_RELEASE=) Nov 26 04:50:28 localhost podman[292245]: 2025-11-26 09:50:28.407035971 +0000 UTC m=+0.159917738 container attach 59589ac9bb184c611a8205875672a703772f71a82aa5d9c6290a9f6e42c298d8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_bardeen, io.buildah.version=1.33.12, vcs-type=git, GIT_CLEAN=True, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vendor=Red Hat, Inc., version=7, distribution-scope=public, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, RELEASE=main, CEPH_POINT_RELEASE=) Nov 26 04:50:28 localhost exciting_bardeen[292260]: 167 167 Nov 26 04:50:28 localhost systemd[1]: libpod-59589ac9bb184c611a8205875672a703772f71a82aa5d9c6290a9f6e42c298d8.scope: Deactivated successfully. Nov 26 04:50:28 localhost podman[292245]: 2025-11-26 09:50:28.411350631 +0000 UTC m=+0.164232378 container died 59589ac9bb184c611a8205875672a703772f71a82aa5d9c6290a9f6e42c298d8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_bardeen, RELEASE=main, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, version=7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, ceph=True, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux ) Nov 26 04:50:28 localhost ceph-mon[288827]: mon.np0005536118@3(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:50:28 localhost podman[292265]: 2025-11-26 09:50:28.518418763 +0000 UTC m=+0.097985159 container remove 59589ac9bb184c611a8205875672a703772f71a82aa5d9c6290a9f6e42c298d8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_bardeen, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, maintainer=Guillaume Abrioux , GIT_BRANCH=main, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, RELEASE=main, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, build-date=2025-09-24T08:57:55) Nov 26 04:50:28 localhost systemd[1]: libpod-conmon-59589ac9bb184c611a8205875672a703772f71a82aa5d9c6290a9f6e42c298d8.scope: Deactivated successfully. Nov 26 04:50:28 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:28 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:28 localhost ceph-mon[288827]: Reconfiguring mds.mds.np0005536118.kohnma (monmap changed)... Nov 26 04:50:28 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005536118.kohnma", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 26 04:50:28 localhost ceph-mon[288827]: Reconfiguring daemon mds.mds.np0005536118.kohnma on np0005536118.localdomain Nov 26 04:50:28 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:28 localhost nova_compute[281415]: 2025-11-26 09:50:28.675 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:50:28 localhost systemd[1]: var-lib-containers-storage-overlay-06eaa49cd8d6f44835a52b1edd4019d9d51dbf3edceaafd38a73ae927387b0cd-merged.mount: Deactivated successfully. Nov 26 04:50:29 localhost podman[292334]: Nov 26 04:50:29 localhost podman[292334]: 2025-11-26 09:50:29.33599656 +0000 UTC m=+0.080555502 container create 7925fad985ce88239e5479eb57dfc66c36bd7b11163d3b1b92acd30e470ce23c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_shtern, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.33.12, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, release=553, build-date=2025-09-24T08:57:55, RELEASE=main, description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container) Nov 26 04:50:29 localhost systemd[1]: Started libpod-conmon-7925fad985ce88239e5479eb57dfc66c36bd7b11163d3b1b92acd30e470ce23c.scope. Nov 26 04:50:29 localhost systemd[1]: Started libcrun container. Nov 26 04:50:29 localhost podman[292334]: 2025-11-26 09:50:29.30254581 +0000 UTC m=+0.047104762 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:50:29 localhost podman[292334]: 2025-11-26 09:50:29.403485927 +0000 UTC m=+0.148044859 container init 7925fad985ce88239e5479eb57dfc66c36bd7b11163d3b1b92acd30e470ce23c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_shtern, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , architecture=x86_64, CEPH_POINT_RELEASE=, vcs-type=git, ceph=True, GIT_CLEAN=True, GIT_BRANCH=main, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, distribution-scope=public, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 26 04:50:29 localhost podman[292334]: 2025-11-26 09:50:29.413878781 +0000 UTC m=+0.158437723 container start 7925fad985ce88239e5479eb57dfc66c36bd7b11163d3b1b92acd30e470ce23c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_shtern, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_BRANCH=main, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, RELEASE=main, ceph=True, release=553, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, vcs-type=git, vendor=Red Hat, Inc.) Nov 26 04:50:29 localhost podman[292334]: 2025-11-26 09:50:29.41418887 +0000 UTC m=+0.158747852 container attach 7925fad985ce88239e5479eb57dfc66c36bd7b11163d3b1b92acd30e470ce23c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_shtern, ceph=True, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, RELEASE=main, GIT_BRANCH=main, name=rhceph, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, maintainer=Guillaume Abrioux , vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 26 04:50:29 localhost upbeat_shtern[292349]: 167 167 Nov 26 04:50:29 localhost systemd[1]: libpod-7925fad985ce88239e5479eb57dfc66c36bd7b11163d3b1b92acd30e470ce23c.scope: Deactivated successfully. Nov 26 04:50:29 localhost podman[292334]: 2025-11-26 09:50:29.419086999 +0000 UTC m=+0.163645951 container died 7925fad985ce88239e5479eb57dfc66c36bd7b11163d3b1b92acd30e470ce23c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_shtern, architecture=x86_64, GIT_BRANCH=main, vendor=Red Hat, Inc., RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, distribution-scope=public, CEPH_POINT_RELEASE=, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, io.buildah.version=1.33.12) Nov 26 04:50:29 localhost podman[292354]: 2025-11-26 09:50:29.513141378 +0000 UTC m=+0.086839092 container remove 7925fad985ce88239e5479eb57dfc66c36bd7b11163d3b1b92acd30e470ce23c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_shtern, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, version=7, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, GIT_CLEAN=True, distribution-scope=public, CEPH_POINT_RELEASE=, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, release=553, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container) Nov 26 04:50:29 localhost systemd[1]: libpod-conmon-7925fad985ce88239e5479eb57dfc66c36bd7b11163d3b1b92acd30e470ce23c.scope: Deactivated successfully. Nov 26 04:50:29 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:29 localhost ceph-mon[288827]: Reconfiguring mgr.np0005536118.anceyj (monmap changed)... Nov 26 04:50:29 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536118.anceyj", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:50:29 localhost ceph-mon[288827]: Reconfiguring daemon mgr.np0005536118.anceyj on np0005536118.localdomain Nov 26 04:50:29 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:29 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:29 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536119.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:50:29 localhost systemd[1]: var-lib-containers-storage-overlay-e47fd6453c0586556ff0ce357327ccaf4e0f5afc4a525e3c6c25bda0b93db4c5-merged.mount: Deactivated successfully. Nov 26 04:50:30 localhost ceph-mon[288827]: Reconfiguring crash.np0005536119 (monmap changed)... Nov 26 04:50:30 localhost ceph-mon[288827]: Reconfiguring daemon crash.np0005536119 on np0005536119.localdomain Nov 26 04:50:30 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:30 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:30 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Nov 26 04:50:31 localhost ceph-mon[288827]: Reconfiguring osd.1 (monmap changed)... Nov 26 04:50:31 localhost ceph-mon[288827]: Reconfiguring daemon osd.1 on np0005536119.localdomain Nov 26 04:50:31 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:31 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:31 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Nov 26 04:50:31 localhost nova_compute[281415]: 2025-11-26 09:50:31.707 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:50:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:50:32 localhost ceph-mon[288827]: Reconfiguring osd.3 (monmap changed)... Nov 26 04:50:32 localhost ceph-mon[288827]: Reconfiguring daemon osd.3 on np0005536119.localdomain Nov 26 04:50:32 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:32 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:32 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005536119.dxhchp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 26 04:50:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:50:32 localhost podman[292371]: 2025-11-26 09:50:32.843045598 +0000 UTC m=+0.092670348 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent) Nov 26 04:50:32 localhost podman[292371]: 2025-11-26 09:50:32.876336533 +0000 UTC m=+0.125961313 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2) Nov 26 04:50:32 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:50:32 localhost podman[292372]: 2025-11-26 09:50:32.8974301 +0000 UTC m=+0.148717020 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd) Nov 26 04:50:32 localhost podman[292372]: 2025-11-26 09:50:32.934566711 +0000 UTC m=+0.185853641 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.3) Nov 26 04:50:32 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:50:33 localhost ceph-mon[288827]: mon.np0005536118@3(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:50:33 localhost nova_compute[281415]: 2025-11-26 09:50:33.705 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:50:33 localhost ceph-mon[288827]: Reconfiguring mds.mds.np0005536119.dxhchp (monmap changed)... Nov 26 04:50:33 localhost ceph-mon[288827]: Reconfiguring daemon mds.mds.np0005536119.dxhchp on np0005536119.localdomain Nov 26 04:50:33 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:33 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:33 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536119.eupicg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:50:34 localhost ceph-mon[288827]: Reconfiguring mgr.np0005536119.eupicg (monmap changed)... Nov 26 04:50:34 localhost ceph-mon[288827]: Reconfiguring daemon mgr.np0005536119.eupicg on np0005536119.localdomain Nov 26 04:50:34 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:34 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:35 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:35 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 26 04:50:35 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:35 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:36 localhost nova_compute[281415]: 2025-11-26 09:50:36.710 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:50:36 localhost ceph-mon[288827]: Deploying daemon mon.np0005536117 on np0005536117.localdomain Nov 26 04:50:36 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:50:37 localhost ceph-mon[288827]: mon.np0005536118@3(peon) e8 adding peer [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] to list of hints Nov 26 04:50:37 localhost ceph-mon[288827]: mon.np0005536118@3(peon) e8 adding peer [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] to list of hints Nov 26 04:50:37 localhost ceph-mon[288827]: Updating np0005536113.localdomain:/etc/ceph/ceph.conf Nov 26 04:50:37 localhost ceph-mon[288827]: Updating np0005536114.localdomain:/etc/ceph/ceph.conf Nov 26 04:50:37 localhost ceph-mon[288827]: Updating np0005536117.localdomain:/etc/ceph/ceph.conf Nov 26 04:50:37 localhost ceph-mon[288827]: Updating np0005536118.localdomain:/etc/ceph/ceph.conf Nov 26 04:50:37 localhost ceph-mon[288827]: Updating np0005536119.localdomain:/etc/ceph/ceph.conf Nov 26 04:50:37 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:37 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:37 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:37 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:37 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:37 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:37 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:37 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:37 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:38 localhost ceph-mon[288827]: mon.np0005536118@3(peon) e8 adding peer [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] to list of hints Nov 26 04:50:38 localhost ceph-mgr[287388]: ms_deliver_dispatch: unhandled message 0x55fcbfcdb600 mon_map magic: 0 from mon.3 v2:172.18.0.107:3300/0 Nov 26 04:50:38 localhost ceph-mon[288827]: log_channel(cluster) log [INF] : mon.np0005536118 calling monitor election Nov 26 04:50:38 localhost ceph-mon[288827]: paxos.3).electionLogic(32) init, last seen epoch 32 Nov 26 04:50:38 localhost ceph-mon[288827]: mon.np0005536118@3(electing) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:50:38 localhost ceph-mon[288827]: mon.np0005536118@3(electing) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:50:38 localhost nova_compute[281415]: 2025-11-26 09:50:38.755 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:50:41 localhost nova_compute[281415]: 2025-11-26 09:50:41.714 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:50:43 localhost ceph-mon[288827]: mon.np0005536118@3(peon) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:50:43 localhost ceph-mon[288827]: mon.np0005536114 calling monitor election Nov 26 04:50:43 localhost ceph-mon[288827]: mon.np0005536113 calling monitor election Nov 26 04:50:43 localhost ceph-mon[288827]: mon.np0005536119 calling monitor election Nov 26 04:50:43 localhost ceph-mon[288827]: mon.np0005536118 calling monitor election Nov 26 04:50:43 localhost ceph-mon[288827]: mon.np0005536114 is new leader, mons np0005536114,np0005536113,np0005536119,np0005536118 in quorum (ranks 0,1,2,3) Nov 26 04:50:43 localhost ceph-mon[288827]: Health check failed: 1/5 mons down, quorum np0005536114,np0005536113,np0005536119,np0005536118 (MON_DOWN) Nov 26 04:50:43 localhost ceph-mon[288827]: Health detail: HEALTH_WARN 1/5 mons down, quorum np0005536114,np0005536113,np0005536119,np0005536118 Nov 26 04:50:43 localhost ceph-mon[288827]: [WRN] MON_DOWN: 1/5 mons down, quorum np0005536114,np0005536113,np0005536119,np0005536118 Nov 26 04:50:43 localhost ceph-mon[288827]: mon.np0005536117 (rank 4) addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] is down (out of quorum) Nov 26 04:50:43 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:43 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:43 localhost ceph-mon[288827]: mon.np0005536118@3(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:50:43 localhost nova_compute[281415]: 2025-11-26 09:50:43.756 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:50:44 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:44 localhost ceph-mon[288827]: Reconfiguring crash.np0005536113 (monmap changed)... Nov 26 04:50:44 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536113.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:50:44 localhost ceph-mon[288827]: Reconfiguring daemon crash.np0005536113 on np0005536113.localdomain Nov 26 04:50:44 localhost sshd[292747]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:50:45 localhost ceph-mon[288827]: log_channel(cluster) log [INF] : mon.np0005536118 calling monitor election Nov 26 04:50:45 localhost ceph-mon[288827]: paxos.3).electionLogic(34) init, last seen epoch 34 Nov 26 04:50:45 localhost ceph-mon[288827]: mon.np0005536118@3(electing) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:50:45 localhost ceph-mon[288827]: mon.np0005536118@3(electing) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:50:45 localhost ceph-mon[288827]: mon.np0005536118@3(electing) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:50:45 localhost ceph-mon[288827]: mon.np0005536118@3(peon) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:50:45 localhost ceph-mon[288827]: mon.np0005536117 calling monitor election Nov 26 04:50:45 localhost ceph-mon[288827]: mon.np0005536113 calling monitor election Nov 26 04:50:45 localhost ceph-mon[288827]: mon.np0005536114 calling monitor election Nov 26 04:50:45 localhost ceph-mon[288827]: mon.np0005536118 calling monitor election Nov 26 04:50:45 localhost ceph-mon[288827]: mon.np0005536119 calling monitor election Nov 26 04:50:45 localhost ceph-mon[288827]: mon.np0005536114 is new leader, mons np0005536114,np0005536113,np0005536119,np0005536118,np0005536117 in quorum (ranks 0,1,2,3,4) Nov 26 04:50:45 localhost ceph-mon[288827]: mon.np0005536117 calling monitor election Nov 26 04:50:45 localhost ceph-mon[288827]: Health check cleared: MON_DOWN (was: 1/5 mons down, quorum np0005536114,np0005536113,np0005536119,np0005536118) Nov 26 04:50:45 localhost ceph-mon[288827]: Cluster is now healthy Nov 26 04:50:45 localhost ceph-mon[288827]: overall HEALTH_OK Nov 26 04:50:45 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:45 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:45 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536114.ddbqmi", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:50:45 localhost openstack_network_exporter[242153]: ERROR 09:50:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:50:45 localhost openstack_network_exporter[242153]: ERROR 09:50:45 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:50:45 localhost openstack_network_exporter[242153]: ERROR 09:50:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:50:45 localhost openstack_network_exporter[242153]: ERROR 09:50:45 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:50:45 localhost openstack_network_exporter[242153]: Nov 26 04:50:45 localhost openstack_network_exporter[242153]: ERROR 09:50:45 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:50:45 localhost openstack_network_exporter[242153]: Nov 26 04:50:46 localhost ceph-mon[288827]: Reconfiguring mgr.np0005536114.ddbqmi (monmap changed)... Nov 26 04:50:46 localhost ceph-mon[288827]: Reconfiguring daemon mgr.np0005536114.ddbqmi on np0005536114.localdomain Nov 26 04:50:46 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:46 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:46 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536114.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:50:46 localhost nova_compute[281415]: 2025-11-26 09:50:46.716 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:50:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:50:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:50:46 localhost podman[292750]: 2025-11-26 09:50:46.8497438 +0000 UTC m=+0.098447388 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:50:46 localhost podman[292749]: 2025-11-26 09:50:46.884594163 +0000 UTC m=+0.133608921 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:50:46 localhost podman[292750]: 2025-11-26 09:50:46.893017062 +0000 UTC m=+0.141720640 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_managed=true) Nov 26 04:50:46 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:50:46 localhost podman[292749]: 2025-11-26 09:50:46.921305872 +0000 UTC m=+0.170320660 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 26 04:50:46 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:50:47 localhost ceph-mon[288827]: Reconfiguring crash.np0005536114 (monmap changed)... Nov 26 04:50:47 localhost ceph-mon[288827]: Reconfiguring daemon crash.np0005536114 on np0005536114.localdomain Nov 26 04:50:47 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:47 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:47 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536117.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:50:47 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:48 localhost ceph-mon[288827]: mon.np0005536118@3(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:50:48 localhost nova_compute[281415]: 2025-11-26 09:50:48.790 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:50:48 localhost ceph-mon[288827]: Reconfiguring crash.np0005536117 (monmap changed)... Nov 26 04:50:48 localhost ceph-mon[288827]: Reconfiguring daemon crash.np0005536117 on np0005536117.localdomain Nov 26 04:50:48 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:48 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:48 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Nov 26 04:50:49 localhost nova_compute[281415]: 2025-11-26 09:50:49.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:50:49 localhost ceph-mon[288827]: Reconfiguring osd.2 (monmap changed)... Nov 26 04:50:49 localhost ceph-mon[288827]: Reconfiguring daemon osd.2 on np0005536117.localdomain Nov 26 04:50:49 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:49 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:49 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Nov 26 04:50:50 localhost ceph-mon[288827]: Reconfiguring osd.5 (monmap changed)... Nov 26 04:50:50 localhost ceph-mon[288827]: Reconfiguring daemon osd.5 on np0005536117.localdomain Nov 26 04:50:50 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:50 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:50 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005536117.tfthzg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 26 04:50:50 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:50 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:50 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:50 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:50 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:50 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:50 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:50 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:50 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:50 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:50 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:50 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:51 localhost nova_compute[281415]: 2025-11-26 09:50:51.720 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:50:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:50:51 localhost podman[292788]: 2025-11-26 09:50:51.823711812 +0000 UTC m=+0.078937240 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible) Nov 26 04:50:51 localhost nova_compute[281415]: 2025-11-26 09:50:51.844 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:50:51 localhost nova_compute[281415]: 2025-11-26 09:50:51.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:50:51 localhost nova_compute[281415]: 2025-11-26 09:50:51.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:50:51 localhost nova_compute[281415]: 2025-11-26 09:50:51.847 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 04:50:51 localhost podman[292788]: 2025-11-26 09:50:51.894256692 +0000 UTC m=+0.149482090 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:50:51 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:50:52 localhost ceph-mon[288827]: Reconfiguring mds.mds.np0005536117.tfthzg (monmap changed)... Nov 26 04:50:52 localhost ceph-mon[288827]: Reconfiguring daemon mds.mds.np0005536117.tfthzg on np0005536117.localdomain Nov 26 04:50:52 localhost ceph-mon[288827]: Reconfig service osd.default_drive_group Nov 26 04:50:52 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:52 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' Nov 26 04:50:52 localhost ceph-mon[288827]: from='mgr.14184 172.18.0.105:0/3749454852' entity='mgr.np0005536114.ddbqmi' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536117.ggibwg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:50:52 localhost ceph-mon[288827]: mon.np0005536118@3(peon) e9 handle_command mon_command({"prefix": "mgr fail"} v 0) Nov 26 04:50:52 localhost ceph-mon[288827]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/3711428409' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Nov 26 04:50:52 localhost ceph-mon[288827]: mon.np0005536118@3(peon).osd e83 e83: 6 total, 6 up, 6 in Nov 26 04:50:52 localhost systemd[1]: session-65.scope: Deactivated successfully. Nov 26 04:50:52 localhost systemd[1]: session-65.scope: Consumed 19.573s CPU time. Nov 26 04:50:52 localhost systemd-logind[761]: Session 65 logged out. Waiting for processes to exit. Nov 26 04:50:52 localhost systemd-logind[761]: Removed session 65. Nov 26 04:50:52 localhost sshd[292814]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:50:52 localhost systemd-logind[761]: New session 66 of user ceph-admin. Nov 26 04:50:52 localhost systemd[1]: Started Session 66 of User ceph-admin. Nov 26 04:50:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:50:53 localhost podman[292836]: 2025-11-26 09:50:53.077119604 +0000 UTC m=+0.086449790 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, release=1755695350, vcs-type=git, io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.buildah.version=1.33.7) Nov 26 04:50:53 localhost podman[292836]: 2025-11-26 09:50:53.093102956 +0000 UTC m=+0.102433092 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, managed_by=edpm_ansible, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, release=1755695350, name=ubi9-minimal, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container) Nov 26 04:50:53 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:50:53 localhost ceph-mon[288827]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Nov 26 04:50:53 localhost ceph-mon[288827]: Activating manager daemon np0005536119.eupicg Nov 26 04:50:53 localhost ceph-mon[288827]: from='client.? 172.18.0.200:0/3711428409' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Nov 26 04:50:53 localhost ceph-mon[288827]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Nov 26 04:50:53 localhost ceph-mon[288827]: Manager daemon np0005536119.eupicg is now available Nov 26 04:50:53 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005536112.localdomain.devices.0"} : dispatch Nov 26 04:50:53 localhost ceph-mon[288827]: removing stray HostCache host record np0005536112.localdomain.devices.0 Nov 26 04:50:53 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005536112.localdomain.devices.0"} : dispatch Nov 26 04:50:53 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005536112.localdomain.devices.0"}]': finished Nov 26 04:50:53 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005536112.localdomain.devices.0"} : dispatch Nov 26 04:50:53 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005536112.localdomain.devices.0"} : dispatch Nov 26 04:50:53 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005536112.localdomain.devices.0"}]': finished Nov 26 04:50:53 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005536119.eupicg/mirror_snapshot_schedule"} : dispatch Nov 26 04:50:53 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005536119.eupicg/mirror_snapshot_schedule"} : dispatch Nov 26 04:50:53 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005536119.eupicg/trash_purge_schedule"} : dispatch Nov 26 04:50:53 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005536119.eupicg/trash_purge_schedule"} : dispatch Nov 26 04:50:53 localhost ceph-mon[288827]: mon.np0005536118@3(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:50:53 localhost nova_compute[281415]: 2025-11-26 09:50:53.836 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:50:53 localhost nova_compute[281415]: 2025-11-26 09:50:53.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:50:53 localhost nova_compute[281415]: 2025-11-26 09:50:53.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:50:53 localhost nova_compute[281415]: 2025-11-26 09:50:53.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:50:53 localhost nova_compute[281415]: 2025-11-26 09:50:53.872 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:50:53 localhost nova_compute[281415]: 2025-11-26 09:50:53.873 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:50:53 localhost nova_compute[281415]: 2025-11-26 09:50:53.873 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:50:53 localhost nova_compute[281415]: 2025-11-26 09:50:53.873 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 04:50:53 localhost nova_compute[281415]: 2025-11-26 09:50:53.873 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:50:53 localhost podman[292943]: 2025-11-26 09:50:53.951191159 +0000 UTC m=+0.098452229 container exec a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, distribution-scope=public, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, build-date=2025-09-24T08:57:55, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , vcs-type=git, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 26 04:50:54 localhost podman[292943]: 2025-11-26 09:50:54.299460251 +0000 UTC m=+0.446721331 container exec_died a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, name=rhceph, maintainer=Guillaume Abrioux , distribution-scope=public, RELEASE=main, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, GIT_CLEAN=True, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, release=553, vcs-type=git, com.redhat.component=rhceph-container, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12) Nov 26 04:50:54 localhost ceph-mon[288827]: [26/Nov/2025:09:50:53] ENGINE Bus STARTING Nov 26 04:50:54 localhost nova_compute[281415]: 2025-11-26 09:50:54.626 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.752s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:50:54 localhost nova_compute[281415]: 2025-11-26 09:50:54.709 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:50:54 localhost nova_compute[281415]: 2025-11-26 09:50:54.710 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:50:54 localhost nova_compute[281415]: 2025-11-26 09:50:54.946 281419 WARNING nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 04:50:54 localhost nova_compute[281415]: 2025-11-26 09:50:54.948 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=11769MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 04:50:54 localhost nova_compute[281415]: 2025-11-26 09:50:54.949 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:50:54 localhost nova_compute[281415]: 2025-11-26 09:50:54.949 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:50:55 localhost nova_compute[281415]: 2025-11-26 09:50:55.028 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 04:50:55 localhost nova_compute[281415]: 2025-11-26 09:50:55.028 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 04:50:55 localhost nova_compute[281415]: 2025-11-26 09:50:55.029 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 04:50:55 localhost nova_compute[281415]: 2025-11-26 09:50:55.069 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:50:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:50:55 localhost podman[293104]: 2025-11-26 09:50:55.323127228 +0000 UTC m=+0.113041458 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 04:50:55 localhost podman[293104]: 2025-11-26 09:50:55.336561962 +0000 UTC m=+0.126476222 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 04:50:55 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:50:55 localhost ceph-mon[288827]: [26/Nov/2025:09:50:53] ENGINE Serving on https://172.18.0.108:7150 Nov 26 04:50:55 localhost ceph-mon[288827]: [26/Nov/2025:09:50:53] ENGINE Client ('172.18.0.108', 58290) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Nov 26 04:50:55 localhost ceph-mon[288827]: [26/Nov/2025:09:50:53] ENGINE Serving on http://172.18.0.108:8765 Nov 26 04:50:55 localhost ceph-mon[288827]: [26/Nov/2025:09:50:53] ENGINE Bus STARTED Nov 26 04:50:55 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:50:55 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:50:55 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:50:55 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:50:55 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:50:55 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:50:55 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:50:55 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:50:55 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:50:55 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:50:55 localhost ceph-mon[288827]: mon.np0005536118@3(peon) e9 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 04:50:55 localhost ceph-mon[288827]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2832927965' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 04:50:55 localhost nova_compute[281415]: 2025-11-26 09:50:55.536 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:50:55 localhost nova_compute[281415]: 2025-11-26 09:50:55.545 281419 DEBUG nova.compute.provider_tree [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 04:50:55 localhost nova_compute[281415]: 2025-11-26 09:50:55.568 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 04:50:55 localhost nova_compute[281415]: 2025-11-26 09:50:55.571 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 04:50:55 localhost nova_compute[281415]: 2025-11-26 09:50:55.571 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.622s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:50:56 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:50:56 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:50:56 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:50:56 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:50:56 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:50:56 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 26 04:50:56 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 26 04:50:56 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:50:56 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 26 04:50:56 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 26 04:50:56 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:50:56 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd/host:np0005536113", "name": "osd_memory_target"} : dispatch Nov 26 04:50:56 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd/host:np0005536113", "name": "osd_memory_target"} : dispatch Nov 26 04:50:56 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:50:56 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd/host:np0005536114", "name": "osd_memory_target"} : dispatch Nov 26 04:50:56 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 26 04:50:56 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 26 04:50:56 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd/host:np0005536114", "name": "osd_memory_target"} : dispatch Nov 26 04:50:56 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 26 04:50:56 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 26 04:50:56 localhost nova_compute[281415]: 2025-11-26 09:50:56.571 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:50:56 localhost nova_compute[281415]: 2025-11-26 09:50:56.750 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:50:56 localhost nova_compute[281415]: 2025-11-26 09:50:56.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:50:56 localhost nova_compute[281415]: 2025-11-26 09:50:56.849 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 04:50:56 localhost nova_compute[281415]: 2025-11-26 09:50:56.849 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 04:50:57 localhost nova_compute[281415]: 2025-11-26 09:50:57.232 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:50:57 localhost nova_compute[281415]: 2025-11-26 09:50:57.233 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:50:57 localhost nova_compute[281415]: 2025-11-26 09:50:57.233 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 04:50:57 localhost nova_compute[281415]: 2025-11-26 09:50:57.233 281419 DEBUG nova.objects.instance [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:50:57 localhost podman[240049]: time="2025-11-26T09:50:57Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:50:57 localhost podman[240049]: @ - - [26/Nov/2025:09:50:57 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153864 "" "Go-http-client/1.1" Nov 26 04:50:57 localhost podman[240049]: @ - - [26/Nov/2025:09:50:57 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18718 "" "Go-http-client/1.1" Nov 26 04:50:57 localhost nova_compute[281415]: 2025-11-26 09:50:57.594 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:50:57 localhost nova_compute[281415]: 2025-11-26 09:50:57.618 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:50:57 localhost nova_compute[281415]: 2025-11-26 09:50:57.619 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 04:50:57 localhost ceph-mon[288827]: Adjusting osd_memory_target on np0005536117.localdomain to 836.6M Nov 26 04:50:57 localhost ceph-mon[288827]: Adjusting osd_memory_target on np0005536119.localdomain to 836.6M Nov 26 04:50:57 localhost ceph-mon[288827]: Unable to set osd_memory_target on np0005536119.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 26 04:50:57 localhost ceph-mon[288827]: Unable to set osd_memory_target on np0005536117.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 26 04:50:57 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:50:57 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:50:57 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 26 04:50:57 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 26 04:50:57 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 26 04:50:57 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 26 04:50:57 localhost ceph-mon[288827]: Adjusting osd_memory_target on np0005536118.localdomain to 836.6M Nov 26 04:50:57 localhost ceph-mon[288827]: Unable to set osd_memory_target on np0005536118.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 26 04:50:57 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:50:57 localhost ceph-mon[288827]: Updating np0005536113.localdomain:/etc/ceph/ceph.conf Nov 26 04:50:57 localhost ceph-mon[288827]: Updating np0005536114.localdomain:/etc/ceph/ceph.conf Nov 26 04:50:57 localhost ceph-mon[288827]: Updating np0005536117.localdomain:/etc/ceph/ceph.conf Nov 26 04:50:57 localhost ceph-mon[288827]: Updating np0005536118.localdomain:/etc/ceph/ceph.conf Nov 26 04:50:57 localhost ceph-mon[288827]: Updating np0005536119.localdomain:/etc/ceph/ceph.conf Nov 26 04:50:58 localhost ceph-mon[288827]: mon.np0005536118@3(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:50:58 localhost ceph-mon[288827]: Updating np0005536117.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:50:58 localhost ceph-mon[288827]: Updating np0005536113.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:50:58 localhost ceph-mon[288827]: Updating np0005536114.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:50:58 localhost ceph-mon[288827]: Updating np0005536119.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:50:58 localhost ceph-mon[288827]: Updating np0005536118.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:50:58 localhost nova_compute[281415]: 2025-11-26 09:50:58.872 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:50:59 localhost ceph-mon[288827]: Updating np0005536113.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 26 04:50:59 localhost ceph-mon[288827]: Updating np0005536114.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 26 04:50:59 localhost ceph-mon[288827]: Updating np0005536117.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 26 04:50:59 localhost ceph-mon[288827]: Updating np0005536119.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 26 04:50:59 localhost ceph-mon[288827]: Updating np0005536118.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 26 04:50:59 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:50:59 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:50:59 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:50:59 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:50:59 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:50:59 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:50:59 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:50:59 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:50:59 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:50:59 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:50:59 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:00 localhost ceph-mon[288827]: Updating np0005536117.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.client.admin.keyring Nov 26 04:51:00 localhost ceph-mon[288827]: Updating np0005536113.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.client.admin.keyring Nov 26 04:51:00 localhost ceph-mon[288827]: Updating np0005536119.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.client.admin.keyring Nov 26 04:51:00 localhost ceph-mon[288827]: Updating np0005536114.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.client.admin.keyring Nov 26 04:51:00 localhost ceph-mon[288827]: Updating np0005536118.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.client.admin.keyring Nov 26 04:51:00 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Nov 26 04:51:00 localhost ceph-mon[288827]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON) Nov 26 04:51:00 localhost ceph-mon[288827]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST) Nov 26 04:51:01 localhost ceph-mon[288827]: Reconfiguring daemon osd.2 on np0005536117.localdomain Nov 26 04:51:01 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:01 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:01 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:01 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:01 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Nov 26 04:51:01 localhost nova_compute[281415]: 2025-11-26 09:51:01.793 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:51:02 localhost ceph-mon[288827]: Reconfiguring daemon osd.5 on np0005536117.localdomain Nov 26 04:51:02 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:02 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:02 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:02 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:02 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536117.ggibwg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:51:02 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536117.ggibwg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:51:02 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:51:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:51:03 localhost podman[293925]: 2025-11-26 09:51:03.290102749 +0000 UTC m=+0.094463866 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 26 04:51:03 localhost podman[293925]: 2025-11-26 09:51:03.298668633 +0000 UTC m=+0.103029780 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Nov 26 04:51:03 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:51:03 localhost podman[293926]: 2025-11-26 09:51:03.355306495 +0000 UTC m=+0.152001196 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd) Nov 26 04:51:03 localhost podman[293926]: 2025-11-26 09:51:03.372560225 +0000 UTC m=+0.169254896 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2) Nov 26 04:51:03 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:51:03 localhost ceph-mon[288827]: mon.np0005536118@3(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:51:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:51:03.655 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:51:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:51:03.656 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:51:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:51:03.657 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:51:03 localhost podman[293997]: Nov 26 04:51:03 localhost podman[293997]: 2025-11-26 09:51:03.760812798 +0000 UTC m=+0.083214881 container create aed936c8919e942c4b5c9b66d19a2829aa9cc5f20bb671f25a431ed0b8ee30df (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_gates, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, ceph=True, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, release=553, io.openshift.tags=rhceph ceph, distribution-scope=public, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 26 04:51:03 localhost ceph-mon[288827]: Reconfiguring mgr.np0005536117.ggibwg (monmap changed)... Nov 26 04:51:03 localhost ceph-mon[288827]: Reconfiguring daemon mgr.np0005536117.ggibwg on np0005536117.localdomain Nov 26 04:51:03 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:03 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:03 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536118.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:51:03 localhost ceph-mon[288827]: Reconfiguring crash.np0005536118 (monmap changed)... Nov 26 04:51:03 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536118.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:51:03 localhost ceph-mon[288827]: Reconfiguring daemon crash.np0005536118 on np0005536118.localdomain Nov 26 04:51:03 localhost systemd[1]: Started libpod-conmon-aed936c8919e942c4b5c9b66d19a2829aa9cc5f20bb671f25a431ed0b8ee30df.scope. Nov 26 04:51:03 localhost systemd[1]: Started libcrun container. Nov 26 04:51:03 localhost podman[293997]: 2025-11-26 09:51:03.728531134 +0000 UTC m=+0.050933267 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:51:03 localhost podman[293997]: 2025-11-26 09:51:03.843557332 +0000 UTC m=+0.165959455 container init aed936c8919e942c4b5c9b66d19a2829aa9cc5f20bb671f25a431ed0b8ee30df (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_gates, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_CLEAN=True, maintainer=Guillaume Abrioux , distribution-scope=public, io.openshift.expose-services=, release=553, io.buildah.version=1.33.12, vcs-type=git, ceph=True, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc.) Nov 26 04:51:03 localhost podman[293997]: 2025-11-26 09:51:03.858284926 +0000 UTC m=+0.180687029 container start aed936c8919e942c4b5c9b66d19a2829aa9cc5f20bb671f25a431ed0b8ee30df (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_gates, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.openshift.expose-services=, ceph=True, RELEASE=main, build-date=2025-09-24T08:57:55, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, name=rhceph, GIT_CLEAN=True, vendor=Red Hat, Inc., version=7, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 26 04:51:03 localhost podman[293997]: 2025-11-26 09:51:03.858662448 +0000 UTC m=+0.181064561 container attach aed936c8919e942c4b5c9b66d19a2829aa9cc5f20bb671f25a431ed0b8ee30df (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_gates, GIT_BRANCH=main, io.openshift.expose-services=, name=rhceph, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, distribution-scope=public, RELEASE=main, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, architecture=x86_64, ceph=True, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 26 04:51:03 localhost kind_gates[294013]: 167 167 Nov 26 04:51:03 localhost systemd[1]: libpod-aed936c8919e942c4b5c9b66d19a2829aa9cc5f20bb671f25a431ed0b8ee30df.scope: Deactivated successfully. Nov 26 04:51:03 localhost podman[293997]: 2025-11-26 09:51:03.86427522 +0000 UTC m=+0.186677333 container died aed936c8919e942c4b5c9b66d19a2829aa9cc5f20bb671f25a431ed0b8ee30df (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_gates, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, vcs-type=git, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, version=7, RELEASE=main, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc.) Nov 26 04:51:03 localhost nova_compute[281415]: 2025-11-26 09:51:03.907 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:51:03 localhost podman[294018]: 2025-11-26 09:51:03.970683213 +0000 UTC m=+0.096717696 container remove aed936c8919e942c4b5c9b66d19a2829aa9cc5f20bb671f25a431ed0b8ee30df (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_gates, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, RELEASE=main, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, vcs-type=git, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhceph ceph, ceph=True, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, com.redhat.component=rhceph-container, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553) Nov 26 04:51:03 localhost systemd[1]: libpod-conmon-aed936c8919e942c4b5c9b66d19a2829aa9cc5f20bb671f25a431ed0b8ee30df.scope: Deactivated successfully. Nov 26 04:51:04 localhost systemd[1]: var-lib-containers-storage-overlay-2b23ca0d9ca00d7c897a373307f6326bca1cb15e6defc33902afef1b42fa6438-merged.mount: Deactivated successfully. Nov 26 04:51:04 localhost podman[294091]: Nov 26 04:51:04 localhost podman[294091]: 2025-11-26 09:51:04.753272204 +0000 UTC m=+0.064465644 container create ff832de30df3e39585dd1c6ab3ef7a8d270ca5cc36f42575e11750b86787564d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_satoshi, io.buildah.version=1.33.12, architecture=x86_64, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, name=rhceph, ceph=True, CEPH_POINT_RELEASE=, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, RELEASE=main, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , io.openshift.expose-services=, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, GIT_BRANCH=main, version=7) Nov 26 04:51:04 localhost systemd[1]: Started libpod-conmon-ff832de30df3e39585dd1c6ab3ef7a8d270ca5cc36f42575e11750b86787564d.scope. Nov 26 04:51:04 localhost podman[294091]: 2025-11-26 09:51:04.723429406 +0000 UTC m=+0.034622886 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:51:04 localhost systemd[1]: Started libcrun container. Nov 26 04:51:04 localhost podman[294091]: 2025-11-26 09:51:04.846264504 +0000 UTC m=+0.157457974 container init ff832de30df3e39585dd1c6ab3ef7a8d270ca5cc36f42575e11750b86787564d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_satoshi, build-date=2025-09-24T08:57:55, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.openshift.expose-services=, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.component=rhceph-container) Nov 26 04:51:04 localhost podman[294091]: 2025-11-26 09:51:04.856453648 +0000 UTC m=+0.167647118 container start ff832de30df3e39585dd1c6ab3ef7a8d270ca5cc36f42575e11750b86787564d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_satoshi, architecture=x86_64, release=553, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, version=7, maintainer=Guillaume Abrioux , GIT_BRANCH=main, io.buildah.version=1.33.12, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, vendor=Red Hat, Inc.) Nov 26 04:51:04 localhost podman[294091]: 2025-11-26 09:51:04.856755017 +0000 UTC m=+0.167948547 container attach ff832de30df3e39585dd1c6ab3ef7a8d270ca5cc36f42575e11750b86787564d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_satoshi, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, name=rhceph, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.openshift.expose-services=, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, version=7, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7) Nov 26 04:51:04 localhost gallant_satoshi[294106]: 167 167 Nov 26 04:51:04 localhost systemd[1]: libpod-ff832de30df3e39585dd1c6ab3ef7a8d270ca5cc36f42575e11750b86787564d.scope: Deactivated successfully. Nov 26 04:51:04 localhost podman[294091]: 2025-11-26 09:51:04.865138395 +0000 UTC m=+0.176331935 container died ff832de30df3e39585dd1c6ab3ef7a8d270ca5cc36f42575e11750b86787564d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_satoshi, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, release=553, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, architecture=x86_64, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., name=rhceph, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, RELEASE=main, io.openshift.expose-services=, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_BRANCH=main) Nov 26 04:51:04 localhost podman[294111]: 2025-11-26 09:51:04.956677971 +0000 UTC m=+0.081714075 container remove ff832de30df3e39585dd1c6ab3ef7a8d270ca5cc36f42575e11750b86787564d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_satoshi, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, RELEASE=main, version=7, release=553, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, GIT_CLEAN=True, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., vcs-type=git, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main) Nov 26 04:51:04 localhost systemd[1]: libpod-conmon-ff832de30df3e39585dd1c6ab3ef7a8d270ca5cc36f42575e11750b86787564d.scope: Deactivated successfully. Nov 26 04:51:05 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:05 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:05 localhost ceph-mon[288827]: Reconfiguring osd.0 (monmap changed)... Nov 26 04:51:05 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Nov 26 04:51:05 localhost ceph-mon[288827]: Reconfiguring daemon osd.0 on np0005536118.localdomain Nov 26 04:51:05 localhost systemd[1]: var-lib-containers-storage-overlay-cb62927bfb3998f4df501bdbcf7137fc021547923811dc251f7791c3ee992044-merged.mount: Deactivated successfully. Nov 26 04:51:05 localhost podman[294187]: Nov 26 04:51:05 localhost podman[294187]: 2025-11-26 09:51:05.866615189 +0000 UTC m=+0.086700158 container create cd214147927eac289c4694a9eec8158db749c97a2f1ef99c6af1fc485815768b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_turing, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_BRANCH=main, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, distribution-scope=public, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, ceph=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, io.openshift.expose-services=, vcs-type=git) Nov 26 04:51:05 localhost systemd[1]: Started libpod-conmon-cd214147927eac289c4694a9eec8158db749c97a2f1ef99c6af1fc485815768b.scope. Nov 26 04:51:05 localhost systemd[1]: Started libcrun container. Nov 26 04:51:05 localhost podman[294187]: 2025-11-26 09:51:05.927552873 +0000 UTC m=+0.147637852 container init cd214147927eac289c4694a9eec8158db749c97a2f1ef99c6af1fc485815768b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_turing, distribution-scope=public, vendor=Red Hat, Inc., name=rhceph, version=7, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, vcs-type=git, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 26 04:51:05 localhost podman[294187]: 2025-11-26 09:51:05.832663754 +0000 UTC m=+0.052748773 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:51:05 localhost reverent_turing[294202]: 167 167 Nov 26 04:51:05 localhost podman[294187]: 2025-11-26 09:51:05.93917843 +0000 UTC m=+0.159263409 container start cd214147927eac289c4694a9eec8158db749c97a2f1ef99c6af1fc485815768b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_turing, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, release=553, io.buildah.version=1.33.12, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, version=7, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, RELEASE=main) Nov 26 04:51:05 localhost podman[294187]: 2025-11-26 09:51:05.940688437 +0000 UTC m=+0.160773466 container attach cd214147927eac289c4694a9eec8158db749c97a2f1ef99c6af1fc485815768b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_turing, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, GIT_BRANCH=main, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, RELEASE=main, release=553, version=7, architecture=x86_64, io.openshift.expose-services=, GIT_CLEAN=True) Nov 26 04:51:05 localhost systemd[1]: libpod-cd214147927eac289c4694a9eec8158db749c97a2f1ef99c6af1fc485815768b.scope: Deactivated successfully. Nov 26 04:51:05 localhost podman[294187]: 2025-11-26 09:51:05.945370411 +0000 UTC m=+0.165455400 container died cd214147927eac289c4694a9eec8158db749c97a2f1ef99c6af1fc485815768b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_turing, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, release=553, RELEASE=main, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, distribution-scope=public, vcs-type=git, ceph=True, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 26 04:51:06 localhost podman[294208]: 2025-11-26 09:51:06.053844007 +0000 UTC m=+0.095816378 container remove cd214147927eac289c4694a9eec8158db749c97a2f1ef99c6af1fc485815768b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_turing, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, distribution-scope=public, com.redhat.component=rhceph-container, GIT_CLEAN=True, io.openshift.expose-services=, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., version=7, release=553, name=rhceph, GIT_BRANCH=main, maintainer=Guillaume Abrioux ) Nov 26 04:51:06 localhost systemd[1]: libpod-conmon-cd214147927eac289c4694a9eec8158db749c97a2f1ef99c6af1fc485815768b.scope: Deactivated successfully. Nov 26 04:51:06 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:06 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:06 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:06 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:06 localhost ceph-mon[288827]: Reconfiguring osd.4 (monmap changed)... Nov 26 04:51:06 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Nov 26 04:51:06 localhost ceph-mon[288827]: Reconfiguring daemon osd.4 on np0005536118.localdomain Nov 26 04:51:06 localhost systemd[1]: var-lib-containers-storage-overlay-b2f24754cc82abcb18856471116ba05b93dcfa20b7693c248c293f7ea7361679-merged.mount: Deactivated successfully. Nov 26 04:51:06 localhost nova_compute[281415]: 2025-11-26 09:51:06.831 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:51:06 localhost podman[294284]: Nov 26 04:51:06 localhost podman[294284]: 2025-11-26 09:51:06.98637447 +0000 UTC m=+0.088671237 container create 7112416f4ece0188b8fa0f2518d3b085ed50e4d037d363532d1b6f78121cc61b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_mendel, io.openshift.expose-services=, GIT_BRANCH=main, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., distribution-scope=public, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, name=rhceph, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, ceph=True, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux ) Nov 26 04:51:07 localhost systemd[1]: Started libpod-conmon-7112416f4ece0188b8fa0f2518d3b085ed50e4d037d363532d1b6f78121cc61b.scope. Nov 26 04:51:07 localhost systemd[1]: Started libcrun container. Nov 26 04:51:07 localhost podman[294284]: 2025-11-26 09:51:06.950653771 +0000 UTC m=+0.052950588 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:51:07 localhost podman[294284]: 2025-11-26 09:51:07.061430759 +0000 UTC m=+0.163727546 container init 7112416f4ece0188b8fa0f2518d3b085ed50e4d037d363532d1b6f78121cc61b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_mendel, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, architecture=x86_64, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, release=553, io.openshift.expose-services=, GIT_BRANCH=main, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., distribution-scope=public, ceph=True, vcs-type=git, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 26 04:51:07 localhost tender_mendel[294299]: 167 167 Nov 26 04:51:07 localhost systemd[1]: libpod-7112416f4ece0188b8fa0f2518d3b085ed50e4d037d363532d1b6f78121cc61b.scope: Deactivated successfully. Nov 26 04:51:07 localhost podman[294284]: 2025-11-26 09:51:07.083194379 +0000 UTC m=+0.185491156 container start 7112416f4ece0188b8fa0f2518d3b085ed50e4d037d363532d1b6f78121cc61b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_mendel, version=7, ceph=True, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, release=553, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., name=rhceph, build-date=2025-09-24T08:57:55, RELEASE=main, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, architecture=x86_64, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, vcs-type=git) Nov 26 04:51:07 localhost podman[294284]: 2025-11-26 09:51:07.08355125 +0000 UTC m=+0.185848077 container attach 7112416f4ece0188b8fa0f2518d3b085ed50e4d037d363532d1b6f78121cc61b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_mendel, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., GIT_BRANCH=main, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, name=rhceph, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 26 04:51:07 localhost podman[294284]: 2025-11-26 09:51:07.086417518 +0000 UTC m=+0.188714315 container died 7112416f4ece0188b8fa0f2518d3b085ed50e4d037d363532d1b6f78121cc61b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_mendel, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, ceph=True, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_CLEAN=True, release=553, io.openshift.expose-services=, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc.) Nov 26 04:51:07 localhost podman[294304]: 2025-11-26 09:51:07.190065886 +0000 UTC m=+0.103419122 container remove 7112416f4ece0188b8fa0f2518d3b085ed50e4d037d363532d1b6f78121cc61b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_mendel, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, architecture=x86_64, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, release=553, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.openshift.expose-services=) Nov 26 04:51:07 localhost systemd[1]: libpod-conmon-7112416f4ece0188b8fa0f2518d3b085ed50e4d037d363532d1b6f78121cc61b.scope: Deactivated successfully. Nov 26 04:51:07 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:07 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:07 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:07 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:07 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005536118.kohnma", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 26 04:51:07 localhost ceph-mon[288827]: Reconfiguring mds.mds.np0005536118.kohnma (monmap changed)... Nov 26 04:51:07 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005536118.kohnma", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 26 04:51:07 localhost ceph-mon[288827]: Reconfiguring daemon mds.mds.np0005536118.kohnma on np0005536118.localdomain Nov 26 04:51:07 localhost ceph-mon[288827]: Saving service mon spec with placement label:mon Nov 26 04:51:07 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:07 localhost systemd[1]: var-lib-containers-storage-overlay-4a9d32098dd427c5ca98348f25b3aa67f487bcbf1aa3ecc4fb8b94b2e61f7635-merged.mount: Deactivated successfully. Nov 26 04:51:07 localhost podman[294373]: Nov 26 04:51:07 localhost podman[294373]: 2025-11-26 09:51:07.965862898 +0000 UTC m=+0.079223687 container create 6c0e74f02775bfe1c5405672ee77930431d1f0c45a1acc7432685e7d6811e65e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_wilbur, GIT_CLEAN=True, maintainer=Guillaume Abrioux , GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, distribution-scope=public, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, ceph=True, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 26 04:51:08 localhost systemd[1]: Started libpod-conmon-6c0e74f02775bfe1c5405672ee77930431d1f0c45a1acc7432685e7d6811e65e.scope. Nov 26 04:51:08 localhost systemd[1]: Started libcrun container. Nov 26 04:51:08 localhost podman[294373]: 2025-11-26 09:51:07.936099682 +0000 UTC m=+0.049460471 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:51:08 localhost podman[294373]: 2025-11-26 09:51:08.036286164 +0000 UTC m=+0.149646983 container init 6c0e74f02775bfe1c5405672ee77930431d1f0c45a1acc7432685e7d6811e65e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_wilbur, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, release=553, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, version=7, RELEASE=main, description=Red Hat Ceph Storage 7, name=rhceph, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 26 04:51:08 localhost podman[294373]: 2025-11-26 09:51:08.045897739 +0000 UTC m=+0.159258518 container start 6c0e74f02775bfe1c5405672ee77930431d1f0c45a1acc7432685e7d6811e65e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_wilbur, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, version=7, GIT_BRANCH=main, distribution-scope=public, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-type=git, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, name=rhceph, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12) Nov 26 04:51:08 localhost podman[294373]: 2025-11-26 09:51:08.04622623 +0000 UTC m=+0.159587049 container attach 6c0e74f02775bfe1c5405672ee77930431d1f0c45a1acc7432685e7d6811e65e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_wilbur, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, architecture=x86_64, vendor=Red Hat, Inc., version=7, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, GIT_BRANCH=main, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, name=rhceph, io.openshift.expose-services=, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main) Nov 26 04:51:08 localhost adoring_wilbur[294389]: 167 167 Nov 26 04:51:08 localhost systemd[1]: libpod-6c0e74f02775bfe1c5405672ee77930431d1f0c45a1acc7432685e7d6811e65e.scope: Deactivated successfully. Nov 26 04:51:08 localhost podman[294373]: 2025-11-26 09:51:08.051166271 +0000 UTC m=+0.164527080 container died 6c0e74f02775bfe1c5405672ee77930431d1f0c45a1acc7432685e7d6811e65e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_wilbur, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, RELEASE=main, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vcs-type=git, ceph=True, version=7, description=Red Hat Ceph Storage 7, name=rhceph, GIT_CLEAN=True, distribution-scope=public, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 26 04:51:08 localhost podman[294394]: 2025-11-26 09:51:08.171671378 +0000 UTC m=+0.102258226 container remove 6c0e74f02775bfe1c5405672ee77930431d1f0c45a1acc7432685e7d6811e65e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_wilbur, CEPH_POINT_RELEASE=, architecture=x86_64, build-date=2025-09-24T08:57:55, release=553, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, GIT_CLEAN=True, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , ceph=True, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, vendor=Red Hat, Inc., version=7, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 26 04:51:08 localhost systemd[1]: libpod-conmon-6c0e74f02775bfe1c5405672ee77930431d1f0c45a1acc7432685e7d6811e65e.scope: Deactivated successfully. Nov 26 04:51:08 localhost systemd[1]: var-lib-containers-storage-overlay-c54e0ee60a68d1aba0a5443f26ca82f1c414ca1a9973adfd55ef27bc64ac3142-merged.mount: Deactivated successfully. Nov 26 04:51:08 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:08 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:08 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536118.anceyj", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:51:08 localhost ceph-mon[288827]: Reconfiguring mgr.np0005536118.anceyj (monmap changed)... Nov 26 04:51:08 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536118.anceyj", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:51:08 localhost ceph-mon[288827]: Reconfiguring daemon mgr.np0005536118.anceyj on np0005536118.localdomain Nov 26 04:51:08 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:08 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:08 localhost ceph-mon[288827]: mon.np0005536118@3(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:51:08 localhost nova_compute[281415]: 2025-11-26 09:51:08.958 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:51:09 localhost podman[294463]: Nov 26 04:51:09 localhost podman[294463]: 2025-11-26 09:51:09.018504076 +0000 UTC m=+0.114053219 container create 8b726f86765f3ad2dce30da0bb49a6c3b490758cd5b4f0b37417241ce75b6cd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_poitras, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, maintainer=Guillaume Abrioux , GIT_BRANCH=main, RELEASE=main, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, name=rhceph, vcs-type=git, build-date=2025-09-24T08:57:55, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True) Nov 26 04:51:09 localhost systemd[1]: Started libpod-conmon-8b726f86765f3ad2dce30da0bb49a6c3b490758cd5b4f0b37417241ce75b6cd6.scope. Nov 26 04:51:09 localhost systemd[1]: Started libcrun container. Nov 26 04:51:09 localhost podman[294463]: 2025-11-26 09:51:08.985862852 +0000 UTC m=+0.081412015 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:51:09 localhost podman[294463]: 2025-11-26 09:51:09.087804047 +0000 UTC m=+0.183353190 container init 8b726f86765f3ad2dce30da0bb49a6c3b490758cd5b4f0b37417241ce75b6cd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_poitras, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, name=rhceph, vcs-type=git, distribution-scope=public, CEPH_POINT_RELEASE=, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, release=553, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, GIT_CLEAN=True) Nov 26 04:51:09 localhost podman[294463]: 2025-11-26 09:51:09.09796962 +0000 UTC m=+0.193518773 container start 8b726f86765f3ad2dce30da0bb49a6c3b490758cd5b4f0b37417241ce75b6cd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_poitras, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, name=rhceph, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, RELEASE=main, architecture=x86_64, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 26 04:51:09 localhost keen_poitras[294478]: 167 167 Nov 26 04:51:09 localhost podman[294463]: 2025-11-26 09:51:09.100184948 +0000 UTC m=+0.195734091 container attach 8b726f86765f3ad2dce30da0bb49a6c3b490758cd5b4f0b37417241ce75b6cd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_poitras, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vcs-type=git, version=7, vendor=Red Hat, Inc., name=rhceph, ceph=True, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_BRANCH=main, RELEASE=main, distribution-scope=public, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, release=553, GIT_CLEAN=True, io.buildah.version=1.33.12) Nov 26 04:51:09 localhost systemd[1]: libpod-8b726f86765f3ad2dce30da0bb49a6c3b490758cd5b4f0b37417241ce75b6cd6.scope: Deactivated successfully. Nov 26 04:51:09 localhost podman[294463]: 2025-11-26 09:51:09.103397137 +0000 UTC m=+0.198946330 container died 8b726f86765f3ad2dce30da0bb49a6c3b490758cd5b4f0b37417241ce75b6cd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_poitras, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, architecture=x86_64, RELEASE=main, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, description=Red Hat Ceph Storage 7, version=7, GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=) Nov 26 04:51:09 localhost podman[294483]: 2025-11-26 09:51:09.21405876 +0000 UTC m=+0.097561531 container remove 8b726f86765f3ad2dce30da0bb49a6c3b490758cd5b4f0b37417241ce75b6cd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_poitras, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, com.redhat.component=rhceph-container, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, GIT_CLEAN=True, version=7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, distribution-scope=public) Nov 26 04:51:09 localhost systemd[1]: libpod-conmon-8b726f86765f3ad2dce30da0bb49a6c3b490758cd5b4f0b37417241ce75b6cd6.scope: Deactivated successfully. Nov 26 04:51:09 localhost systemd[1]: var-lib-containers-storage-overlay-21805d09126236729db74616ec5638204d7e829ff1056487a84459be844ec059-merged.mount: Deactivated successfully. Nov 26 04:51:09 localhost ceph-mon[288827]: Reconfiguring mon.np0005536118 (monmap changed)... Nov 26 04:51:09 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 26 04:51:09 localhost ceph-mon[288827]: Reconfiguring daemon mon.np0005536118 on np0005536118.localdomain Nov 26 04:51:09 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:10 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:10 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536119.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:51:10 localhost ceph-mon[288827]: Reconfiguring crash.np0005536119 (monmap changed)... Nov 26 04:51:10 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536119.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:51:10 localhost ceph-mon[288827]: Reconfiguring daemon crash.np0005536119 on np0005536119.localdomain Nov 26 04:51:10 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:10 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:10 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Nov 26 04:51:11 localhost ceph-mon[288827]: Reconfiguring osd.1 (monmap changed)... Nov 26 04:51:11 localhost ceph-mon[288827]: Reconfiguring daemon osd.1 on np0005536119.localdomain Nov 26 04:51:11 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:11 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:11 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:11 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:11 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Nov 26 04:51:11 localhost nova_compute[281415]: 2025-11-26 09:51:11.835 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:51:12 localhost ceph-mon[288827]: Reconfiguring osd.3 (monmap changed)... Nov 26 04:51:12 localhost ceph-mon[288827]: Reconfiguring daemon osd.3 on np0005536119.localdomain Nov 26 04:51:12 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:12 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:12 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:12 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:12 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005536119.dxhchp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 26 04:51:12 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005536119.dxhchp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 26 04:51:12 localhost ceph-mon[288827]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0. Nov 26 04:51:12 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:51:12.960831) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 26 04:51:12 localhost ceph-mon[288827]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19 Nov 26 04:51:12 localhost ceph-mon[288827]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150672960963, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 2672, "num_deletes": 256, "total_data_size": 8369392, "memory_usage": 8661072, "flush_reason": "Manual Compaction"} Nov 26 04:51:12 localhost ceph-mon[288827]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started Nov 26 04:51:12 localhost ceph-mon[288827]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150672989631, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 4643818, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13071, "largest_seqno": 15738, "table_properties": {"data_size": 4633505, "index_size": 6242, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3013, "raw_key_size": 26729, "raw_average_key_size": 22, "raw_value_size": 4610845, "raw_average_value_size": 3891, "num_data_blocks": 261, "num_entries": 1185, "num_filter_entries": 1185, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764150616, "oldest_key_time": 1764150616, "file_creation_time": 1764150672, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "96260784-ab37-4bfa-a747-b54286a1d4f8", "db_session_id": "ZOF5ONGIRCTUGR7KNLS5", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}} Nov 26 04:51:12 localhost ceph-mon[288827]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 28875 microseconds, and 13386 cpu microseconds. Nov 26 04:51:12 localhost ceph-mon[288827]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 04:51:12 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:51:12.989722) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 4643818 bytes OK Nov 26 04:51:12 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:51:12.989757) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started Nov 26 04:51:12 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:51:12.992363) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done Nov 26 04:51:12 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:51:12.992385) EVENT_LOG_v1 {"time_micros": 1764150672992379, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 26 04:51:12 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:51:12.992412) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 26 04:51:12 localhost ceph-mon[288827]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 8356348, prev total WAL file size 8356348, number of live WAL files 2. Nov 26 04:51:12 localhost ceph-mon[288827]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 04:51:12 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:51:12.994255) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031303132' seq:72057594037927935, type:22 .. '6B760031323636' seq:0, type:0; will stop at (end) Nov 26 04:51:12 localhost ceph-mon[288827]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 26 04:51:12 localhost ceph-mon[288827]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(4534KB)], [18(10MB)] Nov 26 04:51:12 localhost ceph-mon[288827]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150672994349, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 15716740, "oldest_snapshot_seqno": -1} Nov 26 04:51:13 localhost ceph-mon[288827]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 10043 keys, 14973939 bytes, temperature: kUnknown Nov 26 04:51:13 localhost ceph-mon[288827]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150673073746, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 14973939, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14915530, "index_size": 32196, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25157, "raw_key_size": 268322, "raw_average_key_size": 26, "raw_value_size": 14742664, "raw_average_value_size": 1467, "num_data_blocks": 1223, "num_entries": 10043, "num_filter_entries": 10043, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764150548, "oldest_key_time": 0, "file_creation_time": 1764150672, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "96260784-ab37-4bfa-a747-b54286a1d4f8", "db_session_id": "ZOF5ONGIRCTUGR7KNLS5", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}} Nov 26 04:51:13 localhost ceph-mon[288827]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 04:51:13 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:51:13.074412) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 14973939 bytes Nov 26 04:51:13 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:51:13.076566) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 197.2 rd, 187.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.4, 10.6 +0.0 blob) out(14.3 +0.0 blob), read-write-amplify(6.6) write-amplify(3.2) OK, records in: 10505, records dropped: 462 output_compression: NoCompression Nov 26 04:51:13 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:51:13.076612) EVENT_LOG_v1 {"time_micros": 1764150673076590, "job": 8, "event": "compaction_finished", "compaction_time_micros": 79694, "compaction_time_cpu_micros": 48741, "output_level": 6, "num_output_files": 1, "total_output_size": 14973939, "num_input_records": 10505, "num_output_records": 10043, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 26 04:51:13 localhost ceph-mon[288827]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 04:51:13 localhost ceph-mon[288827]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150673077706, "job": 8, "event": "table_file_deletion", "file_number": 20} Nov 26 04:51:13 localhost ceph-mon[288827]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 04:51:13 localhost ceph-mon[288827]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150673080028, "job": 8, "event": "table_file_deletion", "file_number": 18} Nov 26 04:51:13 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:51:12.994076) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:51:13 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:51:13.080183) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:51:13 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:51:13.080195) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:51:13 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:51:13.080199) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:51:13 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:51:13.080201) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:51:13 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:51:13.080204) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:51:13 localhost ceph-mon[288827]: Reconfiguring mds.mds.np0005536119.dxhchp (monmap changed)... Nov 26 04:51:13 localhost ceph-mon[288827]: Reconfiguring daemon mds.mds.np0005536119.dxhchp on np0005536119.localdomain Nov 26 04:51:13 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:13 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:13 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536119.eupicg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:51:13 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536119.eupicg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:51:13 localhost ceph-mon[288827]: mon.np0005536118@3(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:51:13 localhost nova_compute[281415]: 2025-11-26 09:51:13.988 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:51:14 localhost ceph-mon[288827]: Reconfiguring mgr.np0005536119.eupicg (monmap changed)... Nov 26 04:51:14 localhost ceph-mon[288827]: Reconfiguring daemon mgr.np0005536119.eupicg on np0005536119.localdomain Nov 26 04:51:14 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:14 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:14 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 26 04:51:15 localhost ceph-mon[288827]: Reconfiguring mon.np0005536119 (monmap changed)... Nov 26 04:51:15 localhost ceph-mon[288827]: Reconfiguring daemon mon.np0005536119 on np0005536119.localdomain Nov 26 04:51:15 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:15 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:15 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:51:15 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:15 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:15 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 26 04:51:15 localhost openstack_network_exporter[242153]: ERROR 09:51:15 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:51:15 localhost openstack_network_exporter[242153]: ERROR 09:51:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:51:15 localhost openstack_network_exporter[242153]: ERROR 09:51:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:51:15 localhost openstack_network_exporter[242153]: ERROR 09:51:15 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:51:15 localhost openstack_network_exporter[242153]: Nov 26 04:51:15 localhost openstack_network_exporter[242153]: ERROR 09:51:15 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:51:15 localhost openstack_network_exporter[242153]: Nov 26 04:51:16 localhost ceph-mon[288827]: Reconfiguring mon.np0005536113 (monmap changed)... Nov 26 04:51:16 localhost ceph-mon[288827]: Reconfiguring daemon mon.np0005536113 on np0005536113.localdomain Nov 26 04:51:16 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:16 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:16 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 26 04:51:16 localhost nova_compute[281415]: 2025-11-26 09:51:16.865 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:51:17 localhost ceph-mgr[287388]: ms_deliver_dispatch: unhandled message 0x55fcbfcdaf20 mon_map magic: 0 from mon.3 v2:172.18.0.107:3300/0 Nov 26 04:51:17 localhost ceph-mon[288827]: mon.np0005536118@3(peon) e10 my rank is now 2 (was 3) Nov 26 04:51:17 localhost ceph-mgr[287388]: client.0 ms_handle_reset on v2:172.18.0.107:3300/0 Nov 26 04:51:17 localhost ceph-mgr[287388]: client.0 ms_handle_reset on v2:172.18.0.107:3300/0 Nov 26 04:51:17 localhost ceph-mgr[287388]: ms_deliver_dispatch: unhandled message 0x55fcc98f8000 mon_map magic: 0 from mon.2 v2:172.18.0.107:3300/0 Nov 26 04:51:17 localhost ceph-mon[288827]: log_channel(cluster) log [INF] : mon.np0005536118 calling monitor election Nov 26 04:51:17 localhost ceph-mon[288827]: paxos.2).electionLogic(36) init, last seen epoch 36 Nov 26 04:51:17 localhost ceph-mon[288827]: mon.np0005536118@2(electing) e10 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:51:17 localhost ceph-mon[288827]: mon.np0005536118@2(electing) e10 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:51:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:51:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:51:17 localhost podman[294520]: 2025-11-26 09:51:17.844312392 +0000 UTC m=+0.095245110 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:51:17 localhost podman[294519]: 2025-11-26 09:51:17.892079502 +0000 UTC m=+0.143702091 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 26 04:51:17 localhost podman[294520]: 2025-11-26 09:51:17.913632775 +0000 UTC m=+0.164565523 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:51:17 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:51:17 localhost podman[294519]: 2025-11-26 09:51:17.929426751 +0000 UTC m=+0.181049340 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 04:51:17 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:51:19 localhost nova_compute[281415]: 2025-11-26 09:51:19.004 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:51:19 localhost ceph-mon[288827]: mon.np0005536118@2(electing) e10 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:51:19 localhost ceph-mon[288827]: mon.np0005536118@2(peon) e10 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:51:19 localhost ceph-mon[288827]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0. Nov 26 04:51:19 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:51:19.186516) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 26 04:51:19 localhost ceph-mon[288827]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22 Nov 26 04:51:19 localhost ceph-mon[288827]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150679186835, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 460, "num_deletes": 251, "total_data_size": 422860, "memory_usage": 432360, "flush_reason": "Manual Compaction"} Nov 26 04:51:19 localhost ceph-mon[288827]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started Nov 26 04:51:19 localhost ceph-mon[288827]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150679191130, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 259822, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 15743, "largest_seqno": 16198, "table_properties": {"data_size": 257143, "index_size": 726, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 901, "raw_key_size": 7503, "raw_average_key_size": 21, "raw_value_size": 251377, "raw_average_value_size": 706, "num_data_blocks": 30, "num_entries": 356, "num_filter_entries": 356, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764150673, "oldest_key_time": 1764150673, "file_creation_time": 1764150679, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "96260784-ab37-4bfa-a747-b54286a1d4f8", "db_session_id": "ZOF5ONGIRCTUGR7KNLS5", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}} Nov 26 04:51:19 localhost ceph-mon[288827]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 4398 microseconds, and 1442 cpu microseconds. Nov 26 04:51:19 localhost ceph-mon[288827]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 04:51:19 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:51:19.191177) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 259822 bytes OK Nov 26 04:51:19 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:51:19.191198) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started Nov 26 04:51:19 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:51:19.193417) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done Nov 26 04:51:19 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:51:19.193438) EVENT_LOG_v1 {"time_micros": 1764150679193432, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 26 04:51:19 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:51:19.193460) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 26 04:51:19 localhost ceph-mon[288827]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 419941, prev total WAL file size 419941, number of live WAL files 2. Nov 26 04:51:19 localhost ceph-mon[288827]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 04:51:19 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:51:19.194455) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130323931' seq:72057594037927935, type:22 .. '7061786F73003130353433' seq:0, type:0; will stop at (end) Nov 26 04:51:19 localhost ceph-mon[288827]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 26 04:51:19 localhost ceph-mon[288827]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(253KB)], [21(14MB)] Nov 26 04:51:19 localhost ceph-mon[288827]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150679194503, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 15233761, "oldest_snapshot_seqno": -1} Nov 26 04:51:19 localhost ceph-mon[288827]: Reconfiguring daemon mon.np0005536114 on np0005536114.localdomain Nov 26 04:51:19 localhost ceph-mon[288827]: Remove daemons mon.np0005536113 Nov 26 04:51:19 localhost ceph-mon[288827]: Safe to remove mon.np0005536113: new quorum should be ['np0005536114', 'np0005536119', 'np0005536118', 'np0005536117'] (from ['np0005536114', 'np0005536119', 'np0005536118', 'np0005536117']) Nov 26 04:51:19 localhost ceph-mon[288827]: Removing monitor np0005536113 from monmap... Nov 26 04:51:19 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "mon rm", "name": "np0005536113"} : dispatch Nov 26 04:51:19 localhost ceph-mon[288827]: Removing daemon mon.np0005536113 from np0005536113.localdomain -- ports [] Nov 26 04:51:19 localhost ceph-mon[288827]: mon.np0005536119 calling monitor election Nov 26 04:51:19 localhost ceph-mon[288827]: mon.np0005536118 calling monitor election Nov 26 04:51:19 localhost ceph-mon[288827]: mon.np0005536117 calling monitor election Nov 26 04:51:19 localhost ceph-mon[288827]: mon.np0005536114 calling monitor election Nov 26 04:51:19 localhost ceph-mon[288827]: mon.np0005536114 is new leader, mons np0005536114,np0005536119,np0005536118,np0005536117 in quorum (ranks 0,1,2,3) Nov 26 04:51:19 localhost ceph-mon[288827]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm Nov 26 04:51:19 localhost ceph-mon[288827]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm Nov 26 04:51:19 localhost ceph-mon[288827]: stray daemon mgr.np0005536112.srlncr on host np0005536112.localdomain not managed by cephadm Nov 26 04:51:19 localhost ceph-mon[288827]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm Nov 26 04:51:19 localhost ceph-mon[288827]: stray host np0005536112.localdomain has 1 stray daemons: ['mgr.np0005536112.srlncr'] Nov 26 04:51:19 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:19 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:19 localhost ceph-mon[288827]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 9870 keys, 13176645 bytes, temperature: kUnknown Nov 26 04:51:19 localhost ceph-mon[288827]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150679251842, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 13176645, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13121013, "index_size": 29867, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24709, "raw_key_size": 265303, "raw_average_key_size": 26, "raw_value_size": 12952769, "raw_average_value_size": 1312, "num_data_blocks": 1123, "num_entries": 9870, "num_filter_entries": 9870, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764150548, "oldest_key_time": 0, "file_creation_time": 1764150679, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "96260784-ab37-4bfa-a747-b54286a1d4f8", "db_session_id": "ZOF5ONGIRCTUGR7KNLS5", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}} Nov 26 04:51:19 localhost ceph-mon[288827]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 04:51:19 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:51:19.252151) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 13176645 bytes Nov 26 04:51:19 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:51:19.254212) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 265.1 rd, 229.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.2, 14.3 +0.0 blob) out(12.6 +0.0 blob), read-write-amplify(109.3) write-amplify(50.7) OK, records in: 10399, records dropped: 529 output_compression: NoCompression Nov 26 04:51:19 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:51:19.254234) EVENT_LOG_v1 {"time_micros": 1764150679254224, "job": 10, "event": "compaction_finished", "compaction_time_micros": 57469, "compaction_time_cpu_micros": 27364, "output_level": 6, "num_output_files": 1, "total_output_size": 13176645, "num_input_records": 10399, "num_output_records": 9870, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 26 04:51:19 localhost ceph-mon[288827]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 04:51:19 localhost ceph-mon[288827]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150679254393, "job": 10, "event": "table_file_deletion", "file_number": 23} Nov 26 04:51:19 localhost ceph-mon[288827]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 04:51:19 localhost ceph-mon[288827]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150679255903, "job": 10, "event": "table_file_deletion", "file_number": 21} Nov 26 04:51:19 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:51:19.194338) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:51:19 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:51:19.256091) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:51:19 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:51:19.256101) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:51:19 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:51:19.256104) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:51:19 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:51:19.256108) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:51:19 localhost ceph-mon[288827]: rocksdb: (Original Log Time 2025/11/26-09:51:19.256111) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:51:20 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:20 localhost ceph-mon[288827]: Reconfiguring mgr.np0005536114.ddbqmi (monmap changed)... Nov 26 04:51:20 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536114.ddbqmi", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:51:20 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536114.ddbqmi", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:51:20 localhost ceph-mon[288827]: Reconfiguring daemon mgr.np0005536114.ddbqmi on np0005536114.localdomain Nov 26 04:51:20 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:21 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:21 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536114.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:51:21 localhost ceph-mon[288827]: Reconfiguring crash.np0005536114 (monmap changed)... Nov 26 04:51:21 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536114.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:51:21 localhost ceph-mon[288827]: Reconfiguring daemon crash.np0005536114 on np0005536114.localdomain Nov 26 04:51:21 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:21 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:21 localhost sshd[294563]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:51:21 localhost sshd[294564]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:51:21 localhost nova_compute[281415]: 2025-11-26 09:51:21.904 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:51:22 localhost ceph-mon[288827]: Removed label mon from host np0005536113.localdomain Nov 26 04:51:22 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:22 localhost ceph-mon[288827]: Reconfiguring crash.np0005536117 (monmap changed)... Nov 26 04:51:22 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536117.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:51:22 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536117.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:51:22 localhost ceph-mon[288827]: Reconfiguring daemon crash.np0005536117 on np0005536117.localdomain Nov 26 04:51:22 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:22 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:22 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:51:22 localhost podman[294566]: 2025-11-26 09:51:22.852401413 +0000 UTC m=+0.098675266 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Nov 26 04:51:22 localhost podman[294566]: 2025-11-26 09:51:22.897610283 +0000 UTC m=+0.143884136 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251118) Nov 26 04:51:22 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:51:23 localhost sshd[294591]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:51:23 localhost ceph-mon[288827]: Removed label mgr from host np0005536113.localdomain Nov 26 04:51:23 localhost ceph-mon[288827]: Reconfiguring osd.2 (monmap changed)... Nov 26 04:51:23 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Nov 26 04:51:23 localhost ceph-mon[288827]: Reconfiguring daemon osd.2 on np0005536117.localdomain Nov 26 04:51:23 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:51:23 localhost ceph-mon[288827]: mon.np0005536118@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:51:23 localhost systemd[1]: tmp-crun.fNqqt1.mount: Deactivated successfully. Nov 26 04:51:23 localhost podman[294593]: 2025-11-26 09:51:23.461773146 +0000 UTC m=+0.093654291 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Nov 26 04:51:23 localhost podman[294593]: 2025-11-26 09:51:23.504515761 +0000 UTC m=+0.136396906 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vcs-type=git, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, name=ubi9-minimal, distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, managed_by=edpm_ansible) Nov 26 04:51:23 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:51:24 localhost nova_compute[281415]: 2025-11-26 09:51:24.050 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:51:24 localhost ceph-mon[288827]: Removed label _admin from host np0005536113.localdomain Nov 26 04:51:24 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:24 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:24 localhost ceph-mon[288827]: Reconfiguring osd.5 (monmap changed)... Nov 26 04:51:24 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Nov 26 04:51:24 localhost ceph-mon[288827]: Reconfiguring daemon osd.5 on np0005536117.localdomain Nov 26 04:51:25 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:25 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:25 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005536117.tfthzg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 26 04:51:25 localhost ceph-mon[288827]: Reconfiguring mds.mds.np0005536117.tfthzg (monmap changed)... Nov 26 04:51:25 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005536117.tfthzg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 26 04:51:25 localhost ceph-mon[288827]: Reconfiguring daemon mds.mds.np0005536117.tfthzg on np0005536117.localdomain Nov 26 04:51:25 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:25 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:25 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536117.ggibwg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:51:25 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536117.ggibwg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:51:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:51:25 localhost podman[294611]: 2025-11-26 09:51:25.818442284 +0000 UTC m=+0.078756534 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 26 04:51:25 localhost podman[294611]: 2025-11-26 09:51:25.832418784 +0000 UTC m=+0.092733044 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 26 04:51:25 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:51:26 localhost ceph-mon[288827]: Reconfiguring mgr.np0005536117.ggibwg (monmap changed)... Nov 26 04:51:26 localhost ceph-mon[288827]: Reconfiguring daemon mgr.np0005536117.ggibwg on np0005536117.localdomain Nov 26 04:51:26 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:26 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:26 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 26 04:51:26 localhost nova_compute[281415]: 2025-11-26 09:51:26.945 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:51:27 localhost ceph-mon[288827]: Reconfiguring mon.np0005536117 (monmap changed)... Nov 26 04:51:27 localhost ceph-mon[288827]: Reconfiguring daemon mon.np0005536117 on np0005536117.localdomain Nov 26 04:51:27 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:27 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:27 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536118.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:51:27 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536118.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:51:27 localhost podman[240049]: time="2025-11-26T09:51:27Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:51:27 localhost podman[240049]: @ - - [26/Nov/2025:09:51:27 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153864 "" "Go-http-client/1.1" Nov 26 04:51:27 localhost podman[240049]: @ - - [26/Nov/2025:09:51:27 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18720 "" "Go-http-client/1.1" Nov 26 04:51:27 localhost podman[294686]: Nov 26 04:51:27 localhost podman[294686]: 2025-11-26 09:51:27.77553914 +0000 UTC m=+0.077327878 container create f5dbfac7ea29736331b2c88e308a92500de9db2fb69859ee7ea61e7ec7ba2cc5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_leakey, io.openshift.expose-services=, vendor=Red Hat, Inc., version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, com.redhat.component=rhceph-container, name=rhceph, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, ceph=True, release=553, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, maintainer=Guillaume Abrioux , architecture=x86_64) Nov 26 04:51:27 localhost systemd[1]: Started libpod-conmon-f5dbfac7ea29736331b2c88e308a92500de9db2fb69859ee7ea61e7ec7ba2cc5.scope. Nov 26 04:51:27 localhost systemd[1]: Started libcrun container. Nov 26 04:51:27 localhost podman[294686]: 2025-11-26 09:51:27.744504916 +0000 UTC m=+0.046293684 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:51:27 localhost podman[294686]: 2025-11-26 09:51:27.855658475 +0000 UTC m=+0.157447193 container init f5dbfac7ea29736331b2c88e308a92500de9db2fb69859ee7ea61e7ec7ba2cc5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_leakey, version=7, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, name=rhceph, io.buildah.version=1.33.12, GIT_CLEAN=True, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_BRANCH=main, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public) Nov 26 04:51:27 localhost podman[294686]: 2025-11-26 09:51:27.868308924 +0000 UTC m=+0.170097652 container start f5dbfac7ea29736331b2c88e308a92500de9db2fb69859ee7ea61e7ec7ba2cc5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_leakey, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, version=7, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, release=553, name=rhceph, vcs-type=git, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, build-date=2025-09-24T08:57:55) Nov 26 04:51:27 localhost podman[294686]: 2025-11-26 09:51:27.868733197 +0000 UTC m=+0.170521945 container attach f5dbfac7ea29736331b2c88e308a92500de9db2fb69859ee7ea61e7ec7ba2cc5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_leakey, version=7, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, vcs-type=git, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, release=553, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, GIT_CLEAN=True, description=Red Hat Ceph Storage 7) Nov 26 04:51:27 localhost optimistic_leakey[294701]: 167 167 Nov 26 04:51:27 localhost systemd[1]: libpod-f5dbfac7ea29736331b2c88e308a92500de9db2fb69859ee7ea61e7ec7ba2cc5.scope: Deactivated successfully. Nov 26 04:51:27 localhost podman[294686]: 2025-11-26 09:51:27.876478175 +0000 UTC m=+0.178266973 container died f5dbfac7ea29736331b2c88e308a92500de9db2fb69859ee7ea61e7ec7ba2cc5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_leakey, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, release=553, GIT_CLEAN=True, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, RELEASE=main, maintainer=Guillaume Abrioux , architecture=x86_64, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public) Nov 26 04:51:27 localhost podman[294706]: 2025-11-26 09:51:27.993452683 +0000 UTC m=+0.104172585 container remove f5dbfac7ea29736331b2c88e308a92500de9db2fb69859ee7ea61e7ec7ba2cc5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_leakey, io.openshift.expose-services=, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, ceph=True, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., version=7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git) Nov 26 04:51:28 localhost systemd[1]: libpod-conmon-f5dbfac7ea29736331b2c88e308a92500de9db2fb69859ee7ea61e7ec7ba2cc5.scope: Deactivated successfully. Nov 26 04:51:28 localhost ceph-mon[288827]: mon.np0005536118@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:51:28 localhost ceph-mon[288827]: Reconfiguring crash.np0005536118 (monmap changed)... Nov 26 04:51:28 localhost ceph-mon[288827]: Reconfiguring daemon crash.np0005536118 on np0005536118.localdomain Nov 26 04:51:28 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:28 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:28 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Nov 26 04:51:28 localhost podman[294776]: Nov 26 04:51:28 localhost podman[294776]: 2025-11-26 09:51:28.718143714 +0000 UTC m=+0.056240541 container create 2150ccd154ed359349394d33a765c0998bd867f2f519ac11ca99e2affc0804bc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_cannon, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, CEPH_POINT_RELEASE=, name=rhceph, release=553, distribution-scope=public, GIT_BRANCH=main, version=7, io.openshift.expose-services=, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, maintainer=Guillaume Abrioux ) Nov 26 04:51:28 localhost systemd[1]: Started libpod-conmon-2150ccd154ed359349394d33a765c0998bd867f2f519ac11ca99e2affc0804bc.scope. Nov 26 04:51:28 localhost systemd[1]: Started libcrun container. Nov 26 04:51:28 localhost podman[294776]: 2025-11-26 09:51:28.765015535 +0000 UTC m=+0.103112362 container init 2150ccd154ed359349394d33a765c0998bd867f2f519ac11ca99e2affc0804bc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_cannon, version=7, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, io.openshift.expose-services=, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, vcs-type=git, release=553, GIT_CLEAN=True, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_BRANCH=main) Nov 26 04:51:28 localhost podman[294776]: 2025-11-26 09:51:28.773057912 +0000 UTC m=+0.111154729 container start 2150ccd154ed359349394d33a765c0998bd867f2f519ac11ca99e2affc0804bc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_cannon, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, RELEASE=main, version=7, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, ceph=True, vcs-type=git, io.openshift.tags=rhceph ceph, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, CEPH_POINT_RELEASE=, release=553) Nov 26 04:51:28 localhost podman[294776]: 2025-11-26 09:51:28.773282559 +0000 UTC m=+0.111379396 container attach 2150ccd154ed359349394d33a765c0998bd867f2f519ac11ca99e2affc0804bc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_cannon, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, name=rhceph, release=553, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, GIT_BRANCH=main, vcs-type=git, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=) Nov 26 04:51:28 localhost focused_cannon[294790]: 167 167 Nov 26 04:51:28 localhost systemd[1]: libpod-2150ccd154ed359349394d33a765c0998bd867f2f519ac11ca99e2affc0804bc.scope: Deactivated successfully. Nov 26 04:51:28 localhost podman[294776]: 2025-11-26 09:51:28.775716634 +0000 UTC m=+0.113813471 container died 2150ccd154ed359349394d33a765c0998bd867f2f519ac11ca99e2affc0804bc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_cannon, maintainer=Guillaume Abrioux , RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, name=rhceph, architecture=x86_64, GIT_CLEAN=True, GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, version=7, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, release=553) Nov 26 04:51:28 localhost systemd[1]: var-lib-containers-storage-overlay-d5d1e64f9a8d912782b068b74f62f1dcda714bd0ad2956ac4b1bd16e0b6f7782-merged.mount: Deactivated successfully. Nov 26 04:51:28 localhost podman[294776]: 2025-11-26 09:51:28.696596561 +0000 UTC m=+0.034693428 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:51:28 localhost systemd[1]: var-lib-containers-storage-overlay-930e1f7fcce13f0165fb516ca57188e74e18594174519f7409583ff9b66c3d52-merged.mount: Deactivated successfully. Nov 26 04:51:28 localhost podman[294795]: 2025-11-26 09:51:28.854704794 +0000 UTC m=+0.068432096 container remove 2150ccd154ed359349394d33a765c0998bd867f2f519ac11ca99e2affc0804bc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_cannon, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, architecture=x86_64, build-date=2025-09-24T08:57:55, ceph=True, RELEASE=main, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, version=7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7) Nov 26 04:51:28 localhost systemd[1]: libpod-conmon-2150ccd154ed359349394d33a765c0998bd867f2f519ac11ca99e2affc0804bc.scope: Deactivated successfully. Nov 26 04:51:29 localhost nova_compute[281415]: 2025-11-26 09:51:29.051 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:51:29 localhost ceph-mon[288827]: Reconfiguring osd.0 (monmap changed)... Nov 26 04:51:29 localhost ceph-mon[288827]: Reconfiguring daemon osd.0 on np0005536118.localdomain Nov 26 04:51:29 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:29 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:29 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Nov 26 04:51:29 localhost podman[294872]: Nov 26 04:51:29 localhost podman[294872]: 2025-11-26 09:51:29.68881583 +0000 UTC m=+0.076578486 container create 6a8b39e1b7168d8908ba18454fe3cc775706beec7fca85d181c5bff59af81a73 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_shannon, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , GIT_CLEAN=True, CEPH_POINT_RELEASE=, architecture=x86_64, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, ceph=True, release=553, io.openshift.expose-services=, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 26 04:51:29 localhost systemd[1]: Started libpod-conmon-6a8b39e1b7168d8908ba18454fe3cc775706beec7fca85d181c5bff59af81a73.scope. Nov 26 04:51:29 localhost systemd[1]: Started libcrun container. Nov 26 04:51:29 localhost podman[294872]: 2025-11-26 09:51:29.657055063 +0000 UTC m=+0.044817709 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:51:29 localhost podman[294872]: 2025-11-26 09:51:29.772298308 +0000 UTC m=+0.160060964 container init 6a8b39e1b7168d8908ba18454fe3cc775706beec7fca85d181c5bff59af81a73 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_shannon, vendor=Red Hat, Inc., io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, RELEASE=main, GIT_CLEAN=True, name=rhceph, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, ceph=True, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements) Nov 26 04:51:29 localhost podman[294872]: 2025-11-26 09:51:29.7808222 +0000 UTC m=+0.168584856 container start 6a8b39e1b7168d8908ba18454fe3cc775706beec7fca85d181c5bff59af81a73 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_shannon, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, io.openshift.tags=rhceph ceph, vcs-type=git, vendor=Red Hat, Inc., GIT_CLEAN=True, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, ceph=True, name=rhceph, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, GIT_BRANCH=main) Nov 26 04:51:29 localhost loving_shannon[294888]: 167 167 Nov 26 04:51:29 localhost podman[294872]: 2025-11-26 09:51:29.78245394 +0000 UTC m=+0.170216656 container attach 6a8b39e1b7168d8908ba18454fe3cc775706beec7fca85d181c5bff59af81a73 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_shannon, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, io.buildah.version=1.33.12, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, name=rhceph, description=Red Hat Ceph Storage 7, version=7, io.openshift.expose-services=, com.redhat.component=rhceph-container, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., RELEASE=main, maintainer=Guillaume Abrioux ) Nov 26 04:51:29 localhost systemd[1]: libpod-6a8b39e1b7168d8908ba18454fe3cc775706beec7fca85d181c5bff59af81a73.scope: Deactivated successfully. Nov 26 04:51:29 localhost podman[294872]: 2025-11-26 09:51:29.7847262 +0000 UTC m=+0.172488856 container died 6a8b39e1b7168d8908ba18454fe3cc775706beec7fca85d181c5bff59af81a73 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_shannon, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, version=7, io.openshift.expose-services=, name=rhceph, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, GIT_BRANCH=main, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph) Nov 26 04:51:29 localhost systemd[1]: var-lib-containers-storage-overlay-eb5e9b4f099c618a54bf519b33e4bc5e3c45915453a9ffe5ace891fe8d48939f-merged.mount: Deactivated successfully. Nov 26 04:51:29 localhost podman[294893]: 2025-11-26 09:51:29.889673778 +0000 UTC m=+0.090279138 container remove 6a8b39e1b7168d8908ba18454fe3cc775706beec7fca85d181c5bff59af81a73 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_shannon, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, ceph=True, description=Red Hat Ceph Storage 7, distribution-scope=public, release=553, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, version=7) Nov 26 04:51:29 localhost systemd[1]: libpod-conmon-6a8b39e1b7168d8908ba18454fe3cc775706beec7fca85d181c5bff59af81a73.scope: Deactivated successfully. Nov 26 04:51:30 localhost ceph-mon[288827]: Reconfiguring osd.4 (monmap changed)... Nov 26 04:51:30 localhost ceph-mon[288827]: Reconfiguring daemon osd.4 on np0005536118.localdomain Nov 26 04:51:30 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:30 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:30 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005536118.kohnma", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 26 04:51:30 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005536118.kohnma", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 26 04:51:30 localhost podman[294971]: Nov 26 04:51:30 localhost podman[294971]: 2025-11-26 09:51:30.752092885 +0000 UTC m=+0.081854179 container create 05c14cb50cf98b6120d94809db6a02706f9fa6def5392e9465bb32820278a010 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_mendel, vendor=Red Hat, Inc., version=7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, release=553, GIT_CLEAN=True, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, description=Red Hat Ceph Storage 7, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 26 04:51:30 localhost systemd[1]: Started libpod-conmon-05c14cb50cf98b6120d94809db6a02706f9fa6def5392e9465bb32820278a010.scope. Nov 26 04:51:30 localhost podman[294971]: 2025-11-26 09:51:30.717047496 +0000 UTC m=+0.046808790 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:51:30 localhost systemd[1]: Started libcrun container. Nov 26 04:51:30 localhost podman[294971]: 2025-11-26 09:51:30.851453221 +0000 UTC m=+0.181214485 container init 05c14cb50cf98b6120d94809db6a02706f9fa6def5392e9465bb32820278a010 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_mendel, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.buildah.version=1.33.12, GIT_BRANCH=main, CEPH_POINT_RELEASE=, architecture=x86_64, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, io.openshift.expose-services=, distribution-scope=public, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, GIT_CLEAN=True) Nov 26 04:51:30 localhost podman[294971]: 2025-11-26 09:51:30.863083418 +0000 UTC m=+0.192844682 container start 05c14cb50cf98b6120d94809db6a02706f9fa6def5392e9465bb32820278a010 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_mendel, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, distribution-scope=public, version=7, ceph=True, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, name=rhceph, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, architecture=x86_64, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., GIT_BRANCH=main) Nov 26 04:51:30 localhost podman[294971]: 2025-11-26 09:51:30.863445229 +0000 UTC m=+0.193206493 container attach 05c14cb50cf98b6120d94809db6a02706f9fa6def5392e9465bb32820278a010 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_mendel, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_CLEAN=True, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., version=7, io.buildah.version=1.33.12, GIT_BRANCH=main, release=553, io.openshift.expose-services=, ceph=True, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 26 04:51:30 localhost hopeful_mendel[294986]: 167 167 Nov 26 04:51:30 localhost systemd[1]: libpod-05c14cb50cf98b6120d94809db6a02706f9fa6def5392e9465bb32820278a010.scope: Deactivated successfully. Nov 26 04:51:30 localhost podman[294971]: 2025-11-26 09:51:30.868220586 +0000 UTC m=+0.197981870 container died 05c14cb50cf98b6120d94809db6a02706f9fa6def5392e9465bb32820278a010 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_mendel, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, release=553, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, GIT_BRANCH=main, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, description=Red Hat Ceph Storage 7) Nov 26 04:51:30 localhost podman[294991]: 2025-11-26 09:51:30.977675443 +0000 UTC m=+0.095263611 container remove 05c14cb50cf98b6120d94809db6a02706f9fa6def5392e9465bb32820278a010 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_mendel, ceph=True, RELEASE=main, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, vcs-type=git, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, GIT_BRANCH=main) Nov 26 04:51:30 localhost systemd[1]: libpod-conmon-05c14cb50cf98b6120d94809db6a02706f9fa6def5392e9465bb32820278a010.scope: Deactivated successfully. Nov 26 04:51:31 localhost ceph-mon[288827]: Reconfiguring mds.mds.np0005536118.kohnma (monmap changed)... Nov 26 04:51:31 localhost ceph-mon[288827]: Reconfiguring daemon mds.mds.np0005536118.kohnma on np0005536118.localdomain Nov 26 04:51:31 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:31 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:31 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536118.anceyj", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:51:31 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536118.anceyj", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:51:31 localhost podman[295060]: Nov 26 04:51:31 localhost podman[295060]: 2025-11-26 09:51:31.735798542 +0000 UTC m=+0.082635003 container create 74236bd3c3f6334630f4bc077d1e6303c8066b99ed7ceef286877a8eb5a032e8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_yalow, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, io.openshift.expose-services=, io.buildah.version=1.33.12, vcs-type=git, distribution-scope=public, release=553, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, GIT_BRANCH=main, RELEASE=main, vendor=Red Hat, Inc.) Nov 26 04:51:31 localhost systemd[1]: Started libpod-conmon-74236bd3c3f6334630f4bc077d1e6303c8066b99ed7ceef286877a8eb5a032e8.scope. Nov 26 04:51:31 localhost systemd[1]: Started libcrun container. Nov 26 04:51:31 localhost podman[295060]: 2025-11-26 09:51:31.798582993 +0000 UTC m=+0.145419454 container init 74236bd3c3f6334630f4bc077d1e6303c8066b99ed7ceef286877a8eb5a032e8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_yalow, GIT_BRANCH=main, distribution-scope=public, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, version=7, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, io.openshift.expose-services=, vcs-type=git, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , release=553) Nov 26 04:51:31 localhost podman[295060]: 2025-11-26 09:51:31.701769895 +0000 UTC m=+0.048606396 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:51:31 localhost podman[295060]: 2025-11-26 09:51:31.80822793 +0000 UTC m=+0.155064391 container start 74236bd3c3f6334630f4bc077d1e6303c8066b99ed7ceef286877a8eb5a032e8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_yalow, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, distribution-scope=public, RELEASE=main, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, name=rhceph, maintainer=Guillaume Abrioux , ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, architecture=x86_64, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph) Nov 26 04:51:31 localhost podman[295060]: 2025-11-26 09:51:31.808502658 +0000 UTC m=+0.155339149 container attach 74236bd3c3f6334630f4bc077d1e6303c8066b99ed7ceef286877a8eb5a032e8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_yalow, io.buildah.version=1.33.12, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, GIT_CLEAN=True, GIT_BRANCH=main, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, maintainer=Guillaume Abrioux , distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, architecture=x86_64, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 26 04:51:31 localhost determined_yalow[295075]: 167 167 Nov 26 04:51:31 localhost systemd[1]: var-lib-containers-storage-overlay-5e868bb332f89e392c1fe1193f7161e5a156f5434676dcf1159555e6a32152f8-merged.mount: Deactivated successfully. Nov 26 04:51:31 localhost podman[295060]: 2025-11-26 09:51:31.81216264 +0000 UTC m=+0.158999171 container died 74236bd3c3f6334630f4bc077d1e6303c8066b99ed7ceef286877a8eb5a032e8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_yalow, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-type=git, GIT_BRANCH=main, name=rhceph, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, com.redhat.component=rhceph-container, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, distribution-scope=public, ceph=True, release=553, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 26 04:51:31 localhost systemd[1]: libpod-74236bd3c3f6334630f4bc077d1e6303c8066b99ed7ceef286877a8eb5a032e8.scope: Deactivated successfully. Nov 26 04:51:31 localhost systemd[1]: tmp-crun.uvFJCJ.mount: Deactivated successfully. Nov 26 04:51:31 localhost systemd[1]: var-lib-containers-storage-overlay-2250494a1fc17328946cf9b70d8baa0f8136bad9e05e1d4164a10a879376178c-merged.mount: Deactivated successfully. Nov 26 04:51:31 localhost podman[295080]: 2025-11-26 09:51:31.897432324 +0000 UTC m=+0.076649529 container remove 74236bd3c3f6334630f4bc077d1e6303c8066b99ed7ceef286877a8eb5a032e8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_yalow, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, release=553, description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, distribution-scope=public, GIT_CLEAN=True, version=7, ceph=True, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 26 04:51:31 localhost systemd[1]: libpod-conmon-74236bd3c3f6334630f4bc077d1e6303c8066b99ed7ceef286877a8eb5a032e8.scope: Deactivated successfully. Nov 26 04:51:31 localhost nova_compute[281415]: 2025-11-26 09:51:31.977 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:51:32 localhost ceph-mon[288827]: Reconfiguring mgr.np0005536118.anceyj (monmap changed)... Nov 26 04:51:32 localhost ceph-mon[288827]: Reconfiguring daemon mgr.np0005536118.anceyj on np0005536118.localdomain Nov 26 04:51:32 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:32 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:32 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 26 04:51:32 localhost podman[295148]: Nov 26 04:51:32 localhost podman[295148]: 2025-11-26 09:51:32.674850106 +0000 UTC m=+0.083062797 container create 79fd7aa011acd054740e9a2234f9af1e8f511fc1d45195c1b57ff0b7f8228928 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_diffie, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, vendor=Red Hat, Inc., io.openshift.expose-services=, RELEASE=main, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, vcs-type=git, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.buildah.version=1.33.12, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7) Nov 26 04:51:32 localhost systemd[1]: Started libpod-conmon-79fd7aa011acd054740e9a2234f9af1e8f511fc1d45195c1b57ff0b7f8228928.scope. Nov 26 04:51:32 localhost systemd[1]: Started libcrun container. Nov 26 04:51:32 localhost podman[295148]: 2025-11-26 09:51:32.640754387 +0000 UTC m=+0.048967108 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:51:32 localhost podman[295148]: 2025-11-26 09:51:32.746406667 +0000 UTC m=+0.154619358 container init 79fd7aa011acd054740e9a2234f9af1e8f511fc1d45195c1b57ff0b7f8228928 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_diffie, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, maintainer=Guillaume Abrioux , version=7, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, vcs-type=git, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, vendor=Red Hat, Inc., name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, distribution-scope=public, RELEASE=main, com.redhat.component=rhceph-container) Nov 26 04:51:32 localhost podman[295148]: 2025-11-26 09:51:32.755504277 +0000 UTC m=+0.163716978 container start 79fd7aa011acd054740e9a2234f9af1e8f511fc1d45195c1b57ff0b7f8228928 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_diffie, io.buildah.version=1.33.12, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, version=7, GIT_BRANCH=main, io.openshift.expose-services=, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , architecture=x86_64, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, vcs-type=git, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., CEPH_POINT_RELEASE=) Nov 26 04:51:32 localhost podman[295148]: 2025-11-26 09:51:32.755770875 +0000 UTC m=+0.163983576 container attach 79fd7aa011acd054740e9a2234f9af1e8f511fc1d45195c1b57ff0b7f8228928 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_diffie, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, io.openshift.expose-services=, com.redhat.component=rhceph-container, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, release=553, ceph=True, name=rhceph, GIT_CLEAN=True, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, version=7) Nov 26 04:51:32 localhost interesting_diffie[295163]: 167 167 Nov 26 04:51:32 localhost systemd[1]: libpod-79fd7aa011acd054740e9a2234f9af1e8f511fc1d45195c1b57ff0b7f8228928.scope: Deactivated successfully. Nov 26 04:51:32 localhost podman[295148]: 2025-11-26 09:51:32.762732468 +0000 UTC m=+0.170945189 container died 79fd7aa011acd054740e9a2234f9af1e8f511fc1d45195c1b57ff0b7f8228928 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_diffie, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, release=553, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, GIT_CLEAN=True, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 26 04:51:32 localhost systemd[1]: var-lib-containers-storage-overlay-68c5a3e17c534d3e534f9626add6211cd4f1f24b7e0913f84bf617170c6c4d31-merged.mount: Deactivated successfully. Nov 26 04:51:32 localhost podman[295168]: 2025-11-26 09:51:32.861873218 +0000 UTC m=+0.090752923 container remove 79fd7aa011acd054740e9a2234f9af1e8f511fc1d45195c1b57ff0b7f8228928 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_diffie, GIT_BRANCH=main, ceph=True, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, vcs-type=git, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, RELEASE=main, distribution-scope=public, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, build-date=2025-09-24T08:57:55, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 26 04:51:32 localhost systemd[1]: libpod-conmon-79fd7aa011acd054740e9a2234f9af1e8f511fc1d45195c1b57ff0b7f8228928.scope: Deactivated successfully. Nov 26 04:51:33 localhost ceph-mon[288827]: mon.np0005536118@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:51:33 localhost ceph-mon[288827]: Reconfiguring mon.np0005536118 (monmap changed)... Nov 26 04:51:33 localhost ceph-mon[288827]: Reconfiguring daemon mon.np0005536118 on np0005536118.localdomain Nov 26 04:51:33 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:33 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:33 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536119.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:51:33 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536119.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:51:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:51:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:51:33 localhost podman[295186]: 2025-11-26 09:51:33.834717701 +0000 UTC m=+0.087527013 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:51:33 localhost podman[295185]: 2025-11-26 09:51:33.885437851 +0000 UTC m=+0.140349878 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251118, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 26 04:51:33 localhost podman[295186]: 2025-11-26 09:51:33.901429363 +0000 UTC m=+0.154238625 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd) Nov 26 04:51:33 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:51:33 localhost podman[295185]: 2025-11-26 09:51:33.919439237 +0000 UTC m=+0.174351314 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Nov 26 04:51:33 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:51:34 localhost nova_compute[281415]: 2025-11-26 09:51:34.088 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:51:34 localhost ceph-mon[288827]: Reconfiguring crash.np0005536119 (monmap changed)... Nov 26 04:51:34 localhost ceph-mon[288827]: Reconfiguring daemon crash.np0005536119 on np0005536119.localdomain Nov 26 04:51:34 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:34 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:34 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Nov 26 04:51:35 localhost ceph-mon[288827]: Reconfiguring osd.1 (monmap changed)... Nov 26 04:51:35 localhost ceph-mon[288827]: Reconfiguring daemon osd.1 on np0005536119.localdomain Nov 26 04:51:35 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:35 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:35 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:35 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Nov 26 04:51:35 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:36 localhost ceph-mon[288827]: Added label _no_schedule to host np0005536113.localdomain Nov 26 04:51:36 localhost ceph-mon[288827]: Reconfiguring osd.3 (monmap changed)... Nov 26 04:51:36 localhost ceph-mon[288827]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005536113.localdomain Nov 26 04:51:36 localhost ceph-mon[288827]: Reconfiguring daemon osd.3 on np0005536119.localdomain Nov 26 04:51:36 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:36 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:36 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005536119.dxhchp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 26 04:51:36 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005536119.dxhchp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 26 04:51:37 localhost nova_compute[281415]: 2025-11-26 09:51:37.022 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:51:37 localhost ceph-mon[288827]: Reconfiguring mds.mds.np0005536119.dxhchp (monmap changed)... Nov 26 04:51:37 localhost ceph-mon[288827]: Reconfiguring daemon mds.mds.np0005536119.dxhchp on np0005536119.localdomain Nov 26 04:51:37 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:37 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:37 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536119.eupicg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:51:37 localhost ceph-mon[288827]: Reconfiguring mgr.np0005536119.eupicg (monmap changed)... Nov 26 04:51:37 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536119.eupicg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:51:37 localhost ceph-mon[288827]: Reconfiguring daemon mgr.np0005536119.eupicg on np0005536119.localdomain Nov 26 04:51:37 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:37 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005536113.localdomain"} : dispatch Nov 26 04:51:37 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005536113.localdomain"} : dispatch Nov 26 04:51:37 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005536113.localdomain"}]': finished Nov 26 04:51:37 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:38 localhost ceph-mon[288827]: mon.np0005536118@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:51:38 localhost sshd[295231]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:51:38 localhost ceph-mon[288827]: Removed host np0005536113.localdomain Nov 26 04:51:38 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:38 localhost ceph-mon[288827]: Reconfiguring mon.np0005536119 (monmap changed)... Nov 26 04:51:38 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 26 04:51:38 localhost ceph-mon[288827]: Reconfiguring daemon mon.np0005536119 on np0005536119.localdomain Nov 26 04:51:38 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:38 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:38 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:51:39 localhost nova_compute[281415]: 2025-11-26 09:51:39.117 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:51:39 localhost sshd[295418]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:51:39 localhost systemd-logind[761]: New session 67 of user tripleo-admin. Nov 26 04:51:39 localhost systemd[1]: Created slice User Slice of UID 1003. Nov 26 04:51:39 localhost systemd[1]: Starting User Runtime Directory /run/user/1003... Nov 26 04:51:39 localhost systemd[1]: Finished User Runtime Directory /run/user/1003. Nov 26 04:51:39 localhost systemd[1]: Starting User Manager for UID 1003... Nov 26 04:51:39 localhost ceph-mon[288827]: Updating np0005536114.localdomain:/etc/ceph/ceph.conf Nov 26 04:51:39 localhost ceph-mon[288827]: Updating np0005536117.localdomain:/etc/ceph/ceph.conf Nov 26 04:51:39 localhost ceph-mon[288827]: Updating np0005536118.localdomain:/etc/ceph/ceph.conf Nov 26 04:51:39 localhost ceph-mon[288827]: Updating np0005536119.localdomain:/etc/ceph/ceph.conf Nov 26 04:51:39 localhost systemd[295463]: Queued start job for default target Main User Target. Nov 26 04:51:39 localhost systemd[295463]: Created slice User Application Slice. Nov 26 04:51:39 localhost systemd[295463]: Started Mark boot as successful after the user session has run 2 minutes. Nov 26 04:51:39 localhost systemd[295463]: Started Daily Cleanup of User's Temporary Directories. Nov 26 04:51:39 localhost systemd[295463]: Reached target Paths. Nov 26 04:51:39 localhost systemd[295463]: Reached target Timers. Nov 26 04:51:39 localhost systemd[295463]: Starting D-Bus User Message Bus Socket... Nov 26 04:51:39 localhost systemd[295463]: Starting Create User's Volatile Files and Directories... Nov 26 04:51:39 localhost systemd[295463]: Finished Create User's Volatile Files and Directories. Nov 26 04:51:39 localhost systemd[295463]: Listening on D-Bus User Message Bus Socket. Nov 26 04:51:39 localhost systemd[295463]: Reached target Sockets. Nov 26 04:51:39 localhost systemd[295463]: Reached target Basic System. Nov 26 04:51:39 localhost systemd[295463]: Reached target Main User Target. Nov 26 04:51:39 localhost systemd[295463]: Startup finished in 144ms. Nov 26 04:51:39 localhost systemd[1]: Started User Manager for UID 1003. Nov 26 04:51:39 localhost systemd[1]: Started Session 67 of User tripleo-admin. Nov 26 04:51:40 localhost python3[295705]: ansible-ansible.builtin.lineinfile Invoked with dest=/etc/os-net-config/tripleo_config.yaml insertafter=172.18.0 line= - ip_netmask: 172.18.0.104/24 backup=True path=/etc/os-net-config/tripleo_config.yaml state=present backrefs=False create=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 04:51:40 localhost ceph-mon[288827]: Updating np0005536119.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:51:40 localhost ceph-mon[288827]: Updating np0005536118.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:51:40 localhost ceph-mon[288827]: Updating np0005536114.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:51:40 localhost ceph-mon[288827]: Updating np0005536117.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:51:40 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:40 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:40 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:40 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:40 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:40 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:40 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:40 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:40 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:40 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 26 04:51:41 localhost python3[295852]: ansible-ansible.legacy.command Invoked with _raw_params=ip a add 172.18.0.104/24 dev vlan21 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:51:41 localhost ceph-mon[288827]: Reconfiguring mon.np0005536114 (monmap changed)... Nov 26 04:51:41 localhost ceph-mon[288827]: Reconfiguring daemon mon.np0005536114 on np0005536114.localdomain Nov 26 04:51:41 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:41 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:42 localhost nova_compute[281415]: 2025-11-26 09:51:42.077 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:51:42 localhost python3[295997]: ansible-ansible.legacy.command Invoked with _raw_params=ping -W1 -c 3 172.18.0.104 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 04:51:43 localhost ceph-mon[288827]: mon.np0005536118@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:51:44 localhost nova_compute[281415]: 2025-11-26 09:51:44.155 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:51:45 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:45 localhost openstack_network_exporter[242153]: ERROR 09:51:45 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:51:45 localhost openstack_network_exporter[242153]: ERROR 09:51:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:51:45 localhost openstack_network_exporter[242153]: ERROR 09:51:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:51:45 localhost openstack_network_exporter[242153]: ERROR 09:51:45 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:51:45 localhost openstack_network_exporter[242153]: Nov 26 04:51:45 localhost openstack_network_exporter[242153]: ERROR 09:51:45 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:51:45 localhost openstack_network_exporter[242153]: Nov 26 04:51:46 localhost ceph-mon[288827]: Saving service mon spec with placement label:mon Nov 26 04:51:46 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:46 localhost ceph-mon[288827]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:51:46 localhost ceph-mon[288827]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:51:47 localhost nova_compute[281415]: 2025-11-26 09:51:47.109 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:51:48 localhost ceph-mgr[287388]: ms_deliver_dispatch: unhandled message 0x55fcc98f82c0 mon_map magic: 0 from mon.2 v2:172.18.0.107:3300/0 Nov 26 04:51:48 localhost ceph-mon[288827]: mon.np0005536118@2(peon) e11 removed from monmap, suicide. Nov 26 04:51:48 localhost ceph-mgr[287388]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0 Nov 26 04:51:48 localhost ceph-mgr[287388]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0 Nov 26 04:51:48 localhost ceph-mgr[287388]: ms_deliver_dispatch: unhandled message 0x55fcbfcdb600 mon_map magic: 0 from mon.2 v2:172.18.0.103:3300/0 Nov 26 04:51:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:51:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:51:48 localhost podman[296050]: 2025-11-26 09:51:48.30133452 +0000 UTC m=+0.064600168 container died bbb2d15582705a5b34fb4367dd88de91e3439671aa3a6cf770afa1b9821f781f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mon-np0005536118, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, name=rhceph, CEPH_POINT_RELEASE=, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, distribution-scope=public, vendor=Red Hat, Inc., GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 26 04:51:48 localhost systemd[1]: var-lib-containers-storage-overlay-182a500ca4ade7edc3bd32620c5cdbda21648c79651a358836c91ebc9ca1b0cc-merged.mount: Deactivated successfully. Nov 26 04:51:48 localhost podman[296050]: 2025-11-26 09:51:48.359986724 +0000 UTC m=+0.123252332 container remove bbb2d15582705a5b34fb4367dd88de91e3439671aa3a6cf770afa1b9821f781f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mon-np0005536118, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, GIT_BRANCH=main, name=rhceph, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, ceph=True, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 26 04:51:48 localhost podman[296063]: 2025-11-26 09:51:48.398156609 +0000 UTC m=+0.101752132 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 04:51:48 localhost podman[296063]: 2025-11-26 09:51:48.434633361 +0000 UTC m=+0.138228864 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 26 04:51:48 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:51:48 localhost podman[296064]: 2025-11-26 09:51:48.454306766 +0000 UTC m=+0.157432524 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 26 04:51:48 localhost podman[296064]: 2025-11-26 09:51:48.465972654 +0000 UTC m=+0.169098382 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:51:48 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:51:49 localhost nova_compute[281415]: 2025-11-26 09:51:49.158 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:51:49 localhost systemd[1]: ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606@mon.np0005536118.service: Deactivated successfully. Nov 26 04:51:49 localhost systemd[1]: Stopped Ceph mon.np0005536118 for 0d5e5e6d-3c4b-5efe-8c65-346ae6715606. Nov 26 04:51:49 localhost systemd[1]: ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606@mon.np0005536118.service: Consumed 7.651s CPU time. Nov 26 04:51:49 localhost systemd[1]: Reloading. Nov 26 04:51:49 localhost systemd-rc-local-generator[296252]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:51:49 localhost systemd-sysv-generator[296255]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:51:49 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:51:49 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:51:49 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:51:49 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:51:49 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:51:49 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:51:49 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:51:49 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:51:49 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:51:51 localhost podman[296370]: 2025-11-26 09:51:51.062145049 +0000 UTC m=+0.090882777 container exec a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, RELEASE=main, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, CEPH_POINT_RELEASE=, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph) Nov 26 04:51:51 localhost podman[296370]: 2025-11-26 09:51:51.188698141 +0000 UTC m=+0.217435839 container exec_died a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, RELEASE=main, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, ceph=True, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.buildah.version=1.33.12, distribution-scope=public, GIT_CLEAN=True, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, build-date=2025-09-24T08:57:55, vcs-type=git, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 26 04:51:51 localhost nova_compute[281415]: 2025-11-26 09:51:51.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:51:51 localhost nova_compute[281415]: 2025-11-26 09:51:51.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:51:51 localhost nova_compute[281415]: 2025-11-26 09:51:51.849 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:51:51 localhost nova_compute[281415]: 2025-11-26 09:51:51.850 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 04:51:52 localhost nova_compute[281415]: 2025-11-26 09:51:52.152 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:51:52 localhost nova_compute[281415]: 2025-11-26 09:51:52.845 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:51:52 localhost nova_compute[281415]: 2025-11-26 09:51:52.872 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:51:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:51:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:51:53 localhost podman[296488]: 2025-11-26 09:51:53.578404684 +0000 UTC m=+0.095226230 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS) Nov 26 04:51:53 localhost podman[296488]: 2025-11-26 09:51:53.690025028 +0000 UTC m=+0.206846584 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Nov 26 04:51:53 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:51:53 localhost podman[296521]: 2025-11-26 09:51:53.710204128 +0000 UTC m=+0.124346695 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=9.6, maintainer=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, distribution-scope=public, config_id=edpm, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Nov 26 04:51:53 localhost podman[296521]: 2025-11-26 09:51:53.727319584 +0000 UTC m=+0.141462131 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, name=ubi9-minimal, architecture=x86_64, io.openshift.tags=minimal rhel9) Nov 26 04:51:53 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:51:53 localhost nova_compute[281415]: 2025-11-26 09:51:53.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:51:54 localhost nova_compute[281415]: 2025-11-26 09:51:54.161 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:51:54 localhost nova_compute[281415]: 2025-11-26 09:51:54.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:51:54 localhost nova_compute[281415]: 2025-11-26 09:51:54.869 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:51:54 localhost nova_compute[281415]: 2025-11-26 09:51:54.869 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:51:54 localhost nova_compute[281415]: 2025-11-26 09:51:54.870 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:51:54 localhost nova_compute[281415]: 2025-11-26 09:51:54.870 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 04:51:54 localhost nova_compute[281415]: 2025-11-26 09:51:54.870 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:51:55 localhost nova_compute[281415]: 2025-11-26 09:51:55.356 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:51:55 localhost nova_compute[281415]: 2025-11-26 09:51:55.427 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:51:55 localhost nova_compute[281415]: 2025-11-26 09:51:55.428 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:51:55 localhost nova_compute[281415]: 2025-11-26 09:51:55.643 281419 WARNING nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 04:51:55 localhost nova_compute[281415]: 2025-11-26 09:51:55.645 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=11832MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 04:51:55 localhost nova_compute[281415]: 2025-11-26 09:51:55.645 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:51:55 localhost nova_compute[281415]: 2025-11-26 09:51:55.646 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:51:55 localhost nova_compute[281415]: 2025-11-26 09:51:55.716 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 04:51:55 localhost nova_compute[281415]: 2025-11-26 09:51:55.717 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 04:51:55 localhost nova_compute[281415]: 2025-11-26 09:51:55.717 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 04:51:55 localhost nova_compute[281415]: 2025-11-26 09:51:55.754 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:51:56 localhost nova_compute[281415]: 2025-11-26 09:51:56.232 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:51:56 localhost nova_compute[281415]: 2025-11-26 09:51:56.238 281419 DEBUG nova.compute.provider_tree [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 04:51:56 localhost nova_compute[281415]: 2025-11-26 09:51:56.258 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 04:51:56 localhost nova_compute[281415]: 2025-11-26 09:51:56.261 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 04:51:56 localhost nova_compute[281415]: 2025-11-26 09:51:56.261 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.616s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:51:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:51:56 localhost systemd[1]: tmp-crun.jrnH7C.mount: Deactivated successfully. Nov 26 04:51:56 localhost podman[296897]: 2025-11-26 09:51:56.840370827 +0000 UTC m=+0.098011006 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 26 04:51:56 localhost podman[296897]: 2025-11-26 09:51:56.851332454 +0000 UTC m=+0.108972663 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 26 04:51:56 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:51:57 localhost nova_compute[281415]: 2025-11-26 09:51:57.184 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:51:57 localhost nova_compute[281415]: 2025-11-26 09:51:57.262 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:51:57 localhost nova_compute[281415]: 2025-11-26 09:51:57.262 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:51:57 localhost podman[240049]: time="2025-11-26T09:51:57Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:51:57 localhost podman[240049]: @ - - [26/Nov/2025:09:51:57 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 151669 "" "Go-http-client/1.1" Nov 26 04:51:57 localhost podman[240049]: @ - - [26/Nov/2025:09:51:57 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18226 "" "Go-http-client/1.1" Nov 26 04:51:58 localhost nova_compute[281415]: 2025-11-26 09:51:58.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:51:58 localhost nova_compute[281415]: 2025-11-26 09:51:58.848 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 04:51:58 localhost nova_compute[281415]: 2025-11-26 09:51:58.848 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 04:51:59 localhost nova_compute[281415]: 2025-11-26 09:51:59.039 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:51:59 localhost nova_compute[281415]: 2025-11-26 09:51:59.040 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:51:59 localhost nova_compute[281415]: 2025-11-26 09:51:59.040 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 04:51:59 localhost nova_compute[281415]: 2025-11-26 09:51:59.041 281419 DEBUG nova.objects.instance [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:51:59 localhost nova_compute[281415]: 2025-11-26 09:51:59.192 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:51:59 localhost nova_compute[281415]: 2025-11-26 09:51:59.372 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:51:59 localhost nova_compute[281415]: 2025-11-26 09:51:59.394 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:51:59 localhost nova_compute[281415]: 2025-11-26 09:51:59.395 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 04:52:01 localhost sshd[296919]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:52:02 localhost podman[297045]: Nov 26 04:52:02 localhost podman[297045]: 2025-11-26 09:52:02.08099717 +0000 UTC m=+0.072897833 container create e5fa9c2b18d69e32a7d38bad23882e2307a7e38c5b07e712c83b952b02455a67 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_cartwright, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, architecture=x86_64, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., RELEASE=main, com.redhat.component=rhceph-container, version=7, name=rhceph, vcs-type=git, distribution-scope=public) Nov 26 04:52:02 localhost systemd[1]: Started libpod-conmon-e5fa9c2b18d69e32a7d38bad23882e2307a7e38c5b07e712c83b952b02455a67.scope. Nov 26 04:52:02 localhost systemd[1]: Started libcrun container. Nov 26 04:52:02 localhost podman[297045]: 2025-11-26 09:52:02.049703788 +0000 UTC m=+0.041604521 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:52:02 localhost podman[297045]: 2025-11-26 09:52:02.153304854 +0000 UTC m=+0.145205577 container init e5fa9c2b18d69e32a7d38bad23882e2307a7e38c5b07e712c83b952b02455a67 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_cartwright, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, version=7, release=553, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, architecture=x86_64, CEPH_POINT_RELEASE=, distribution-scope=public, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, name=rhceph, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, RELEASE=main, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux ) Nov 26 04:52:02 localhost podman[297045]: 2025-11-26 09:52:02.163658363 +0000 UTC m=+0.155559046 container start e5fa9c2b18d69e32a7d38bad23882e2307a7e38c5b07e712c83b952b02455a67 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_cartwright, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, CEPH_POINT_RELEASE=, distribution-scope=public, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, architecture=x86_64, build-date=2025-09-24T08:57:55, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.openshift.expose-services=, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 26 04:52:02 localhost podman[297045]: 2025-11-26 09:52:02.163987123 +0000 UTC m=+0.155887856 container attach e5fa9c2b18d69e32a7d38bad23882e2307a7e38c5b07e712c83b952b02455a67 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_cartwright, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, version=7, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, CEPH_POINT_RELEASE=, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True) Nov 26 04:52:02 localhost sad_cartwright[297064]: 167 167 Nov 26 04:52:02 localhost systemd[1]: libpod-e5fa9c2b18d69e32a7d38bad23882e2307a7e38c5b07e712c83b952b02455a67.scope: Deactivated successfully. Nov 26 04:52:02 localhost podman[297045]: 2025-11-26 09:52:02.169458512 +0000 UTC m=+0.161359185 container died e5fa9c2b18d69e32a7d38bad23882e2307a7e38c5b07e712c83b952b02455a67 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_cartwright, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, CEPH_POINT_RELEASE=, GIT_CLEAN=True, maintainer=Guillaume Abrioux , name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, RELEASE=main, io.buildah.version=1.33.12, distribution-scope=public, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7) Nov 26 04:52:02 localhost nova_compute[281415]: 2025-11-26 09:52:02.223 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:52:02 localhost podman[297069]: 2025-11-26 09:52:02.291385491 +0000 UTC m=+0.113045247 container remove e5fa9c2b18d69e32a7d38bad23882e2307a7e38c5b07e712c83b952b02455a67 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_cartwright, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, name=rhceph, distribution-scope=public, CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_BRANCH=main, version=7, maintainer=Guillaume Abrioux ) Nov 26 04:52:02 localhost systemd[1]: libpod-conmon-e5fa9c2b18d69e32a7d38bad23882e2307a7e38c5b07e712c83b952b02455a67.scope: Deactivated successfully. Nov 26 04:52:02 localhost podman[297085]: Nov 26 04:52:02 localhost podman[297085]: 2025-11-26 09:52:02.426961882 +0000 UTC m=+0.092513186 container create 0d095e24e7c68e9b1a2b6aa52906879d2a9f006d21d8e83fbbb68ddb899de6ba (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_bose, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, name=rhceph, vcs-type=git, ceph=True, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, release=553, distribution-scope=public, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_BRANCH=main) Nov 26 04:52:02 localhost systemd[1]: Started libpod-conmon-0d095e24e7c68e9b1a2b6aa52906879d2a9f006d21d8e83fbbb68ddb899de6ba.scope. Nov 26 04:52:02 localhost systemd[1]: Started libcrun container. Nov 26 04:52:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/13b411361888ccca7ccc7f982674bac7d09e3e4024e92ee94ae63fa5a93feb1b/merged/tmp/config supports timestamps until 2038 (0x7fffffff) Nov 26 04:52:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/13b411361888ccca7ccc7f982674bac7d09e3e4024e92ee94ae63fa5a93feb1b/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff) Nov 26 04:52:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/13b411361888ccca7ccc7f982674bac7d09e3e4024e92ee94ae63fa5a93feb1b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 26 04:52:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/13b411361888ccca7ccc7f982674bac7d09e3e4024e92ee94ae63fa5a93feb1b/merged/var/lib/ceph/mon/ceph-np0005536118 supports timestamps until 2038 (0x7fffffff) Nov 26 04:52:02 localhost podman[297085]: 2025-11-26 09:52:02.389229221 +0000 UTC m=+0.054780565 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:52:02 localhost podman[297085]: 2025-11-26 09:52:02.495939363 +0000 UTC m=+0.161490637 container init 0d095e24e7c68e9b1a2b6aa52906879d2a9f006d21d8e83fbbb68ddb899de6ba (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_bose, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, distribution-scope=public, maintainer=Guillaume Abrioux , vcs-type=git, io.openshift.expose-services=, GIT_BRANCH=main, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, description=Red Hat Ceph Storage 7, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12) Nov 26 04:52:02 localhost podman[297085]: 2025-11-26 09:52:02.50461245 +0000 UTC m=+0.170163764 container start 0d095e24e7c68e9b1a2b6aa52906879d2a9f006d21d8e83fbbb68ddb899de6ba (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_bose, io.buildah.version=1.33.12, release=553, distribution-scope=public, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux ) Nov 26 04:52:02 localhost podman[297085]: 2025-11-26 09:52:02.504852697 +0000 UTC m=+0.170403991 container attach 0d095e24e7c68e9b1a2b6aa52906879d2a9f006d21d8e83fbbb68ddb899de6ba (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_bose, distribution-scope=public, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, RELEASE=main, GIT_CLEAN=True, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph) Nov 26 04:52:02 localhost systemd[1]: libpod-0d095e24e7c68e9b1a2b6aa52906879d2a9f006d21d8e83fbbb68ddb899de6ba.scope: Deactivated successfully. Nov 26 04:52:02 localhost podman[297085]: 2025-11-26 09:52:02.61025765 +0000 UTC m=+0.275808954 container died 0d095e24e7c68e9b1a2b6aa52906879d2a9f006d21d8e83fbbb68ddb899de6ba (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_bose, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, vcs-type=git, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , version=7, release=553, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_CLEAN=True, GIT_BRANCH=main, name=rhceph, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container) Nov 26 04:52:02 localhost podman[297126]: 2025-11-26 09:52:02.707053887 +0000 UTC m=+0.087968027 container remove 0d095e24e7c68e9b1a2b6aa52906879d2a9f006d21d8e83fbbb68ddb899de6ba (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_bose, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, name=rhceph, io.openshift.expose-services=, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, GIT_BRANCH=main, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-type=git, vendor=Red Hat, Inc., release=553, io.buildah.version=1.33.12, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, io.k8s.description=Red Hat Ceph Storage 7) Nov 26 04:52:02 localhost systemd[1]: libpod-conmon-0d095e24e7c68e9b1a2b6aa52906879d2a9f006d21d8e83fbbb68ddb899de6ba.scope: Deactivated successfully. Nov 26 04:52:02 localhost systemd[1]: Reloading. Nov 26 04:52:02 localhost systemd-rc-local-generator[297167]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:52:02 localhost systemd-sysv-generator[297172]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:52:02 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:52:02 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:52:02 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:52:02 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:52:02 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:52:02 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:52:02 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:52:02 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:52:02 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:52:03 localhost systemd[1]: var-lib-containers-storage-overlay-dd38f9bd6b91996028c058abc2bfe677c2844fab834e44be0df1dfcbfdefc11d-merged.mount: Deactivated successfully. Nov 26 04:52:03 localhost systemd[1]: Reloading. Nov 26 04:52:03 localhost systemd-rc-local-generator[297210]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 26 04:52:03 localhost systemd-sysv-generator[297213]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 26 04:52:03 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:52:03 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 26 04:52:03 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:52:03 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:52:03 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 26 04:52:03 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 26 04:52:03 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:52:03 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:52:03 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 26 04:52:03 localhost systemd[1]: Starting Ceph mon.np0005536118 for 0d5e5e6d-3c4b-5efe-8c65-346ae6715606... Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.580 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'name': 'test', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005536118.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'hostId': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.581 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.586 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '58d2180a-3a04-48e5-9e34-0a885e9cd1c1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:52:03.582021', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '8f99ed7e-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11198.824281232, 'message_signature': 'b6623f0dd0d235b7374353b3089bee3dd82dd70303753a569e013e4c3c23b726'}]}, 'timestamp': '2025-11-26 09:52:03.587202', '_unique_id': '82833eb07e20486390ba07837f0bdff7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.588 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.590 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.590 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes volume: 7111 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea677ab3-f67e-4746-8313-dfa729ebc2fe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7111, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:52:03.590898', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '8f9a9ae4-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11198.824281232, 'message_signature': '0b7d6f1d077b27e7a3c223c6ad5e3746b787412b6529bcea2cd0b0ef8843d04b'}]}, 'timestamp': '2025-11-26 09:52:03.591492', '_unique_id': 'fa8a3410c64d42eaa4a847b66542af91'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.592 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.593 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.620 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 1723586642 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.620 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 89399569 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fd5d4df5-fb90-4753-b11c-4b8dc9f05abe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1723586642, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:52:03.593858', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8f9f0dae-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11198.836154247, 'message_signature': '1dc78f49fdcf914111666167731c5072fa21a296c82f221bc78ceb9e909d02f8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89399569, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:52:03.593858', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8f9f28a2-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11198.836154247, 'message_signature': '6172873ef18acd06ef1dc86660991be7c846ca2d7b150acb40e5da7447c0dabd'}]}, 'timestamp': '2025-11-26 09:52:03.621416', '_unique_id': 'eaec4580cec9462ca92269dbd32229a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.622 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.624 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.624 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b18b2dbd-c4b8-4110-9a75-fefea8109d70', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:52:03.624796', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '8f9fc67c-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11198.824281232, 'message_signature': 'bfacc76b7f465e9b641d9c8b0f1af5a01da8a68c65be0816d49c5bd513553a4e'}]}, 'timestamp': '2025-11-26 09:52:03.625371', '_unique_id': 'c90ecc433a434f8cbd5a5ea9418fae73'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.626 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.627 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.627 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.627 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.628 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c26bb14-fbaf-4d4e-bd86-2f0aa28aa3e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:52:03.628067', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '8fa0443a-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11198.824281232, 'message_signature': '1ff26ceb5680e7647cfae42cc2a97c1e6b6e95bda6de9f3cb44e810f4e6e01e6'}]}, 'timestamp': '2025-11-26 09:52:03.628589', '_unique_id': '0e2421a6b90441568856b7d014a2abaf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.629 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.630 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.630 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '62824a5b-66d3-4c98-af97-387a2d3a7692', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:52:03.630848', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '8fa0b172-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11198.824281232, 'message_signature': '230d8d6f6b88d479a16ae8a27e61009456a4f03a142550e0330ce755ae9759d4'}]}, 'timestamp': '2025-11-26 09:52:03.631382', '_unique_id': 'b813ad5e1d844b15b769453a4d86e0e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.632 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.633 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.647 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/memory.usage volume: 51.79296875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '861d7b25-44f2-4517-b373-062b6a6d8e17', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.79296875, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T09:52:03.633743', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '8fa34996-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11198.889993493, 'message_signature': '05f0e13d45beb0ea0f1e5478033c3e658e4d1e77cdf101946864b60861203b28'}]}, 'timestamp': '2025-11-26 09:52:03.648367', '_unique_id': '48fb104ea24846fda6a08f30d13eb3d6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.649 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.650 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 26 04:52:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:52:03.656 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:52:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:52:03.658 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:52:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:52:03.660 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.659 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.660 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1905ec7f-d4ad-4fa4-8be5-e12b04ad1196', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:52:03.650548', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8fa517e4-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11198.892799189, 'message_signature': '56ba3c210b9b53902765e2deb5d8725bf8762588d6c461cd1c8c7f0ea3da0314'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:52:03.650548', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8fa52996-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11198.892799189, 'message_signature': 'b444edfeed11b798c38c2370c967e83674c33987f214ad67ccfb9c1cda30284c'}]}, 'timestamp': '2025-11-26 09:52:03.660646', '_unique_id': '3e6fda330f7b434fa7bb87844eecfe5e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.661 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.662 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.663 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84239d77-5870-450b-b125-1b656f4643c1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:52:03.662978', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '8fa59804-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11198.824281232, 'message_signature': '56548d74ff3128823f5327933e891c80f6c915010ea8075fce95ab21eaf7cd96'}]}, 'timestamp': '2025-11-26 09:52:03.663531', '_unique_id': 'f00cb5aa52a54eb89c5ab48863ab81aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.664 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.665 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.665 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.666 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '029a1452-1b6e-4822-b60c-95f826a2ff45', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:52:03.665701', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8fa60172-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11198.836154247, 'message_signature': 'da68fc22ec7abc46efeb0ed899138232364422cafe009dc24ef4de1346f96ec3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:52:03.665701', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8fa61522-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11198.836154247, 'message_signature': '9d378328e21383224d6af1fa3810c1449b17f59743c5b6951a18142d9a5c9599'}]}, 'timestamp': '2025-11-26 09:52:03.666675', '_unique_id': 'dbecf0e5dbfb4b0fa23e2a9def1d7f1b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.667 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.668 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.668 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.669 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e577196-461c-4874-932a-bccb97392451', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:52:03.668901', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8fa67e9a-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11198.836154247, 'message_signature': '0de0d07c9aa94bd6f18f5af1d5254c4e6c27dd36065a9f703ea3f782da0886b7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:52:03.668901', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8fa68f0c-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11198.836154247, 'message_signature': 'd2c26e0e0345f8ebfb46b6e0bb76b9af755a5dc65fbddb3371032cdfb3da796a'}]}, 'timestamp': '2025-11-26 09:52:03.669788', '_unique_id': '87f5724829094143accb1888511c1739'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.670 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.671 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.672 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.672 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ded8210-9aa7-42bc-8d32-d502f8c668aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:52:03.672030', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8fa6f7f8-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11198.836154247, 'message_signature': '732cc34261e7c33dc4e46eba88ba83ca39f280c5bf049d9cf5b55b3487b68abf'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:52:03.672030', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8fa70842-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11198.836154247, 'message_signature': '195de32328b7572f77385192a571ebe3567345d9f4a40a1f2480bb430d319944'}]}, 'timestamp': '2025-11-26 09:52:03.672890', '_unique_id': 'fb6cc732ee8c4e5486f59c4a220e12a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.673 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.674 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.675 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.675 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9a817d60-902b-46f5-b9db-cf57bb415b95', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:52:03.675112', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8fa7703e-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11198.836154247, 'message_signature': '629595f4e488d55527f049cff891b3a58edd13b40b5fc95ab06d8e621bbd56cd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:52:03.675112', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8fa781d2-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11198.836154247, 'message_signature': 'd65af00e79ffb732ed46f0f3b3e1a69b3a8a446974a29e293ad12ad08d051319'}]}, 'timestamp': '2025-11-26 09:52:03.676031', '_unique_id': '775ee44ce3e4423cbd01d15f512344e0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.676 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.678 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.678 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'accb7147-b684-465d-81e7-3008548dac73', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:52:03.678190', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '8fa7e83e-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11198.824281232, 'message_signature': '228ff3aa50fb87e71e7eb0621abd369a7c138ab816d6d31b22cebb6c25c32d19'}]}, 'timestamp': '2025-11-26 09:52:03.678653', '_unique_id': 'a121d142211e45efb244a1c0ed81f12a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.679 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.680 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.681 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 1143371229 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.681 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 23326743 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a5cad5a9-b446-42cc-a9fb-4af2331940be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1143371229, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:52:03.680924', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8fa854ea-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11198.836154247, 'message_signature': '80ff801b9fbbda52828169bd073f9fed95a10b65ce4d4f496ed5f0ecf451f188'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 23326743, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:52:03.680924', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8fa8652a-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11198.836154247, 'message_signature': 'b7334eb1a3ed3cf54e2d7b62dbd0a8c8958a87c8eb22960ee5fab28246db6d3e'}]}, 'timestamp': '2025-11-26 09:52:03.681824', '_unique_id': 'd8076d5d175648c3a470f863f481345d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.682 12 ERROR oslo_messaging.notify.messaging Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.684 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.684 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.686 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4dda738a-f279-4bd2-a327-9c8564177382', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:52:03.684689', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '8fa8e66c-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11198.824281232, 'message_signature': '629c610c391c137bb1a78e85647cbe86e05f43045c9d4995c814ae23a5034283'}]}, 'timestamp': '2025-11-26 09:52:03.685195', '_unique_id': '6e143499f1754b9b93d6c46a545a2ada'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.686 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.686 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.686 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.686 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.686 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:52:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:52:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:52:29 localhost nova_compute[281415]: 2025-11-26 09:52:29.365 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:52:29 localhost rsyslogd[760]: imjournal: 930 messages lost due to rate-limiting (20000 allowed within 600 seconds) Nov 26 04:52:32 localhost nova_compute[281415]: 2025-11-26 09:52:32.417 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:52:34 localhost nova_compute[281415]: 2025-11-26 09:52:34.410 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:52:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:52:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:52:34 localhost systemd[1]: tmp-crun.cPWWZT.mount: Deactivated successfully. Nov 26 04:52:34 localhost podman[298221]: 2025-11-26 09:52:34.903199908 +0000 UTC m=+0.155287677 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:52:34 localhost podman[298222]: 2025-11-26 09:52:34.864573641 +0000 UTC m=+0.113239375 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:52:34 localhost podman[298221]: 2025-11-26 09:52:34.93737567 +0000 UTC m=+0.189463419 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Nov 26 04:52:34 localhost podman[298222]: 2025-11-26 09:52:34.947504121 +0000 UTC m=+0.196169825 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3) Nov 26 04:52:34 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:52:34 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:52:37 localhost nova_compute[281415]: 2025-11-26 09:52:37.449 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:52:39 localhost nova_compute[281415]: 2025-11-26 09:52:39.448 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:52:40 localhost sshd[298256]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:52:42 localhost nova_compute[281415]: 2025-11-26 09:52:42.501 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:52:44 localhost systemd[1]: session-67.scope: Deactivated successfully. Nov 26 04:52:44 localhost systemd[1]: session-67.scope: Consumed 1.836s CPU time. Nov 26 04:52:44 localhost systemd-logind[761]: Session 67 logged out. Waiting for processes to exit. Nov 26 04:52:44 localhost systemd-logind[761]: Removed session 67. Nov 26 04:52:44 localhost nova_compute[281415]: 2025-11-26 09:52:44.451 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:52:44 localhost ceph-mon[297296]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:52:44 localhost ceph-mon[297296]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:52:44 localhost ceph-mon[297296]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:52:44 localhost ceph-mon[297296]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:52:44 localhost ceph-mon[297296]: from='mgr.17079 172.18.0.108:0/2900779467' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Nov 26 04:52:44 localhost ceph-mon[297296]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Nov 26 04:52:44 localhost ceph-mon[297296]: Activating manager daemon np0005536112.srlncr Nov 26 04:52:44 localhost ceph-mon[297296]: from='client.? 172.18.0.200:0/138424909' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Nov 26 04:52:44 localhost ceph-mon[297296]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Nov 26 04:52:44 localhost ceph-mon[297296]: from='mgr.17079 ' entity='mgr.np0005536119.eupicg' Nov 26 04:52:45 localhost openstack_network_exporter[242153]: ERROR 09:52:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:52:45 localhost openstack_network_exporter[242153]: ERROR 09:52:45 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:52:45 localhost openstack_network_exporter[242153]: ERROR 09:52:45 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:52:45 localhost openstack_network_exporter[242153]: Nov 26 04:52:45 localhost openstack_network_exporter[242153]: ERROR 09:52:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:52:45 localhost openstack_network_exporter[242153]: ERROR 09:52:45 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:52:45 localhost openstack_network_exporter[242153]: Nov 26 04:52:47 localhost nova_compute[281415]: 2025-11-26 09:52:47.554 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:52:49 localhost nova_compute[281415]: 2025-11-26 09:52:49.453 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:52:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:52:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:52:49 localhost podman[298258]: 2025-11-26 09:52:49.840212977 +0000 UTC m=+0.089770132 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 26 04:52:49 localhost podman[298258]: 2025-11-26 09:52:49.852475665 +0000 UTC m=+0.102032860 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 26 04:52:49 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:52:49 localhost podman[298259]: 2025-11-26 09:52:49.94663598 +0000 UTC m=+0.191774529 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, managed_by=edpm_ansible) Nov 26 04:52:49 localhost podman[298259]: 2025-11-26 09:52:49.962269252 +0000 UTC m=+0.207407871 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118) Nov 26 04:52:49 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:52:52 localhost nova_compute[281415]: 2025-11-26 09:52:52.391 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:52:52 localhost nova_compute[281415]: 2025-11-26 09:52:52.586 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:52:53 localhost ceph-mon[297296]: mon.np0005536118@-1(probing) e11 handle_auth_request failed to assign global_id Nov 26 04:52:53 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0. Nov 26 04:52:53 localhost sshd[298298]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:52:53 localhost systemd-logind[761]: New session 69 of user ceph-admin. Nov 26 04:52:53 localhost systemd[1]: Created slice User Slice of UID 1002. Nov 26 04:52:53 localhost systemd[1]: Starting User Runtime Directory /run/user/1002... Nov 26 04:52:53 localhost systemd[1]: Finished User Runtime Directory /run/user/1002. Nov 26 04:52:53 localhost systemd[1]: Starting User Manager for UID 1002... Nov 26 04:52:53 localhost systemd[298302]: Queued start job for default target Main User Target. Nov 26 04:52:53 localhost systemd[298302]: Created slice User Application Slice. Nov 26 04:52:53 localhost systemd[298302]: Started Mark boot as successful after the user session has run 2 minutes. Nov 26 04:52:53 localhost systemd[298302]: Started Daily Cleanup of User's Temporary Directories. Nov 26 04:52:53 localhost systemd[298302]: Reached target Paths. Nov 26 04:52:53 localhost systemd[298302]: Reached target Timers. Nov 26 04:52:53 localhost systemd[298302]: Starting D-Bus User Message Bus Socket... Nov 26 04:52:53 localhost systemd[298302]: Starting Create User's Volatile Files and Directories... Nov 26 04:52:53 localhost systemd[298302]: Listening on D-Bus User Message Bus Socket. Nov 26 04:52:53 localhost systemd[298302]: Reached target Sockets. Nov 26 04:52:53 localhost systemd[298302]: Finished Create User's Volatile Files and Directories. Nov 26 04:52:53 localhost systemd[298302]: Reached target Basic System. Nov 26 04:52:53 localhost systemd[1]: Started User Manager for UID 1002. Nov 26 04:52:53 localhost systemd[298302]: Reached target Main User Target. Nov 26 04:52:53 localhost systemd[298302]: Startup finished in 159ms. Nov 26 04:52:53 localhost nova_compute[281415]: 2025-11-26 09:52:53.846 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:52:54 localhost nova_compute[281415]: 2025-11-26 09:52:53.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:52:54 localhost nova_compute[281415]: 2025-11-26 09:52:53.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:52:54 localhost nova_compute[281415]: 2025-11-26 09:52:53.849 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 04:52:54 localhost systemd[1]: Started Session 69 of User ceph-admin. Nov 26 04:52:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:52:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:52:54 localhost podman[298317]: 2025-11-26 09:52:54.251597984 +0000 UTC m=+0.089445492 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:52:54 localhost podman[298318]: 2025-11-26 09:52:54.22707423 +0000 UTC m=+0.067782336 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.openshift.tags=minimal rhel9, name=ubi9-minimal, managed_by=edpm_ansible) Nov 26 04:52:54 localhost podman[298317]: 2025-11-26 09:52:54.320192795 +0000 UTC m=+0.158040293 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 26 04:52:54 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:52:54 localhost podman[298318]: 2025-11-26 09:52:54.336343481 +0000 UTC m=+0.177051537 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, config_id=edpm, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-type=git, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9) Nov 26 04:52:54 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:52:54 localhost systemd[1]: Stopping User Manager for UID 1003... Nov 26 04:52:54 localhost systemd[295463]: Activating special unit Exit the Session... Nov 26 04:52:54 localhost systemd[295463]: Stopped target Main User Target. Nov 26 04:52:54 localhost systemd[295463]: Stopped target Basic System. Nov 26 04:52:54 localhost systemd[295463]: Stopped target Paths. Nov 26 04:52:54 localhost systemd[295463]: Stopped target Sockets. Nov 26 04:52:54 localhost systemd[295463]: Stopped target Timers. Nov 26 04:52:54 localhost systemd[295463]: Stopped Mark boot as successful after the user session has run 2 minutes. Nov 26 04:52:54 localhost systemd[295463]: Stopped Daily Cleanup of User's Temporary Directories. Nov 26 04:52:54 localhost systemd[295463]: Closed D-Bus User Message Bus Socket. Nov 26 04:52:54 localhost systemd[295463]: Stopped Create User's Volatile Files and Directories. Nov 26 04:52:54 localhost systemd[295463]: Removed slice User Application Slice. Nov 26 04:52:54 localhost systemd[295463]: Reached target Shutdown. Nov 26 04:52:54 localhost systemd[295463]: Finished Exit the Session. Nov 26 04:52:54 localhost systemd[295463]: Reached target Exit the Session. Nov 26 04:52:54 localhost systemd[1]: user@1003.service: Deactivated successfully. Nov 26 04:52:54 localhost systemd[1]: Stopped User Manager for UID 1003. Nov 26 04:52:54 localhost ceph-mon[297296]: mon.np0005536118@-1(probing) e11 handle_auth_request failed to assign global_id Nov 26 04:52:54 localhost systemd[1]: Stopping User Runtime Directory /run/user/1003... Nov 26 04:52:54 localhost systemd[1]: run-user-1003.mount: Deactivated successfully. Nov 26 04:52:54 localhost systemd[1]: user-runtime-dir@1003.service: Deactivated successfully. Nov 26 04:52:54 localhost systemd[1]: Stopped User Runtime Directory /run/user/1003. Nov 26 04:52:54 localhost systemd[1]: Removed slice User Slice of UID 1003. Nov 26 04:52:54 localhost systemd[1]: user-1003.slice: Consumed 2.321s CPU time. Nov 26 04:52:54 localhost nova_compute[281415]: 2025-11-26 09:52:54.457 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:52:54 localhost nova_compute[281415]: 2025-11-26 09:52:54.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:52:54 localhost nova_compute[281415]: 2025-11-26 09:52:54.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:52:54 localhost nova_compute[281415]: 2025-11-26 09:52:54.894 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:52:54 localhost nova_compute[281415]: 2025-11-26 09:52:54.895 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:52:54 localhost nova_compute[281415]: 2025-11-26 09:52:54.896 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:52:54 localhost nova_compute[281415]: 2025-11-26 09:52:54.896 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 04:52:54 localhost nova_compute[281415]: 2025-11-26 09:52:54.896 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:52:55 localhost ceph-mon[297296]: mon.np0005536118@-1(probing) e11 handle_auth_request failed to assign global_id Nov 26 04:52:55 localhost podman[298490]: 2025-11-26 09:52:55.342104596 +0000 UTC m=+0.108088875 container exec a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, io.buildah.version=1.33.12, release=553, io.openshift.expose-services=, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., GIT_CLEAN=True, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, name=rhceph, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, architecture=x86_64, io.openshift.tags=rhceph ceph, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 26 04:52:55 localhost nova_compute[281415]: 2025-11-26 09:52:55.365 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:52:55 localhost nova_compute[281415]: 2025-11-26 09:52:55.431 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:52:55 localhost nova_compute[281415]: 2025-11-26 09:52:55.432 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:52:55 localhost podman[298490]: 2025-11-26 09:52:55.497854887 +0000 UTC m=+0.263839146 container exec_died a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, vcs-type=git, release=553, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, RELEASE=main, com.redhat.component=rhceph-container, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, ceph=True, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, name=rhceph, version=7, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 26 04:52:55 localhost nova_compute[281415]: 2025-11-26 09:52:55.662 281419 WARNING nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 04:52:55 localhost nova_compute[281415]: 2025-11-26 09:52:55.666 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=11747MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 04:52:55 localhost nova_compute[281415]: 2025-11-26 09:52:55.666 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:52:55 localhost nova_compute[281415]: 2025-11-26 09:52:55.666 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:52:55 localhost nova_compute[281415]: 2025-11-26 09:52:55.748 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 04:52:55 localhost nova_compute[281415]: 2025-11-26 09:52:55.749 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 04:52:55 localhost nova_compute[281415]: 2025-11-26 09:52:55.749 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 04:52:55 localhost nova_compute[281415]: 2025-11-26 09:52:55.794 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:52:55 localhost ceph-mon[297296]: mon.np0005536118@-1(probing) e11 handle_auth_request failed to assign global_id Nov 26 04:52:56 localhost nova_compute[281415]: 2025-11-26 09:52:56.282 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:52:56 localhost nova_compute[281415]: 2025-11-26 09:52:56.291 281419 DEBUG nova.compute.provider_tree [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 04:52:56 localhost nova_compute[281415]: 2025-11-26 09:52:56.308 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 04:52:56 localhost nova_compute[281415]: 2025-11-26 09:52:56.312 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 04:52:56 localhost nova_compute[281415]: 2025-11-26 09:52:56.313 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.646s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:52:56 localhost ceph-mon[297296]: mon.np0005536118@-1(synchronizing).osd e85 e85: 6 total, 6 up, 6 in Nov 26 04:52:56 localhost ceph-mon[297296]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0. Nov 26 04:52:56 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:52:56.693763) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 26 04:52:56 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13 Nov 26 04:52:56 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150776693873, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 11461, "num_deletes": 277, "total_data_size": 18914395, "memory_usage": 19385936, "flush_reason": "Manual Compaction"} Nov 26 04:52:56 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started Nov 26 04:52:56 localhost ceph-mon[297296]: Activating manager daemon np0005536117.ggibwg Nov 26 04:52:56 localhost ceph-mon[297296]: Manager daemon np0005536112.srlncr is unresponsive, replacing it with standby daemon np0005536117.ggibwg Nov 26 04:52:56 localhost ceph-mon[297296]: Manager daemon np0005536117.ggibwg is now available Nov 26 04:52:56 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005536113.localdomain.devices.0"} : dispatch Nov 26 04:52:56 localhost ceph-mon[297296]: removing stray HostCache host record np0005536113.localdomain.devices.0 Nov 26 04:52:56 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005536113.localdomain.devices.0"} : dispatch Nov 26 04:52:56 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005536113.localdomain.devices.0"}]': finished Nov 26 04:52:56 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005536113.localdomain.devices.0"} : dispatch Nov 26 04:52:56 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005536113.localdomain.devices.0"} : dispatch Nov 26 04:52:56 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005536113.localdomain.devices.0"}]': finished Nov 26 04:52:56 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005536117.ggibwg/mirror_snapshot_schedule"} : dispatch Nov 26 04:52:56 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005536117.ggibwg/mirror_snapshot_schedule"} : dispatch Nov 26 04:52:56 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005536117.ggibwg/trash_purge_schedule"} : dispatch Nov 26 04:52:56 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005536117.ggibwg/trash_purge_schedule"} : dispatch Nov 26 04:52:56 localhost ceph-mon[297296]: [26/Nov/2025:09:52:54] ENGINE Bus STARTING Nov 26 04:52:56 localhost ceph-mon[297296]: [26/Nov/2025:09:52:54] ENGINE Serving on https://172.18.0.106:7150 Nov 26 04:52:56 localhost ceph-mon[297296]: [26/Nov/2025:09:52:54] ENGINE Client ('172.18.0.106', 35448) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Nov 26 04:52:56 localhost ceph-mon[297296]: [26/Nov/2025:09:52:54] ENGINE Serving on http://172.18.0.106:8765 Nov 26 04:52:56 localhost ceph-mon[297296]: [26/Nov/2025:09:52:54] ENGINE Bus STARTED Nov 26 04:52:56 localhost ceph-mon[297296]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm) Nov 26 04:52:56 localhost ceph-mon[297296]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm) Nov 26 04:52:56 localhost ceph-mon[297296]: Cluster is now healthy Nov 26 04:52:56 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:52:56 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:52:56 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:52:56 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:52:56 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:52:56 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:52:56 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:52:56 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:52:56 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150776755387, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 15675231, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 11466, "table_properties": {"data_size": 15613103, "index_size": 33606, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27461, "raw_key_size": 290361, "raw_average_key_size": 26, "raw_value_size": 15426702, "raw_average_value_size": 1407, "num_data_blocks": 1282, "num_entries": 10958, "num_filter_entries": 10958, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764150724, "oldest_key_time": 1764150724, "file_creation_time": 1764150776, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79fcc812-ff89-4330-9c0d-148e92884770", "db_session_id": "NHY1MHQN9GMTALZ562AS", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}} Nov 26 04:52:56 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 61677 microseconds, and 31197 cpu microseconds. Nov 26 04:52:56 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:52:56.755455) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 15675231 bytes OK Nov 26 04:52:56 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:52:56.755481) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started Nov 26 04:52:56 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:52:56.757453) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done Nov 26 04:52:56 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:52:56.757479) EVENT_LOG_v1 {"time_micros": 1764150776757472, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0} Nov 26 04:52:56 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:52:56.757500) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50 Nov 26 04:52:56 localhost ceph-mon[297296]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 18834383, prev total WAL file size 18834424, number of live WAL files 2. Nov 26 04:52:56 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 04:52:56 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:52:56.760384) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130353432' seq:72057594037927935, type:22 .. '7061786F73003130373934' seq:0, type:0; will stop at (end) Nov 26 04:52:56 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00 Nov 26 04:52:56 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(14MB) 8(1887B)] Nov 26 04:52:56 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150776760466, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 15677118, "oldest_snapshot_seqno": -1} Nov 26 04:52:56 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 10706 keys, 15671851 bytes, temperature: kUnknown Nov 26 04:52:56 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150776830014, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 15671851, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15610416, "index_size": 33558, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26821, "raw_key_size": 285465, "raw_average_key_size": 26, "raw_value_size": 15427353, "raw_average_value_size": 1441, "num_data_blocks": 1281, "num_entries": 10706, "num_filter_entries": 10706, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764150724, "oldest_key_time": 0, "file_creation_time": 1764150776, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79fcc812-ff89-4330-9c0d-148e92884770", "db_session_id": "NHY1MHQN9GMTALZ562AS", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}} Nov 26 04:52:56 localhost ceph-mon[297296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 04:52:56 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:52:56.830416) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 15671851 bytes Nov 26 04:52:56 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:52:56.832074) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 225.0 rd, 224.9 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(15.0, 0.0 +0.0 blob) out(14.9 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 10963, records dropped: 257 output_compression: NoCompression Nov 26 04:52:56 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:52:56.832104) EVENT_LOG_v1 {"time_micros": 1764150776832092, "job": 4, "event": "compaction_finished", "compaction_time_micros": 69680, "compaction_time_cpu_micros": 38049, "output_level": 6, "num_output_files": 1, "total_output_size": 15671851, "num_input_records": 10963, "num_output_records": 10706, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 26 04:52:56 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 04:52:56 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150776834468, "job": 4, "event": "table_file_deletion", "file_number": 14} Nov 26 04:52:56 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 04:52:56 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150776834526, "job": 4, "event": "table_file_deletion", "file_number": 8} Nov 26 04:52:56 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:52:56.760263) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:52:57 localhost nova_compute[281415]: 2025-11-26 09:52:57.313 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:52:57 localhost nova_compute[281415]: 2025-11-26 09:52:57.314 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:52:57 localhost podman[240049]: time="2025-11-26T09:52:57Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:52:57 localhost podman[240049]: @ - - [26/Nov/2025:09:52:57 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153864 "" "Go-http-client/1.1" Nov 26 04:52:57 localhost podman[240049]: @ - - [26/Nov/2025:09:52:57 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18713 "" "Go-http-client/1.1" Nov 26 04:52:57 localhost nova_compute[281415]: 2025-11-26 09:52:57.611 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:52:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:52:58 localhost podman[298808]: 2025-11-26 09:52:58.047404197 +0000 UTC m=+0.088039828 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 26 04:52:58 localhost podman[298808]: 2025-11-26 09:52:58.058922771 +0000 UTC m=+0.099558352 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 04:52:58 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:52:58 localhost ceph-mon[297296]: Saving service mon spec with placement label:mon Nov 26 04:52:58 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:52:58 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:52:58 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:52:58 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 26 04:52:58 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 26 04:52:58 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 26 04:52:58 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 26 04:52:58 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:52:58 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:52:58 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:52:58 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:52:58 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 26 04:52:58 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "config rm", "who": "osd/host:np0005536114", "name": "osd_memory_target"} : dispatch Nov 26 04:52:58 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 26 04:52:58 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 26 04:52:58 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "config rm", "who": "osd/host:np0005536114", "name": "osd_memory_target"} : dispatch Nov 26 04:52:58 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 26 04:52:58 localhost ceph-mon[297296]: Adjusting osd_memory_target on np0005536117.localdomain to 836.6M Nov 26 04:52:58 localhost ceph-mon[297296]: Unable to set osd_memory_target on np0005536117.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 26 04:52:58 localhost ceph-mon[297296]: Adjusting osd_memory_target on np0005536119.localdomain to 836.6M Nov 26 04:52:58 localhost ceph-mon[297296]: Unable to set osd_memory_target on np0005536119.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 26 04:52:58 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:52:58 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:52:58 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 26 04:52:58 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 26 04:52:58 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 26 04:52:58 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 26 04:52:58 localhost ceph-mon[297296]: Adjusting osd_memory_target on np0005536118.localdomain to 836.6M Nov 26 04:52:58 localhost ceph-mon[297296]: Unable to set osd_memory_target on np0005536118.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 26 04:52:58 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:52:58 localhost ceph-mon[297296]: Updating np0005536114.localdomain:/etc/ceph/ceph.conf Nov 26 04:52:58 localhost ceph-mon[297296]: Updating np0005536117.localdomain:/etc/ceph/ceph.conf Nov 26 04:52:58 localhost ceph-mon[297296]: Updating np0005536118.localdomain:/etc/ceph/ceph.conf Nov 26 04:52:58 localhost ceph-mon[297296]: Updating np0005536119.localdomain:/etc/ceph/ceph.conf Nov 26 04:52:59 localhost ceph-mon[297296]: mon.np0005536118@-1(probing) e11 handle_auth_request failed to assign global_id Nov 26 04:52:59 localhost nova_compute[281415]: 2025-11-26 09:52:59.504 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:52:59 localhost ceph-mgr[287388]: ms_deliver_dispatch: unhandled message 0x55fcc98f8000 mon_map magic: 0 from mon.2 v2:172.18.0.103:3300/0 Nov 26 04:52:59 localhost nova_compute[281415]: 2025-11-26 09:52:59.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:52:59 localhost nova_compute[281415]: 2025-11-26 09:52:59.849 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 04:52:59 localhost nova_compute[281415]: 2025-11-26 09:52:59.849 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 04:53:00 localhost nova_compute[281415]: 2025-11-26 09:53:00.277 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:53:00 localhost nova_compute[281415]: 2025-11-26 09:53:00.278 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:53:00 localhost nova_compute[281415]: 2025-11-26 09:53:00.278 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 04:53:00 localhost nova_compute[281415]: 2025-11-26 09:53:00.279 281419 DEBUG nova.objects.instance [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:53:00 localhost ceph-mon[297296]: mon.np0005536118@-1(probing) e12 my rank is now 3 (was -1) Nov 26 04:53:00 localhost ceph-mon[297296]: log_channel(cluster) log [INF] : mon.np0005536118 calling monitor election Nov 26 04:53:00 localhost ceph-mon[297296]: paxos.3).electionLogic(0) init, first boot, initializing epoch at 1 Nov 26 04:53:00 localhost ceph-mon[297296]: mon.np0005536118@3(electing) e12 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:53:01 localhost nova_compute[281415]: 2025-11-26 09:53:01.484 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:53:01 localhost nova_compute[281415]: 2025-11-26 09:53:01.593 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:53:01 localhost nova_compute[281415]: 2025-11-26 09:53:01.593 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 04:53:02 localhost nova_compute[281415]: 2025-11-26 09:53:02.614 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:53:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:53:03.657 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:53:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:53:03.658 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:53:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:53:03.659 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:53:03 localhost ceph-mds[286153]: mds.beacon.mds.np0005536118.kohnma missed beacon ack from the monitors Nov 26 04:53:04 localhost nova_compute[281415]: 2025-11-26 09:53:04.548 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:53:04 localhost ceph-mon[297296]: mon.np0005536118@3(electing) e12 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:53:04 localhost ceph-mon[297296]: mon.np0005536118@3(peon) e12 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code} Nov 26 04:53:04 localhost ceph-mon[297296]: mon.np0005536118@3(peon) e12 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout} Nov 26 04:53:04 localhost ceph-mon[297296]: mon.np0005536118@3(peon) e12 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:53:04 localhost ceph-mon[297296]: mgrc update_daemon_metadata mon.np0005536118 metadata {addrs=[v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable),ceph_version_short=18.2.1-361.el9cp,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=np0005536118.localdomain,container_image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=rhel,distro_description=Red Hat Enterprise Linux 9.6 (Plow),distro_version=9.6,hostname=np0005536118.localdomain,kernel_description=#1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023,kernel_version=5.14.0-284.11.1.el9_2.x86_64,mem_swap_kb=1048572,mem_total_kb=16116612,os=Linux} Nov 26 04:53:04 localhost ceph-mon[297296]: Updating np0005536114.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 26 04:53:04 localhost ceph-mon[297296]: Updating np0005536119.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 26 04:53:04 localhost ceph-mon[297296]: Updating np0005536117.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 26 04:53:04 localhost ceph-mon[297296]: Updating np0005536118.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 26 04:53:04 localhost ceph-mon[297296]: mon.np0005536114 calling monitor election Nov 26 04:53:04 localhost ceph-mon[297296]: mon.np0005536119 calling monitor election Nov 26 04:53:04 localhost ceph-mon[297296]: mon.np0005536117 calling monitor election Nov 26 04:53:04 localhost ceph-mon[297296]: Updating np0005536114.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.client.admin.keyring Nov 26 04:53:04 localhost ceph-mon[297296]: Updating np0005536119.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.client.admin.keyring Nov 26 04:53:04 localhost ceph-mon[297296]: Updating np0005536117.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.client.admin.keyring Nov 26 04:53:04 localhost ceph-mon[297296]: Updating np0005536118.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.client.admin.keyring Nov 26 04:53:04 localhost ceph-mon[297296]: mon.np0005536118 calling monitor election Nov 26 04:53:04 localhost ceph-mon[297296]: mon.np0005536114 is new leader, mons np0005536114,np0005536119,np0005536117,np0005536118 in quorum (ranks 0,1,2,3) Nov 26 04:53:04 localhost ceph-mon[297296]: overall HEALTH_OK Nov 26 04:53:04 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:04 localhost ceph-mon[297296]: mon.np0005536118@3(peon) e12 handle_command mon_command({"prefix": "mgr stat", "format": "json"} v 0) Nov 26 04:53:04 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/4027517598' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json"} : dispatch Nov 26 04:53:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:53:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:53:05 localhost podman[299436]: 2025-11-26 09:53:05.172499024 +0000 UTC m=+0.096767268 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Nov 26 04:53:05 localhost podman[299436]: 2025-11-26 09:53:05.1873214 +0000 UTC m=+0.111589634 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:53:05 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:53:05 localhost podman[299435]: 2025-11-26 09:53:05.279236147 +0000 UTC m=+0.205226564 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:53:05 localhost podman[299435]: 2025-11-26 09:53:05.311791238 +0000 UTC m=+0.237781655 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2) Nov 26 04:53:05 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:53:05 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:05 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:05 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:05 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:05 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:05 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:05 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:05 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:05 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:05 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 26 04:53:06 localhost ceph-mon[297296]: Reconfiguring mon.np0005536114 (monmap changed)... Nov 26 04:53:06 localhost ceph-mon[297296]: Reconfiguring daemon mon.np0005536114 on np0005536114.localdomain Nov 26 04:53:06 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:06 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:06 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536114.ddbqmi", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:53:06 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536114.ddbqmi", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:53:07 localhost nova_compute[281415]: 2025-11-26 09:53:07.649 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:53:07 localhost ceph-mon[297296]: Reconfiguring mgr.np0005536114.ddbqmi (monmap changed)... Nov 26 04:53:07 localhost ceph-mon[297296]: Reconfiguring daemon mgr.np0005536114.ddbqmi on np0005536114.localdomain Nov 26 04:53:07 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:07 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:07 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536114.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:53:07 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536114.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:53:07 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:08 localhost ceph-mgr[287388]: ms_deliver_dispatch: unhandled message 0x55fcc98f82c0 mon_map magic: 0 from mon.2 v2:172.18.0.103:3300/0 Nov 26 04:53:08 localhost ceph-mgr[287388]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0 Nov 26 04:53:08 localhost ceph-mgr[287388]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0 Nov 26 04:53:08 localhost ceph-mon[297296]: mon.np0005536118@3(peon) e13 my rank is now 2 (was 3) Nov 26 04:53:08 localhost ceph-mgr[287388]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0 Nov 26 04:53:08 localhost ceph-mgr[287388]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0 Nov 26 04:53:08 localhost ceph-mon[297296]: log_channel(cluster) log [INF] : mon.np0005536118 calling monitor election Nov 26 04:53:08 localhost ceph-mon[297296]: paxos.2).electionLogic(48) init, last seen epoch 48 Nov 26 04:53:08 localhost ceph-mgr[287388]: ms_deliver_dispatch: unhandled message 0x55fcc98f89a0 mon_map magic: 0 from mon.0 v2:172.18.0.108:3300/0 Nov 26 04:53:08 localhost ceph-mon[297296]: mon.np0005536118@2(electing) e13 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:53:08 localhost ceph-mon[297296]: mon.np0005536118@2(electing) e13 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:53:08 localhost ceph-mon[297296]: mon.np0005536118@2(electing) e13 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:53:08 localhost ceph-mon[297296]: mon.np0005536118@2(peon) e13 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:53:08 localhost ceph-mon[297296]: Reconfiguring daemon crash.np0005536114 on np0005536114.localdomain Nov 26 04:53:08 localhost ceph-mon[297296]: Reconfiguring crash.np0005536117 (monmap changed)... Nov 26 04:53:08 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536117.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:53:08 localhost ceph-mon[297296]: Reconfiguring daemon crash.np0005536117 on np0005536117.localdomain Nov 26 04:53:08 localhost ceph-mon[297296]: Remove daemons mon.np0005536114 Nov 26 04:53:08 localhost ceph-mon[297296]: Safe to remove mon.np0005536114: new quorum should be ['np0005536119', 'np0005536117', 'np0005536118'] (from ['np0005536119', 'np0005536117', 'np0005536118']) Nov 26 04:53:08 localhost ceph-mon[297296]: Removing monitor np0005536114 from monmap... Nov 26 04:53:08 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "mon rm", "name": "np0005536114"} : dispatch Nov 26 04:53:08 localhost ceph-mon[297296]: Removing daemon mon.np0005536114 from np0005536114.localdomain -- ports [] Nov 26 04:53:08 localhost ceph-mon[297296]: mon.np0005536119 calling monitor election Nov 26 04:53:08 localhost ceph-mon[297296]: mon.np0005536118 calling monitor election Nov 26 04:53:08 localhost ceph-mon[297296]: mon.np0005536117 calling monitor election Nov 26 04:53:08 localhost ceph-mon[297296]: mon.np0005536119 is new leader, mons np0005536119,np0005536117,np0005536118 in quorum (ranks 0,1,2) Nov 26 04:53:08 localhost ceph-mon[297296]: overall HEALTH_OK Nov 26 04:53:08 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:09 localhost ceph-mon[297296]: mon.np0005536118@2(peon).osd e85 _set_new_cache_sizes cache_size:1019675549 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:53:09 localhost nova_compute[281415]: 2025-11-26 09:53:09.588 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:53:09 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:09 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:09 localhost ceph-mon[297296]: Reconfiguring osd.2 (monmap changed)... Nov 26 04:53:09 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Nov 26 04:53:09 localhost ceph-mon[297296]: Reconfiguring daemon osd.2 on np0005536117.localdomain Nov 26 04:53:09 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:09 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:10 localhost ceph-mon[297296]: Reconfiguring osd.5 (monmap changed)... Nov 26 04:53:10 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Nov 26 04:53:10 localhost ceph-mon[297296]: Reconfiguring daemon osd.5 on np0005536117.localdomain Nov 26 04:53:10 localhost ceph-mon[297296]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0. Nov 26 04:53:10 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:53:10.841624) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 26 04:53:10 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16 Nov 26 04:53:10 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150790841720, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 829, "num_deletes": 261, "total_data_size": 1497264, "memory_usage": 1522176, "flush_reason": "Manual Compaction"} Nov 26 04:53:10 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started Nov 26 04:53:10 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150790852035, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 1090984, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 11467, "largest_seqno": 12295, "table_properties": {"data_size": 1086544, "index_size": 1911, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 12384, "raw_average_key_size": 21, "raw_value_size": 1076496, "raw_average_value_size": 1885, "num_data_blocks": 79, "num_entries": 571, "num_filter_entries": 571, "num_deletions": 261, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764150776, "oldest_key_time": 1764150776, "file_creation_time": 1764150790, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79fcc812-ff89-4330-9c0d-148e92884770", "db_session_id": "NHY1MHQN9GMTALZ562AS", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}} Nov 26 04:53:10 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 10450 microseconds, and 4243 cpu microseconds. Nov 26 04:53:10 localhost ceph-mon[297296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 04:53:10 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:53:10.852085) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 1090984 bytes OK Nov 26 04:53:10 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:53:10.852109) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started Nov 26 04:53:10 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:53:10.853879) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done Nov 26 04:53:10 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:53:10.853902) EVENT_LOG_v1 {"time_micros": 1764150790853895, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 26 04:53:10 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:53:10.853927) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 26 04:53:10 localhost ceph-mon[297296]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 1492352, prev total WAL file size 1492676, number of live WAL files 2. Nov 26 04:53:10 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 04:53:10 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:53:10.854717) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033353135' seq:72057594037927935, type:22 .. '6C6F676D0033373638' seq:0, type:0; will stop at (end) Nov 26 04:53:10 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 26 04:53:10 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(1065KB)], [15(14MB)] Nov 26 04:53:10 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150790854829, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 16762835, "oldest_snapshot_seqno": -1} Nov 26 04:53:10 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 10723 keys, 16622059 bytes, temperature: kUnknown Nov 26 04:53:10 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150790940342, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 16622059, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16559336, "index_size": 34846, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26821, "raw_key_size": 287649, "raw_average_key_size": 26, "raw_value_size": 16374746, "raw_average_value_size": 1527, "num_data_blocks": 1333, "num_entries": 10723, "num_filter_entries": 10723, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764150724, "oldest_key_time": 0, "file_creation_time": 1764150790, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79fcc812-ff89-4330-9c0d-148e92884770", "db_session_id": "NHY1MHQN9GMTALZ562AS", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}} Nov 26 04:53:10 localhost ceph-mon[297296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 04:53:10 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:53:10.940684) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 16622059 bytes Nov 26 04:53:10 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:53:10.942785) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 195.9 rd, 194.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.0, 14.9 +0.0 blob) out(15.9 +0.0 blob), read-write-amplify(30.6) write-amplify(15.2) OK, records in: 11277, records dropped: 554 output_compression: NoCompression Nov 26 04:53:10 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:53:10.942818) EVENT_LOG_v1 {"time_micros": 1764150790942803, "job": 6, "event": "compaction_finished", "compaction_time_micros": 85573, "compaction_time_cpu_micros": 50262, "output_level": 6, "num_output_files": 1, "total_output_size": 16622059, "num_input_records": 11277, "num_output_records": 10723, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 26 04:53:10 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 04:53:10 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150790943146, "job": 6, "event": "table_file_deletion", "file_number": 17} Nov 26 04:53:10 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 04:53:10 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150790946521, "job": 6, "event": "table_file_deletion", "file_number": 15} Nov 26 04:53:10 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:53:10.854560) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:53:10 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:53:10.946616) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:53:10 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:53:10.946623) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:53:10 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:53:10.946627) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:53:10 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:53:10.946630) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:53:10 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:53:10.946633) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:53:11 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:11 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:11 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:11 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:11 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005536117.tfthzg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 26 04:53:11 localhost ceph-mon[297296]: Reconfiguring mds.mds.np0005536117.tfthzg (monmap changed)... Nov 26 04:53:11 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005536117.tfthzg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 26 04:53:11 localhost ceph-mon[297296]: Reconfiguring daemon mds.mds.np0005536117.tfthzg on np0005536117.localdomain Nov 26 04:53:11 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:11 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:11 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:11 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536117.ggibwg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:53:11 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536117.ggibwg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:53:12 localhost nova_compute[281415]: 2025-11-26 09:53:12.703 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:53:12 localhost ceph-mon[297296]: Removed label mon from host np0005536114.localdomain Nov 26 04:53:12 localhost ceph-mon[297296]: Reconfiguring mgr.np0005536117.ggibwg (monmap changed)... Nov 26 04:53:12 localhost ceph-mon[297296]: Reconfiguring daemon mgr.np0005536117.ggibwg on np0005536117.localdomain Nov 26 04:53:12 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:12 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:12 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 26 04:53:13 localhost podman[299526]: Nov 26 04:53:13 localhost podman[299526]: 2025-11-26 09:53:13.896693786 +0000 UTC m=+0.091718502 container create 48a73a2e4d23517b31a080f23854e1dce96f7bccee5cdaf289212b4410e27851 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_ellis, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, release=553, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, name=rhceph, io.openshift.expose-services=, distribution-scope=public, version=7, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, RELEASE=main, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git) Nov 26 04:53:13 localhost systemd[1]: Started libpod-conmon-48a73a2e4d23517b31a080f23854e1dce96f7bccee5cdaf289212b4410e27851.scope. Nov 26 04:53:13 localhost podman[299526]: 2025-11-26 09:53:13.854252861 +0000 UTC m=+0.049277597 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:53:13 localhost ceph-mon[297296]: Reconfiguring mon.np0005536117 (monmap changed)... Nov 26 04:53:13 localhost ceph-mon[297296]: Reconfiguring daemon mon.np0005536117 on np0005536117.localdomain Nov 26 04:53:13 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:13 localhost ceph-mon[297296]: Removed label mgr from host np0005536114.localdomain Nov 26 04:53:13 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:13 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:13 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536118.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:53:13 localhost ceph-mon[297296]: Reconfiguring crash.np0005536118 (monmap changed)... Nov 26 04:53:13 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536118.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:53:13 localhost ceph-mon[297296]: Reconfiguring daemon crash.np0005536118 on np0005536118.localdomain Nov 26 04:53:13 localhost systemd[1]: Started libcrun container. Nov 26 04:53:13 localhost podman[299526]: 2025-11-26 09:53:13.994435522 +0000 UTC m=+0.189460228 container init 48a73a2e4d23517b31a080f23854e1dce96f7bccee5cdaf289212b4410e27851 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_ellis, io.buildah.version=1.33.12, vcs-type=git, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, release=553, RELEASE=main, version=7) Nov 26 04:53:14 localhost podman[299526]: 2025-11-26 09:53:14.019233034 +0000 UTC m=+0.214257750 container start 48a73a2e4d23517b31a080f23854e1dce96f7bccee5cdaf289212b4410e27851 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_ellis, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, version=7, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, name=rhceph, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, io.openshift.expose-services=) Nov 26 04:53:14 localhost podman[299526]: 2025-11-26 09:53:14.019520504 +0000 UTC m=+0.214545210 container attach 48a73a2e4d23517b31a080f23854e1dce96f7bccee5cdaf289212b4410e27851 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_ellis, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, distribution-scope=public, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, version=7, ceph=True, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.openshift.expose-services=, release=553, io.openshift.tags=rhceph ceph) Nov 26 04:53:14 localhost bold_ellis[299541]: 167 167 Nov 26 04:53:14 localhost systemd[1]: libpod-48a73a2e4d23517b31a080f23854e1dce96f7bccee5cdaf289212b4410e27851.scope: Deactivated successfully. Nov 26 04:53:14 localhost podman[299526]: 2025-11-26 09:53:14.026210659 +0000 UTC m=+0.221235405 container died 48a73a2e4d23517b31a080f23854e1dce96f7bccee5cdaf289212b4410e27851 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_ellis, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , GIT_BRANCH=main, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vcs-type=git, release=553, name=rhceph, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 26 04:53:14 localhost ceph-mon[297296]: mon.np0005536118@2(peon).osd e85 _set_new_cache_sizes cache_size:1020048888 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:53:14 localhost podman[299546]: 2025-11-26 09:53:14.138270006 +0000 UTC m=+0.099406458 container remove 48a73a2e4d23517b31a080f23854e1dce96f7bccee5cdaf289212b4410e27851 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_ellis, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., version=7, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, GIT_BRANCH=main, name=rhceph, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, GIT_CLEAN=True, distribution-scope=public, ceph=True, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 26 04:53:14 localhost systemd[1]: libpod-conmon-48a73a2e4d23517b31a080f23854e1dce96f7bccee5cdaf289212b4410e27851.scope: Deactivated successfully. Nov 26 04:53:14 localhost nova_compute[281415]: 2025-11-26 09:53:14.635 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:53:14 localhost systemd[1]: tmp-crun.UCgmUk.mount: Deactivated successfully. Nov 26 04:53:14 localhost systemd[1]: var-lib-containers-storage-overlay-436bb5107809dacc40971ab0958b67b27d8ae00d2559bd8f00002e2da927cd22-merged.mount: Deactivated successfully. Nov 26 04:53:14 localhost podman[299616]: Nov 26 04:53:14 localhost podman[299616]: 2025-11-26 09:53:14.921354863 +0000 UTC m=+0.079190567 container create ac97d585b90a5bf480579d24474ae2077b1821629e994f1f13dd942b318e3f7f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_sanderson, release=553, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, version=7, com.redhat.component=rhceph-container, ceph=True, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, description=Red Hat Ceph Storage 7) Nov 26 04:53:14 localhost systemd[1]: Started libpod-conmon-ac97d585b90a5bf480579d24474ae2077b1821629e994f1f13dd942b318e3f7f.scope. Nov 26 04:53:14 localhost systemd[1]: Started libcrun container. Nov 26 04:53:14 localhost podman[299616]: 2025-11-26 09:53:14.888492492 +0000 UTC m=+0.046328226 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:53:14 localhost podman[299616]: 2025-11-26 09:53:14.99311443 +0000 UTC m=+0.150950134 container init ac97d585b90a5bf480579d24474ae2077b1821629e994f1f13dd942b318e3f7f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_sanderson, distribution-scope=public, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, release=553, ceph=True, RELEASE=main, io.openshift.expose-services=, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., name=rhceph, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 26 04:53:15 localhost podman[299616]: 2025-11-26 09:53:15.003316874 +0000 UTC m=+0.161152578 container start ac97d585b90a5bf480579d24474ae2077b1821629e994f1f13dd942b318e3f7f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_sanderson, build-date=2025-09-24T08:57:55, architecture=x86_64, vcs-type=git, name=rhceph, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , distribution-scope=public, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, GIT_CLEAN=True, release=553) Nov 26 04:53:15 localhost thirsty_sanderson[299630]: 167 167 Nov 26 04:53:15 localhost podman[299616]: 2025-11-26 09:53:15.003645914 +0000 UTC m=+0.161481678 container attach ac97d585b90a5bf480579d24474ae2077b1821629e994f1f13dd942b318e3f7f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_sanderson, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, build-date=2025-09-24T08:57:55, version=7, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., name=rhceph, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, RELEASE=main, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, ceph=True, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 26 04:53:15 localhost systemd[1]: libpod-ac97d585b90a5bf480579d24474ae2077b1821629e994f1f13dd942b318e3f7f.scope: Deactivated successfully. Nov 26 04:53:15 localhost podman[299616]: 2025-11-26 09:53:15.01002828 +0000 UTC m=+0.167864014 container died ac97d585b90a5bf480579d24474ae2077b1821629e994f1f13dd942b318e3f7f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_sanderson, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.component=rhceph-container, RELEASE=main, io.buildah.version=1.33.12, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=) Nov 26 04:53:15 localhost podman[299635]: 2025-11-26 09:53:15.111162011 +0000 UTC m=+0.092672022 container remove ac97d585b90a5bf480579d24474ae2077b1821629e994f1f13dd942b318e3f7f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_sanderson, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, RELEASE=main, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, distribution-scope=public, ceph=True, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, vendor=Red Hat, Inc., architecture=x86_64, release=553, version=7, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git) Nov 26 04:53:15 localhost systemd[1]: libpod-conmon-ac97d585b90a5bf480579d24474ae2077b1821629e994f1f13dd942b318e3f7f.scope: Deactivated successfully. Nov 26 04:53:15 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:15 localhost ceph-mon[297296]: Removed label _admin from host np0005536114.localdomain Nov 26 04:53:15 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:15 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:15 localhost ceph-mon[297296]: Reconfiguring osd.0 (monmap changed)... Nov 26 04:53:15 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Nov 26 04:53:15 localhost ceph-mon[297296]: Reconfiguring daemon osd.0 on np0005536118.localdomain Nov 26 04:53:15 localhost openstack_network_exporter[242153]: ERROR 09:53:15 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:53:15 localhost openstack_network_exporter[242153]: ERROR 09:53:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:53:15 localhost openstack_network_exporter[242153]: ERROR 09:53:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:53:15 localhost openstack_network_exporter[242153]: ERROR 09:53:15 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:53:15 localhost openstack_network_exporter[242153]: Nov 26 04:53:15 localhost openstack_network_exporter[242153]: ERROR 09:53:15 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:53:15 localhost openstack_network_exporter[242153]: Nov 26 04:53:15 localhost systemd[1]: var-lib-containers-storage-overlay-8f5d16b55df3f631d924c32ef410b0170e23b54ab6edc92fe6172c540df5e506-merged.mount: Deactivated successfully. Nov 26 04:53:16 localhost podman[299712]: Nov 26 04:53:16 localhost podman[299712]: 2025-11-26 09:53:16.034958845 +0000 UTC m=+0.086382198 container create 99c6672607588aad6a7f7bee1b08c23480da8218a767d090df899853acf7901f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_swanson, com.redhat.component=rhceph-container, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, GIT_BRANCH=main, name=rhceph, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, version=7, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.openshift.expose-services=, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 26 04:53:16 localhost systemd[1]: Started libpod-conmon-99c6672607588aad6a7f7bee1b08c23480da8218a767d090df899853acf7901f.scope. Nov 26 04:53:16 localhost podman[299712]: 2025-11-26 09:53:15.999725981 +0000 UTC m=+0.051149374 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:53:16 localhost systemd[1]: Started libcrun container. Nov 26 04:53:16 localhost podman[299712]: 2025-11-26 09:53:16.122270701 +0000 UTC m=+0.173694054 container init 99c6672607588aad6a7f7bee1b08c23480da8218a767d090df899853acf7901f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_swanson, io.openshift.tags=rhceph ceph, name=rhceph, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, version=7, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, release=553, io.openshift.expose-services=, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, distribution-scope=public, architecture=x86_64, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 26 04:53:16 localhost podman[299712]: 2025-11-26 09:53:16.134976922 +0000 UTC m=+0.186400285 container start 99c6672607588aad6a7f7bee1b08c23480da8218a767d090df899853acf7901f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_swanson, version=7, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, architecture=x86_64, CEPH_POINT_RELEASE=, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, GIT_BRANCH=main, com.redhat.component=rhceph-container, release=553, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 26 04:53:16 localhost podman[299712]: 2025-11-26 09:53:16.135816908 +0000 UTC m=+0.187240281 container attach 99c6672607588aad6a7f7bee1b08c23480da8218a767d090df899853acf7901f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_swanson, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, RELEASE=main, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, release=553, vcs-type=git, io.openshift.tags=rhceph ceph, name=rhceph, maintainer=Guillaume Abrioux , distribution-scope=public, ceph=True, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7) Nov 26 04:53:16 localhost awesome_swanson[299727]: 167 167 Nov 26 04:53:16 localhost systemd[1]: libpod-99c6672607588aad6a7f7bee1b08c23480da8218a767d090df899853acf7901f.scope: Deactivated successfully. Nov 26 04:53:16 localhost podman[299712]: 2025-11-26 09:53:16.139524741 +0000 UTC m=+0.190948154 container died 99c6672607588aad6a7f7bee1b08c23480da8218a767d090df899853acf7901f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_swanson, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, GIT_CLEAN=True, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, RELEASE=main, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, name=rhceph, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7) Nov 26 04:53:16 localhost podman[299732]: 2025-11-26 09:53:16.250920348 +0000 UTC m=+0.095031174 container remove 99c6672607588aad6a7f7bee1b08c23480da8218a767d090df899853acf7901f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_swanson, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, GIT_CLEAN=True, name=rhceph, io.openshift.expose-services=, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, RELEASE=main, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, vcs-type=git, GIT_BRANCH=main, com.redhat.component=rhceph-container, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 26 04:53:16 localhost systemd[1]: libpod-conmon-99c6672607588aad6a7f7bee1b08c23480da8218a767d090df899853acf7901f.scope: Deactivated successfully. Nov 26 04:53:16 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:16 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:16 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:16 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:16 localhost ceph-mon[297296]: Reconfiguring osd.4 (monmap changed)... Nov 26 04:53:16 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Nov 26 04:53:16 localhost ceph-mon[297296]: Reconfiguring daemon osd.4 on np0005536118.localdomain Nov 26 04:53:16 localhost systemd[1]: var-lib-containers-storage-overlay-3d5f389b08b04efd9d32ffd448d286854aefcdb011e8b30d6eb6ed49a7899bfe-merged.mount: Deactivated successfully. Nov 26 04:53:17 localhost podman[299808]: Nov 26 04:53:17 localhost podman[299808]: 2025-11-26 09:53:17.154963025 +0000 UTC m=+0.081802247 container create e1bb08516f362c666eafae90d53d7f69b9817b9174236bffe970705c920c20af (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_aryabhata, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, name=rhceph, release=553, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vendor=Red Hat, Inc., RELEASE=main, GIT_BRANCH=main, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, vcs-type=git, ceph=True) Nov 26 04:53:17 localhost systemd[1]: Started libpod-conmon-e1bb08516f362c666eafae90d53d7f69b9817b9174236bffe970705c920c20af.scope. Nov 26 04:53:17 localhost podman[299808]: 2025-11-26 09:53:17.121802055 +0000 UTC m=+0.048641307 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:53:17 localhost systemd[1]: Started libcrun container. Nov 26 04:53:17 localhost podman[299808]: 2025-11-26 09:53:17.244299613 +0000 UTC m=+0.171138835 container init e1bb08516f362c666eafae90d53d7f69b9817b9174236bffe970705c920c20af (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_aryabhata, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, name=rhceph, CEPH_POINT_RELEASE=, RELEASE=main, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, vcs-type=git, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, release=553, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 26 04:53:17 localhost podman[299808]: 2025-11-26 09:53:17.258304494 +0000 UTC m=+0.185143716 container start e1bb08516f362c666eafae90d53d7f69b9817b9174236bffe970705c920c20af (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_aryabhata, architecture=x86_64, GIT_BRANCH=main, GIT_CLEAN=True, vendor=Red Hat, Inc., name=rhceph, distribution-scope=public, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553) Nov 26 04:53:17 localhost podman[299808]: 2025-11-26 09:53:17.258701896 +0000 UTC m=+0.185541118 container attach e1bb08516f362c666eafae90d53d7f69b9817b9174236bffe970705c920c20af (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_aryabhata, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, vcs-type=git, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, release=553, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 26 04:53:17 localhost admiring_aryabhata[299823]: 167 167 Nov 26 04:53:17 localhost systemd[1]: libpod-e1bb08516f362c666eafae90d53d7f69b9817b9174236bffe970705c920c20af.scope: Deactivated successfully. Nov 26 04:53:17 localhost podman[299808]: 2025-11-26 09:53:17.262333617 +0000 UTC m=+0.189172869 container died e1bb08516f362c666eafae90d53d7f69b9817b9174236bffe970705c920c20af (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_aryabhata, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, distribution-scope=public, description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, RELEASE=main, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 26 04:53:17 localhost podman[299828]: 2025-11-26 09:53:17.368854434 +0000 UTC m=+0.093650291 container remove e1bb08516f362c666eafae90d53d7f69b9817b9174236bffe970705c920c20af (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_aryabhata, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.buildah.version=1.33.12, io.openshift.expose-services=, name=rhceph, GIT_CLEAN=True, release=553, com.redhat.component=rhceph-container, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, architecture=x86_64, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, vcs-type=git, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 26 04:53:17 localhost systemd[1]: libpod-conmon-e1bb08516f362c666eafae90d53d7f69b9817b9174236bffe970705c920c20af.scope: Deactivated successfully. Nov 26 04:53:17 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:17 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:17 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:17 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:17 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005536118.kohnma", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 26 04:53:17 localhost ceph-mon[297296]: Reconfiguring mds.mds.np0005536118.kohnma (monmap changed)... Nov 26 04:53:17 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005536118.kohnma", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 26 04:53:17 localhost ceph-mon[297296]: Reconfiguring daemon mds.mds.np0005536118.kohnma on np0005536118.localdomain Nov 26 04:53:17 localhost sshd[299880]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:53:17 localhost nova_compute[281415]: 2025-11-26 09:53:17.740 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:53:17 localhost systemd[1]: tmp-crun.TBvyoJ.mount: Deactivated successfully. Nov 26 04:53:17 localhost systemd[1]: var-lib-containers-storage-overlay-0a978929ffa49f3a3b97de609370786ac823e951424e3e1745100cf9d5ceb4a2-merged.mount: Deactivated successfully. Nov 26 04:53:18 localhost podman[299899]: Nov 26 04:53:18 localhost podman[299899]: 2025-11-26 09:53:18.125901959 +0000 UTC m=+0.085121759 container create 0a8bc8f55822ff92eb8b9b5057896598effdbc048f79643d2fc48520e9434036 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_gagarin, com.redhat.component=rhceph-container, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, name=rhceph, version=7, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, ceph=True, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64) Nov 26 04:53:18 localhost systemd[1]: Started libpod-conmon-0a8bc8f55822ff92eb8b9b5057896598effdbc048f79643d2fc48520e9434036.scope. Nov 26 04:53:18 localhost systemd[1]: Started libcrun container. Nov 26 04:53:18 localhost podman[299899]: 2025-11-26 09:53:18.091003746 +0000 UTC m=+0.050223546 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:53:18 localhost podman[299899]: 2025-11-26 09:53:18.197094069 +0000 UTC m=+0.156313899 container init 0a8bc8f55822ff92eb8b9b5057896598effdbc048f79643d2fc48520e9434036 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_gagarin, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhceph, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, RELEASE=main, release=553, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, distribution-scope=public, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph) Nov 26 04:53:18 localhost podman[299899]: 2025-11-26 09:53:18.206410596 +0000 UTC m=+0.165630396 container start 0a8bc8f55822ff92eb8b9b5057896598effdbc048f79643d2fc48520e9434036 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_gagarin, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, RELEASE=main, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_CLEAN=True, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, io.buildah.version=1.33.12, distribution-scope=public, io.openshift.expose-services=, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc.) Nov 26 04:53:18 localhost podman[299899]: 2025-11-26 09:53:18.20686934 +0000 UTC m=+0.166089200 container attach 0a8bc8f55822ff92eb8b9b5057896598effdbc048f79643d2fc48520e9434036 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_gagarin, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, com.redhat.component=rhceph-container, name=rhceph, CEPH_POINT_RELEASE=, version=7, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, vendor=Red Hat, Inc., io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 26 04:53:18 localhost priceless_gagarin[299914]: 167 167 Nov 26 04:53:18 localhost systemd[1]: libpod-0a8bc8f55822ff92eb8b9b5057896598effdbc048f79643d2fc48520e9434036.scope: Deactivated successfully. Nov 26 04:53:18 localhost podman[299899]: 2025-11-26 09:53:18.211322207 +0000 UTC m=+0.170542047 container died 0a8bc8f55822ff92eb8b9b5057896598effdbc048f79643d2fc48520e9434036 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_gagarin, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.k8s.description=Red Hat Ceph Storage 7, release=553, io.openshift.expose-services=, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, name=rhceph, architecture=x86_64, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=rhceph-container, GIT_BRANCH=main) Nov 26 04:53:18 localhost podman[299919]: 2025-11-26 09:53:18.307540266 +0000 UTC m=+0.087922485 container remove 0a8bc8f55822ff92eb8b9b5057896598effdbc048f79643d2fc48520e9434036 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_gagarin, distribution-scope=public, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, name=rhceph, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True) Nov 26 04:53:18 localhost systemd[1]: libpod-conmon-0a8bc8f55822ff92eb8b9b5057896598effdbc048f79643d2fc48520e9434036.scope: Deactivated successfully. Nov 26 04:53:18 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:18 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:18 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536118.anceyj", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:53:18 localhost ceph-mon[297296]: Reconfiguring mgr.np0005536118.anceyj (monmap changed)... Nov 26 04:53:18 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536118.anceyj", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:53:18 localhost ceph-mon[297296]: Reconfiguring daemon mgr.np0005536118.anceyj on np0005536118.localdomain Nov 26 04:53:18 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:18 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:18 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 26 04:53:18 localhost systemd[1]: var-lib-containers-storage-overlay-738d2b530a562e801453a7bac7b54e72ca8761fbe42324a15679a2341133833d-merged.mount: Deactivated successfully. Nov 26 04:53:19 localhost ceph-mon[297296]: mon.np0005536118@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054641 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:53:19 localhost podman[299987]: Nov 26 04:53:19 localhost podman[299987]: 2025-11-26 09:53:19.078961314 +0000 UTC m=+0.079099494 container create 85ff96dcb2cee21dc49e4f8b3edacd35950b59f6f705c4c5b0935bb17dbfa50f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_sanderson, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, name=rhceph, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, com.redhat.component=rhceph-container, GIT_BRANCH=main, ceph=True, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, version=7, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, architecture=x86_64, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 26 04:53:19 localhost systemd[1]: Started libpod-conmon-85ff96dcb2cee21dc49e4f8b3edacd35950b59f6f705c4c5b0935bb17dbfa50f.scope. Nov 26 04:53:19 localhost systemd[1]: Started libcrun container. Nov 26 04:53:19 localhost podman[299987]: 2025-11-26 09:53:19.145807561 +0000 UTC m=+0.145945721 container init 85ff96dcb2cee21dc49e4f8b3edacd35950b59f6f705c4c5b0935bb17dbfa50f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_sanderson, ceph=True, distribution-scope=public, release=553, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., GIT_BRANCH=main, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, name=rhceph, architecture=x86_64) Nov 26 04:53:19 localhost podman[299987]: 2025-11-26 09:53:19.047682172 +0000 UTC m=+0.047820392 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:53:19 localhost podman[299987]: 2025-11-26 09:53:19.154808237 +0000 UTC m=+0.154946417 container start 85ff96dcb2cee21dc49e4f8b3edacd35950b59f6f705c4c5b0935bb17dbfa50f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_sanderson, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, version=7, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main) Nov 26 04:53:19 localhost podman[299987]: 2025-11-26 09:53:19.155133967 +0000 UTC m=+0.155272117 container attach 85ff96dcb2cee21dc49e4f8b3edacd35950b59f6f705c4c5b0935bb17dbfa50f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_sanderson, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, name=rhceph, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, RELEASE=main, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, release=553) Nov 26 04:53:19 localhost unruffled_sanderson[300002]: 167 167 Nov 26 04:53:19 localhost systemd[1]: libpod-85ff96dcb2cee21dc49e4f8b3edacd35950b59f6f705c4c5b0935bb17dbfa50f.scope: Deactivated successfully. Nov 26 04:53:19 localhost podman[299987]: 2025-11-26 09:53:19.163604718 +0000 UTC m=+0.163742898 container died 85ff96dcb2cee21dc49e4f8b3edacd35950b59f6f705c4c5b0935bb17dbfa50f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_sanderson, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, vcs-type=git, com.redhat.component=rhceph-container, architecture=x86_64, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_BRANCH=main, RELEASE=main, description=Red Hat Ceph Storage 7, name=rhceph, maintainer=Guillaume Abrioux , distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 26 04:53:19 localhost podman[300007]: 2025-11-26 09:53:19.259436475 +0000 UTC m=+0.084295814 container remove 85ff96dcb2cee21dc49e4f8b3edacd35950b59f6f705c4c5b0935bb17dbfa50f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_sanderson, io.openshift.expose-services=, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, name=rhceph, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-type=git, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, GIT_CLEAN=True, distribution-scope=public) Nov 26 04:53:19 localhost systemd[1]: libpod-conmon-85ff96dcb2cee21dc49e4f8b3edacd35950b59f6f705c4c5b0935bb17dbfa50f.scope: Deactivated successfully. Nov 26 04:53:19 localhost ceph-mon[297296]: Reconfiguring mon.np0005536118 (monmap changed)... Nov 26 04:53:19 localhost ceph-mon[297296]: Reconfiguring daemon mon.np0005536118 on np0005536118.localdomain Nov 26 04:53:19 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:19 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:19 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536119.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:53:19 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536119.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:53:19 localhost nova_compute[281415]: 2025-11-26 09:53:19.638 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:53:19 localhost systemd[1]: var-lib-containers-storage-overlay-cd0cefec012ce0b2da9b5ddd7dd4b07e84623de0780766651926ed1225619e16-merged.mount: Deactivated successfully. Nov 26 04:53:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:53:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:53:20 localhost podman[300023]: 2025-11-26 09:53:20.05349843 +0000 UTC m=+0.108872930 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 04:53:20 localhost podman[300023]: 2025-11-26 09:53:20.089484096 +0000 UTC m=+0.144858606 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 26 04:53:20 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:53:20 localhost podman[300040]: 2025-11-26 09:53:20.162268655 +0000 UTC m=+0.096071086 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3) Nov 26 04:53:20 localhost podman[300040]: 2025-11-26 09:53:20.174824351 +0000 UTC m=+0.108626832 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:53:20 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:53:20 localhost ceph-mon[297296]: Reconfiguring crash.np0005536119 (monmap changed)... Nov 26 04:53:20 localhost ceph-mon[297296]: Reconfiguring daemon crash.np0005536119 on np0005536119.localdomain Nov 26 04:53:20 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:20 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:20 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Nov 26 04:53:21 localhost ceph-mon[297296]: Reconfiguring osd.1 (monmap changed)... Nov 26 04:53:21 localhost ceph-mon[297296]: Reconfiguring daemon osd.1 on np0005536119.localdomain Nov 26 04:53:21 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:21 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:21 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:21 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:21 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Nov 26 04:53:22 localhost nova_compute[281415]: 2025-11-26 09:53:22.775 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:53:22 localhost ceph-mon[297296]: Reconfiguring osd.3 (monmap changed)... Nov 26 04:53:22 localhost ceph-mon[297296]: Reconfiguring daemon osd.3 on np0005536119.localdomain Nov 26 04:53:22 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:22 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:22 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:22 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:22 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005536119.dxhchp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 26 04:53:22 localhost ceph-mon[297296]: Reconfiguring mds.mds.np0005536119.dxhchp (monmap changed)... Nov 26 04:53:22 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005536119.dxhchp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 26 04:53:22 localhost ceph-mon[297296]: Reconfiguring daemon mds.mds.np0005536119.dxhchp on np0005536119.localdomain Nov 26 04:53:24 localhost ceph-mon[297296]: mon.np0005536118@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054730 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:53:24 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:24 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:24 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536119.eupicg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:53:24 localhost ceph-mon[297296]: Reconfiguring mgr.np0005536119.eupicg (monmap changed)... Nov 26 04:53:24 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536119.eupicg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:53:24 localhost ceph-mon[297296]: Reconfiguring daemon mgr.np0005536119.eupicg on np0005536119.localdomain Nov 26 04:53:24 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:24 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:24 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 26 04:53:24 localhost nova_compute[281415]: 2025-11-26 09:53:24.674 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:53:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:53:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:53:24 localhost podman[300066]: 2025-11-26 09:53:24.835752954 +0000 UTC m=+0.092324411 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_controller) Nov 26 04:53:24 localhost podman[300066]: 2025-11-26 09:53:24.876319352 +0000 UTC m=+0.132890799 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251118) Nov 26 04:53:24 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:53:24 localhost podman[300067]: 2025-11-26 09:53:24.894060397 +0000 UTC m=+0.144223087 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, release=1755695350, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_id=edpm, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, com.redhat.component=ubi9-minimal-container, distribution-scope=public, name=ubi9-minimal) Nov 26 04:53:24 localhost podman[300067]: 2025-11-26 09:53:24.913372191 +0000 UTC m=+0.163534871 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, container_name=openstack_network_exporter, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Nov 26 04:53:24 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:53:25 localhost ceph-mon[297296]: Reconfiguring mon.np0005536119 (monmap changed)... Nov 26 04:53:25 localhost ceph-mon[297296]: Reconfiguring daemon mon.np0005536119 on np0005536119.localdomain Nov 26 04:53:25 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:25 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:26 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:26 localhost ceph-mon[297296]: Added label _no_schedule to host np0005536114.localdomain Nov 26 04:53:26 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:26 localhost ceph-mon[297296]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005536114.localdomain Nov 26 04:53:27 localhost podman[240049]: time="2025-11-26T09:53:27Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:53:27 localhost podman[240049]: @ - - [26/Nov/2025:09:53:27 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153864 "" "Go-http-client/1.1" Nov 26 04:53:27 localhost podman[240049]: @ - - [26/Nov/2025:09:53:27 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18714 "" "Go-http-client/1.1" Nov 26 04:53:27 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:27 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:27 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:53:27 localhost ceph-mon[297296]: Removing np0005536114.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:53:27 localhost ceph-mon[297296]: Updating np0005536117.localdomain:/etc/ceph/ceph.conf Nov 26 04:53:27 localhost ceph-mon[297296]: Updating np0005536118.localdomain:/etc/ceph/ceph.conf Nov 26 04:53:27 localhost ceph-mon[297296]: Updating np0005536119.localdomain:/etc/ceph/ceph.conf Nov 26 04:53:27 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:27 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:27 localhost nova_compute[281415]: 2025-11-26 09:53:27.814 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:53:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:53:28 localhost systemd[1]: tmp-crun.Illt76.mount: Deactivated successfully. Nov 26 04:53:28 localhost podman[300395]: 2025-11-26 09:53:28.214956613 +0000 UTC m=+0.097996745 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 26 04:53:28 localhost podman[300395]: 2025-11-26 09:53:28.249354461 +0000 UTC m=+0.132394593 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 04:53:28 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:53:28 localhost ceph-mon[297296]: Removing np0005536114.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 26 04:53:28 localhost ceph-mon[297296]: Removing np0005536114.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.client.admin.keyring Nov 26 04:53:28 localhost ceph-mon[297296]: Updating np0005536119.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:53:28 localhost ceph-mon[297296]: Updating np0005536117.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:53:28 localhost ceph-mon[297296]: Updating np0005536118.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:53:28 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:28 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:28 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:28 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:28 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:28 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:28 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:28 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:28 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005536114.localdomain"} : dispatch Nov 26 04:53:28 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005536114.localdomain"} : dispatch Nov 26 04:53:28 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005536114.localdomain"}]': finished Nov 26 04:53:28 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth rm", "entity": "client.crash.np0005536114.localdomain"} : dispatch Nov 26 04:53:28 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth rm", "entity": "client.crash.np0005536114.localdomain"} : dispatch Nov 26 04:53:28 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' cmd='[{"prefix": "auth rm", "entity": "client.crash.np0005536114.localdomain"}]': finished Nov 26 04:53:28 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:28 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:29 localhost ceph-mon[297296]: mon.np0005536118@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:53:29 localhost nova_compute[281415]: 2025-11-26 09:53:29.715 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:53:29 localhost ceph-mon[297296]: Removing daemon crash.np0005536114 from np0005536114.localdomain -- ports [] Nov 26 04:53:29 localhost ceph-mon[297296]: Removing key for client.crash.np0005536114.localdomain Nov 26 04:53:29 localhost ceph-mon[297296]: Removed host np0005536114.localdomain Nov 26 04:53:29 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:53:29 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:29 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536117.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:53:29 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536117.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:53:30 localhost ceph-mon[297296]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0. Nov 26 04:53:30 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:53:30.872467) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 26 04:53:30 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19 Nov 26 04:53:30 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150810872542, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 1204, "num_deletes": 254, "total_data_size": 1946738, "memory_usage": 1970560, "flush_reason": "Manual Compaction"} Nov 26 04:53:30 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started Nov 26 04:53:30 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150810882520, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 1056586, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12300, "largest_seqno": 13499, "table_properties": {"data_size": 1051312, "index_size": 2553, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1669, "raw_key_size": 13104, "raw_average_key_size": 20, "raw_value_size": 1039897, "raw_average_value_size": 1622, "num_data_blocks": 109, "num_entries": 641, "num_filter_entries": 641, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764150790, "oldest_key_time": 1764150790, "file_creation_time": 1764150810, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79fcc812-ff89-4330-9c0d-148e92884770", "db_session_id": "NHY1MHQN9GMTALZ562AS", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}} Nov 26 04:53:30 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 10127 microseconds, and 5781 cpu microseconds. Nov 26 04:53:30 localhost ceph-mon[297296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 04:53:30 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:53:30.882593) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 1056586 bytes OK Nov 26 04:53:30 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:53:30.882621) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started Nov 26 04:53:30 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:53:30.884398) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done Nov 26 04:53:30 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:53:30.884419) EVENT_LOG_v1 {"time_micros": 1764150810884413, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 26 04:53:30 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:53:30.884446) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 26 04:53:30 localhost ceph-mon[297296]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 1940432, prev total WAL file size 1944958, number of live WAL files 2. Nov 26 04:53:30 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 04:53:30 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:53:30.885183) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031323635' seq:72057594037927935, type:22 .. '6B760031353139' seq:0, type:0; will stop at (end) Nov 26 04:53:30 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 26 04:53:30 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(1031KB)], [18(15MB)] Nov 26 04:53:30 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150810885243, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 17678645, "oldest_snapshot_seqno": -1} Nov 26 04:53:30 localhost ceph-mon[297296]: Reconfiguring crash.np0005536117 (monmap changed)... Nov 26 04:53:30 localhost ceph-mon[297296]: Reconfiguring daemon crash.np0005536117 on np0005536117.localdomain Nov 26 04:53:30 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:30 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:30 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:53:30 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:30 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 10839 keys, 16662986 bytes, temperature: kUnknown Nov 26 04:53:30 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150810962515, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 16662986, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16599867, "index_size": 34922, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27141, "raw_key_size": 292123, "raw_average_key_size": 26, "raw_value_size": 16413562, "raw_average_value_size": 1514, "num_data_blocks": 1322, "num_entries": 10839, "num_filter_entries": 10839, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764150724, "oldest_key_time": 0, "file_creation_time": 1764150810, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79fcc812-ff89-4330-9c0d-148e92884770", "db_session_id": "NHY1MHQN9GMTALZ562AS", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}} Nov 26 04:53:30 localhost ceph-mon[297296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 04:53:30 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:53:30.963274) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 16662986 bytes Nov 26 04:53:30 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:53:30.965067) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 227.2 rd, 214.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.0, 15.9 +0.0 blob) out(15.9 +0.0 blob), read-write-amplify(32.5) write-amplify(15.8) OK, records in: 11364, records dropped: 525 output_compression: NoCompression Nov 26 04:53:30 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:53:30.965103) EVENT_LOG_v1 {"time_micros": 1764150810965089, "job": 8, "event": "compaction_finished", "compaction_time_micros": 77825, "compaction_time_cpu_micros": 50640, "output_level": 6, "num_output_files": 1, "total_output_size": 16662986, "num_input_records": 11364, "num_output_records": 10839, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 26 04:53:30 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 04:53:30 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150810965665, "job": 8, "event": "table_file_deletion", "file_number": 20} Nov 26 04:53:30 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 04:53:30 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150810968311, "job": 8, "event": "table_file_deletion", "file_number": 18} Nov 26 04:53:30 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:53:30.885079) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:53:30 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:53:30.968457) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:53:30 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:53:30.968499) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:53:30 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:53:30.968502) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:53:30 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:53:30.968510) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:53:30 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:53:30.968513) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:53:32 localhost nova_compute[281415]: 2025-11-26 09:53:32.818 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:53:34 localhost ceph-mon[297296]: mon.np0005536118@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:53:34 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:34 localhost nova_compute[281415]: 2025-11-26 09:53:34.764 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:53:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:53:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:53:35 localhost podman[300512]: 2025-11-26 09:53:35.836161858 +0000 UTC m=+0.088533283 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS) Nov 26 04:53:35 localhost systemd[1]: tmp-crun.a3PBAK.mount: Deactivated successfully. Nov 26 04:53:35 localhost podman[300511]: 2025-11-26 09:53:35.889503319 +0000 UTC m=+0.145735914 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:53:35 localhost podman[300512]: 2025-11-26 09:53:35.905907314 +0000 UTC m=+0.158278739 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd) Nov 26 04:53:35 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:53:35 localhost podman[300511]: 2025-11-26 09:53:35.921123202 +0000 UTC m=+0.177355847 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3) Nov 26 04:53:35 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:53:37 localhost sshd[300566]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:53:37 localhost ceph-mon[297296]: Saving service mon spec with placement label:mon Nov 26 04:53:37 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:37 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:53:37 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:37 localhost nova_compute[281415]: 2025-11-26 09:53:37.854 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:53:38 localhost ceph-mon[297296]: from='mgr.26711 ' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:39 localhost ceph-mon[297296]: mon.np0005536118@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:53:39 localhost ceph-mgr[287388]: ms_deliver_dispatch: unhandled message 0x55fcc98f8b00 mon_map magic: 0 from mon.0 v2:172.18.0.108:3300/0 Nov 26 04:53:39 localhost ceph-mon[297296]: mon.np0005536118@2(peon) e14 my rank is now 1 (was 2) Nov 26 04:53:39 localhost ceph-mgr[287388]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0 Nov 26 04:53:39 localhost ceph-mgr[287388]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0 Nov 26 04:53:39 localhost ceph-mgr[287388]: ms_deliver_dispatch: unhandled message 0x55fcc98f8c60 mon_map magic: 0 from mon.1 v2:172.18.0.104:3300/0 Nov 26 04:53:39 localhost ceph-mon[297296]: log_channel(cluster) log [INF] : mon.np0005536118 calling monitor election Nov 26 04:53:39 localhost ceph-mon[297296]: paxos.1).electionLogic(50) init, last seen epoch 50 Nov 26 04:53:39 localhost ceph-mon[297296]: mon.np0005536118@1(electing) e14 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:53:39 localhost ceph-mon[297296]: mon.np0005536118@1(electing) e14 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:53:39 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e14 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:53:39 localhost ceph-mon[297296]: Remove daemons mon.np0005536119 Nov 26 04:53:39 localhost ceph-mon[297296]: Safe to remove mon.np0005536119: new quorum should be ['np0005536117', 'np0005536118'] (from ['np0005536117', 'np0005536118']) Nov 26 04:53:39 localhost ceph-mon[297296]: Removing monitor np0005536119 from monmap... Nov 26 04:53:39 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "mon rm", "name": "np0005536119"} : dispatch Nov 26 04:53:39 localhost ceph-mon[297296]: Removing daemon mon.np0005536119 from np0005536119.localdomain -- ports [] Nov 26 04:53:39 localhost ceph-mon[297296]: mon.np0005536118 calling monitor election Nov 26 04:53:39 localhost ceph-mon[297296]: mon.np0005536117 calling monitor election Nov 26 04:53:39 localhost ceph-mon[297296]: mon.np0005536117 is new leader, mons np0005536117,np0005536118 in quorum (ranks 0,1) Nov 26 04:53:39 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:53:39 localhost ceph-mon[297296]: overall HEALTH_OK Nov 26 04:53:39 localhost nova_compute[281415]: 2025-11-26 09:53:39.802 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:53:40 localhost ceph-mon[297296]: Updating np0005536117.localdomain:/etc/ceph/ceph.conf Nov 26 04:53:40 localhost ceph-mon[297296]: Updating np0005536118.localdomain:/etc/ceph/ceph.conf Nov 26 04:53:40 localhost ceph-mon[297296]: Updating np0005536119.localdomain:/etc/ceph/ceph.conf Nov 26 04:53:41 localhost ceph-mon[297296]: Updating np0005536117.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:53:41 localhost ceph-mon[297296]: Updating np0005536118.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:53:41 localhost ceph-mon[297296]: Updating np0005536119.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:53:41 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:41 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:41 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:41 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:41 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:41 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:41 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:41 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536117.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:53:42 localhost ceph-mon[297296]: Reconfiguring crash.np0005536117 (monmap changed)... Nov 26 04:53:42 localhost ceph-mon[297296]: Reconfiguring daemon crash.np0005536117 on np0005536117.localdomain Nov 26 04:53:42 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:42 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:42 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Nov 26 04:53:42 localhost nova_compute[281415]: 2025-11-26 09:53:42.905 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:53:43 localhost ceph-mon[297296]: Reconfiguring osd.2 (monmap changed)... Nov 26 04:53:43 localhost ceph-mon[297296]: Reconfiguring daemon osd.2 on np0005536117.localdomain Nov 26 04:53:43 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:43 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:43 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Nov 26 04:53:43 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:44 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:53:44 localhost ceph-mon[297296]: Reconfiguring osd.5 (monmap changed)... Nov 26 04:53:44 localhost ceph-mon[297296]: Reconfiguring daemon osd.5 on np0005536117.localdomain Nov 26 04:53:44 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:44 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:44 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005536117.tfthzg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 26 04:53:44 localhost nova_compute[281415]: 2025-11-26 09:53:44.848 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:53:45 localhost ceph-mon[297296]: Reconfiguring mds.mds.np0005536117.tfthzg (monmap changed)... Nov 26 04:53:45 localhost ceph-mon[297296]: Reconfiguring daemon mds.mds.np0005536117.tfthzg on np0005536117.localdomain Nov 26 04:53:45 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:45 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:45 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536117.ggibwg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:53:45 localhost openstack_network_exporter[242153]: ERROR 09:53:45 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:53:45 localhost openstack_network_exporter[242153]: ERROR 09:53:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:53:45 localhost openstack_network_exporter[242153]: ERROR 09:53:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:53:45 localhost openstack_network_exporter[242153]: ERROR 09:53:45 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:53:45 localhost openstack_network_exporter[242153]: Nov 26 04:53:45 localhost openstack_network_exporter[242153]: ERROR 09:53:45 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:53:45 localhost openstack_network_exporter[242153]: Nov 26 04:53:46 localhost podman[300958]: Nov 26 04:53:46 localhost podman[300958]: 2025-11-26 09:53:46.591792545 +0000 UTC m=+0.083451048 container create 92572874bd8ea8542d85a437cb5a01e42858969b5d4d007daaad84e1fe9d9a87 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_pare, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, RELEASE=main, vcs-type=git, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, GIT_CLEAN=True) Nov 26 04:53:46 localhost systemd[1]: Started libpod-conmon-92572874bd8ea8542d85a437cb5a01e42858969b5d4d007daaad84e1fe9d9a87.scope. Nov 26 04:53:46 localhost systemd[1]: Started libcrun container. Nov 26 04:53:46 localhost podman[300958]: 2025-11-26 09:53:46.55811701 +0000 UTC m=+0.049775533 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:53:46 localhost podman[300958]: 2025-11-26 09:53:46.670047302 +0000 UTC m=+0.161705795 container init 92572874bd8ea8542d85a437cb5a01e42858969b5d4d007daaad84e1fe9d9a87 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_pare, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , architecture=x86_64, ceph=True, name=rhceph, io.buildah.version=1.33.12, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, io.openshift.expose-services=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, RELEASE=main) Nov 26 04:53:46 localhost podman[300958]: 2025-11-26 09:53:46.681838085 +0000 UTC m=+0.173496578 container start 92572874bd8ea8542d85a437cb5a01e42858969b5d4d007daaad84e1fe9d9a87 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_pare, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, name=rhceph, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, com.redhat.component=rhceph-container, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 26 04:53:46 localhost podman[300958]: 2025-11-26 09:53:46.682157535 +0000 UTC m=+0.173816038 container attach 92572874bd8ea8542d85a437cb5a01e42858969b5d4d007daaad84e1fe9d9a87 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_pare, ceph=True, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, RELEASE=main, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , version=7, GIT_BRANCH=main, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=) Nov 26 04:53:46 localhost busy_pare[300973]: 167 167 Nov 26 04:53:46 localhost systemd[1]: libpod-92572874bd8ea8542d85a437cb5a01e42858969b5d4d007daaad84e1fe9d9a87.scope: Deactivated successfully. Nov 26 04:53:46 localhost podman[300958]: 2025-11-26 09:53:46.685338812 +0000 UTC m=+0.176997375 container died 92572874bd8ea8542d85a437cb5a01e42858969b5d4d007daaad84e1fe9d9a87 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_pare, maintainer=Guillaume Abrioux , vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, architecture=x86_64, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, io.openshift.expose-services=, name=rhceph, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, distribution-scope=public, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 26 04:53:46 localhost ceph-mon[297296]: Reconfiguring mgr.np0005536117.ggibwg (monmap changed)... Nov 26 04:53:46 localhost ceph-mon[297296]: Reconfiguring daemon mgr.np0005536117.ggibwg on np0005536117.localdomain Nov 26 04:53:46 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:46 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:46 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536118.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:53:46 localhost podman[300978]: 2025-11-26 09:53:46.792309092 +0000 UTC m=+0.094570079 container remove 92572874bd8ea8542d85a437cb5a01e42858969b5d4d007daaad84e1fe9d9a87 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_pare, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, GIT_BRANCH=main, name=rhceph, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, description=Red Hat Ceph Storage 7, ceph=True, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.openshift.expose-services=, io.openshift.tags=rhceph ceph, vcs-type=git, release=553, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, vendor=Red Hat, Inc.) Nov 26 04:53:46 localhost systemd[1]: libpod-conmon-92572874bd8ea8542d85a437cb5a01e42858969b5d4d007daaad84e1fe9d9a87.scope: Deactivated successfully. Nov 26 04:53:47 localhost podman[301047]: Nov 26 04:53:47 localhost podman[301047]: 2025-11-26 09:53:47.518490499 +0000 UTC m=+0.083214251 container create f879341990f0ccd1c56a56efb88219f5753898c627796e8abe1c1afe72d2edac (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_poitras, GIT_BRANCH=main, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, description=Red Hat Ceph Storage 7, RELEASE=main, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, build-date=2025-09-24T08:57:55, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, name=rhceph) Nov 26 04:53:47 localhost systemd[1]: Started libpod-conmon-f879341990f0ccd1c56a56efb88219f5753898c627796e8abe1c1afe72d2edac.scope. Nov 26 04:53:47 localhost podman[301047]: 2025-11-26 09:53:47.484364739 +0000 UTC m=+0.049088481 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:53:47 localhost systemd[1]: Started libcrun container. Nov 26 04:53:47 localhost systemd[1]: var-lib-containers-storage-overlay-0a74fe0b241c85101487d90008296ff05ae4d12c854fcf690bf053444eed2aae-merged.mount: Deactivated successfully. Nov 26 04:53:47 localhost podman[301047]: 2025-11-26 09:53:47.600713598 +0000 UTC m=+0.165437330 container init f879341990f0ccd1c56a56efb88219f5753898c627796e8abe1c1afe72d2edac (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_poitras, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , version=7, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, release=553, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, com.redhat.component=rhceph-container, ceph=True) Nov 26 04:53:47 localhost mystifying_poitras[301061]: 167 167 Nov 26 04:53:47 localhost podman[301047]: 2025-11-26 09:53:47.61055278 +0000 UTC m=+0.175276512 container start f879341990f0ccd1c56a56efb88219f5753898c627796e8abe1c1afe72d2edac (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_poitras, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, distribution-scope=public, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, com.redhat.component=rhceph-container, ceph=True, name=rhceph, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, architecture=x86_64) Nov 26 04:53:47 localhost podman[301047]: 2025-11-26 09:53:47.615269976 +0000 UTC m=+0.179993738 container attach f879341990f0ccd1c56a56efb88219f5753898c627796e8abe1c1afe72d2edac (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_poitras, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, architecture=x86_64, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.component=rhceph-container, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, release=553, maintainer=Guillaume Abrioux , io.openshift.expose-services=, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 26 04:53:47 localhost systemd[1]: libpod-f879341990f0ccd1c56a56efb88219f5753898c627796e8abe1c1afe72d2edac.scope: Deactivated successfully. Nov 26 04:53:47 localhost podman[301047]: 2025-11-26 09:53:47.618283608 +0000 UTC m=+0.183007340 container died f879341990f0ccd1c56a56efb88219f5753898c627796e8abe1c1afe72d2edac (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_poitras, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, name=rhceph, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, com.redhat.component=rhceph-container, RELEASE=main, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, release=553, CEPH_POINT_RELEASE=, version=7) Nov 26 04:53:47 localhost systemd[1]: var-lib-containers-storage-overlay-bc7aa3fbca64f5d531036186a386c902770259a0127e67646b6dd8803d39d54b-merged.mount: Deactivated successfully. Nov 26 04:53:47 localhost ceph-mon[297296]: Reconfiguring crash.np0005536118 (monmap changed)... Nov 26 04:53:47 localhost ceph-mon[297296]: Reconfiguring daemon crash.np0005536118 on np0005536118.localdomain Nov 26 04:53:47 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:47 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:47 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Nov 26 04:53:47 localhost podman[301066]: 2025-11-26 09:53:47.729138278 +0000 UTC m=+0.100021427 container remove f879341990f0ccd1c56a56efb88219f5753898c627796e8abe1c1afe72d2edac (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_poitras, description=Red Hat Ceph Storage 7, vcs-type=git, distribution-scope=public, name=rhceph, version=7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, io.buildah.version=1.33.12, release=553, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=) Nov 26 04:53:47 localhost systemd[1]: libpod-conmon-f879341990f0ccd1c56a56efb88219f5753898c627796e8abe1c1afe72d2edac.scope: Deactivated successfully. Nov 26 04:53:47 localhost nova_compute[281415]: 2025-11-26 09:53:47.932 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:53:48 localhost podman[301141]: Nov 26 04:53:48 localhost podman[301141]: 2025-11-26 09:53:48.571836558 +0000 UTC m=+0.082670065 container create 99f85b1ffe5a9ec7d171b8a36fdacc56284f09b8b0e381dcc49f2195609f241e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_dubinsky, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, ceph=True, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, RELEASE=main, distribution-scope=public, com.redhat.component=rhceph-container, GIT_CLEAN=True, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.openshift.tags=rhceph ceph, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 26 04:53:48 localhost systemd[1]: Started libpod-conmon-99f85b1ffe5a9ec7d171b8a36fdacc56284f09b8b0e381dcc49f2195609f241e.scope. Nov 26 04:53:48 localhost systemd[1]: Started libcrun container. Nov 26 04:53:48 localhost podman[301141]: 2025-11-26 09:53:48.538450722 +0000 UTC m=+0.049284208 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:53:48 localhost podman[301141]: 2025-11-26 09:53:48.64178084 +0000 UTC m=+0.152614316 container init 99f85b1ffe5a9ec7d171b8a36fdacc56284f09b8b0e381dcc49f2195609f241e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_dubinsky, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, architecture=x86_64, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True) Nov 26 04:53:48 localhost infallible_dubinsky[301156]: 167 167 Nov 26 04:53:48 localhost systemd[1]: libpod-99f85b1ffe5a9ec7d171b8a36fdacc56284f09b8b0e381dcc49f2195609f241e.scope: Deactivated successfully. Nov 26 04:53:48 localhost podman[301141]: 2025-11-26 09:53:48.655449189 +0000 UTC m=+0.166282675 container start 99f85b1ffe5a9ec7d171b8a36fdacc56284f09b8b0e381dcc49f2195609f241e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_dubinsky, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, version=7, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, GIT_CLEAN=True, architecture=x86_64, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, release=553) Nov 26 04:53:48 localhost podman[301141]: 2025-11-26 09:53:48.655789561 +0000 UTC m=+0.166623057 container attach 99f85b1ffe5a9ec7d171b8a36fdacc56284f09b8b0e381dcc49f2195609f241e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_dubinsky, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, ceph=True, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, vcs-type=git, release=553, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 26 04:53:48 localhost podman[301141]: 2025-11-26 09:53:48.658045689 +0000 UTC m=+0.168879185 container died 99f85b1ffe5a9ec7d171b8a36fdacc56284f09b8b0e381dcc49f2195609f241e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_dubinsky, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, distribution-scope=public, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, name=rhceph, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, CEPH_POINT_RELEASE=) Nov 26 04:53:48 localhost ceph-mon[297296]: Reconfiguring osd.0 (monmap changed)... Nov 26 04:53:48 localhost ceph-mon[297296]: Reconfiguring daemon osd.0 on np0005536118.localdomain Nov 26 04:53:48 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:48 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:48 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Nov 26 04:53:48 localhost podman[301161]: 2025-11-26 09:53:48.765168895 +0000 UTC m=+0.099324037 container remove 99f85b1ffe5a9ec7d171b8a36fdacc56284f09b8b0e381dcc49f2195609f241e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_dubinsky, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , name=rhceph, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, version=7, io.openshift.tags=rhceph ceph, ceph=True, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, build-date=2025-09-24T08:57:55, architecture=x86_64, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=) Nov 26 04:53:48 localhost systemd[1]: libpod-conmon-99f85b1ffe5a9ec7d171b8a36fdacc56284f09b8b0e381dcc49f2195609f241e.scope: Deactivated successfully. Nov 26 04:53:49 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:53:49 localhost podman[301238]: Nov 26 04:53:49 localhost podman[301238]: 2025-11-26 09:53:49.583760753 +0000 UTC m=+0.082294452 container create db5b27be0cafadcebec3e756ac68d329ebd51e414aa9572b0c53f800c5ebf73c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_sinoussi, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, RELEASE=main, name=rhceph, io.openshift.tags=rhceph ceph, vcs-type=git, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, com.redhat.component=rhceph-container, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, ceph=True, CEPH_POINT_RELEASE=, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.buildah.version=1.33.12) Nov 26 04:53:49 localhost systemd[1]: tmp-crun.MtLvj4.mount: Deactivated successfully. Nov 26 04:53:49 localhost systemd[1]: var-lib-containers-storage-overlay-741f3ce760d4702d4a19a5a70efda914d78064539610491ebdef63a0c94728f2-merged.mount: Deactivated successfully. Nov 26 04:53:49 localhost systemd[1]: Started libpod-conmon-db5b27be0cafadcebec3e756ac68d329ebd51e414aa9572b0c53f800c5ebf73c.scope. Nov 26 04:53:49 localhost systemd[1]: Started libcrun container. Nov 26 04:53:49 localhost podman[301238]: 2025-11-26 09:53:49.549670645 +0000 UTC m=+0.048204394 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:53:49 localhost podman[301238]: 2025-11-26 09:53:49.658719189 +0000 UTC m=+0.157252888 container init db5b27be0cafadcebec3e756ac68d329ebd51e414aa9572b0c53f800c5ebf73c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_sinoussi, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , release=553, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, vcs-type=git, vendor=Red Hat, Inc., GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, ceph=True, CEPH_POINT_RELEASE=, version=7, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 26 04:53:49 localhost podman[301238]: 2025-11-26 09:53:49.670190652 +0000 UTC m=+0.168724351 container start db5b27be0cafadcebec3e756ac68d329ebd51e414aa9572b0c53f800c5ebf73c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_sinoussi, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, distribution-scope=public, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, version=7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , ceph=True, io.openshift.tags=rhceph ceph, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, GIT_CLEAN=True) Nov 26 04:53:49 localhost podman[301238]: 2025-11-26 09:53:49.67046278 +0000 UTC m=+0.168996479 container attach db5b27be0cafadcebec3e756ac68d329ebd51e414aa9572b0c53f800c5ebf73c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_sinoussi, GIT_BRANCH=main, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, ceph=True, io.openshift.tags=rhceph ceph, RELEASE=main, vcs-type=git, release=553, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, GIT_CLEAN=True, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 26 04:53:49 localhost pensive_sinoussi[301253]: 167 167 Nov 26 04:53:49 localhost systemd[1]: libpod-db5b27be0cafadcebec3e756ac68d329ebd51e414aa9572b0c53f800c5ebf73c.scope: Deactivated successfully. Nov 26 04:53:49 localhost podman[301238]: 2025-11-26 09:53:49.6743629 +0000 UTC m=+0.172896609 container died db5b27be0cafadcebec3e756ac68d329ebd51e414aa9572b0c53f800c5ebf73c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_sinoussi, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, com.redhat.component=rhceph-container, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, RELEASE=main, maintainer=Guillaume Abrioux , ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, release=553, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 26 04:53:49 localhost ceph-mon[297296]: Reconfiguring osd.4 (monmap changed)... Nov 26 04:53:49 localhost ceph-mon[297296]: Reconfiguring daemon osd.4 on np0005536118.localdomain Nov 26 04:53:49 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:49 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:49 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005536118.kohnma", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 26 04:53:49 localhost podman[301258]: 2025-11-26 09:53:49.781471494 +0000 UTC m=+0.094991743 container remove db5b27be0cafadcebec3e756ac68d329ebd51e414aa9572b0c53f800c5ebf73c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_sinoussi, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., release=553, RELEASE=main, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.component=rhceph-container, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, build-date=2025-09-24T08:57:55, distribution-scope=public) Nov 26 04:53:49 localhost systemd[1]: libpod-conmon-db5b27be0cafadcebec3e756ac68d329ebd51e414aa9572b0c53f800c5ebf73c.scope: Deactivated successfully. Nov 26 04:53:49 localhost nova_compute[281415]: 2025-11-26 09:53:49.852 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:53:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:53:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:53:50 localhost podman[301326]: 2025-11-26 09:53:50.551589232 +0000 UTC m=+0.107907830 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm) Nov 26 04:53:50 localhost podman[301326]: 2025-11-26 09:53:50.592537712 +0000 UTC m=+0.148856320 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3) Nov 26 04:53:50 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:53:50 localhost systemd[1]: var-lib-containers-storage-overlay-07b6e1bd6f951895f5e01bad4ad7466a7c2ab5e26b8c3b56d7171331531d6f70-merged.mount: Deactivated successfully. Nov 26 04:53:50 localhost podman[301325]: 2025-11-26 09:53:50.610539095 +0000 UTC m=+0.166293386 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 26 04:53:50 localhost podman[301349]: Nov 26 04:53:50 localhost podman[301325]: 2025-11-26 09:53:50.648510913 +0000 UTC m=+0.204265174 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 26 04:53:50 localhost podman[301349]: 2025-11-26 09:53:50.651057132 +0000 UTC m=+0.142623248 container create 13962a0a071887004d0b6baa3f45af4c4351635e8df3b19ac3f520f201785df1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_fermat, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., version=7, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 26 04:53:50 localhost systemd[1]: Started libpod-conmon-13962a0a071887004d0b6baa3f45af4c4351635e8df3b19ac3f520f201785df1.scope. Nov 26 04:53:50 localhost podman[301349]: 2025-11-26 09:53:50.616737666 +0000 UTC m=+0.108303862 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:53:50 localhost systemd[1]: Started libcrun container. Nov 26 04:53:50 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:53:50 localhost podman[301349]: 2025-11-26 09:53:50.736109597 +0000 UTC m=+0.227675713 container init 13962a0a071887004d0b6baa3f45af4c4351635e8df3b19ac3f520f201785df1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_fermat, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, ceph=True, version=7, GIT_CLEAN=True, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, description=Red Hat Ceph Storage 7, architecture=x86_64, vendor=Red Hat, Inc., RELEASE=main, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55) Nov 26 04:53:50 localhost podman[301349]: 2025-11-26 09:53:50.747875599 +0000 UTC m=+0.239441715 container start 13962a0a071887004d0b6baa3f45af4c4351635e8df3b19ac3f520f201785df1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_fermat, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, version=7, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, vendor=Red Hat, Inc., GIT_BRANCH=main, RELEASE=main, GIT_CLEAN=True, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True) Nov 26 04:53:50 localhost podman[301349]: 2025-11-26 09:53:50.748287232 +0000 UTC m=+0.239853348 container attach 13962a0a071887004d0b6baa3f45af4c4351635e8df3b19ac3f520f201785df1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_fermat, ceph=True, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., version=7, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, architecture=x86_64, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, GIT_BRANCH=main, build-date=2025-09-24T08:57:55) Nov 26 04:53:50 localhost priceless_fermat[301382]: 167 167 Nov 26 04:53:50 localhost systemd[1]: libpod-13962a0a071887004d0b6baa3f45af4c4351635e8df3b19ac3f520f201785df1.scope: Deactivated successfully. Nov 26 04:53:50 localhost podman[301349]: 2025-11-26 09:53:50.752185342 +0000 UTC m=+0.243751488 container died 13962a0a071887004d0b6baa3f45af4c4351635e8df3b19ac3f520f201785df1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_fermat, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_BRANCH=main, io.openshift.expose-services=, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, ceph=True, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, name=rhceph, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 26 04:53:50 localhost ceph-mon[297296]: Reconfiguring mds.mds.np0005536118.kohnma (monmap changed)... Nov 26 04:53:50 localhost ceph-mon[297296]: Reconfiguring daemon mds.mds.np0005536118.kohnma on np0005536118.localdomain Nov 26 04:53:50 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:50 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:50 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536118.anceyj", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:53:50 localhost podman[301388]: 2025-11-26 09:53:50.862176225 +0000 UTC m=+0.098342306 container remove 13962a0a071887004d0b6baa3f45af4c4351635e8df3b19ac3f520f201785df1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_fermat, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, name=rhceph, distribution-scope=public, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, RELEASE=main, version=7) Nov 26 04:53:50 localhost systemd[1]: libpod-conmon-13962a0a071887004d0b6baa3f45af4c4351635e8df3b19ac3f520f201785df1.scope: Deactivated successfully. Nov 26 04:53:51 localhost systemd[1]: var-lib-containers-storage-overlay-94fd2995e3d22b707d0a65f86190931d172e8e6691f5f0fac8bb0baa3a1d86a1-merged.mount: Deactivated successfully. Nov 26 04:53:51 localhost ceph-mon[297296]: Reconfiguring mgr.np0005536118.anceyj (monmap changed)... Nov 26 04:53:51 localhost ceph-mon[297296]: Reconfiguring daemon mgr.np0005536118.anceyj on np0005536118.localdomain Nov 26 04:53:51 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:51 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:51 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536119.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:53:52 localhost ceph-mon[297296]: Reconfiguring crash.np0005536119 (monmap changed)... Nov 26 04:53:52 localhost ceph-mon[297296]: Reconfiguring daemon crash.np0005536119 on np0005536119.localdomain Nov 26 04:53:52 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:52 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:52 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Nov 26 04:53:52 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:52 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 26 04:53:52 localhost nova_compute[281415]: 2025-11-26 09:53:52.984 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:53:53 localhost ceph-mon[297296]: Reconfiguring osd.1 (monmap changed)... Nov 26 04:53:53 localhost ceph-mon[297296]: Reconfiguring daemon osd.1 on np0005536119.localdomain Nov 26 04:53:53 localhost ceph-mon[297296]: Deploying daemon mon.np0005536119 on np0005536119.localdomain Nov 26 04:53:53 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:53 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:53:53 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Nov 26 04:53:53 localhost nova_compute[281415]: 2025-11-26 09:53:53.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:53:53 localhost nova_compute[281415]: 2025-11-26 09:53:53.849 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:53:53 localhost nova_compute[281415]: 2025-11-26 09:53:53.883 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:53:53 localhost nova_compute[281415]: 2025-11-26 09:53:53.883 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:53:54 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:53:54 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e14 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 04:53:54 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3140303944' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 04:53:54 localhost ceph-mon[297296]: Reconfiguring osd.3 (monmap changed)... Nov 26 04:53:54 localhost ceph-mon[297296]: Reconfiguring daemon osd.3 on np0005536119.localdomain Nov 26 04:53:54 localhost nova_compute[281415]: 2025-11-26 09:53:54.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:53:54 localhost nova_compute[281415]: 2025-11-26 09:53:54.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:53:54 localhost nova_compute[281415]: 2025-11-26 09:53:54.848 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 04:53:54 localhost nova_compute[281415]: 2025-11-26 09:53:54.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:53:54 localhost nova_compute[281415]: 2025-11-26 09:53:54.849 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Nov 26 04:53:54 localhost nova_compute[281415]: 2025-11-26 09:53:54.882 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Nov 26 04:53:54 localhost nova_compute[281415]: 2025-11-26 09:53:54.882 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:53:54 localhost nova_compute[281415]: 2025-11-26 09:53:54.883 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Nov 26 04:53:54 localhost nova_compute[281415]: 2025-11-26 09:53:54.901 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:53:54 localhost nova_compute[281415]: 2025-11-26 09:53:54.905 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:53:55 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e14 adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints Nov 26 04:53:55 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e14 adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints Nov 26 04:53:55 localhost sshd[301405]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:53:55 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e14 adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints Nov 26 04:53:55 localhost ceph-mgr[287388]: ms_deliver_dispatch: unhandled message 0x55fcc98f8000 mon_map magic: 0 from mon.1 v2:172.18.0.104:3300/0 Nov 26 04:53:55 localhost ceph-mon[297296]: log_channel(cluster) log [INF] : mon.np0005536118 calling monitor election Nov 26 04:53:55 localhost ceph-mon[297296]: paxos.1).electionLogic(52) init, last seen epoch 52 Nov 26 04:53:55 localhost ceph-mon[297296]: mon.np0005536118@1(electing) e15 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:53:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:53:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:53:55 localhost podman[301407]: 2025-11-26 09:53:55.840557642 +0000 UTC m=+0.087956296 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, container_name=ovn_controller, org.label-schema.schema-version=1.0) Nov 26 04:53:55 localhost podman[301407]: 2025-11-26 09:53:55.885281938 +0000 UTC m=+0.132680602 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 26 04:53:55 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:53:55 localhost nova_compute[281415]: 2025-11-26 09:53:55.922 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:53:55 localhost podman[301408]: 2025-11-26 09:53:55.890736336 +0000 UTC m=+0.139452371 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, version=9.6, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=edpm, io.buildah.version=1.33.7, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., name=ubi9-minimal) Nov 26 04:53:55 localhost podman[301408]: 2025-11-26 09:53:55.974388929 +0000 UTC m=+0.223104934 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, vcs-type=git, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, managed_by=edpm_ansible, architecture=x86_64) Nov 26 04:53:55 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:53:56 localhost nova_compute[281415]: 2025-11-26 09:53:56.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:53:56 localhost nova_compute[281415]: 2025-11-26 09:53:56.871 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:53:56 localhost nova_compute[281415]: 2025-11-26 09:53:56.872 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:53:56 localhost nova_compute[281415]: 2025-11-26 09:53:56.872 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:53:56 localhost nova_compute[281415]: 2025-11-26 09:53:56.872 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 04:53:56 localhost nova_compute[281415]: 2025-11-26 09:53:56.873 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:53:57 localhost ceph-mon[297296]: mon.np0005536118@1(electing) e15 handle_auth_request failed to assign global_id Nov 26 04:53:57 localhost ceph-mon[297296]: mon.np0005536118@1(electing) e15 handle_auth_request failed to assign global_id Nov 26 04:53:57 localhost podman[240049]: time="2025-11-26T09:53:57Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:53:57 localhost podman[240049]: @ - - [26/Nov/2025:09:53:57 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153864 "" "Go-http-client/1.1" Nov 26 04:53:57 localhost podman[240049]: @ - - [26/Nov/2025:09:53:57 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18714 "" "Go-http-client/1.1" Nov 26 04:53:57 localhost ceph-mon[297296]: mon.np0005536118@1(electing) e15 handle_auth_request failed to assign global_id Nov 26 04:53:58 localhost nova_compute[281415]: 2025-11-26 09:53:58.017 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:53:58 localhost ceph-mon[297296]: mon.np0005536118@1(electing) e15 handle_auth_request failed to assign global_id Nov 26 04:53:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:53:58 localhost podman[301463]: 2025-11-26 09:53:58.826802304 +0000 UTC m=+0.085917433 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 26 04:53:58 localhost podman[301463]: 2025-11-26 09:53:58.839569347 +0000 UTC m=+0.098684516 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 04:53:58 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:53:59 localhost ceph-mds[286153]: mds.beacon.mds.np0005536118.kohnma missed beacon ack from the monitors Nov 26 04:53:59 localhost nova_compute[281415]: 2025-11-26 09:53:59.930 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:54:00 localhost ceph-mon[297296]: mon.np0005536118@1(electing) e15 handle_auth_request failed to assign global_id Nov 26 04:54:00 localhost ceph-mon[297296]: mon.np0005536118@1(electing) e15 handle_auth_request failed to assign global_id Nov 26 04:54:00 localhost ceph-mon[297296]: mon.np0005536118@1(electing) e15 handle_auth_request failed to assign global_id Nov 26 04:54:00 localhost ceph-mon[297296]: mon.np0005536118@1(electing) e15 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:54:00 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 26 04:54:00 localhost ceph-mon[297296]: mon.np0005536118 calling monitor election Nov 26 04:54:00 localhost ceph-mon[297296]: mon.np0005536117 calling monitor election Nov 26 04:54:00 localhost ceph-mon[297296]: mon.np0005536119 calling monitor election Nov 26 04:54:00 localhost ceph-mon[297296]: mon.np0005536117 is new leader, mons np0005536117,np0005536118,np0005536119 in quorum (ranks 0,1,2) Nov 26 04:54:00 localhost ceph-mon[297296]: overall HEALTH_OK Nov 26 04:54:00 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:00 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:00 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005536119.dxhchp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 26 04:54:01 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 04:54:01 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2432723965' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 04:54:01 localhost nova_compute[281415]: 2025-11-26 09:54:01.776 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 4.903s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:54:01 localhost ceph-mon[297296]: Reconfiguring mds.mds.np0005536119.dxhchp (monmap changed)... Nov 26 04:54:01 localhost ceph-mon[297296]: Reconfiguring daemon mds.mds.np0005536119.dxhchp on np0005536119.localdomain Nov 26 04:54:01 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:01 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:01 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536119.eupicg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:54:01 localhost nova_compute[281415]: 2025-11-26 09:54:01.865 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:54:01 localhost nova_compute[281415]: 2025-11-26 09:54:01.865 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:54:02 localhost nova_compute[281415]: 2025-11-26 09:54:02.098 281419 WARNING nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 04:54:02 localhost nova_compute[281415]: 2025-11-26 09:54:02.100 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=11798MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 04:54:02 localhost nova_compute[281415]: 2025-11-26 09:54:02.100 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:54:02 localhost nova_compute[281415]: 2025-11-26 09:54:02.101 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:54:02 localhost nova_compute[281415]: 2025-11-26 09:54:02.260 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 04:54:02 localhost nova_compute[281415]: 2025-11-26 09:54:02.261 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 04:54:02 localhost nova_compute[281415]: 2025-11-26 09:54:02.262 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 04:54:02 localhost nova_compute[281415]: 2025-11-26 09:54:02.344 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Refreshing inventories for resource provider 05276789-7461-410b-9529-16f5185a8bff _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Nov 26 04:54:02 localhost nova_compute[281415]: 2025-11-26 09:54:02.420 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Updating ProviderTree inventory for provider 05276789-7461-410b-9529-16f5185a8bff from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Nov 26 04:54:02 localhost nova_compute[281415]: 2025-11-26 09:54:02.421 281419 DEBUG nova.compute.provider_tree [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Updating inventory in ProviderTree for provider 05276789-7461-410b-9529-16f5185a8bff with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Nov 26 04:54:02 localhost nova_compute[281415]: 2025-11-26 09:54:02.436 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Refreshing aggregate associations for resource provider 05276789-7461-410b-9529-16f5185a8bff, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Nov 26 04:54:02 localhost nova_compute[281415]: 2025-11-26 09:54:02.458 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Refreshing trait associations for resource provider 05276789-7461-410b-9529-16f5185a8bff, traits: COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_F16C,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_FMA3,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ACCELERATORS,HW_CPU_X86_AESNI,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_CLMUL,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_BMI,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Nov 26 04:54:02 localhost ceph-mon[297296]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0. Nov 26 04:54:02 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:54:02.508223) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 26 04:54:02 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22 Nov 26 04:54:02 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150842508297, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1185, "num_deletes": 251, "total_data_size": 1936522, "memory_usage": 1959616, "flush_reason": "Manual Compaction"} Nov 26 04:54:02 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started Nov 26 04:54:02 localhost nova_compute[281415]: 2025-11-26 09:54:02.510 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:54:02 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150842518790, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 1115062, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13504, "largest_seqno": 14684, "table_properties": {"data_size": 1109663, "index_size": 2678, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1669, "raw_key_size": 14476, "raw_average_key_size": 22, "raw_value_size": 1097886, "raw_average_value_size": 1681, "num_data_blocks": 115, "num_entries": 653, "num_filter_entries": 653, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764150810, "oldest_key_time": 1764150810, "file_creation_time": 1764150842, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79fcc812-ff89-4330-9c0d-148e92884770", "db_session_id": "NHY1MHQN9GMTALZ562AS", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}} Nov 26 04:54:02 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 10621 microseconds, and 4491 cpu microseconds. Nov 26 04:54:02 localhost ceph-mon[297296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 04:54:02 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:54:02.518845) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 1115062 bytes OK Nov 26 04:54:02 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:54:02.518874) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started Nov 26 04:54:02 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:54:02.520445) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done Nov 26 04:54:02 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:54:02.520469) EVENT_LOG_v1 {"time_micros": 1764150842520461, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 26 04:54:02 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:54:02.520499) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 26 04:54:02 localhost ceph-mon[297296]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 1930328, prev total WAL file size 1930328, number of live WAL files 2. Nov 26 04:54:02 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 04:54:02 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:54:02.521475) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130373933' seq:72057594037927935, type:22 .. '7061786F73003131303435' seq:0, type:0; will stop at (end) Nov 26 04:54:02 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 26 04:54:02 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(1088KB)], [21(15MB)] Nov 26 04:54:02 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150842521584, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 17778048, "oldest_snapshot_seqno": -1} Nov 26 04:54:02 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 10962 keys, 14806518 bytes, temperature: kUnknown Nov 26 04:54:02 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150842590127, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 14806518, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14744118, "index_size": 33883, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27461, "raw_key_size": 296013, "raw_average_key_size": 27, "raw_value_size": 14557167, "raw_average_value_size": 1327, "num_data_blocks": 1275, "num_entries": 10962, "num_filter_entries": 10962, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764150724, "oldest_key_time": 0, "file_creation_time": 1764150842, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79fcc812-ff89-4330-9c0d-148e92884770", "db_session_id": "NHY1MHQN9GMTALZ562AS", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}} Nov 26 04:54:02 localhost ceph-mon[297296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 04:54:02 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:54:02.591111) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 14806518 bytes Nov 26 04:54:02 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:54:02.592811) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 256.9 rd, 214.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.1, 15.9 +0.0 blob) out(14.1 +0.0 blob), read-write-amplify(29.2) write-amplify(13.3) OK, records in: 11492, records dropped: 530 output_compression: NoCompression Nov 26 04:54:02 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:54:02.592848) EVENT_LOG_v1 {"time_micros": 1764150842592831, "job": 10, "event": "compaction_finished", "compaction_time_micros": 69205, "compaction_time_cpu_micros": 39425, "output_level": 6, "num_output_files": 1, "total_output_size": 14806518, "num_input_records": 11492, "num_output_records": 10962, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 26 04:54:02 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 04:54:02 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150842593238, "job": 10, "event": "table_file_deletion", "file_number": 23} Nov 26 04:54:02 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 04:54:02 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150842596466, "job": 10, "event": "table_file_deletion", "file_number": 21} Nov 26 04:54:02 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:54:02.521358) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:54:02 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:54:02.596622) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:54:02 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:54:02.596633) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:54:02 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:54:02.596641) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:54:02 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:54:02.596644) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:54:02 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:54:02.596647) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:54:02 localhost ceph-mon[297296]: Reconfiguring mgr.np0005536119.eupicg (monmap changed)... Nov 26 04:54:02 localhost ceph-mon[297296]: Reconfiguring daemon mgr.np0005536119.eupicg on np0005536119.localdomain Nov 26 04:54:02 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:02 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:03 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 04:54:03 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1062456648' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 04:54:03 localhost nova_compute[281415]: 2025-11-26 09:54:03.058 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:54:03 localhost nova_compute[281415]: 2025-11-26 09:54:03.073 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.563s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:54:03 localhost nova_compute[281415]: 2025-11-26 09:54:03.081 281419 DEBUG nova.compute.provider_tree [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 04:54:03 localhost nova_compute[281415]: 2025-11-26 09:54:03.096 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 04:54:03 localhost nova_compute[281415]: 2025-11-26 09:54:03.098 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 04:54:03 localhost nova_compute[281415]: 2025-11-26 09:54:03.098 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.997s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.582 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'name': 'test', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005536118.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'hostId': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.583 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.596 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.597 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8f6c969a-7949-4ebe-8625-a2ec2302265d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:54:03.584039', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd7220f3c-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11318.826294321, 'message_signature': '41461471c6a87d45505c3c313323cc681016f2047675f1194d7dc50068ff92b8'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:54:03.584039', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd72223d2-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11318.826294321, 'message_signature': '1c68cc5876a2bf25805bf6f4c93ed3e32dc2bcd55d3db664877c581931f58f4a'}]}, 'timestamp': '2025-11-26 09:54:03.597969', '_unique_id': '73448f3bb7ed4e14b48501dcf30d35c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.599 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.601 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.601 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.629 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 1143371229 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.630 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 23326743 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '02fa9a92-faf1-4b45-8eb5-2f3a8192f713', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1143371229, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:54:03.601632', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd7271496-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11318.843903974, 'message_signature': '44938900ac8e8e1f142740e5bf3e93441fd52ff4a87fc35ffc6a988bc6103693'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 23326743, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:54:03.601632', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd727256c-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11318.843903974, 'message_signature': 'b0dc6db824762cdd256fe808613c1b722ad000ebfb5ddbc4541df6a39fc221a0'}]}, 'timestamp': '2025-11-26 09:54:03.630729', '_unique_id': '7fa6db340ee14c61a3db50a7b07cab90'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.631 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.632 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.633 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 1723586642 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.633 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 89399569 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7297559f-9295-4c47-a9c4-c93aa0b79ec5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1723586642, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:54:03.633007', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd7278f48-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11318.843903974, 'message_signature': 'a60b14ea9494e2e2e08d1517f8a347777342c6cbcfa7e3ac343d3feae32b874e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89399569, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:54:03.633007', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd7279f6a-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11318.843903974, 'message_signature': '8f1a2059535266ab2d0624bff535354f43ae1c3cb4411868266d43af81645a3e'}]}, 'timestamp': '2025-11-26 09:54:03.633851', '_unique_id': '3f35cf377e354e1a9326ee0736bb47ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.634 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.635 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.636 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.636 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9b16f562-e636-404a-8b8f-b0bb1836d7c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:54:03.636059', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd728063a-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11318.826294321, 'message_signature': '79067f0aae8775ed8c39e6d361e7c7dfeb2972c21b6c655efc866e2892a93d64'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:54:03.636059', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd7281648-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11318.826294321, 'message_signature': '943df789e03383344233146a552d092136545ef7828f023c23e4ad6ca79d0ff0'}]}, 'timestamp': '2025-11-26 09:54:03.636892', '_unique_id': '64c697799f434335a27d56ff53520f30'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.638 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.639 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.657 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/cpu volume: 14220000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:54:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:54:03.660 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:54:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:54:03.660 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:54:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:54:03.661 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a721dcd-3eeb-4426-be76-cbeb604a08f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14220000000, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T09:54:03.639558', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'd72b5b32-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11318.899985408, 'message_signature': 'a9b3d44984e7f34b22b98d72e343c136e58bf7c89a770f4206422af01fca3b67'}]}, 'timestamp': '2025-11-26 09:54:03.658338', '_unique_id': 'd3a5dbb747a74bbd834b4b073c731fba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.659 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.660 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.660 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.660 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '525f295c-48ac-47b5-b4d1-5fb1500e8537', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:54:03.660466', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd72bbf64-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11318.843903974, 'message_signature': 'd51f5b9a6aa66cb6fdc28fd7a58761c9fc57d996d872ace629d1114e6cc14dbb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:54:03.660466', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd72bd03a-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11318.843903974, 'message_signature': '37230161b39d282a2337b354fae1816bc12669dfa41de3af458f06b34c18f4ed'}]}, 'timestamp': '2025-11-26 09:54:03.661414', '_unique_id': '5c8bd536d5af47cbb9c69f519d2f66a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.662 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.663 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.671 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes volume: 7111 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b28f5b43-836d-41fc-82ef-8234d80abd6a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7111, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:54:03.663564', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'd72d6c1a-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11318.905816568, 'message_signature': '61471df2a4d018c2e32f57af20e61c0e7c10273b5232b063c1192bbe2583a887'}]}, 'timestamp': '2025-11-26 09:54:03.671922', '_unique_id': '09a4cdf59152465db236d45eb4a9ca94'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.672 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.673 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.674 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8dfbf720-61a6-4863-8eb0-238a026ea4a5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:54:03.674078', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'd72dd362-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11318.905816568, 'message_signature': 'fd1a67121cbda14131ad7f4fcd6a6cf97fffe113a0e6a9c629e95dc7c9ff552c'}]}, 'timestamp': '2025-11-26 09:54:03.674531', '_unique_id': '6adcf61c63ba493e82d076f2f5e31107'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.675 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.676 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.676 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '79ab29cd-3dba-4aeb-a875-e70fbf9e3b1b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:54:03.676628', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'd72e36f4-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11318.905816568, 'message_signature': 'a0441968a0aee49088c7dbdf128c9ae9213ee730a23b7cdd3a9ae120aa71923b'}]}, 'timestamp': '2025-11-26 09:54:03.677117', '_unique_id': 'e805ec753333497db2aefb06c46d1f30'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.678 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.679 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.679 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.679 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5c9b9dbe-6310-426e-a7d7-4d2163bd5853', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:54:03.679186', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd72e9ab8-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11318.826294321, 'message_signature': 'f1b8b2ab9acfe5846d5b465c131736dd5b973d126b3ef2eade15abcfc62d6eb7'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:54:03.679186', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd72eaac6-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11318.826294321, 'message_signature': 'cf22aee3e4b223ea1b06a795e79d99ec94b2f2a560b7a39d4deb7e09d514a2e9'}]}, 'timestamp': '2025-11-26 09:54:03.680062', '_unique_id': 'a7469fb2d9f44e13b0944c60fd0010a8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.681 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.682 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.682 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.682 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/memory.usage volume: 51.79296875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da07eeab-eb5e-4154-bd31-a942f6e939e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.79296875, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T09:54:03.682356', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'd72f168c-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11318.899985408, 'message_signature': '722c2692fbc48c5587cea354986bf02a47185d2bb92d889224391978a35e8861'}]}, 'timestamp': '2025-11-26 09:54:03.682787', '_unique_id': '2b33f919388e4f548937c39a6e532bc4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.683 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.684 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.684 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.685 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4e5b075e-cffa-4cb4-bb06-8444c40dae05', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:54:03.684862', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd72f79ce-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11318.843903974, 'message_signature': 'e1091c274c4a87fdf620608efb74b1044a5e0e6747974635917a5261939431cb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:54:03.684862', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd72f89b4-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11318.843903974, 'message_signature': '96bd299dc5327d3b045a08ea9ba09e3f09fb0212a8a17cc3ff6f9aec3ea0a123'}]}, 'timestamp': '2025-11-26 09:54:03.685721', '_unique_id': 'ca90f85dec2c45f19a92992395d73bf3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.686 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.687 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.687 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e70099f-684f-43a1-8ee3-7be4bfef65f4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:54:03.687831', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'd72feeea-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11318.905816568, 'message_signature': '0d61c7ff210b7884c86c50dd80b72ceb58438f6eff82b388fda0b673e6a1bddc'}]}, 'timestamp': '2025-11-26 09:54:03.688345', '_unique_id': '95f0abea12a44a3e9c9bb54dce515e4f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.689 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.690 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.690 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '854594d1-b2c3-4215-b776-90ae29fc1a55', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:54:03.690403', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'd73050f6-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11318.905816568, 'message_signature': 'c3b3cea329e0aa96d7674d7bb8b4cb9ee4b52101161b694e4d1a03fa7ceabf0c'}]}, 'timestamp': '2025-11-26 09:54:03.690854', '_unique_id': '3542d38143c84b4199e63a027ef0c4a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.691 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.693 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.693 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '87dbf956-8947-4d5a-a1cf-f35ccaa0f429', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:54:03.693187', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'd730bdb6-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11318.905816568, 'message_signature': '06925ffa3f5d65d6fe620280d9c5958aef72b2979b24068bfe4ddbd3f3a18f6b'}]}, 'timestamp': '2025-11-26 09:54:03.693640', '_unique_id': 'b8df32a86480441cbaec4d07e8ce0031'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.694 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.695 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.695 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.695 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.696 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c1753e28-50d5-429c-a11e-a70b6acd4bf8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:54:03.695834', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd73126de-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11318.843903974, 'message_signature': 'd4bc9553f92fc9b391778acfe88814e79e714d252ceecf5bad2427b23de84420'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:54:03.695834', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd73136c4-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11318.843903974, 'message_signature': '6026c36ee5fd4e6f3d2fda26140807fb8d465c9c2fc292122d1b806d1c7a07eb'}]}, 'timestamp': '2025-11-26 09:54:03.696707', '_unique_id': '19405b7eb8e84d3aa6f8efd9d981c9d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.697 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.699 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.699 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97c8e157-8244-4953-b974-4dbb5c5b8d25', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:54:03.699227', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'd731aa50-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11318.905816568, 'message_signature': '3e63178f006768fc08e70ecebc34cdac2265c17e10d5b98fffb21d4b9c544d9f'}]}, 'timestamp': '2025-11-26 09:54:03.699708', '_unique_id': '4379750184104d1bb00bd0aeedb83053'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.700 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.701 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.701 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '38ddd0f4-466c-43ac-b2f5-5d83bcf9d0a0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:54:03.701259', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'd731f5be-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11318.905816568, 'message_signature': 'e8956785f3b668b1700305b75d41266d0226d12b2203973aafac1bf5cd3ce709'}]}, 'timestamp': '2025-11-26 09:54:03.701539', '_unique_id': '206cab6f235c4b40883acfe93a4c3f9c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.702 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.703 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.703 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '326620bd-c780-42df-8840-b38888701561', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:54:03.703005', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd7323a10-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11318.843903974, 'message_signature': 'ed95239586a618e18d70af369c8844517c0d8d28f9704fbba86a42ffaefaaa9d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:54:03.703005', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd73243ac-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11318.843903974, 'message_signature': 'f39253799977fa3d1a57101cddfc21715b90b7cdee6904fa58f5415cc5cc0733'}]}, 'timestamp': '2025-11-26 09:54:03.703516', '_unique_id': '78560882bfc84637bc78bb9a497008f5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.704 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15df15c0-d34b-42b3-8f3b-3556c0362170', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:54:03.704815', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'd7328182-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11318.905816568, 'message_signature': '0ed51af3f01795a4460967f4e343afff27523a7490bf521b7975fe20ea55cf01'}]}, 'timestamp': '2025-11-26 09:54:03.705116', '_unique_id': '97fa377ca32f44f395aeaff9c534b910'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.705 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.706 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.706 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '32d77612-1f7e-48b1-82cb-3847e631a614', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:54:03.706373', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'd732bd78-caad-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11318.905816568, 'message_signature': '637ecc1ebd5f91a499b23ba64bce03b88eafee4bd169463f4b8421f9ba3d2715'}]}, 'timestamp': '2025-11-26 09:54:03.706650', '_unique_id': 'a27cff5839984d51a989a445b64c853c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 ERROR oslo_messaging.notify.messaging Nov 26 04:54:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:54:03.707 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:54:04 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:54:04 localhost nova_compute[281415]: 2025-11-26 09:54:04.962 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:54:05 localhost nova_compute[281415]: 2025-11-26 09:54:05.098 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:54:05 localhost nova_compute[281415]: 2025-11-26 09:54:05.099 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 04:54:05 localhost nova_compute[281415]: 2025-11-26 09:54:05.099 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 04:54:05 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:05 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:05 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:54:05 localhost nova_compute[281415]: 2025-11-26 09:54:05.309 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:54:05 localhost nova_compute[281415]: 2025-11-26 09:54:05.310 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:54:05 localhost nova_compute[281415]: 2025-11-26 09:54:05.310 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 04:54:05 localhost nova_compute[281415]: 2025-11-26 09:54:05.311 281419 DEBUG nova.objects.instance [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:54:05 localhost nova_compute[281415]: 2025-11-26 09:54:05.793 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:54:05 localhost nova_compute[281415]: 2025-11-26 09:54:05.814 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:54:05 localhost nova_compute[281415]: 2025-11-26 09:54:05.815 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 04:54:05 localhost nova_compute[281415]: 2025-11-26 09:54:05.815 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:54:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:54:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:54:06 localhost podman[301769]: 2025-11-26 09:54:06.092038971 +0000 UTC m=+0.103466633 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Nov 26 04:54:06 localhost systemd[1]: tmp-crun.0gFsck.mount: Deactivated successfully. Nov 26 04:54:06 localhost podman[301773]: 2025-11-26 09:54:06.143826934 +0000 UTC m=+0.156186936 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 26 04:54:06 localhost podman[301769]: 2025-11-26 09:54:06.177817449 +0000 UTC m=+0.189245121 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS) Nov 26 04:54:06 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:54:06 localhost podman[301773]: 2025-11-26 09:54:06.229414747 +0000 UTC m=+0.241774759 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:54:06 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:54:06 localhost ceph-mon[297296]: Updating np0005536117.localdomain:/etc/ceph/ceph.conf Nov 26 04:54:06 localhost ceph-mon[297296]: Updating np0005536118.localdomain:/etc/ceph/ceph.conf Nov 26 04:54:06 localhost ceph-mon[297296]: Updating np0005536119.localdomain:/etc/ceph/ceph.conf Nov 26 04:54:06 localhost ceph-mon[297296]: Updating np0005536117.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:54:06 localhost ceph-mon[297296]: Updating np0005536119.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:54:07 localhost ceph-mon[297296]: Updating np0005536118.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:54:07 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:07 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:07 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:07 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:07 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:07 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:07 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:07 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536117.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:54:08 localhost nova_compute[281415]: 2025-11-26 09:54:08.092 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:54:08 localhost ceph-mon[297296]: Reconfiguring crash.np0005536117 (monmap changed)... Nov 26 04:54:08 localhost ceph-mon[297296]: Reconfiguring daemon crash.np0005536117 on np0005536117.localdomain Nov 26 04:54:08 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:08 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:08 localhost ceph-mon[297296]: Reconfiguring osd.2 (monmap changed)... Nov 26 04:54:08 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Nov 26 04:54:08 localhost ceph-mon[297296]: Reconfiguring daemon osd.2 on np0005536117.localdomain Nov 26 04:54:08 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:09 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:54:09 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:09 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:09 localhost ceph-mon[297296]: Reconfiguring osd.5 (monmap changed)... Nov 26 04:54:09 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Nov 26 04:54:09 localhost ceph-mon[297296]: Reconfiguring daemon osd.5 on np0005536117.localdomain Nov 26 04:54:09 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:09 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:09 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005536117.tfthzg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 26 04:54:10 localhost nova_compute[281415]: 2025-11-26 09:54:10.006 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:54:10 localhost ceph-mon[297296]: Reconfiguring mds.mds.np0005536117.tfthzg (monmap changed)... Nov 26 04:54:10 localhost ceph-mon[297296]: Reconfiguring daemon mds.mds.np0005536117.tfthzg on np0005536117.localdomain Nov 26 04:54:10 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:10 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:10 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536117.ggibwg", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:54:10 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:10 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:10 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:10 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:10 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:10 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:10 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:10 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:11 localhost ceph-mon[297296]: Reconfiguring mgr.np0005536117.ggibwg (monmap changed)... Nov 26 04:54:11 localhost ceph-mon[297296]: Reconfiguring daemon mgr.np0005536117.ggibwg on np0005536117.localdomain Nov 26 04:54:11 localhost ceph-mon[297296]: Reconfig service osd.default_drive_group Nov 26 04:54:11 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:11 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:11 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:11 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:11 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:11 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:11 localhost ceph-mon[297296]: Reconfiguring crash.np0005536118 (monmap changed)... Nov 26 04:54:11 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536118.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:54:11 localhost ceph-mon[297296]: Reconfiguring daemon crash.np0005536118 on np0005536118.localdomain Nov 26 04:54:12 localhost podman[302016]: Nov 26 04:54:12 localhost podman[302016]: 2025-11-26 09:54:12.16710423 +0000 UTC m=+0.083768218 container create c802c1226f5eac05e6262b6d5344f534248f9885b57af4b5a5fe6d04b6f3a16a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_kowalevski, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, name=rhceph, release=553, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vendor=Red Hat, Inc., GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 26 04:54:12 localhost systemd[1]: Started libpod-conmon-c802c1226f5eac05e6262b6d5344f534248f9885b57af4b5a5fe6d04b6f3a16a.scope. Nov 26 04:54:12 localhost systemd[1]: Started libcrun container. Nov 26 04:54:12 localhost podman[302016]: 2025-11-26 09:54:12.134318382 +0000 UTC m=+0.050982410 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:54:12 localhost podman[302016]: 2025-11-26 09:54:12.248603816 +0000 UTC m=+0.165267794 container init c802c1226f5eac05e6262b6d5344f534248f9885b57af4b5a5fe6d04b6f3a16a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_kowalevski, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, version=7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.expose-services=, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, release=553, GIT_CLEAN=True, CEPH_POINT_RELEASE=, ceph=True, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , architecture=x86_64, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 26 04:54:12 localhost podman[302016]: 2025-11-26 09:54:12.26038921 +0000 UTC m=+0.177053188 container start c802c1226f5eac05e6262b6d5344f534248f9885b57af4b5a5fe6d04b6f3a16a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_kowalevski, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, build-date=2025-09-24T08:57:55, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 26 04:54:12 localhost podman[302016]: 2025-11-26 09:54:12.260693219 +0000 UTC m=+0.177357207 container attach c802c1226f5eac05e6262b6d5344f534248f9885b57af4b5a5fe6d04b6f3a16a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_kowalevski, io.buildah.version=1.33.12, distribution-scope=public, build-date=2025-09-24T08:57:55, release=553, vendor=Red Hat, Inc., GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , RELEASE=main, io.openshift.tags=rhceph ceph, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, name=rhceph, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 26 04:54:12 localhost elated_kowalevski[302031]: 167 167 Nov 26 04:54:12 localhost systemd[1]: libpod-c802c1226f5eac05e6262b6d5344f534248f9885b57af4b5a5fe6d04b6f3a16a.scope: Deactivated successfully. Nov 26 04:54:12 localhost podman[302016]: 2025-11-26 09:54:12.265785685 +0000 UTC m=+0.182449683 container died c802c1226f5eac05e6262b6d5344f534248f9885b57af4b5a5fe6d04b6f3a16a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_kowalevski, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, name=rhceph, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, version=7, GIT_CLEAN=True, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux ) Nov 26 04:54:12 localhost podman[302036]: 2025-11-26 09:54:12.384226818 +0000 UTC m=+0.103363710 container remove c802c1226f5eac05e6262b6d5344f534248f9885b57af4b5a5fe6d04b6f3a16a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_kowalevski, RELEASE=main, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, release=553, io.buildah.version=1.33.12, GIT_BRANCH=main, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., architecture=x86_64, CEPH_POINT_RELEASE=) Nov 26 04:54:12 localhost systemd[1]: libpod-conmon-c802c1226f5eac05e6262b6d5344f534248f9885b57af4b5a5fe6d04b6f3a16a.scope: Deactivated successfully. Nov 26 04:54:12 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e86 e86: 6 total, 6 up, 6 in Nov 26 04:54:12 localhost ceph-mgr[287388]: mgr handle_mgr_map Activating! Nov 26 04:54:12 localhost ceph-mgr[287388]: mgr handle_mgr_map I am now activating Nov 26 04:54:12 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005536117"} v 0) Nov 26 04:54:12 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "mon metadata", "id": "np0005536117"} : dispatch Nov 26 04:54:12 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005536118"} v 0) Nov 26 04:54:12 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "mon metadata", "id": "np0005536118"} : dispatch Nov 26 04:54:12 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005536119"} v 0) Nov 26 04:54:12 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "mon metadata", "id": "np0005536119"} : dispatch Nov 26 04:54:12 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005536119.dxhchp"} v 0) Nov 26 04:54:12 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "mds metadata", "who": "mds.np0005536119.dxhchp"} : dispatch Nov 26 04:54:12 localhost ceph-mon[297296]: mon.np0005536118@1(peon).mds e16 all = 0 Nov 26 04:54:12 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005536118.kohnma"} v 0) Nov 26 04:54:12 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "mds metadata", "who": "mds.np0005536118.kohnma"} : dispatch Nov 26 04:54:12 localhost ceph-mon[297296]: mon.np0005536118@1(peon).mds e16 all = 0 Nov 26 04:54:12 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005536117.tfthzg"} v 0) Nov 26 04:54:12 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "mds metadata", "who": "mds.np0005536117.tfthzg"} : dispatch Nov 26 04:54:12 localhost ceph-mon[297296]: mon.np0005536118@1(peon).mds e16 all = 0 Nov 26 04:54:12 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005536118.anceyj", "id": "np0005536118.anceyj"} v 0) Nov 26 04:54:12 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "mgr metadata", "who": "np0005536118.anceyj", "id": "np0005536118.anceyj"} : dispatch Nov 26 04:54:12 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005536114.ddbqmi", "id": "np0005536114.ddbqmi"} v 0) Nov 26 04:54:12 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "mgr metadata", "who": "np0005536114.ddbqmi", "id": "np0005536114.ddbqmi"} : dispatch Nov 26 04:54:12 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005536119.eupicg", "id": "np0005536119.eupicg"} v 0) Nov 26 04:54:12 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "mgr metadata", "who": "np0005536119.eupicg", "id": "np0005536119.eupicg"} : dispatch Nov 26 04:54:12 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) Nov 26 04:54:12 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "osd metadata", "id": 0} : dispatch Nov 26 04:54:12 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) Nov 26 04:54:12 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "osd metadata", "id": 1} : dispatch Nov 26 04:54:12 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) Nov 26 04:54:12 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "osd metadata", "id": 2} : dispatch Nov 26 04:54:12 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "osd metadata", "id": 3} v 0) Nov 26 04:54:12 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "osd metadata", "id": 3} : dispatch Nov 26 04:54:12 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "osd metadata", "id": 4} v 0) Nov 26 04:54:12 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "osd metadata", "id": 4} : dispatch Nov 26 04:54:12 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "osd metadata", "id": 5} v 0) Nov 26 04:54:12 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "osd metadata", "id": 5} : dispatch Nov 26 04:54:12 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mds metadata"} v 0) Nov 26 04:54:12 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "mds metadata"} : dispatch Nov 26 04:54:12 localhost ceph-mon[297296]: mon.np0005536118@1(peon).mds e16 all = 1 Nov 26 04:54:12 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "osd metadata"} v 0) Nov 26 04:54:12 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "osd metadata"} : dispatch Nov 26 04:54:12 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon metadata"} v 0) Nov 26 04:54:12 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "mon metadata"} : dispatch Nov 26 04:54:12 localhost systemd-logind[761]: Session 69 logged out. Waiting for processes to exit. Nov 26 04:54:12 localhost ceph-mgr[287388]: [balancer DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 26 04:54:12 localhost ceph-mgr[287388]: mgr load Constructed class from module: balancer Nov 26 04:54:12 localhost ceph-mgr[287388]: [balancer INFO root] Starting Nov 26 04:54:12 localhost ceph-mgr[287388]: [balancer INFO root] Optimize plan auto_2025-11-26_09:54:12 Nov 26 04:54:12 localhost ceph-mgr[287388]: [balancer INFO root] Mode upmap, max misplaced 0.050000 Nov 26 04:54:12 localhost ceph-mgr[287388]: [balancer INFO root] Some PGs (1.000000) are unknown; try again later Nov 26 04:54:12 localhost ceph-mgr[287388]: [cephadm DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 26 04:54:12 localhost systemd[1]: session-69.scope: Deactivated successfully. Nov 26 04:54:12 localhost systemd[1]: session-69.scope: Consumed 21.809s CPU time. Nov 26 04:54:12 localhost systemd-logind[761]: Removed session 69. Nov 26 04:54:12 localhost ceph-mgr[287388]: [cephadm WARNING root] removing stray HostCache host record np0005536114.localdomain.devices.0 Nov 26 04:54:12 localhost ceph-mgr[287388]: log_channel(cephadm) log [WRN] : removing stray HostCache host record np0005536114.localdomain.devices.0 Nov 26 04:54:12 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"config-key del","key":"mgr/cephadm/host.np0005536114.localdomain.devices.0"} v 0) Nov 26 04:54:12 localhost ceph-mon[297296]: log_channel(audit) log [INF] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005536114.localdomain.devices.0"} : dispatch Nov 26 04:54:12 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"config-key del","key":"mgr/cephadm/host.np0005536114.localdomain.devices.0"} v 0) Nov 26 04:54:12 localhost ceph-mon[297296]: log_channel(audit) log [INF] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005536114.localdomain.devices.0"} : dispatch Nov 26 04:54:12 localhost ceph-mgr[287388]: mgr load Constructed class from module: cephadm Nov 26 04:54:12 localhost ceph-mgr[287388]: [crash DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 26 04:54:12 localhost ceph-mgr[287388]: mgr load Constructed class from module: crash Nov 26 04:54:12 localhost ceph-mgr[287388]: [devicehealth DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 26 04:54:12 localhost ceph-mgr[287388]: mgr load Constructed class from module: devicehealth Nov 26 04:54:12 localhost ceph-mgr[287388]: [iostat DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 26 04:54:12 localhost ceph-mgr[287388]: mgr load Constructed class from module: iostat Nov 26 04:54:12 localhost ceph-mgr[287388]: [devicehealth INFO root] Starting Nov 26 04:54:12 localhost ceph-mgr[287388]: [nfs DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 26 04:54:12 localhost ceph-mgr[287388]: mgr load Constructed class from module: nfs Nov 26 04:54:12 localhost ceph-mgr[287388]: [orchestrator DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 26 04:54:12 localhost ceph-mgr[287388]: mgr load Constructed class from module: orchestrator Nov 26 04:54:12 localhost ceph-mgr[287388]: [pg_autoscaler DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 26 04:54:12 localhost ceph-mgr[287388]: mgr load Constructed class from module: pg_autoscaler Nov 26 04:54:12 localhost ceph-mgr[287388]: [pg_autoscaler INFO root] _maybe_adjust Nov 26 04:54:12 localhost ceph-mgr[287388]: [progress DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 26 04:54:12 localhost ceph-mgr[287388]: mgr load Constructed class from module: progress Nov 26 04:54:12 localhost ceph-mgr[287388]: [rbd_support DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 26 04:54:12 localhost ceph-mgr[287388]: [progress INFO root] Loading... Nov 26 04:54:12 localhost ceph-mgr[287388]: [progress INFO root] Loaded [, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ] historic events Nov 26 04:54:12 localhost ceph-mgr[287388]: [progress INFO root] Loaded OSDMap, ready. Nov 26 04:54:12 localhost ceph-mgr[287388]: [rbd_support INFO root] recovery thread starting Nov 26 04:54:12 localhost ceph-mgr[287388]: [rbd_support INFO root] starting setup Nov 26 04:54:12 localhost ceph-mgr[287388]: mgr load Constructed class from module: rbd_support Nov 26 04:54:12 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005536118.anceyj/mirror_snapshot_schedule"} v 0) Nov 26 04:54:12 localhost ceph-mon[297296]: log_channel(audit) log [INF] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005536118.anceyj/mirror_snapshot_schedule"} : dispatch Nov 26 04:54:12 localhost ceph-mgr[287388]: [restful DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 26 04:54:12 localhost ceph-mgr[287388]: mgr load Constructed class from module: restful Nov 26 04:54:12 localhost ceph-mgr[287388]: [status DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 26 04:54:12 localhost ceph-mgr[287388]: mgr load Constructed class from module: status Nov 26 04:54:12 localhost ceph-mgr[287388]: [restful INFO root] server_addr: :: server_port: 8003 Nov 26 04:54:12 localhost ceph-mgr[287388]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules Nov 26 04:54:12 localhost ceph-mgr[287388]: [rbd_support INFO root] load_schedules: vms, start_after= Nov 26 04:54:12 localhost ceph-mgr[287388]: [telemetry DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 26 04:54:12 localhost ceph-mgr[287388]: mgr load Constructed class from module: telemetry Nov 26 04:54:12 localhost ceph-mgr[287388]: [restful WARNING root] server not running: no certificate configured Nov 26 04:54:12 localhost ceph-mgr[287388]: [volumes DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 26 04:54:12 localhost ceph-mgr[287388]: [rbd_support INFO root] load_schedules: volumes, start_after= Nov 26 04:54:12 localhost ceph-mgr[287388]: [rbd_support INFO root] load_schedules: images, start_after= Nov 26 04:54:12 localhost ceph-mgr[287388]: [rbd_support INFO root] load_schedules: backups, start_after= Nov 26 04:54:12 localhost ceph-mgr[287388]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: starting Nov 26 04:54:12 localhost ceph-mgr[287388]: [rbd_support INFO root] PerfHandler: starting Nov 26 04:54:12 localhost ceph-mgr[287388]: [rbd_support INFO root] load_task_task: vms, start_after= Nov 26 04:54:12 localhost ceph-mgr[287388]: [rbd_support INFO root] load_task_task: volumes, start_after= Nov 26 04:54:12 localhost ceph-mgr[287388]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs' Nov 26 04:54:12 localhost ceph-mgr[287388]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs' Nov 26 04:54:12 localhost ceph-mgr[287388]: mgr load Constructed class from module: volumes Nov 26 04:54:12 localhost ceph-mgr[287388]: [rbd_support INFO root] load_task_task: images, start_after= Nov 26 04:54:12 localhost ceph-mgr[287388]: [rbd_support INFO root] load_task_task: backups, start_after= Nov 26 04:54:12 localhost ceph-mgr[287388]: [rbd_support INFO root] TaskHandler: starting Nov 26 04:54:12 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005536118.anceyj/trash_purge_schedule"} v 0) Nov 26 04:54:12 localhost ceph-mon[297296]: log_channel(audit) log [INF] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005536118.anceyj/trash_purge_schedule"} : dispatch Nov 26 04:54:12 localhost ceph-mgr[287388]: client.0 error registering admin socket command: (17) File exists Nov 26 04:54:12 localhost ceph-mgr[287388]: client.0 error registering admin socket command: (17) File exists Nov 26 04:54:12 localhost ceph-mgr[287388]: client.0 error registering admin socket command: (17) File exists Nov 26 04:54:12 localhost ceph-mgr[287388]: client.0 error registering admin socket command: (17) File exists Nov 26 04:54:12 localhost ceph-mgr[287388]: client.0 error registering admin socket command: (17) File exists Nov 26 04:54:12 localhost ceph-mgr[287388]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules Nov 26 04:54:12 localhost ceph-mgr[287388]: client.0 error registering admin socket command: (17) File exists Nov 26 04:54:12 localhost ceph-mgr[287388]: client.0 error registering admin socket command: (17) File exists Nov 26 04:54:12 localhost ceph-mgr[287388]: client.0 error registering admin socket command: (17) File exists Nov 26 04:54:12 localhost ceph-mgr[287388]: client.0 error registering admin socket command: (17) File exists Nov 26 04:54:12 localhost ceph-mgr[287388]: client.0 error registering admin socket command: (17) File exists Nov 26 04:54:12 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:54:12.841+0000 7f53d062c640 -1 client.0 error registering admin socket command: (17) File exists Nov 26 04:54:12 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:54:12.841+0000 7f53d062c640 -1 client.0 error registering admin socket command: (17) File exists Nov 26 04:54:12 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:54:12.841+0000 7f53d062c640 -1 client.0 error registering admin socket command: (17) File exists Nov 26 04:54:12 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:54:12.841+0000 7f53d062c640 -1 client.0 error registering admin socket command: (17) File exists Nov 26 04:54:12 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:54:12.841+0000 7f53d062c640 -1 client.0 error registering admin socket command: (17) File exists Nov 26 04:54:12 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:54:12.842+0000 7f53d4634640 -1 client.0 error registering admin socket command: (17) File exists Nov 26 04:54:12 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:54:12.843+0000 7f53d4634640 -1 client.0 error registering admin socket command: (17) File exists Nov 26 04:54:12 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:54:12.843+0000 7f53d4634640 -1 client.0 error registering admin socket command: (17) File exists Nov 26 04:54:12 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:54:12.843+0000 7f53d4634640 -1 client.0 error registering admin socket command: (17) File exists Nov 26 04:54:12 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:54:12.843+0000 7f53d4634640 -1 client.0 error registering admin socket command: (17) File exists Nov 26 04:54:12 localhost ceph-mgr[287388]: [rbd_support INFO root] load_schedules: vms, start_after= Nov 26 04:54:12 localhost ceph-mgr[287388]: [rbd_support INFO root] load_schedules: volumes, start_after= Nov 26 04:54:12 localhost ceph-mgr[287388]: [rbd_support INFO root] load_schedules: images, start_after= Nov 26 04:54:12 localhost ceph-mgr[287388]: [rbd_support INFO root] load_schedules: backups, start_after= Nov 26 04:54:12 localhost ceph-mgr[287388]: [rbd_support INFO root] TrashPurgeScheduleHandler: starting Nov 26 04:54:12 localhost ceph-mgr[287388]: [rbd_support INFO root] setup complete Nov 26 04:54:13 localhost sshd[302210]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:54:13 localhost nova_compute[281415]: 2025-11-26 09:54:13.136 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:54:13 localhost systemd-logind[761]: New session 71 of user ceph-admin. Nov 26 04:54:13 localhost systemd[1]: Started Session 71 of User ceph-admin. Nov 26 04:54:13 localhost systemd[1]: var-lib-containers-storage-overlay-7c755c76dd9fbe7a1f783539eb7231d6c791b46a800c102a9c2bc461a0e83748-merged.mount: Deactivated successfully. Nov 26 04:54:13 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:13 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' Nov 26 04:54:13 localhost ceph-mon[297296]: from='mgr.26711 172.18.0.106:0/3384569901' entity='mgr.np0005536117.ggibwg' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Nov 26 04:54:13 localhost ceph-mon[297296]: from='client.? 172.18.0.200:0/1095241444' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Nov 26 04:54:13 localhost ceph-mon[297296]: Activating manager daemon np0005536118.anceyj Nov 26 04:54:13 localhost ceph-mon[297296]: from='client.? 172.18.0.200:0/1095241444' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Nov 26 04:54:13 localhost ceph-mon[297296]: Manager daemon np0005536118.anceyj is now available Nov 26 04:54:13 localhost ceph-mon[297296]: from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005536114.localdomain.devices.0"} : dispatch Nov 26 04:54:13 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005536114.localdomain.devices.0"} : dispatch Nov 26 04:54:13 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005536114.localdomain.devices.0"}]': finished Nov 26 04:54:13 localhost ceph-mon[297296]: from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005536114.localdomain.devices.0"} : dispatch Nov 26 04:54:13 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005536114.localdomain.devices.0"} : dispatch Nov 26 04:54:13 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005536114.localdomain.devices.0"}]': finished Nov 26 04:54:13 localhost ceph-mon[297296]: from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005536118.anceyj/mirror_snapshot_schedule"} : dispatch Nov 26 04:54:13 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005536118.anceyj/mirror_snapshot_schedule"} : dispatch Nov 26 04:54:13 localhost ceph-mon[297296]: from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005536118.anceyj/trash_purge_schedule"} : dispatch Nov 26 04:54:13 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005536118.anceyj/trash_purge_schedule"} : dispatch Nov 26 04:54:13 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v3: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:54:14 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:54:14 localhost ceph-mgr[287388]: [cephadm INFO cherrypy.error] [26/Nov/2025:09:54:14] ENGINE Bus STARTING Nov 26 04:54:14 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : [26/Nov/2025:09:54:14] ENGINE Bus STARTING Nov 26 04:54:14 localhost ceph-mgr[287388]: [cephadm INFO cherrypy.error] [26/Nov/2025:09:54:14] ENGINE Serving on http://172.18.0.107:8765 Nov 26 04:54:14 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : [26/Nov/2025:09:54:14] ENGINE Serving on http://172.18.0.107:8765 Nov 26 04:54:14 localhost podman[302332]: 2025-11-26 09:54:14.346900166 +0000 UTC m=+0.099267043 container exec a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, com.redhat.component=rhceph-container, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , name=rhceph, version=7, CEPH_POINT_RELEASE=, RELEASE=main) Nov 26 04:54:14 localhost ceph-mgr[287388]: [cephadm INFO cherrypy.error] [26/Nov/2025:09:54:14] ENGINE Serving on https://172.18.0.107:7150 Nov 26 04:54:14 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : [26/Nov/2025:09:54:14] ENGINE Serving on https://172.18.0.107:7150 Nov 26 04:54:14 localhost ceph-mgr[287388]: [cephadm INFO cherrypy.error] [26/Nov/2025:09:54:14] ENGINE Bus STARTED Nov 26 04:54:14 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : [26/Nov/2025:09:54:14] ENGINE Bus STARTED Nov 26 04:54:14 localhost ceph-mgr[287388]: [cephadm INFO cherrypy.error] [26/Nov/2025:09:54:14] ENGINE Client ('172.18.0.107', 34252) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Nov 26 04:54:14 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : [26/Nov/2025:09:54:14] ENGINE Client ('172.18.0.107', 34252) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Nov 26 04:54:14 localhost podman[302332]: 2025-11-26 09:54:14.447591334 +0000 UTC m=+0.199958241 container exec_died a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, GIT_CLEAN=True, release=553, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., ceph=True, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, version=7, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux ) Nov 26 04:54:14 localhost ceph-mon[297296]: removing stray HostCache host record np0005536114.localdomain.devices.0 Nov 26 04:54:14 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:54:14 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536117.localdomain.devices.0}] v 0) Nov 26 04:54:14 localhost ceph-mgr[287388]: [devicehealth INFO root] Check health Nov 26 04:54:14 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536117.localdomain}] v 0) Nov 26 04:54:15 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536119.localdomain.devices.0}] v 0) Nov 26 04:54:15 localhost nova_compute[281415]: 2025-11-26 09:54:15.056 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:54:15 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536119.localdomain}] v 0) Nov 26 04:54:15 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536118.localdomain.devices.0}] v 0) Nov 26 04:54:15 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536118.localdomain}] v 0) Nov 26 04:54:15 localhost ceph-mon[297296]: [26/Nov/2025:09:54:14] ENGINE Bus STARTING Nov 26 04:54:15 localhost ceph-mon[297296]: [26/Nov/2025:09:54:14] ENGINE Serving on http://172.18.0.107:8765 Nov 26 04:54:15 localhost ceph-mon[297296]: [26/Nov/2025:09:54:14] ENGINE Serving on https://172.18.0.107:7150 Nov 26 04:54:15 localhost ceph-mon[297296]: [26/Nov/2025:09:54:14] ENGINE Bus STARTED Nov 26 04:54:15 localhost ceph-mon[297296]: [26/Nov/2025:09:54:14] ENGINE Client ('172.18.0.107', 34252) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Nov 26 04:54:15 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:15 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:15 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:15 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:15 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:15 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:15 localhost openstack_network_exporter[242153]: ERROR 09:54:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:54:15 localhost openstack_network_exporter[242153]: ERROR 09:54:15 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:54:15 localhost openstack_network_exporter[242153]: ERROR 09:54:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:54:15 localhost openstack_network_exporter[242153]: ERROR 09:54:15 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:54:15 localhost openstack_network_exporter[242153]: Nov 26 04:54:15 localhost openstack_network_exporter[242153]: ERROR 09:54:15 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:54:15 localhost openstack_network_exporter[242153]: Nov 26 04:54:16 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536117.localdomain.devices.0}] v 0) Nov 26 04:54:16 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536117.localdomain}] v 0) Nov 26 04:54:16 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0) Nov 26 04:54:16 localhost ceph-mon[297296]: log_channel(audit) log [INF] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 26 04:54:16 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0) Nov 26 04:54:16 localhost ceph-mon[297296]: log_channel(audit) log [INF] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 26 04:54:16 localhost ceph-mgr[287388]: [cephadm INFO root] Adjusting osd_memory_target on np0005536117.localdomain to 836.6M Nov 26 04:54:16 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005536117.localdomain to 836.6M Nov 26 04:54:16 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Nov 26 04:54:16 localhost ceph-mgr[287388]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005536117.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 26 04:54:16 localhost ceph-mgr[287388]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005536117.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 26 04:54:16 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536119.localdomain.devices.0}] v 0) Nov 26 04:54:16 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536119.localdomain}] v 0) Nov 26 04:54:16 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0) Nov 26 04:54:16 localhost ceph-mon[297296]: log_channel(audit) log [INF] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 26 04:54:16 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0) Nov 26 04:54:16 localhost ceph-mon[297296]: log_channel(audit) log [INF] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 26 04:54:16 localhost ceph-mgr[287388]: [cephadm INFO root] Adjusting osd_memory_target on np0005536119.localdomain to 836.6M Nov 26 04:54:16 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005536119.localdomain to 836.6M Nov 26 04:54:16 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Nov 26 04:54:16 localhost ceph-mgr[287388]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005536119.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 26 04:54:16 localhost ceph-mgr[287388]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005536119.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 26 04:54:16 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:16 localhost ceph-mon[297296]: from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 26 04:54:16 localhost ceph-mon[297296]: from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 26 04:54:16 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:16 localhost ceph-mon[297296]: Adjusting osd_memory_target on np0005536117.localdomain to 836.6M Nov 26 04:54:16 localhost ceph-mon[297296]: Unable to set osd_memory_target on np0005536117.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 26 04:54:16 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 26 04:54:16 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 26 04:54:16 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:16 localhost ceph-mon[297296]: from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 26 04:54:16 localhost ceph-mon[297296]: from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 26 04:54:16 localhost ceph-mon[297296]: Adjusting osd_memory_target on np0005536119.localdomain to 836.6M Nov 26 04:54:16 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:16 localhost ceph-mon[297296]: Unable to set osd_memory_target on np0005536119.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 26 04:54:16 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 26 04:54:16 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 26 04:54:16 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:54:16 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536118.localdomain.devices.0}] v 0) Nov 26 04:54:16 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536118.localdomain}] v 0) Nov 26 04:54:16 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0) Nov 26 04:54:16 localhost ceph-mon[297296]: log_channel(audit) log [INF] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 26 04:54:16 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0) Nov 26 04:54:16 localhost ceph-mon[297296]: log_channel(audit) log [INF] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 26 04:54:16 localhost ceph-mgr[287388]: [cephadm INFO root] Adjusting osd_memory_target on np0005536118.localdomain to 836.6M Nov 26 04:54:16 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005536118.localdomain to 836.6M Nov 26 04:54:16 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Nov 26 04:54:16 localhost ceph-mgr[287388]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005536118.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 26 04:54:16 localhost ceph-mgr[287388]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005536118.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 26 04:54:16 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 26 04:54:16 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 26 04:54:16 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) Nov 26 04:54:16 localhost ceph-mon[297296]: log_channel(audit) log [INF] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:54:16 localhost ceph-mgr[287388]: [cephadm INFO cephadm.serve] Updating np0005536117.localdomain:/etc/ceph/ceph.conf Nov 26 04:54:16 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : Updating np0005536117.localdomain:/etc/ceph/ceph.conf Nov 26 04:54:16 localhost ceph-mgr[287388]: [cephadm INFO cephadm.serve] Updating np0005536118.localdomain:/etc/ceph/ceph.conf Nov 26 04:54:16 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : Updating np0005536118.localdomain:/etc/ceph/ceph.conf Nov 26 04:54:16 localhost ceph-mgr[287388]: [cephadm INFO cephadm.serve] Updating np0005536119.localdomain:/etc/ceph/ceph.conf Nov 26 04:54:16 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : Updating np0005536119.localdomain:/etc/ceph/ceph.conf Nov 26 04:54:17 localhost ceph-mgr[287388]: [cephadm INFO cephadm.serve] Updating np0005536117.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:54:17 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : Updating np0005536117.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:54:17 localhost ceph-mgr[287388]: mgr.server handle_open ignoring open from mgr.np0005536117.ggibwg 172.18.0.106:0/3324018721; not ready for session (expect reconnect) Nov 26 04:54:17 localhost ceph-mgr[287388]: [cephadm INFO cephadm.serve] Updating np0005536119.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:54:17 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : Updating np0005536119.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:54:17 localhost ceph-mgr[287388]: [cephadm INFO cephadm.serve] Updating np0005536118.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:54:17 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : Updating np0005536118.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:54:17 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:17 localhost ceph-mon[297296]: from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 26 04:54:17 localhost ceph-mon[297296]: from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 26 04:54:17 localhost ceph-mon[297296]: Adjusting osd_memory_target on np0005536118.localdomain to 836.6M Nov 26 04:54:17 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:17 localhost ceph-mon[297296]: Unable to set osd_memory_target on np0005536118.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 26 04:54:17 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 26 04:54:17 localhost ceph-mon[297296]: from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:54:17 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 26 04:54:17 localhost ceph-mon[297296]: Updating np0005536117.localdomain:/etc/ceph/ceph.conf Nov 26 04:54:17 localhost ceph-mon[297296]: Updating np0005536118.localdomain:/etc/ceph/ceph.conf Nov 26 04:54:17 localhost ceph-mon[297296]: Updating np0005536119.localdomain:/etc/ceph/ceph.conf Nov 26 04:54:17 localhost ceph-mon[297296]: Updating np0005536117.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:54:17 localhost ceph-mon[297296]: Updating np0005536119.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:54:17 localhost ceph-mon[297296]: Updating np0005536118.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:54:17 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005536117.ggibwg", "id": "np0005536117.ggibwg"} v 0) Nov 26 04:54:17 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "mgr metadata", "who": "np0005536117.ggibwg", "id": "np0005536117.ggibwg"} : dispatch Nov 26 04:54:18 localhost ceph-mgr[287388]: [cephadm INFO cephadm.serve] Updating np0005536117.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 26 04:54:18 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : Updating np0005536117.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 26 04:54:18 localhost nova_compute[281415]: 2025-11-26 09:54:18.171 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:54:18 localhost ceph-mgr[287388]: [cephadm INFO cephadm.serve] Updating np0005536119.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 26 04:54:18 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : Updating np0005536119.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 26 04:54:18 localhost ceph-mgr[287388]: [cephadm INFO cephadm.serve] Updating np0005536118.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 26 04:54:18 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : Updating np0005536118.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 26 04:54:18 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:54:18 localhost ceph-mgr[287388]: [cephadm INFO cephadm.serve] Updating np0005536117.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.client.admin.keyring Nov 26 04:54:18 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : Updating np0005536117.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.client.admin.keyring Nov 26 04:54:18 localhost ceph-mon[297296]: Updating np0005536117.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 26 04:54:18 localhost ceph-mon[297296]: Updating np0005536119.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 26 04:54:18 localhost ceph-mon[297296]: Updating np0005536118.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 26 04:54:18 localhost ceph-mgr[287388]: [cephadm INFO cephadm.serve] Updating np0005536119.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.client.admin.keyring Nov 26 04:54:18 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : Updating np0005536119.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.client.admin.keyring Nov 26 04:54:18 localhost ceph-mgr[287388]: [cephadm INFO cephadm.serve] Updating np0005536118.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.client.admin.keyring Nov 26 04:54:18 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : Updating np0005536118.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.client.admin.keyring Nov 26 04:54:19 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:54:19 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536117.localdomain.devices.0}] v 0) Nov 26 04:54:19 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536117.localdomain}] v 0) Nov 26 04:54:19 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536119.localdomain.devices.0}] v 0) Nov 26 04:54:19 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536119.localdomain}] v 0) Nov 26 04:54:19 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536118.localdomain.devices.0}] v 0) Nov 26 04:54:19 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536118.localdomain}] v 0) Nov 26 04:54:19 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Nov 26 04:54:19 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail; 34 KiB/s rd, 0 B/s wr, 19 op/s Nov 26 04:54:19 localhost ceph-mgr[287388]: [progress INFO root] update: starting ev f73a53a2-a175-435b-881f-0b57afc3e981 (Updating node-proxy deployment (+3 -> 3)) Nov 26 04:54:19 localhost ceph-mgr[287388]: [progress INFO root] complete: finished ev f73a53a2-a175-435b-881f-0b57afc3e981 (Updating node-proxy deployment (+3 -> 3)) Nov 26 04:54:19 localhost ceph-mgr[287388]: [progress INFO root] Completed event f73a53a2-a175-435b-881f-0b57afc3e981 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds Nov 26 04:54:19 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) Nov 26 04:54:19 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch Nov 26 04:54:19 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.2"} v 0) Nov 26 04:54:19 localhost ceph-mon[297296]: log_channel(audit) log [INF] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Nov 26 04:54:19 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 26 04:54:19 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 26 04:54:19 localhost ceph-mgr[287388]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.2 on np0005536117.localdomain Nov 26 04:54:19 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.2 on np0005536117.localdomain Nov 26 04:54:20 localhost nova_compute[281415]: 2025-11-26 09:54:20.085 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:54:20 localhost ceph-mon[297296]: Updating np0005536117.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.client.admin.keyring Nov 26 04:54:20 localhost ceph-mon[297296]: Updating np0005536119.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.client.admin.keyring Nov 26 04:54:20 localhost ceph-mon[297296]: Updating np0005536118.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.client.admin.keyring Nov 26 04:54:20 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:20 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:20 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:20 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:20 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:20 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:20 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:20 localhost ceph-mon[297296]: from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Nov 26 04:54:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:54:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:54:20 localhost podman[303250]: 2025-11-26 09:54:20.848546357 +0000 UTC m=+0.102524134 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:54:20 localhost podman[303251]: 2025-11-26 09:54:20.899553296 +0000 UTC m=+0.153261205 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 26 04:54:20 localhost podman[303251]: 2025-11-26 09:54:20.909261935 +0000 UTC m=+0.162969824 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 26 04:54:20 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:54:20 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536117.localdomain.devices.0}] v 0) Nov 26 04:54:20 localhost podman[303250]: 2025-11-26 09:54:20.961459781 +0000 UTC m=+0.215437528 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Nov 26 04:54:20 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536117.localdomain}] v 0) Nov 26 04:54:20 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:54:20 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536117.localdomain.devices.0}] v 0) Nov 26 04:54:21 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536117.localdomain}] v 0) Nov 26 04:54:21 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.5"} v 0) Nov 26 04:54:21 localhost ceph-mon[297296]: log_channel(audit) log [INF] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Nov 26 04:54:21 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 26 04:54:21 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 26 04:54:21 localhost ceph-mgr[287388]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005536117.localdomain Nov 26 04:54:21 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005536117.localdomain Nov 26 04:54:21 localhost ceph-mon[297296]: Reconfiguring daemon osd.2 on np0005536117.localdomain Nov 26 04:54:21 localhost ceph-mon[297296]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON) Nov 26 04:54:21 localhost ceph-mon[297296]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST) Nov 26 04:54:21 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:21 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:21 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:21 localhost ceph-mon[297296]: from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Nov 26 04:54:21 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:21 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail; 26 KiB/s rd, 0 B/s wr, 14 op/s Nov 26 04:54:22 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536117.localdomain.devices.0}] v 0) Nov 26 04:54:22 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536117.localdomain}] v 0) Nov 26 04:54:22 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536117.localdomain.devices.0}] v 0) Nov 26 04:54:22 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536117.localdomain}] v 0) Nov 26 04:54:22 localhost ceph-mgr[287388]: [cephadm INFO cephadm.serve] Reconfiguring osd.0 (monmap changed)... Nov 26 04:54:22 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.0"} v 0) Nov 26 04:54:22 localhost ceph-mon[297296]: log_channel(audit) log [INF] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Nov 26 04:54:22 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : Reconfiguring osd.0 (monmap changed)... Nov 26 04:54:22 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 26 04:54:22 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 26 04:54:22 localhost ceph-mgr[287388]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.0 on np0005536118.localdomain Nov 26 04:54:22 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.0 on np0005536118.localdomain Nov 26 04:54:22 localhost ceph-mon[297296]: Reconfiguring daemon osd.5 on np0005536117.localdomain Nov 26 04:54:22 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:22 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:22 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:22 localhost ceph-mon[297296]: from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Nov 26 04:54:22 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:22 localhost podman[303345]: Nov 26 04:54:22 localhost podman[303345]: 2025-11-26 09:54:22.767117279 +0000 UTC m=+0.083944613 container create ce1a08686e327e80d5d0eae586da4b752d7e6956e3d0f85581c1412287e053b8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_agnesi, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_CLEAN=True, com.redhat.component=rhceph-container, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, GIT_BRANCH=main, vendor=Red Hat, Inc., release=553, io.openshift.tags=rhceph ceph, version=7) Nov 26 04:54:22 localhost ceph-mgr[287388]: [progress INFO root] Writing back 50 completed events Nov 26 04:54:22 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Nov 26 04:54:22 localhost systemd[1]: Started libpod-conmon-ce1a08686e327e80d5d0eae586da4b752d7e6956e3d0f85581c1412287e053b8.scope. Nov 26 04:54:22 localhost podman[303345]: 2025-11-26 09:54:22.735726394 +0000 UTC m=+0.052553758 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:54:22 localhost systemd[1]: Started libcrun container. Nov 26 04:54:22 localhost podman[303345]: 2025-11-26 09:54:22.860540782 +0000 UTC m=+0.177368126 container init ce1a08686e327e80d5d0eae586da4b752d7e6956e3d0f85581c1412287e053b8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_agnesi, release=553, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=rhceph-container, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, name=rhceph, GIT_BRANCH=main, vendor=Red Hat, Inc., version=7, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=) Nov 26 04:54:22 localhost podman[303345]: 2025-11-26 09:54:22.871050056 +0000 UTC m=+0.187877390 container start ce1a08686e327e80d5d0eae586da4b752d7e6956e3d0f85581c1412287e053b8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_agnesi, maintainer=Guillaume Abrioux , RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_CLEAN=True) Nov 26 04:54:22 localhost systemd[1]: tmp-crun.jywCpG.mount: Deactivated successfully. Nov 26 04:54:22 localhost podman[303345]: 2025-11-26 09:54:22.871861801 +0000 UTC m=+0.188689145 container attach ce1a08686e327e80d5d0eae586da4b752d7e6956e3d0f85581c1412287e053b8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_agnesi, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, name=rhceph, com.redhat.component=rhceph-container, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, RELEASE=main, GIT_BRANCH=main, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, version=7, io.k8s.description=Red Hat Ceph Storage 7, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 26 04:54:22 localhost bold_agnesi[303360]: 167 167 Nov 26 04:54:22 localhost systemd[1]: libpod-ce1a08686e327e80d5d0eae586da4b752d7e6956e3d0f85581c1412287e053b8.scope: Deactivated successfully. Nov 26 04:54:22 localhost podman[303345]: 2025-11-26 09:54:22.876856424 +0000 UTC m=+0.193683818 container died ce1a08686e327e80d5d0eae586da4b752d7e6956e3d0f85581c1412287e053b8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_agnesi, io.buildah.version=1.33.12, RELEASE=main, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, name=rhceph, GIT_CLEAN=True, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, architecture=x86_64, GIT_BRANCH=main, release=553, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, vcs-type=git, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=) Nov 26 04:54:22 localhost podman[303365]: 2025-11-26 09:54:22.982438182 +0000 UTC m=+0.091874737 container remove ce1a08686e327e80d5d0eae586da4b752d7e6956e3d0f85581c1412287e053b8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_agnesi, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, distribution-scope=public, version=7, ceph=True, io.buildah.version=1.33.12, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, GIT_CLEAN=True, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, architecture=x86_64, release=553, maintainer=Guillaume Abrioux , GIT_BRANCH=main, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7) Nov 26 04:54:22 localhost systemd[1]: libpod-conmon-ce1a08686e327e80d5d0eae586da4b752d7e6956e3d0f85581c1412287e053b8.scope: Deactivated successfully. Nov 26 04:54:23 localhost nova_compute[281415]: 2025-11-26 09:54:23.210 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:54:23 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536118.localdomain.devices.0}] v 0) Nov 26 04:54:23 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536118.localdomain}] v 0) Nov 26 04:54:23 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536118.localdomain.devices.0}] v 0) Nov 26 04:54:23 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536118.localdomain}] v 0) Nov 26 04:54:23 localhost ceph-mgr[287388]: [cephadm INFO cephadm.serve] Reconfiguring osd.4 (monmap changed)... Nov 26 04:54:23 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : Reconfiguring osd.4 (monmap changed)... Nov 26 04:54:23 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.4"} v 0) Nov 26 04:54:23 localhost ceph-mon[297296]: log_channel(audit) log [INF] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Nov 26 04:54:23 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 26 04:54:23 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 26 04:54:23 localhost ceph-mgr[287388]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.4 on np0005536118.localdomain Nov 26 04:54:23 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.4 on np0005536118.localdomain Nov 26 04:54:23 localhost ceph-mon[297296]: Reconfiguring osd.0 (monmap changed)... Nov 26 04:54:23 localhost ceph-mon[297296]: Reconfiguring daemon osd.0 on np0005536118.localdomain Nov 26 04:54:23 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:23 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:23 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:23 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:23 localhost ceph-mon[297296]: from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Nov 26 04:54:23 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:23 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 0 B/s wr, 11 op/s Nov 26 04:54:23 localhost systemd[1]: var-lib-containers-storage-overlay-553855ff6a8e4d91b558f2462b740bb206a3138a23148f83e34328f1127aeafd-merged.mount: Deactivated successfully. Nov 26 04:54:23 localhost podman[303443]: Nov 26 04:54:23 localhost podman[303443]: 2025-11-26 09:54:23.978605663 +0000 UTC m=+0.081272071 container create d55c4d03b46926cfb5f3704977bbfa27e99a44ea9784b5f01e6ea904553e8cf9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_gauss, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.openshift.expose-services=, release=553, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , RELEASE=main, com.redhat.component=rhceph-container, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git) Nov 26 04:54:24 localhost systemd[1]: Started libpod-conmon-d55c4d03b46926cfb5f3704977bbfa27e99a44ea9784b5f01e6ea904553e8cf9.scope. Nov 26 04:54:24 localhost systemd[1]: Started libcrun container. Nov 26 04:54:24 localhost podman[303443]: 2025-11-26 09:54:24.04451208 +0000 UTC m=+0.147178538 container init d55c4d03b46926cfb5f3704977bbfa27e99a44ea9784b5f01e6ea904553e8cf9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_gauss, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, version=7, ceph=True, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, description=Red Hat Ceph Storage 7, architecture=x86_64, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, release=553, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 26 04:54:24 localhost podman[303443]: 2025-11-26 09:54:23.947442344 +0000 UTC m=+0.050108762 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:54:24 localhost podman[303443]: 2025-11-26 09:54:24.054844538 +0000 UTC m=+0.157510956 container start d55c4d03b46926cfb5f3704977bbfa27e99a44ea9784b5f01e6ea904553e8cf9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_gauss, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , version=7, io.openshift.expose-services=, release=553, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, RELEASE=main, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, architecture=x86_64, io.buildah.version=1.33.12, name=rhceph) Nov 26 04:54:24 localhost podman[303443]: 2025-11-26 09:54:24.05525076 +0000 UTC m=+0.157917168 container attach d55c4d03b46926cfb5f3704977bbfa27e99a44ea9784b5f01e6ea904553e8cf9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_gauss, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, RELEASE=main, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, GIT_CLEAN=True, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, ceph=True, GIT_BRANCH=main, name=rhceph, architecture=x86_64, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7) Nov 26 04:54:24 localhost agitated_gauss[303458]: 167 167 Nov 26 04:54:24 localhost systemd[1]: libpod-d55c4d03b46926cfb5f3704977bbfa27e99a44ea9784b5f01e6ea904553e8cf9.scope: Deactivated successfully. Nov 26 04:54:24 localhost podman[303443]: 2025-11-26 09:54:24.059316345 +0000 UTC m=+0.161982763 container died d55c4d03b46926cfb5f3704977bbfa27e99a44ea9784b5f01e6ea904553e8cf9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_gauss, GIT_CLEAN=True, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vendor=Red Hat, Inc., release=553, architecture=x86_64, vcs-type=git, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , distribution-scope=public, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_BRANCH=main) Nov 26 04:54:24 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:54:24 localhost podman[303463]: 2025-11-26 09:54:24.163743698 +0000 UTC m=+0.089228616 container remove d55c4d03b46926cfb5f3704977bbfa27e99a44ea9784b5f01e6ea904553e8cf9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_gauss, maintainer=Guillaume Abrioux , vcs-type=git, version=7, RELEASE=main, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 26 04:54:24 localhost systemd[1]: libpod-conmon-d55c4d03b46926cfb5f3704977bbfa27e99a44ea9784b5f01e6ea904553e8cf9.scope: Deactivated successfully. Nov 26 04:54:24 localhost ceph-mgr[287388]: log_channel(audit) log [DBG] : from='client.74111 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch Nov 26 04:54:24 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536118.localdomain.devices.0}] v 0) Nov 26 04:54:24 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536118.localdomain}] v 0) Nov 26 04:54:24 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536118.localdomain.devices.0}] v 0) Nov 26 04:54:24 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536118.localdomain}] v 0) Nov 26 04:54:24 localhost ceph-mgr[287388]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005536118.kohnma (monmap changed)... Nov 26 04:54:24 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005536118.kohnma (monmap changed)... Nov 26 04:54:24 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005536118.kohnma", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Nov 26 04:54:24 localhost ceph-mon[297296]: log_channel(audit) log [INF] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005536118.kohnma", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 26 04:54:24 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 26 04:54:24 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 26 04:54:24 localhost ceph-mgr[287388]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005536118.kohnma on np0005536118.localdomain Nov 26 04:54:24 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005536118.kohnma on np0005536118.localdomain Nov 26 04:54:24 localhost ceph-mon[297296]: Reconfiguring osd.4 (monmap changed)... Nov 26 04:54:24 localhost ceph-mon[297296]: Reconfiguring daemon osd.4 on np0005536118.localdomain Nov 26 04:54:24 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:24 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:24 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:24 localhost ceph-mon[297296]: from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005536118.kohnma", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 26 04:54:24 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:24 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005536118.kohnma", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 26 04:54:24 localhost systemd[1]: var-lib-containers-storage-overlay-803f40edbc994931bf99aeca031c43c91bc56fc2898bc626e624d88cbfaadf20-merged.mount: Deactivated successfully. Nov 26 04:54:25 localhost podman[303539]: Nov 26 04:54:25 localhost podman[303539]: 2025-11-26 09:54:25.075881033 +0000 UTC m=+0.081474087 container create dde65ff36098bc6c6d8699c46af7b2c265be4fc563426261e767c8e270081078 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_hofstadter, vcs-type=git, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, io.openshift.tags=rhceph ceph, version=7, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, ceph=True, name=rhceph, distribution-scope=public, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, CEPH_POINT_RELEASE=) Nov 26 04:54:25 localhost nova_compute[281415]: 2025-11-26 09:54:25.114 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:54:25 localhost systemd[1]: Started libpod-conmon-dde65ff36098bc6c6d8699c46af7b2c265be4fc563426261e767c8e270081078.scope. Nov 26 04:54:25 localhost podman[303539]: 2025-11-26 09:54:25.042263389 +0000 UTC m=+0.047856493 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:54:25 localhost systemd[1]: Started libcrun container. Nov 26 04:54:25 localhost podman[303539]: 2025-11-26 09:54:25.183807653 +0000 UTC m=+0.189400717 container init dde65ff36098bc6c6d8699c46af7b2c265be4fc563426261e767c8e270081078 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_hofstadter, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, version=7, maintainer=Guillaume Abrioux , vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_CLEAN=True, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 26 04:54:25 localhost podman[303539]: 2025-11-26 09:54:25.193788759 +0000 UTC m=+0.199381823 container start dde65ff36098bc6c6d8699c46af7b2c265be4fc563426261e767c8e270081078 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_hofstadter, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, ceph=True, vcs-type=git, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, architecture=x86_64, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, RELEASE=main, com.redhat.component=rhceph-container, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, GIT_CLEAN=True, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 26 04:54:25 localhost podman[303539]: 2025-11-26 09:54:25.194105999 +0000 UTC m=+0.199699083 container attach dde65ff36098bc6c6d8699c46af7b2c265be4fc563426261e767c8e270081078 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_hofstadter, distribution-scope=public, io.buildah.version=1.33.12, release=553, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, ceph=True, vcs-type=git, name=rhceph, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=) Nov 26 04:54:25 localhost optimistic_hofstadter[303554]: 167 167 Nov 26 04:54:25 localhost systemd[1]: libpod-dde65ff36098bc6c6d8699c46af7b2c265be4fc563426261e767c8e270081078.scope: Deactivated successfully. Nov 26 04:54:25 localhost podman[303539]: 2025-11-26 09:54:25.19735895 +0000 UTC m=+0.202952024 container died dde65ff36098bc6c6d8699c46af7b2c265be4fc563426261e767c8e270081078 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_hofstadter, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, release=553, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, version=7, io.buildah.version=1.33.12, distribution-scope=public, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_BRANCH=main, com.redhat.component=rhceph-container) Nov 26 04:54:25 localhost podman[303559]: 2025-11-26 09:54:25.303224845 +0000 UTC m=+0.090533625 container remove dde65ff36098bc6c6d8699c46af7b2c265be4fc563426261e767c8e270081078 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_hofstadter, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, vcs-type=git, vendor=Red Hat, Inc., GIT_BRANCH=main, architecture=x86_64, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, ceph=True, version=7, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 26 04:54:25 localhost systemd[1]: libpod-conmon-dde65ff36098bc6c6d8699c46af7b2c265be4fc563426261e767c8e270081078.scope: Deactivated successfully. Nov 26 04:54:25 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536118.localdomain.devices.0}] v 0) Nov 26 04:54:25 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536118.localdomain}] v 0) Nov 26 04:54:25 localhost ceph-mgr[287388]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005536118.anceyj (monmap changed)... Nov 26 04:54:25 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005536118.anceyj (monmap changed)... Nov 26 04:54:25 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005536118.anceyj", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Nov 26 04:54:25 localhost ceph-mon[297296]: log_channel(audit) log [INF] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536118.anceyj", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:54:25 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mgr services"} v 0) Nov 26 04:54:25 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "mgr services"} : dispatch Nov 26 04:54:25 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 26 04:54:25 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 26 04:54:25 localhost ceph-mgr[287388]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005536118.anceyj on np0005536118.localdomain Nov 26 04:54:25 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005536118.anceyj on np0005536118.localdomain Nov 26 04:54:25 localhost ceph-mon[297296]: Reconfiguring mds.mds.np0005536118.kohnma (monmap changed)... Nov 26 04:54:25 localhost ceph-mon[297296]: Reconfiguring daemon mds.mds.np0005536118.kohnma on np0005536118.localdomain Nov 26 04:54:25 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:25 localhost ceph-mon[297296]: from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536118.anceyj", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:54:25 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:25 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005536118.anceyj", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 26 04:54:25 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s Nov 26 04:54:25 localhost systemd[1]: var-lib-containers-storage-overlay-f662622d75da9552d20582eb211b4c67c1c9b3533b88d842e0e950a944d43a5b-merged.mount: Deactivated successfully. Nov 26 04:54:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:54:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:54:26 localhost podman[303633]: Nov 26 04:54:26 localhost systemd[1]: tmp-crun.7dTeOo.mount: Deactivated successfully. Nov 26 04:54:26 localhost podman[303625]: 2025-11-26 09:54:26.085697004 +0000 UTC m=+0.110967535 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Nov 26 04:54:26 localhost podman[303633]: 2025-11-26 09:54:26.094655169 +0000 UTC m=+0.093003041 container create c85f3a0012b83377940422f8fe60e91b9e704579989092d3dcf1891ab6855d3d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_sammet, ceph=True, name=rhceph, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, CEPH_POINT_RELEASE=, distribution-scope=public, vcs-type=git, release=553, GIT_CLEAN=True, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux ) Nov 26 04:54:26 localhost systemd[1]: Started libpod-conmon-c85f3a0012b83377940422f8fe60e91b9e704579989092d3dcf1891ab6855d3d.scope. Nov 26 04:54:26 localhost podman[303633]: 2025-11-26 09:54:26.057581209 +0000 UTC m=+0.055929141 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:54:26 localhost systemd[1]: Started libcrun container. Nov 26 04:54:26 localhost podman[303633]: 2025-11-26 09:54:26.183352668 +0000 UTC m=+0.181700520 container init c85f3a0012b83377940422f8fe60e91b9e704579989092d3dcf1891ab6855d3d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_sammet, io.openshift.tags=rhceph ceph, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, ceph=True, distribution-scope=public, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, release=553, com.redhat.component=rhceph-container, name=rhceph, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7) Nov 26 04:54:26 localhost podman[303633]: 2025-11-26 09:54:26.194703447 +0000 UTC m=+0.193051319 container start c85f3a0012b83377940422f8fe60e91b9e704579989092d3dcf1891ab6855d3d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_sammet, version=7, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, description=Red Hat Ceph Storage 7, release=553, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, distribution-scope=public, ceph=True, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, CEPH_POINT_RELEASE=) Nov 26 04:54:26 localhost podman[303633]: 2025-11-26 09:54:26.195041457 +0000 UTC m=+0.193389309 container attach c85f3a0012b83377940422f8fe60e91b9e704579989092d3dcf1891ab6855d3d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_sammet, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_BRANCH=main, vcs-type=git, com.redhat.component=rhceph-container, distribution-scope=public, ceph=True, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, RELEASE=main, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, version=7) Nov 26 04:54:26 localhost wizardly_sammet[303673]: 167 167 Nov 26 04:54:26 localhost systemd[1]: libpod-c85f3a0012b83377940422f8fe60e91b9e704579989092d3dcf1891ab6855d3d.scope: Deactivated successfully. Nov 26 04:54:26 localhost podman[303656]: 2025-11-26 09:54:26.197638007 +0000 UTC m=+0.103218896 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.expose-services=, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, version=9.6, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, architecture=x86_64, release=1755695350, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, distribution-scope=public, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.tags=minimal rhel9) Nov 26 04:54:26 localhost podman[303625]: 2025-11-26 09:54:26.213309348 +0000 UTC m=+0.238579919 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible) Nov 26 04:54:26 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:54:26 localhost podman[303633]: 2025-11-26 09:54:26.250584785 +0000 UTC m=+0.248932707 container died c85f3a0012b83377940422f8fe60e91b9e704579989092d3dcf1891ab6855d3d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_sammet, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, GIT_BRANCH=main, io.openshift.expose-services=, io.buildah.version=1.33.12, release=553, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , GIT_CLEAN=True, com.redhat.component=rhceph-container, RELEASE=main, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, name=rhceph, io.openshift.tags=rhceph ceph, distribution-scope=public) Nov 26 04:54:26 localhost podman[303656]: 2025-11-26 09:54:26.265448502 +0000 UTC m=+0.171029381 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, name=ubi9-minimal, io.buildah.version=1.33.7, vcs-type=git, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_id=edpm) Nov 26 04:54:26 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:54:26 localhost podman[303688]: 2025-11-26 09:54:26.361463985 +0000 UTC m=+0.147098365 container remove c85f3a0012b83377940422f8fe60e91b9e704579989092d3dcf1891ab6855d3d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_sammet, name=rhceph, RELEASE=main, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, GIT_CLEAN=True, io.openshift.expose-services=, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, release=553, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, version=7, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, ceph=True, com.redhat.component=rhceph-container) Nov 26 04:54:26 localhost systemd[1]: libpod-conmon-c85f3a0012b83377940422f8fe60e91b9e704579989092d3dcf1891ab6855d3d.scope: Deactivated successfully. Nov 26 04:54:26 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536118.localdomain.devices.0}] v 0) Nov 26 04:54:26 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536118.localdomain}] v 0) Nov 26 04:54:26 localhost ceph-mgr[287388]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005536119 (monmap changed)... Nov 26 04:54:26 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005536119 (monmap changed)... Nov 26 04:54:26 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005536119.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Nov 26 04:54:26 localhost ceph-mon[297296]: log_channel(audit) log [INF] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536119.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:54:26 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 26 04:54:26 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 26 04:54:26 localhost ceph-mgr[287388]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005536119 on np0005536119.localdomain Nov 26 04:54:26 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005536119 on np0005536119.localdomain Nov 26 04:54:26 localhost ceph-mon[297296]: Reconfiguring mgr.np0005536118.anceyj (monmap changed)... Nov 26 04:54:26 localhost ceph-mon[297296]: Reconfiguring daemon mgr.np0005536118.anceyj on np0005536118.localdomain Nov 26 04:54:26 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:26 localhost ceph-mon[297296]: from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536119.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:54:26 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:26 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005536119.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 26 04:54:26 localhost systemd[1]: var-lib-containers-storage-overlay-aea68da01d42c4e9eb6f7ef3bdbf405a30a03cedec1aaf27ac62434b125d9dd4-merged.mount: Deactivated successfully. Nov 26 04:54:27 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536119.localdomain.devices.0}] v 0) Nov 26 04:54:27 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536119.localdomain}] v 0) Nov 26 04:54:27 localhost ceph-mgr[287388]: [cephadm INFO cephadm.serve] Reconfiguring osd.1 (monmap changed)... Nov 26 04:54:27 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : Reconfiguring osd.1 (monmap changed)... Nov 26 04:54:27 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.1"} v 0) Nov 26 04:54:27 localhost ceph-mon[297296]: log_channel(audit) log [INF] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Nov 26 04:54:27 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 26 04:54:27 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 26 04:54:27 localhost ceph-mgr[287388]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005536119.localdomain Nov 26 04:54:27 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005536119.localdomain Nov 26 04:54:27 localhost podman[240049]: time="2025-11-26T09:54:27Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:54:27 localhost podman[240049]: @ - - [26/Nov/2025:09:54:27 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153864 "" "Go-http-client/1.1" Nov 26 04:54:27 localhost podman[240049]: @ - - [26/Nov/2025:09:54:27 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18720 "" "Go-http-client/1.1" Nov 26 04:54:27 localhost ceph-mon[297296]: Reconfiguring crash.np0005536119 (monmap changed)... Nov 26 04:54:27 localhost ceph-mon[297296]: Reconfiguring daemon crash.np0005536119 on np0005536119.localdomain Nov 26 04:54:27 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:27 localhost ceph-mon[297296]: from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Nov 26 04:54:27 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:27 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s Nov 26 04:54:28 localhost nova_compute[281415]: 2025-11-26 09:54:28.250 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:54:28 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536119.localdomain.devices.0}] v 0) Nov 26 04:54:28 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536119.localdomain}] v 0) Nov 26 04:54:28 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536119.localdomain.devices.0}] v 0) Nov 26 04:54:28 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536119.localdomain}] v 0) Nov 26 04:54:28 localhost ceph-mgr[287388]: [cephadm INFO cephadm.serve] Reconfiguring osd.3 (monmap changed)... Nov 26 04:54:28 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : Reconfiguring osd.3 (monmap changed)... Nov 26 04:54:28 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.3"} v 0) Nov 26 04:54:28 localhost ceph-mon[297296]: log_channel(audit) log [INF] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Nov 26 04:54:28 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 26 04:54:28 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 26 04:54:28 localhost ceph-mgr[287388]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.3 on np0005536119.localdomain Nov 26 04:54:28 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.3 on np0005536119.localdomain Nov 26 04:54:28 localhost ceph-mon[297296]: Reconfiguring osd.1 (monmap changed)... Nov 26 04:54:28 localhost ceph-mon[297296]: Reconfiguring daemon osd.1 on np0005536119.localdomain Nov 26 04:54:28 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:28 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:28 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:28 localhost ceph-mon[297296]: from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Nov 26 04:54:28 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:29 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:54:29 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536119.localdomain.devices.0}] v 0) Nov 26 04:54:29 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536119.localdomain}] v 0) Nov 26 04:54:29 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536119.localdomain.devices.0}] v 0) Nov 26 04:54:29 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536119.localdomain}] v 0) Nov 26 04:54:29 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 26 04:54:29 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 26 04:54:29 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) Nov 26 04:54:29 localhost ceph-mon[297296]: log_channel(audit) log [INF] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:54:29 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Nov 26 04:54:29 localhost ceph-mgr[287388]: [progress INFO root] update: starting ev b9730888-bc4c-4c76-a0cb-297506ef18aa (Updating node-proxy deployment (+3 -> 3)) Nov 26 04:54:29 localhost ceph-mgr[287388]: [progress INFO root] complete: finished ev b9730888-bc4c-4c76-a0cb-297506ef18aa (Updating node-proxy deployment (+3 -> 3)) Nov 26 04:54:29 localhost ceph-mgr[287388]: [progress INFO root] Completed event b9730888-bc4c-4c76-a0cb-297506ef18aa (Updating node-proxy deployment (+3 -> 3)) in 0 seconds Nov 26 04:54:29 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) Nov 26 04:54:29 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch Nov 26 04:54:29 localhost ceph-mon[297296]: Reconfiguring osd.3 (monmap changed)... Nov 26 04:54:29 localhost ceph-mon[297296]: Reconfiguring daemon osd.3 on np0005536119.localdomain Nov 26 04:54:29 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:29 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:29 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:29 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:29 localhost ceph-mon[297296]: from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:54:29 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:54:29 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s Nov 26 04:54:29 localhost systemd[1]: tmp-crun.2xd4Sp.mount: Deactivated successfully. Nov 26 04:54:29 localhost podman[303725]: 2025-11-26 09:54:29.73857636 +0000 UTC m=+0.074429071 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 26 04:54:29 localhost podman[303725]: 2025-11-26 09:54:29.758386939 +0000 UTC m=+0.094239580 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 26 04:54:29 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:54:29 localhost ceph-mgr[287388]: log_channel(audit) log [DBG] : from='client.74114 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch Nov 26 04:54:29 localhost ceph-mgr[287388]: [cephadm INFO root] Saving service mon spec with placement label:mon Nov 26 04:54:29 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : Saving service mon spec with placement label:mon Nov 26 04:54:29 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Nov 26 04:54:29 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 26 04:54:29 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 26 04:54:29 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) Nov 26 04:54:29 localhost ceph-mon[297296]: log_channel(audit) log [INF] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:54:29 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Nov 26 04:54:29 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Nov 26 04:54:29 localhost ceph-mgr[287388]: [progress INFO root] update: starting ev d16ed2a9-5847-420c-ad42-6ca0e05da2d4 (Updating node-proxy deployment (+3 -> 3)) Nov 26 04:54:29 localhost ceph-mgr[287388]: [progress INFO root] complete: finished ev d16ed2a9-5847-420c-ad42-6ca0e05da2d4 (Updating node-proxy deployment (+3 -> 3)) Nov 26 04:54:29 localhost ceph-mgr[287388]: [progress INFO root] Completed event d16ed2a9-5847-420c-ad42-6ca0e05da2d4 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds Nov 26 04:54:29 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) Nov 26 04:54:29 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch Nov 26 04:54:30 localhost ceph-mgr[287388]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005536117 (monmap changed)... Nov 26 04:54:30 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005536117 (monmap changed)... Nov 26 04:54:30 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0) Nov 26 04:54:30 localhost ceph-mon[297296]: log_channel(audit) log [INF] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 26 04:54:30 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0) Nov 26 04:54:30 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch Nov 26 04:54:30 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 26 04:54:30 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 26 04:54:30 localhost ceph-mgr[287388]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005536117 on np0005536117.localdomain Nov 26 04:54:30 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005536117 on np0005536117.localdomain Nov 26 04:54:30 localhost nova_compute[281415]: 2025-11-26 09:54:30.170 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:54:30 localhost ceph-mon[297296]: Saving service mon spec with placement label:mon Nov 26 04:54:30 localhost ceph-mon[297296]: from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:54:30 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:30 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:30 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:30 localhost ceph-mon[297296]: Reconfiguring mon.np0005536117 (monmap changed)... Nov 26 04:54:30 localhost ceph-mon[297296]: from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 26 04:54:30 localhost ceph-mon[297296]: Reconfiguring daemon mon.np0005536117 on np0005536117.localdomain Nov 26 04:54:30 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536117.localdomain.devices.0}] v 0) Nov 26 04:54:30 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536117.localdomain}] v 0) Nov 26 04:54:30 localhost ceph-mgr[287388]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005536118 (monmap changed)... Nov 26 04:54:30 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005536118 (monmap changed)... Nov 26 04:54:30 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0) Nov 26 04:54:30 localhost ceph-mon[297296]: log_channel(audit) log [INF] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 26 04:54:30 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0) Nov 26 04:54:30 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch Nov 26 04:54:30 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 26 04:54:30 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 26 04:54:31 localhost ceph-mgr[287388]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005536118 on np0005536118.localdomain Nov 26 04:54:31 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005536118 on np0005536118.localdomain Nov 26 04:54:31 localhost podman[303818]: Nov 26 04:54:31 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:54:31 localhost podman[303818]: 2025-11-26 09:54:31.71882834 +0000 UTC m=+0.087454512 container create 5141ed7fea2a4f0945565b3779de8bbb148ba1b1f1206f2eeb5bff0ca2c7c50c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_franklin, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, release=553, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.openshift.tags=rhceph ceph, architecture=x86_64, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, RELEASE=main, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, name=rhceph, GIT_CLEAN=True, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 26 04:54:31 localhost systemd[1]: Started libpod-conmon-5141ed7fea2a4f0945565b3779de8bbb148ba1b1f1206f2eeb5bff0ca2c7c50c.scope. Nov 26 04:54:31 localhost podman[303818]: 2025-11-26 09:54:31.684090981 +0000 UTC m=+0.052717203 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 26 04:54:31 localhost systemd[1]: Started libcrun container. Nov 26 04:54:31 localhost podman[303818]: 2025-11-26 09:54:31.798293064 +0000 UTC m=+0.166919236 container init 5141ed7fea2a4f0945565b3779de8bbb148ba1b1f1206f2eeb5bff0ca2c7c50c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_franklin, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, io.openshift.expose-services=, vendor=Red Hat, Inc., RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , GIT_CLEAN=True, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, CEPH_POINT_RELEASE=, release=553, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64) Nov 26 04:54:31 localhost interesting_franklin[303833]: 167 167 Nov 26 04:54:31 localhost systemd[1]: libpod-5141ed7fea2a4f0945565b3779de8bbb148ba1b1f1206f2eeb5bff0ca2c7c50c.scope: Deactivated successfully. Nov 26 04:54:31 localhost podman[303818]: 2025-11-26 09:54:31.813783329 +0000 UTC m=+0.182409511 container start 5141ed7fea2a4f0945565b3779de8bbb148ba1b1f1206f2eeb5bff0ca2c7c50c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_franklin, vcs-type=git, ceph=True, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, GIT_BRANCH=main, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, distribution-scope=public, RELEASE=main, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.openshift.tags=rhceph ceph) Nov 26 04:54:31 localhost podman[303818]: 2025-11-26 09:54:31.815991158 +0000 UTC m=+0.184617340 container attach 5141ed7fea2a4f0945565b3779de8bbb148ba1b1f1206f2eeb5bff0ca2c7c50c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_franklin, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, version=7, vcs-type=git, ceph=True, CEPH_POINT_RELEASE=, GIT_CLEAN=True, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, GIT_BRANCH=main, distribution-scope=public, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 26 04:54:31 localhost podman[303818]: 2025-11-26 09:54:31.819807506 +0000 UTC m=+0.188433678 container died 5141ed7fea2a4f0945565b3779de8bbb148ba1b1f1206f2eeb5bff0ca2c7c50c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_franklin, release=553, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=rhceph-container, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, name=rhceph, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, version=7, vendor=Red Hat, Inc.) Nov 26 04:54:31 localhost podman[303838]: 2025-11-26 09:54:31.922622737 +0000 UTC m=+0.097643334 container remove 5141ed7fea2a4f0945565b3779de8bbb148ba1b1f1206f2eeb5bff0ca2c7c50c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_franklin, RELEASE=main, maintainer=Guillaume Abrioux , name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, version=7, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, vcs-type=git, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.openshift.expose-services=) Nov 26 04:54:31 localhost systemd[1]: libpod-conmon-5141ed7fea2a4f0945565b3779de8bbb148ba1b1f1206f2eeb5bff0ca2c7c50c.scope: Deactivated successfully. Nov 26 04:54:31 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:31 localhost ceph-mon[297296]: Reconfiguring mon.np0005536118 (monmap changed)... Nov 26 04:54:31 localhost ceph-mon[297296]: from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 26 04:54:31 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:31 localhost ceph-mon[297296]: Reconfiguring daemon mon.np0005536118 on np0005536118.localdomain Nov 26 04:54:32 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536118.localdomain.devices.0}] v 0) Nov 26 04:54:32 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536118.localdomain}] v 0) Nov 26 04:54:32 localhost ceph-mgr[287388]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005536119 (monmap changed)... Nov 26 04:54:32 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005536119 (monmap changed)... Nov 26 04:54:32 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0) Nov 26 04:54:32 localhost ceph-mon[297296]: log_channel(audit) log [INF] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 26 04:54:32 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0) Nov 26 04:54:32 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch Nov 26 04:54:32 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 26 04:54:32 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 26 04:54:32 localhost ceph-mgr[287388]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005536119 on np0005536119.localdomain Nov 26 04:54:32 localhost ceph-mgr[287388]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005536119 on np0005536119.localdomain Nov 26 04:54:32 localhost systemd[1]: var-lib-containers-storage-overlay-dac83d614beba08968abdad0eff42bf377b7ba5f0aa89f3b33c8b282721cf11d-merged.mount: Deactivated successfully. Nov 26 04:54:32 localhost ceph-mgr[287388]: [progress INFO root] Writing back 50 completed events Nov 26 04:54:32 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Nov 26 04:54:32 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536119.localdomain.devices.0}] v 0) Nov 26 04:54:32 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005536119.localdomain}] v 0) Nov 26 04:54:33 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:33 localhost ceph-mon[297296]: Reconfiguring mon.np0005536119 (monmap changed)... Nov 26 04:54:33 localhost ceph-mon[297296]: from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 26 04:54:33 localhost ceph-mon[297296]: Reconfiguring daemon mon.np0005536119 on np0005536119.localdomain Nov 26 04:54:33 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:33 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:33 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:33 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:54:33 localhost nova_compute[281415]: 2025-11-26 09:54:33.287 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:54:33 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:54:33 localhost sshd[303856]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:54:34 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:54:34 localhost ceph-mgr[287388]: log_channel(audit) log [DBG] : from='client.74117 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005536119", "target": ["mon-mgr", ""], "format": "json"}]: dispatch Nov 26 04:54:35 localhost nova_compute[281415]: 2025-11-26 09:54:35.193 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:54:35 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:54:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:54:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:54:36 localhost systemd[1]: tmp-crun.AJffuX.mount: Deactivated successfully. Nov 26 04:54:36 localhost podman[303859]: 2025-11-26 09:54:36.843715392 +0000 UTC m=+0.096527090 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.build-date=20251118, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2) Nov 26 04:54:36 localhost podman[303858]: 2025-11-26 09:54:36.889321765 +0000 UTC m=+0.142221396 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 26 04:54:36 localhost podman[303859]: 2025-11-26 09:54:36.905366209 +0000 UTC m=+0.158177947 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd) Nov 26 04:54:36 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:54:36 localhost podman[303858]: 2025-11-26 09:54:36.923344071 +0000 UTC m=+0.176243662 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 26 04:54:36 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:54:37 localhost nova_compute[281415]: 2025-11-26 09:54:37.069 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:54:37 localhost nova_compute[281415]: 2025-11-26 09:54:37.100 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Triggering sync for uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Nov 26 04:54:37 localhost nova_compute[281415]: 2025-11-26 09:54:37.101 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "9d78bef9-6977-4fb5-b50b-ae75124e73af" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:54:37 localhost nova_compute[281415]: 2025-11-26 09:54:37.101 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "9d78bef9-6977-4fb5-b50b-ae75124e73af" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:54:37 localhost nova_compute[281415]: 2025-11-26 09:54:37.134 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "9d78bef9-6977-4fb5-b50b-ae75124e73af" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.033s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:54:37 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:54:38 localhost nova_compute[281415]: 2025-11-26 09:54:38.335 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:54:39 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:54:39 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:54:40 localhost nova_compute[281415]: 2025-11-26 09:54:40.226 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:54:41 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:54:42 localhost ceph-mgr[287388]: [volumes INFO mgr_util] scanning for idle connections.. Nov 26 04:54:42 localhost ceph-mgr[287388]: [volumes INFO mgr_util] cleaning up connections: [] Nov 26 04:54:42 localhost ceph-mgr[287388]: [volumes INFO mgr_util] scanning for idle connections.. Nov 26 04:54:42 localhost ceph-mgr[287388]: [volumes INFO mgr_util] cleaning up connections: [] Nov 26 04:54:42 localhost ceph-mgr[287388]: [volumes INFO mgr_util] scanning for idle connections.. Nov 26 04:54:42 localhost ceph-mgr[287388]: [volumes INFO mgr_util] cleaning up connections: [] Nov 26 04:54:43 localhost nova_compute[281415]: 2025-11-26 09:54:43.375 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:54:43 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:54:44 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:54:45 localhost nova_compute[281415]: 2025-11-26 09:54:45.265 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:54:45 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:54:45 localhost openstack_network_exporter[242153]: ERROR 09:54:45 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:54:45 localhost openstack_network_exporter[242153]: ERROR 09:54:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:54:45 localhost openstack_network_exporter[242153]: ERROR 09:54:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:54:45 localhost openstack_network_exporter[242153]: ERROR 09:54:45 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:54:45 localhost openstack_network_exporter[242153]: Nov 26 04:54:45 localhost openstack_network_exporter[242153]: ERROR 09:54:45 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:54:45 localhost openstack_network_exporter[242153]: Nov 26 04:54:47 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:54:48 localhost nova_compute[281415]: 2025-11-26 09:54:48.399 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:54:49 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:54:49 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:54:50 localhost nova_compute[281415]: 2025-11-26 09:54:50.300 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:54:51 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:54:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:54:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:54:51 localhost podman[303897]: 2025-11-26 09:54:51.837684572 +0000 UTC m=+0.094785766 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 26 04:54:51 localhost podman[303898]: 2025-11-26 09:54:51.898547855 +0000 UTC m=+0.152123700 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:54:51 localhost podman[303897]: 2025-11-26 09:54:51.904786556 +0000 UTC m=+0.161887750 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:54:51 localhost podman[303898]: 2025-11-26 09:54:51.918310512 +0000 UTC m=+0.171886347 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute) Nov 26 04:54:51 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:54:51 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:54:53 localhost nova_compute[281415]: 2025-11-26 09:54:53.566 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:54:53 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:54:53 localhost nova_compute[281415]: 2025-11-26 09:54:53.875 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:54:54 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:54:54 localhost nova_compute[281415]: 2025-11-26 09:54:54.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:54:54 localhost nova_compute[281415]: 2025-11-26 09:54:54.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:54:55 localhost nova_compute[281415]: 2025-11-26 09:54:55.337 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:54:55 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:54:55 localhost nova_compute[281415]: 2025-11-26 09:54:55.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:54:55 localhost nova_compute[281415]: 2025-11-26 09:54:55.848 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 04:54:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:54:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:54:56 localhost systemd[298302]: Starting Mark boot as successful... Nov 26 04:54:56 localhost podman[303938]: 2025-11-26 09:54:56.833027111 +0000 UTC m=+0.088862924 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3) Nov 26 04:54:56 localhost systemd[298302]: Finished Mark boot as successful. Nov 26 04:54:56 localhost nova_compute[281415]: 2025-11-26 09:54:56.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:54:56 localhost nova_compute[281415]: 2025-11-26 09:54:56.849 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:54:56 localhost podman[303939]: 2025-11-26 09:54:56.91230296 +0000 UTC m=+0.161807238 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.openshift.tags=minimal rhel9) Nov 26 04:54:56 localhost podman[303939]: 2025-11-26 09:54:56.931264273 +0000 UTC m=+0.180768581 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, managed_by=edpm_ansible, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, release=1755695350, distribution-scope=public) Nov 26 04:54:56 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:54:56 localhost podman[303938]: 2025-11-26 09:54:56.982288683 +0000 UTC m=+0.238124496 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible) Nov 26 04:54:56 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:54:57 localhost podman[240049]: time="2025-11-26T09:54:57Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:54:57 localhost podman[240049]: @ - - [26/Nov/2025:09:54:57 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153864 "" "Go-http-client/1.1" Nov 26 04:54:57 localhost podman[240049]: @ - - [26/Nov/2025:09:54:57 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18729 "" "Go-http-client/1.1" Nov 26 04:54:57 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:54:57 localhost nova_compute[281415]: 2025-11-26 09:54:57.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:54:57 localhost nova_compute[281415]: 2025-11-26 09:54:57.874 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:54:57 localhost nova_compute[281415]: 2025-11-26 09:54:57.874 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:54:57 localhost nova_compute[281415]: 2025-11-26 09:54:57.875 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:54:57 localhost nova_compute[281415]: 2025-11-26 09:54:57.875 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 04:54:57 localhost nova_compute[281415]: 2025-11-26 09:54:57.875 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:54:58 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 04:54:58 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2616987798' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 04:54:58 localhost nova_compute[281415]: 2025-11-26 09:54:58.339 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:54:58 localhost nova_compute[281415]: 2025-11-26 09:54:58.414 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:54:58 localhost nova_compute[281415]: 2025-11-26 09:54:58.415 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:54:58 localhost nova_compute[281415]: 2025-11-26 09:54:58.569 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:54:58 localhost nova_compute[281415]: 2025-11-26 09:54:58.649 281419 WARNING nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 04:54:58 localhost nova_compute[281415]: 2025-11-26 09:54:58.650 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=11751MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 04:54:58 localhost nova_compute[281415]: 2025-11-26 09:54:58.651 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:54:58 localhost nova_compute[281415]: 2025-11-26 09:54:58.651 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:54:58 localhost nova_compute[281415]: 2025-11-26 09:54:58.746 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 04:54:58 localhost nova_compute[281415]: 2025-11-26 09:54:58.746 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 04:54:58 localhost nova_compute[281415]: 2025-11-26 09:54:58.747 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 04:54:58 localhost nova_compute[281415]: 2025-11-26 09:54:58.799 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:54:59 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:54:59 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 04:54:59 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/4217271828' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 04:54:59 localhost nova_compute[281415]: 2025-11-26 09:54:59.263 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:54:59 localhost nova_compute[281415]: 2025-11-26 09:54:59.270 281419 DEBUG nova.compute.provider_tree [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 04:54:59 localhost nova_compute[281415]: 2025-11-26 09:54:59.292 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 04:54:59 localhost nova_compute[281415]: 2025-11-26 09:54:59.294 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 04:54:59 localhost nova_compute[281415]: 2025-11-26 09:54:59.295 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.643s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:54:59 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:55:00 localhost nova_compute[281415]: 2025-11-26 09:55:00.295 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:55:00 localhost nova_compute[281415]: 2025-11-26 09:55:00.376 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:55:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:55:00 localhost systemd[1]: tmp-crun.XwHlNm.mount: Deactivated successfully. Nov 26 04:55:00 localhost podman[304025]: 2025-11-26 09:55:00.832164728 +0000 UTC m=+0.090379190 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 26 04:55:00 localhost podman[304025]: 2025-11-26 09:55:00.840592628 +0000 UTC m=+0.098807110 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 04:55:00 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:55:01 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:55:01 localhost nova_compute[281415]: 2025-11-26 09:55:01.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:55:01 localhost nova_compute[281415]: 2025-11-26 09:55:01.848 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 04:55:01 localhost nova_compute[281415]: 2025-11-26 09:55:01.848 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 04:55:02 localhost nova_compute[281415]: 2025-11-26 09:55:02.313 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:55:02 localhost nova_compute[281415]: 2025-11-26 09:55:02.314 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:55:02 localhost nova_compute[281415]: 2025-11-26 09:55:02.315 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 04:55:02 localhost nova_compute[281415]: 2025-11-26 09:55:02.315 281419 DEBUG nova.objects.instance [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:55:03 localhost nova_compute[281415]: 2025-11-26 09:55:03.512 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:55:03 localhost nova_compute[281415]: 2025-11-26 09:55:03.536 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:55:03 localhost nova_compute[281415]: 2025-11-26 09:55:03.536 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 04:55:03 localhost nova_compute[281415]: 2025-11-26 09:55:03.571 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:55:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:55:03.661 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:55:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:55:03.661 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:55:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:55:03.662 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:55:03 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:55:04 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:55:05 localhost nova_compute[281415]: 2025-11-26 09:55:05.380 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:55:05 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v30: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:55:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:55:07 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:55:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:55:07 localhost systemd[1]: tmp-crun.kLASyj.mount: Deactivated successfully. Nov 26 04:55:07 localhost podman[304047]: 2025-11-26 09:55:07.837705494 +0000 UTC m=+0.093652353 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent) Nov 26 04:55:07 localhost podman[304047]: 2025-11-26 09:55:07.868181069 +0000 UTC m=+0.124127878 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:55:07 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:55:07 localhost podman[304048]: 2025-11-26 09:55:07.872881871 +0000 UTC m=+0.127121087 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:55:07 localhost podman[304048]: 2025-11-26 09:55:07.952741334 +0000 UTC m=+0.206980580 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, io.buildah.version=1.41.3) Nov 26 04:55:07 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:55:08 localhost nova_compute[281415]: 2025-11-26 09:55:08.575 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:55:09 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:55:09 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:55:10 localhost nova_compute[281415]: 2025-11-26 09:55:10.415 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:55:11 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:55:12 localhost sshd[304084]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:55:12 localhost ceph-mgr[287388]: [balancer INFO root] Optimize plan auto_2025-11-26_09:55:12 Nov 26 04:55:12 localhost ceph-mgr[287388]: [balancer INFO root] Mode upmap, max misplaced 0.050000 Nov 26 04:55:12 localhost ceph-mgr[287388]: [balancer INFO root] do_upmap Nov 26 04:55:12 localhost ceph-mgr[287388]: [balancer INFO root] pools ['manila_metadata', 'manila_data', '.mgr', 'volumes', 'vms', 'backups', 'images'] Nov 26 04:55:12 localhost ceph-mgr[287388]: [balancer INFO root] prepared 0/10 changes Nov 26 04:55:12 localhost ceph-mgr[287388]: [pg_autoscaler INFO root] _maybe_adjust Nov 26 04:55:12 localhost ceph-mgr[287388]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Nov 26 04:55:12 localhost ceph-mgr[287388]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1) Nov 26 04:55:12 localhost ceph-mgr[287388]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Nov 26 04:55:12 localhost ceph-mgr[287388]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003325274375348967 of space, bias 1.0, pg target 0.6650548750697934 quantized to 32 (current 32) Nov 26 04:55:12 localhost ceph-mgr[287388]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Nov 26 04:55:12 localhost ceph-mgr[287388]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32) Nov 26 04:55:12 localhost ceph-mgr[287388]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Nov 26 04:55:12 localhost ceph-mgr[287388]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0014449417225013959 of space, bias 1.0, pg target 0.2885066972594454 quantized to 32 (current 32) Nov 26 04:55:12 localhost ceph-mgr[287388]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Nov 26 04:55:12 localhost ceph-mgr[287388]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32) Nov 26 04:55:12 localhost ceph-mgr[287388]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Nov 26 04:55:12 localhost ceph-mgr[287388]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32) Nov 26 04:55:12 localhost ceph-mgr[287388]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Nov 26 04:55:12 localhost ceph-mgr[287388]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 2.453674623115578e-06 of space, bias 4.0, pg target 0.0019596681323283084 quantized to 16 (current 16) Nov 26 04:55:12 localhost ceph-mgr[287388]: [volumes INFO mgr_util] scanning for idle connections.. Nov 26 04:55:12 localhost ceph-mgr[287388]: [volumes INFO mgr_util] cleaning up connections: [] Nov 26 04:55:12 localhost ceph-mgr[287388]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules Nov 26 04:55:12 localhost ceph-mgr[287388]: [rbd_support INFO root] load_schedules: vms, start_after= Nov 26 04:55:12 localhost ceph-mgr[287388]: [rbd_support INFO root] load_schedules: volumes, start_after= Nov 26 04:55:12 localhost ceph-mgr[287388]: [rbd_support INFO root] load_schedules: images, start_after= Nov 26 04:55:12 localhost ceph-mgr[287388]: [rbd_support INFO root] load_schedules: backups, start_after= Nov 26 04:55:12 localhost ceph-mgr[287388]: [volumes INFO mgr_util] scanning for idle connections.. Nov 26 04:55:12 localhost ceph-mgr[287388]: [volumes INFO mgr_util] cleaning up connections: [] Nov 26 04:55:12 localhost ceph-mgr[287388]: [volumes INFO mgr_util] scanning for idle connections.. Nov 26 04:55:12 localhost ceph-mgr[287388]: [volumes INFO mgr_util] cleaning up connections: [] Nov 26 04:55:12 localhost ceph-mgr[287388]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules Nov 26 04:55:12 localhost ceph-mgr[287388]: [rbd_support INFO root] load_schedules: vms, start_after= Nov 26 04:55:12 localhost ceph-mgr[287388]: [rbd_support INFO root] load_schedules: volumes, start_after= Nov 26 04:55:12 localhost ceph-mgr[287388]: [rbd_support INFO root] load_schedules: images, start_after= Nov 26 04:55:12 localhost ceph-mgr[287388]: [rbd_support INFO root] load_schedules: backups, start_after= Nov 26 04:55:13 localhost nova_compute[281415]: 2025-11-26 09:55:13.578 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:55:13 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v34: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:55:14 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:55:15 localhost nova_compute[281415]: 2025-11-26 09:55:15.458 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:55:15 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v35: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:55:15 localhost openstack_network_exporter[242153]: ERROR 09:55:15 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:55:15 localhost openstack_network_exporter[242153]: ERROR 09:55:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:55:15 localhost openstack_network_exporter[242153]: ERROR 09:55:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:55:15 localhost openstack_network_exporter[242153]: ERROR 09:55:15 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:55:15 localhost openstack_network_exporter[242153]: Nov 26 04:55:15 localhost openstack_network_exporter[242153]: ERROR 09:55:15 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:55:15 localhost openstack_network_exporter[242153]: Nov 26 04:55:17 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v36: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:55:18 localhost nova_compute[281415]: 2025-11-26 09:55:18.619 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:55:19 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:55:19 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v37: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:55:20 localhost nova_compute[281415]: 2025-11-26 09:55:20.493 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:55:21 localhost ceph-mgr[287388]: log_channel(audit) log [DBG] : from='client.74147 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json"}]: dispatch Nov 26 04:55:21 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:55:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:55:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:55:22 localhost podman[304087]: 2025-11-26 09:55:22.835652172 +0000 UTC m=+0.085610278 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:55:22 localhost podman[304087]: 2025-11-26 09:55:22.851464932 +0000 UTC m=+0.101423108 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118) Nov 26 04:55:22 localhost podman[304086]: 2025-11-26 09:55:22.891070133 +0000 UTC m=+0.144007030 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 26 04:55:22 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:55:22 localhost podman[304086]: 2025-11-26 09:55:22.924748265 +0000 UTC m=+0.177685132 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:55:22 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:55:23 localhost nova_compute[281415]: 2025-11-26 09:55:23.647 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:55:23 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v39: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:55:24 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:55:25 localhost nova_compute[281415]: 2025-11-26 09:55:25.529 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:55:25 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:55:27 localhost podman[240049]: time="2025-11-26T09:55:27Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:55:27 localhost podman[240049]: @ - - [26/Nov/2025:09:55:27 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153864 "" "Go-http-client/1.1" Nov 26 04:55:27 localhost podman[240049]: @ - - [26/Nov/2025:09:55:27 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18725 "" "Go-http-client/1.1" Nov 26 04:55:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:55:27 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v41: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:55:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:55:27 localhost systemd[1]: tmp-crun.p125Rw.mount: Deactivated successfully. Nov 26 04:55:27 localhost podman[304131]: 2025-11-26 09:55:27.859006298 +0000 UTC m=+0.103169261 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, architecture=x86_64, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, distribution-scope=public, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.openshift.expose-services=) Nov 26 04:55:27 localhost podman[304131]: 2025-11-26 09:55:27.900345882 +0000 UTC m=+0.144508835 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, maintainer=Red Hat, Inc., io.openshift.expose-services=) Nov 26 04:55:27 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:55:27 localhost systemd[1]: tmp-crun.0riJzw.mount: Deactivated successfully. Nov 26 04:55:27 localhost podman[304130]: 2025-11-26 09:55:27.950975438 +0000 UTC m=+0.201165294 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, config_id=ovn_controller) Nov 26 04:55:28 localhost podman[304130]: 2025-11-26 09:55:28.042486174 +0000 UTC m=+0.292676030 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller) Nov 26 04:55:28 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:55:28 localhost nova_compute[281415]: 2025-11-26 09:55:28.648 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:55:29 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:55:29 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v42: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:55:30 localhost sshd[304175]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:55:30 localhost nova_compute[281415]: 2025-11-26 09:55:30.568 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:55:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:55:31 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v43: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:55:31 localhost podman[304177]: 2025-11-26 09:55:31.81896666 +0000 UTC m=+0.079611196 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 26 04:55:31 localhost podman[304177]: 2025-11-26 09:55:31.830351595 +0000 UTC m=+0.090996111 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 26 04:55:31 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:55:33 localhost ceph-mgr[287388]: log_channel(audit) log [DBG] : from='client.74156 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json"}]: dispatch Nov 26 04:55:33 localhost nova_compute[281415]: 2025-11-26 09:55:33.684 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:55:33 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v44: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:55:33 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 26 04:55:33 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 26 04:55:33 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) Nov 26 04:55:33 localhost ceph-mon[297296]: log_channel(audit) log [INF] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:55:33 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Nov 26 04:55:33 localhost ceph-mgr[287388]: [progress INFO root] update: starting ev d00b64bb-50b6-455e-bef5-7844f2288a10 (Updating node-proxy deployment (+3 -> 3)) Nov 26 04:55:33 localhost ceph-mgr[287388]: [progress INFO root] complete: finished ev d00b64bb-50b6-455e-bef5-7844f2288a10 (Updating node-proxy deployment (+3 -> 3)) Nov 26 04:55:33 localhost ceph-mgr[287388]: [progress INFO root] Completed event d00b64bb-50b6-455e-bef5-7844f2288a10 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds Nov 26 04:55:33 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) Nov 26 04:55:33 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch Nov 26 04:55:34 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:55:34 localhost ceph-mon[297296]: from='mgr.26723 172.18.0.107:0/56348432' entity='mgr.np0005536118.anceyj' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:55:34 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:55:35 localhost nova_compute[281415]: 2025-11-26 09:55:35.599 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:55:35 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v45: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:55:37 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v46: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:55:37 localhost ceph-mgr[287388]: [progress INFO root] Writing back 50 completed events Nov 26 04:55:37 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Nov 26 04:55:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:55:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:55:38 localhost podman[304287]: 2025-11-26 09:55:38.340845499 +0000 UTC m=+0.095097806 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:55:38 localhost podman[304287]: 2025-11-26 09:55:38.347027207 +0000 UTC m=+0.101279604 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 26 04:55:38 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:55:38 localhost podman[304288]: 2025-11-26 09:55:38.433666935 +0000 UTC m=+0.183771276 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd) Nov 26 04:55:38 localhost podman[304288]: 2025-11-26 09:55:38.444540945 +0000 UTC m=+0.194645296 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.schema-version=1.0) Nov 26 04:55:38 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:55:38 localhost nova_compute[281415]: 2025-11-26 09:55:38.727 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:55:38 localhost ceph-mon[297296]: from='mgr.26723 ' entity='mgr.np0005536118.anceyj' Nov 26 04:55:39 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:55:39 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v47: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:55:40 localhost nova_compute[281415]: 2025-11-26 09:55:40.626 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:55:41 localhost ceph-mgr[287388]: log_channel(cluster) log [DBG] : pgmap v48: 177 pgs: 177 active+clean; 104 MiB data, 597 MiB used, 41 GiB / 42 GiB avail Nov 26 04:55:42 localhost ceph-mgr[287388]: [volumes INFO mgr_util] scanning for idle connections.. Nov 26 04:55:42 localhost ceph-mgr[287388]: [volumes INFO mgr_util] cleaning up connections: [] Nov 26 04:55:42 localhost ceph-mgr[287388]: [volumes INFO mgr_util] scanning for idle connections.. Nov 26 04:55:42 localhost ceph-mgr[287388]: [volumes INFO mgr_util] cleaning up connections: [('cephfs', )] Nov 26 04:55:42 localhost ceph-mgr[287388]: [volumes INFO mgr_util] disconnecting from cephfs 'cephfs' Nov 26 04:55:42 localhost ceph-mgr[287388]: [volumes INFO mgr_util] scanning for idle connections.. Nov 26 04:55:42 localhost ceph-mgr[287388]: [volumes INFO mgr_util] cleaning up connections: [('cephfs', )] Nov 26 04:55:42 localhost ceph-mgr[287388]: [volumes INFO mgr_util] disconnecting from cephfs 'cephfs' Nov 26 04:55:42 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e87 e87: 6 total, 6 up, 6 in Nov 26 04:55:42 localhost ceph-mgr[287388]: mgr handle_mgr_map I was active but no longer am Nov 26 04:55:42 localhost ceph-mgr[287388]: mgr respawn e: '/usr/bin/ceph-mgr' Nov 26 04:55:42 localhost ceph-mgr[287388]: mgr respawn 0: '/usr/bin/ceph-mgr' Nov 26 04:55:42 localhost ceph-mgr[287388]: mgr respawn 1: '-n' Nov 26 04:55:42 localhost ceph-mgr[287388]: mgr respawn 2: 'mgr.np0005536118.anceyj' Nov 26 04:55:42 localhost ceph-mgr[287388]: mgr respawn 3: '-f' Nov 26 04:55:42 localhost ceph-mgr[287388]: mgr respawn 4: '--setuser' Nov 26 04:55:42 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:55:42.843+0000 7f545c85b640 -1 mgr handle_mgr_map I was active but no longer am Nov 26 04:55:42 localhost ceph-mgr[287388]: mgr respawn 5: 'ceph' Nov 26 04:55:42 localhost ceph-mgr[287388]: mgr respawn 6: '--setgroup' Nov 26 04:55:42 localhost ceph-mgr[287388]: mgr respawn 7: 'ceph' Nov 26 04:55:42 localhost ceph-mgr[287388]: mgr respawn 8: '--default-log-to-file=false' Nov 26 04:55:42 localhost ceph-mgr[287388]: mgr respawn 9: '--default-log-to-journald=true' Nov 26 04:55:42 localhost ceph-mgr[287388]: mgr respawn 10: '--default-log-to-stderr=false' Nov 26 04:55:42 localhost ceph-mon[297296]: from='client.? 172.18.0.200:0/463095928' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Nov 26 04:55:42 localhost ceph-mon[297296]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Nov 26 04:55:42 localhost ceph-mon[297296]: Activating manager daemon np0005536119.eupicg Nov 26 04:55:42 localhost ceph-mon[297296]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Nov 26 04:55:42 localhost ceph-mon[297296]: Manager daemon np0005536119.eupicg is now available Nov 26 04:55:42 localhost systemd[1]: session-71.scope: Deactivated successfully. Nov 26 04:55:42 localhost systemd[1]: session-71.scope: Consumed 12.141s CPU time. Nov 26 04:55:42 localhost systemd-logind[761]: Session 71 logged out. Waiting for processes to exit. Nov 26 04:55:42 localhost systemd-logind[761]: Removed session 71. Nov 26 04:55:42 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: ignoring --setuser ceph since I am not root Nov 26 04:55:42 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: ignoring --setgroup ceph since I am not root Nov 26 04:55:42 localhost ceph-mgr[287388]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mgr, pid 2 Nov 26 04:55:42 localhost ceph-mgr[287388]: pidfile_write: ignore empty --pid-file Nov 26 04:55:43 localhost ceph-mgr[287388]: mgr[py] Loading python module 'alerts' Nov 26 04:55:43 localhost ceph-mgr[287388]: mgr[py] Module alerts has missing NOTIFY_TYPES member Nov 26 04:55:43 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:55:43.087+0000 7f8fa3ac1140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member Nov 26 04:55:43 localhost ceph-mgr[287388]: mgr[py] Loading python module 'balancer' Nov 26 04:55:43 localhost ceph-mgr[287388]: mgr[py] Module balancer has missing NOTIFY_TYPES member Nov 26 04:55:43 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:55:43.152+0000 7f8fa3ac1140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member Nov 26 04:55:43 localhost ceph-mgr[287388]: mgr[py] Loading python module 'cephadm' Nov 26 04:55:43 localhost sshd[304346]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:55:43 localhost systemd-logind[761]: New session 72 of user ceph-admin. Nov 26 04:55:43 localhost systemd[1]: Started Session 72 of User ceph-admin. Nov 26 04:55:43 localhost ceph-mgr[287388]: mgr[py] Loading python module 'crash' Nov 26 04:55:43 localhost nova_compute[281415]: 2025-11-26 09:55:43.762 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:55:43 localhost ceph-mgr[287388]: mgr[py] Module crash has missing NOTIFY_TYPES member Nov 26 04:55:43 localhost ceph-mgr[287388]: mgr[py] Loading python module 'dashboard' Nov 26 04:55:43 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:55:43.788+0000 7f8fa3ac1140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member Nov 26 04:55:43 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005536119.eupicg/mirror_snapshot_schedule"} : dispatch Nov 26 04:55:43 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005536119.eupicg/mirror_snapshot_schedule"} : dispatch Nov 26 04:55:43 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005536119.eupicg/trash_purge_schedule"} : dispatch Nov 26 04:55:43 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005536119.eupicg/trash_purge_schedule"} : dispatch Nov 26 04:55:44 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:55:44 localhost ceph-mgr[287388]: mgr[py] Loading python module 'devicehealth' Nov 26 04:55:44 localhost ceph-mgr[287388]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member Nov 26 04:55:44 localhost ceph-mgr[287388]: mgr[py] Loading python module 'diskprediction_local' Nov 26 04:55:44 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:55:44.336+0000 7f8fa3ac1140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member Nov 26 04:55:44 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode. Nov 26 04:55:44 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve. Nov 26 04:55:44 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: from numpy import show_config as show_numpy_config Nov 26 04:55:44 localhost ceph-mgr[287388]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member Nov 26 04:55:44 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:55:44.469+0000 7f8fa3ac1140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member Nov 26 04:55:44 localhost ceph-mgr[287388]: mgr[py] Loading python module 'influx' Nov 26 04:55:44 localhost ceph-mgr[287388]: mgr[py] Module influx has missing NOTIFY_TYPES member Nov 26 04:55:44 localhost ceph-mgr[287388]: mgr[py] Loading python module 'insights' Nov 26 04:55:44 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:55:44.526+0000 7f8fa3ac1140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member Nov 26 04:55:44 localhost systemd[1]: tmp-crun.POi432.mount: Deactivated successfully. Nov 26 04:55:44 localhost podman[304463]: 2025-11-26 09:55:44.561237263 +0000 UTC m=+0.130109109 container exec a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, com.redhat.component=rhceph-container, ceph=True, distribution-scope=public, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , version=7, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, io.buildah.version=1.33.12) Nov 26 04:55:44 localhost ceph-mgr[287388]: mgr[py] Loading python module 'iostat' Nov 26 04:55:44 localhost ceph-mgr[287388]: mgr[py] Module iostat has missing NOTIFY_TYPES member Nov 26 04:55:44 localhost ceph-mgr[287388]: mgr[py] Loading python module 'k8sevents' Nov 26 04:55:44 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:55:44.638+0000 7f8fa3ac1140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member Nov 26 04:55:44 localhost podman[304463]: 2025-11-26 09:55:44.66400662 +0000 UTC m=+0.232878496 container exec_died a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, io.openshift.expose-services=, maintainer=Guillaume Abrioux , release=553, ceph=True, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, architecture=x86_64, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, version=7, com.redhat.component=rhceph-container, distribution-scope=public) Nov 26 04:55:44 localhost ceph-mon[297296]: [26/Nov/2025:09:55:44] ENGINE Bus STARTING Nov 26 04:55:44 localhost ceph-mon[297296]: [26/Nov/2025:09:55:44] ENGINE Serving on http://172.18.0.108:8765 Nov 26 04:55:44 localhost ceph-mon[297296]: [26/Nov/2025:09:55:44] ENGINE Serving on https://172.18.0.108:7150 Nov 26 04:55:44 localhost ceph-mon[297296]: [26/Nov/2025:09:55:44] ENGINE Bus STARTED Nov 26 04:55:44 localhost ceph-mon[297296]: [26/Nov/2025:09:55:44] ENGINE Client ('172.18.0.108', 40526) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Nov 26 04:55:44 localhost ceph-mon[297296]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm) Nov 26 04:55:44 localhost ceph-mon[297296]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm) Nov 26 04:55:44 localhost ceph-mon[297296]: Cluster is now healthy Nov 26 04:55:44 localhost ceph-mon[297296]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0. Nov 26 04:55:44 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:55:44.936825) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 26 04:55:44 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25 Nov 26 04:55:44 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150944936869, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 2679, "num_deletes": 256, "total_data_size": 8165351, "memory_usage": 8395080, "flush_reason": "Manual Compaction"} Nov 26 04:55:44 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started Nov 26 04:55:44 localhost ceph-mgr[287388]: mgr[py] Loading python module 'localpool' Nov 26 04:55:44 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150944966660, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 4896912, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14689, "largest_seqno": 17363, "table_properties": {"data_size": 4886210, "index_size": 6695, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2949, "raw_key_size": 25807, "raw_average_key_size": 22, "raw_value_size": 4863407, "raw_average_value_size": 4156, "num_data_blocks": 290, "num_entries": 1170, "num_filter_entries": 1170, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764150842, "oldest_key_time": 1764150842, "file_creation_time": 1764150944, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79fcc812-ff89-4330-9c0d-148e92884770", "db_session_id": "NHY1MHQN9GMTALZ562AS", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}} Nov 26 04:55:44 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 29898 microseconds, and 12469 cpu microseconds. Nov 26 04:55:44 localhost ceph-mon[297296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 04:55:44 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:55:44.966715) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 4896912 bytes OK Nov 26 04:55:44 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:55:44.966743) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started Nov 26 04:55:44 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:55:44.971714) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done Nov 26 04:55:44 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:55:44.971741) EVENT_LOG_v1 {"time_micros": 1764150944971735, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 26 04:55:44 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:55:44.971792) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 26 04:55:44 localhost ceph-mon[297296]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 8152702, prev total WAL file size 8158135, number of live WAL files 2. Nov 26 04:55:44 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 04:55:44 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:55:44.973594) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131303434' seq:72057594037927935, type:22 .. '7061786F73003131323936' seq:0, type:0; will stop at (end) Nov 26 04:55:44 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 26 04:55:44 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(4782KB)], [24(14MB)] Nov 26 04:55:44 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150944973711, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 19703430, "oldest_snapshot_seqno": -1} Nov 26 04:55:45 localhost ceph-mgr[287388]: mgr[py] Loading python module 'mds_autoscaler' Nov 26 04:55:45 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 11587 keys, 16579086 bytes, temperature: kUnknown Nov 26 04:55:45 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150945057380, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 16579086, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16511811, "index_size": 37222, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 28997, "raw_key_size": 310696, "raw_average_key_size": 26, "raw_value_size": 16313185, "raw_average_value_size": 1407, "num_data_blocks": 1418, "num_entries": 11587, "num_filter_entries": 11587, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764150724, "oldest_key_time": 0, "file_creation_time": 1764150944, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79fcc812-ff89-4330-9c0d-148e92884770", "db_session_id": "NHY1MHQN9GMTALZ562AS", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}} Nov 26 04:55:45 localhost ceph-mon[297296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 04:55:45 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:55:45.057731) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 16579086 bytes Nov 26 04:55:45 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:55:45.060506) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 235.3 rd, 197.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.7, 14.1 +0.0 blob) out(15.8 +0.0 blob), read-write-amplify(7.4) write-amplify(3.4) OK, records in: 12132, records dropped: 545 output_compression: NoCompression Nov 26 04:55:45 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:55:45.060532) EVENT_LOG_v1 {"time_micros": 1764150945060520, "job": 12, "event": "compaction_finished", "compaction_time_micros": 83755, "compaction_time_cpu_micros": 54712, "output_level": 6, "num_output_files": 1, "total_output_size": 16579086, "num_input_records": 12132, "num_output_records": 11587, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 26 04:55:45 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:55:44.973447) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:55:45 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:55:45.060660) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:55:45 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:55:45.060670) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:55:45 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:55:45.060673) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:55:45 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:55:45.060676) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:55:45 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:55:45.060679) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:55:45 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 04:55:45 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150945062504, "job": 0, "event": "table_file_deletion", "file_number": 26} Nov 26 04:55:45 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 04:55:45 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764150945065327, "job": 0, "event": "table_file_deletion", "file_number": 24} Nov 26 04:55:45 localhost ceph-mgr[287388]: mgr[py] Loading python module 'mirroring' Nov 26 04:55:45 localhost ceph-mgr[287388]: mgr[py] Loading python module 'nfs' Nov 26 04:55:45 localhost ceph-mgr[287388]: mgr[py] Module nfs has missing NOTIFY_TYPES member Nov 26 04:55:45 localhost ceph-mgr[287388]: mgr[py] Loading python module 'orchestrator' Nov 26 04:55:45 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:55:45.400+0000 7f8fa3ac1140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member Nov 26 04:55:45 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:55:45.545+0000 7f8fa3ac1140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member Nov 26 04:55:45 localhost ceph-mgr[287388]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member Nov 26 04:55:45 localhost ceph-mgr[287388]: mgr[py] Loading python module 'osd_perf_query' Nov 26 04:55:45 localhost ceph-mgr[287388]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member Nov 26 04:55:45 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:55:45.609+0000 7f8fa3ac1140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member Nov 26 04:55:45 localhost ceph-mgr[287388]: mgr[py] Loading python module 'osd_support' Nov 26 04:55:45 localhost nova_compute[281415]: 2025-11-26 09:55:45.648 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:55:45 localhost ceph-mgr[287388]: mgr[py] Module osd_support has missing NOTIFY_TYPES member Nov 26 04:55:45 localhost ceph-mgr[287388]: mgr[py] Loading python module 'pg_autoscaler' Nov 26 04:55:45 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:55:45.666+0000 7f8fa3ac1140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member Nov 26 04:55:45 localhost ceph-mgr[287388]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member Nov 26 04:55:45 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:55:45.736+0000 7f8fa3ac1140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member Nov 26 04:55:45 localhost ceph-mgr[287388]: mgr[py] Loading python module 'progress' Nov 26 04:55:45 localhost openstack_network_exporter[242153]: ERROR 09:55:45 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:55:45 localhost openstack_network_exporter[242153]: ERROR 09:55:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:55:45 localhost openstack_network_exporter[242153]: ERROR 09:55:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:55:45 localhost openstack_network_exporter[242153]: ERROR 09:55:45 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:55:45 localhost openstack_network_exporter[242153]: Nov 26 04:55:45 localhost openstack_network_exporter[242153]: ERROR 09:55:45 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:55:45 localhost openstack_network_exporter[242153]: Nov 26 04:55:45 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:55:45.798+0000 7f8fa3ac1140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member Nov 26 04:55:45 localhost ceph-mgr[287388]: mgr[py] Module progress has missing NOTIFY_TYPES member Nov 26 04:55:45 localhost ceph-mgr[287388]: mgr[py] Loading python module 'prometheus' Nov 26 04:55:45 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 04:55:45 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 04:55:45 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 04:55:45 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 04:55:45 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 04:55:45 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 04:55:46 localhost ceph-mgr[287388]: mgr[py] Module prometheus has missing NOTIFY_TYPES member Nov 26 04:55:46 localhost ceph-mgr[287388]: mgr[py] Loading python module 'rbd_support' Nov 26 04:55:46 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:55:46.096+0000 7f8fa3ac1140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member Nov 26 04:55:46 localhost ceph-mgr[287388]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member Nov 26 04:55:46 localhost ceph-mgr[287388]: mgr[py] Loading python module 'restful' Nov 26 04:55:46 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:55:46.179+0000 7f8fa3ac1140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member Nov 26 04:55:46 localhost ceph-mgr[287388]: mgr[py] Loading python module 'rgw' Nov 26 04:55:46 localhost ceph-mgr[287388]: mgr[py] Module rgw has missing NOTIFY_TYPES member Nov 26 04:55:46 localhost ceph-mgr[287388]: mgr[py] Loading python module 'rook' Nov 26 04:55:46 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:55:46.502+0000 7f8fa3ac1140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member Nov 26 04:55:46 localhost ceph-mgr[287388]: mgr[py] Module rook has missing NOTIFY_TYPES member Nov 26 04:55:46 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:55:46.924+0000 7f8fa3ac1140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member Nov 26 04:55:46 localhost ceph-mgr[287388]: mgr[py] Loading python module 'selftest' Nov 26 04:55:46 localhost ceph-mgr[287388]: mgr[py] Module selftest has missing NOTIFY_TYPES member Nov 26 04:55:46 localhost ceph-mgr[287388]: mgr[py] Loading python module 'snap_schedule' Nov 26 04:55:46 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:55:46.984+0000 7f8fa3ac1140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member Nov 26 04:55:47 localhost ceph-mgr[287388]: mgr[py] Loading python module 'stats' Nov 26 04:55:47 localhost ceph-mgr[287388]: mgr[py] Loading python module 'status' Nov 26 04:55:47 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 04:55:47 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 04:55:47 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 26 04:55:47 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 04:55:47 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 26 04:55:47 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 26 04:55:47 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 26 04:55:47 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 04:55:47 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 26 04:55:47 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 26 04:55:47 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 26 04:55:47 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 26 04:55:47 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 04:55:47 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 26 04:55:47 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 26 04:55:47 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 04:55:47 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:55:47 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 26 04:55:47 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 26 04:55:47 localhost ceph-mgr[287388]: mgr[py] Module status has missing NOTIFY_TYPES member Nov 26 04:55:47 localhost ceph-mgr[287388]: mgr[py] Loading python module 'telegraf' Nov 26 04:55:47 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:55:47.170+0000 7f8fa3ac1140 -1 mgr[py] Module status has missing NOTIFY_TYPES member Nov 26 04:55:47 localhost ceph-mgr[287388]: mgr[py] Module telegraf has missing NOTIFY_TYPES member Nov 26 04:55:47 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:55:47.228+0000 7f8fa3ac1140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member Nov 26 04:55:47 localhost ceph-mgr[287388]: mgr[py] Loading python module 'telemetry' Nov 26 04:55:47 localhost ceph-mgr[287388]: mgr[py] Module telemetry has missing NOTIFY_TYPES member Nov 26 04:55:47 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:55:47.358+0000 7f8fa3ac1140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member Nov 26 04:55:47 localhost ceph-mgr[287388]: mgr[py] Loading python module 'test_orchestrator' Nov 26 04:55:47 localhost ceph-mgr[287388]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member Nov 26 04:55:47 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:55:47.502+0000 7f8fa3ac1140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member Nov 26 04:55:47 localhost ceph-mgr[287388]: mgr[py] Loading python module 'volumes' Nov 26 04:55:47 localhost ceph-mgr[287388]: mgr[py] Module volumes has missing NOTIFY_TYPES member Nov 26 04:55:47 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:55:47.687+0000 7f8fa3ac1140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member Nov 26 04:55:47 localhost ceph-mgr[287388]: mgr[py] Loading python module 'zabbix' Nov 26 04:55:47 localhost ceph-mgr[287388]: mgr[py] Module zabbix has missing NOTIFY_TYPES member Nov 26 04:55:47 localhost ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-mgr-np0005536118-anceyj[287384]: 2025-11-26T09:55:47.744+0000 7f8fa3ac1140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member Nov 26 04:55:47 localhost ceph-mgr[287388]: ms_deliver_dispatch: unhandled message 0x56547ef9f1e0 mon_map magic: 0 from mon.1 v2:172.18.0.104:3300/0 Nov 26 04:55:47 localhost ceph-mgr[287388]: client.0 ms_handle_reset on v2:172.18.0.108:6810/3354046426 Nov 26 04:55:48 localhost ceph-mon[297296]: Adjusting osd_memory_target on np0005536117.localdomain to 836.6M Nov 26 04:55:48 localhost ceph-mon[297296]: Adjusting osd_memory_target on np0005536119.localdomain to 836.6M Nov 26 04:55:48 localhost ceph-mon[297296]: Unable to set osd_memory_target on np0005536117.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 26 04:55:48 localhost ceph-mon[297296]: Unable to set osd_memory_target on np0005536119.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 26 04:55:48 localhost ceph-mon[297296]: Adjusting osd_memory_target on np0005536118.localdomain to 836.6M Nov 26 04:55:48 localhost ceph-mon[297296]: Unable to set osd_memory_target on np0005536118.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 26 04:55:48 localhost ceph-mon[297296]: Updating np0005536117.localdomain:/etc/ceph/ceph.conf Nov 26 04:55:48 localhost ceph-mon[297296]: Updating np0005536118.localdomain:/etc/ceph/ceph.conf Nov 26 04:55:48 localhost ceph-mon[297296]: Updating np0005536119.localdomain:/etc/ceph/ceph.conf Nov 26 04:55:48 localhost nova_compute[281415]: 2025-11-26 09:55:48.797 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:55:49 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:55:49 localhost ceph-mon[297296]: Updating np0005536119.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:55:49 localhost ceph-mon[297296]: Updating np0005536117.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:55:49 localhost ceph-mon[297296]: Updating np0005536118.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.conf Nov 26 04:55:49 localhost ceph-mon[297296]: Updating np0005536119.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 26 04:55:49 localhost ceph-mon[297296]: Updating np0005536117.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 26 04:55:49 localhost ceph-mon[297296]: Updating np0005536118.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 26 04:55:49 localhost sshd[305234]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:55:50 localhost ceph-mon[297296]: Updating np0005536119.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.client.admin.keyring Nov 26 04:55:50 localhost ceph-mon[297296]: Updating np0005536117.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.client.admin.keyring Nov 26 04:55:50 localhost ceph-mon[297296]: Updating np0005536118.localdomain:/var/lib/ceph/0d5e5e6d-3c4b-5efe-8c65-346ae6715606/config/ceph.client.admin.keyring Nov 26 04:55:50 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 04:55:50 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 04:55:50 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 04:55:50 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 04:55:50 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 04:55:50 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 04:55:50 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 04:55:50 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:55:50 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 04:55:50 localhost nova_compute[281415]: 2025-11-26 09:55:50.675 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:55:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:55:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:55:53 localhost nova_compute[281415]: 2025-11-26 09:55:53.852 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:55:53 localhost podman[305379]: 2025-11-26 09:55:53.879049899 +0000 UTC m=+0.131644525 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 26 04:55:53 localhost podman[305379]: 2025-11-26 09:55:53.914200065 +0000 UTC m=+0.166794681 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 04:55:53 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:55:53 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 04:55:54 localhost podman[305380]: 2025-11-26 09:55:54.003699621 +0000 UTC m=+0.255002248 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=edpm) Nov 26 04:55:54 localhost podman[305380]: 2025-11-26 09:55:54.017213201 +0000 UTC m=+0.268515788 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Nov 26 04:55:54 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:55:54 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:55:54 localhost nova_compute[281415]: 2025-11-26 09:55:54.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:55:54 localhost nova_compute[281415]: 2025-11-26 09:55:54.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:55:55 localhost nova_compute[281415]: 2025-11-26 09:55:55.707 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:55:55 localhost nova_compute[281415]: 2025-11-26 09:55:55.842 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:55:55 localhost nova_compute[281415]: 2025-11-26 09:55:55.863 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:55:56 localhost nova_compute[281415]: 2025-11-26 09:55:56.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:55:57 localhost podman[240049]: time="2025-11-26T09:55:57Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:55:57 localhost podman[240049]: @ - - [26/Nov/2025:09:55:57 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153864 "" "Go-http-client/1.1" Nov 26 04:55:57 localhost podman[240049]: @ - - [26/Nov/2025:09:55:57 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18722 "" "Go-http-client/1.1" Nov 26 04:55:57 localhost nova_compute[281415]: 2025-11-26 09:55:57.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:55:57 localhost nova_compute[281415]: 2025-11-26 09:55:57.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:55:57 localhost nova_compute[281415]: 2025-11-26 09:55:57.848 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 04:55:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:55:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:55:58 localhost podman[305421]: 2025-11-26 09:55:58.839704863 +0000 UTC m=+0.097480669 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:55:58 localhost nova_compute[281415]: 2025-11-26 09:55:58.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:55:58 localhost nova_compute[281415]: 2025-11-26 09:55:58.887 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:55:58 localhost podman[305422]: 2025-11-26 09:55:58.900688602 +0000 UTC m=+0.154910660 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=edpm, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, distribution-scope=public, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, architecture=x86_64, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Nov 26 04:55:58 localhost podman[305421]: 2025-11-26 09:55:58.906493088 +0000 UTC m=+0.164268864 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251118) Nov 26 04:55:58 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:55:58 localhost podman[305422]: 2025-11-26 09:55:58.963363514 +0000 UTC m=+0.217585582 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter) Nov 26 04:55:58 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:55:59 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:55:59 localhost nova_compute[281415]: 2025-11-26 09:55:59.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:55:59 localhost nova_compute[281415]: 2025-11-26 09:55:59.878 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:55:59 localhost nova_compute[281415]: 2025-11-26 09:55:59.879 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:55:59 localhost nova_compute[281415]: 2025-11-26 09:55:59.879 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:55:59 localhost nova_compute[281415]: 2025-11-26 09:55:59.879 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 04:55:59 localhost nova_compute[281415]: 2025-11-26 09:55:59.880 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:56:00 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 04:56:00 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3277496611' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 04:56:00 localhost nova_compute[281415]: 2025-11-26 09:56:00.353 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:56:00 localhost nova_compute[281415]: 2025-11-26 09:56:00.456 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:56:00 localhost nova_compute[281415]: 2025-11-26 09:56:00.457 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:56:00 localhost nova_compute[281415]: 2025-11-26 09:56:00.687 281419 WARNING nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 04:56:00 localhost nova_compute[281415]: 2025-11-26 09:56:00.689 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=11750MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 04:56:00 localhost nova_compute[281415]: 2025-11-26 09:56:00.690 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:56:00 localhost nova_compute[281415]: 2025-11-26 09:56:00.691 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:56:00 localhost nova_compute[281415]: 2025-11-26 09:56:00.745 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:56:00 localhost nova_compute[281415]: 2025-11-26 09:56:00.763 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 04:56:00 localhost nova_compute[281415]: 2025-11-26 09:56:00.763 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 04:56:00 localhost nova_compute[281415]: 2025-11-26 09:56:00.764 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 04:56:00 localhost nova_compute[281415]: 2025-11-26 09:56:00.791 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:56:01 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 04:56:01 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/280536175' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 04:56:01 localhost nova_compute[281415]: 2025-11-26 09:56:01.307 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.516s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:56:01 localhost nova_compute[281415]: 2025-11-26 09:56:01.316 281419 DEBUG nova.compute.provider_tree [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 04:56:01 localhost nova_compute[281415]: 2025-11-26 09:56:01.333 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 04:56:01 localhost nova_compute[281415]: 2025-11-26 09:56:01.336 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 04:56:01 localhost nova_compute[281415]: 2025-11-26 09:56:01.336 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.646s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:56:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:56:02 localhost systemd[1]: tmp-crun.rnJmIS.mount: Deactivated successfully. Nov 26 04:56:02 localhost podman[305510]: 2025-11-26 09:56:02.837239825 +0000 UTC m=+0.093723035 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 26 04:56:02 localhost podman[305510]: 2025-11-26 09:56:02.847260959 +0000 UTC m=+0.103744189 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 26 04:56:02 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.584 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'name': 'test', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005536118.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'hostId': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.585 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.585 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.590 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a908660-8495-4b45-b7c8-3892627b01e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:56:03.585684', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '1ea7b0aa-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11438.827941815, 'message_signature': '0d4ded33520111e384bb52bb2fcdc8444efa1934d3595b84f69089c2b2778b58'}]}, 'timestamp': '2025-11-26 09:56:03.591667', '_unique_id': 'f02a383b94fc464d917bd5b307314f14'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.594 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.595 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.628 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.629 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a8891e90-55bd-4814-b172-432ed054336c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:56:03.596123', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1ead7706-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11438.838427963, 'message_signature': '7df2f26cfd9c23df5ae515e49db142f5e005eb2bd416f003582f827135a5a06e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:56:03.596123', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1ead8b4c-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11438.838427963, 'message_signature': '3ab918662d4e82773c114e4fea532820669ca17ea3671e1d040231f3df3d161e'}]}, 'timestamp': '2025-11-26 09:56:03.629776', '_unique_id': '7b9daba1ff1541358fd9c6171dd22a31'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.631 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.632 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.648 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/cpu volume: 14810000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b1744e8-f04a-4909-8af3-5ac884453fe8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14810000000, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T09:56:03.632730', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '1eb0836a-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11438.89077449, 'message_signature': 'a0410feff9943b975887e6c730462b44adbdab93b694d99ff376f02881bf273b'}]}, 'timestamp': '2025-11-26 09:56:03.649276', '_unique_id': '4fe3de21aeb24102b7ec1568663c0b7e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.650 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.651 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.651 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '70557eac-5095-43c0-bc30-1ef625e91f86', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:56:03.651526', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '1eb0eed6-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11438.827941815, 'message_signature': 'c4cf0b0011831b1a22371f60ba89f472565ce9169d20b4776c20a49cded90fe2'}]}, 'timestamp': '2025-11-26 09:56:03.652041', '_unique_id': '1959de6709bf41ddb17654b35f4111ad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.652 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.654 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.654 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 1143371229 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.654 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 23326743 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23b3359c-f2b6-4779-b153-b8b630f8cf5e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1143371229, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:56:03.654176', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1eb1560a-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11438.838427963, 'message_signature': '11ac93f07f9cfb4875688009b184bbee1911ea962d8f22d2b0621a43a29a167d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 23326743, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:56:03.654176', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1eb1674e-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11438.838427963, 'message_signature': 'c64688450e0b7f43ab55ab1e09162d1d82bc681c5a64b5fca1d4427c0b05436d'}]}, 'timestamp': '2025-11-26 09:56:03.655076', '_unique_id': 'a1070714e7a54cbcb835100498ef7e91'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.656 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.657 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.657 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b9ac31b0-2a4d-4bb5-804a-0fa5f73b6beb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:56:03.657238', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '1eb1cdd8-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11438.827941815, 'message_signature': '5c8dadd9da0308d84e0a4c55348215fd4be77573456c3789854c87c5c498514a'}]}, 'timestamp': '2025-11-26 09:56:03.657701', '_unique_id': '1f1cbb3f848a4b079077f6dbcda9381a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.658 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.659 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.659 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.660 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.660 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:56:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:56:03.662 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd0f376e1-f0d6-484c-aa32-11f21b5b541f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:56:03.659959', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1eb23804-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11438.838427963, 'message_signature': 'c08e0ccc8f534aa9b15f3188915f2c759d9313f6f36d00803a233f501e256ac3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:56:03.659959', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1eb24812-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11438.838427963, 'message_signature': 'abb35ab4240a52ba976932a07faa6c8b785721df6a339ecff08d13edd83a3731'}]}, 'timestamp': '2025-11-26 09:56:03.660823', '_unique_id': '57fa9e4f4f66424db1e94fe466b38548'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.661 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.662 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.663 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:56:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:56:03.663 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:56:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:56:03.663 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '116b4703-b211-4664-8de5-d3e13cd536c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:56:03.662960', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '1eb2ad5c-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11438.827941815, 'message_signature': '7df77c130f068dd6c17bb3bc603a6eae2e587d0b3fd1ad9785194a0b0a47176d'}]}, 'timestamp': '2025-11-26 09:56:03.663420', '_unique_id': '24e96d77043e480baf8cbdc7d6685990'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.664 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.665 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.676 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.677 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '90eb67b3-e56e-4488-bcf5-79179644ccb0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:56:03.665508', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1eb4cd6c-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11438.907756806, 'message_signature': '1522fba3d5a05175f0f9702cb805177718fe0fa7a35a1af931d1f119c797fc10'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:56:03.665508', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1eb4e086-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11438.907756806, 'message_signature': '322da4b3a7684473d947a1c131ecfaf6691350fcafbc8d3ed61ca048a82f64e4'}]}, 'timestamp': '2025-11-26 09:56:03.677837', '_unique_id': '8bf64fc63484426c8e87771d7d2354ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.679 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.680 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.680 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.680 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.681 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9fbaa28a-139f-406d-a963-a681599a8444', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:56:03.680564', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1eb55dc2-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11438.907756806, 'message_signature': '48cbd113cd9fc63a4b4274dc90051273cc09421d39ba8ca2a379a8f3bc1d8efb'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:56:03.680564', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1eb57906-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11438.907756806, 'message_signature': '2f26f8176eef0e1983a2ee9f62c1379152bf0be4925b3bc3dcd085348d4e5224'}]}, 'timestamp': '2025-11-26 09:56:03.681808', '_unique_id': 'de0ce071033c41d58b6c6bf596b3b439'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.682 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.684 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.684 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.684 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b91188d2-79df-44b5-8c4c-a1b30d48eb4e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:56:03.684147', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1eb5e8c8-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11438.838427963, 'message_signature': 'ff0bb9636faf081c50d3dad31396e148a38fea3f93446557c6e2f481a60a6723'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:56:03.684147', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1eb5fa48-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11438.838427963, 'message_signature': 'c0a5ba938909b238621fb56471923cffc35353638113bf5b18cfca4810796446'}]}, 'timestamp': '2025-11-26 09:56:03.685070', '_unique_id': 'f04fd5c9b6274421878ab84e8b4bc448'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.686 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.687 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.687 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3657776e-582c-4693-a359-ee9c4e1913b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:56:03.687324', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '1eb6653c-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11438.827941815, 'message_signature': 'a5a68800a1356c59f82b313b000493402f74dfec0cdd18e22eb8305e30ff5058'}]}, 'timestamp': '2025-11-26 09:56:03.687791', '_unique_id': '6421f34d931b42f0b36c45eafa850b85'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.688 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.689 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.690 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ffcc352-5534-4adf-acab-89e0165d0238', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:56:03.690020', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '1eb6ce64-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11438.827941815, 'message_signature': '58c4c28868f3fe3e01edd4c94a0c4f542b58ce26f078ce5c86180eabab828666'}]}, 'timestamp': '2025-11-26 09:56:03.690482', '_unique_id': '9ac25ddf21de4925a0d97e3d1edaccdd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.691 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.692 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.692 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.693 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 1723586642 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.693 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 89399569 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '19b8b60a-d470-446f-b2e9-6b058a669b8b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1723586642, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:56:03.692980', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1eb7459c-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11438.838427963, 'message_signature': '1092b8a594b413b44b232f9fdb88fd4e6fcb8c32b570449eea7dfef5b7853b56'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89399569, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:56:03.692980', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1eb75c30-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11438.838427963, 'message_signature': '951d189424dd4aacd7b079592fb4d08d15ac0702ba01a875ba612b4b54807ac1'}]}, 'timestamp': '2025-11-26 09:56:03.694211', '_unique_id': 'b6dff8dbac174b569ab2e5587885e88a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.695 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.697 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.697 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d66b5ca-f2ab-42e7-b9d7-7bbaf8e3a1a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:56:03.697233', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '1eb7ebfa-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11438.827941815, 'message_signature': 'f84917e0e54aa773ee4c667aa592c1b7d2fe9cb77f81b9bb2f8eb1b2014d4af1'}]}, 'timestamp': '2025-11-26 09:56:03.697810', '_unique_id': 'd8c4b1102dcf443a9bef5095130d9871'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.698 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.699 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.700 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.700 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4c1e6d15-3216-4d57-b97e-a0ac4323f662', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:56:03.699963', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1eb852d4-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11438.838427963, 'message_signature': '04330db32aeb646be29f2b7b8f17c892a76d92a0c0d69e0bd0a29d186d55a381'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:56:03.699963', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1eb8630a-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11438.838427963, 'message_signature': 'e4b4d9a2c9cca9d77d21254ab6ee1a8abe3d663c6b33a45754b5f89d89598b84'}]}, 'timestamp': '2025-11-26 09:56:03.700838', '_unique_id': 'b38a4aaf0dd84b0a8224335cfe321e0c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.701 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.702 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.703 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.703 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4166cce7-9c6d-45c8-bc18-840885293d65', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:56:03.703017', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1eb8c9ee-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11438.907756806, 'message_signature': 'ef024a7b8d10338e8ae2a0233d33284a18951ab36421bfba7ec5b02aa686ba7f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:56:03.703017', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1eb8da1a-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11438.907756806, 'message_signature': '060d267c1e115781998ce4c4755c2147152fd9ef9cfe79654ed51806b5f5bbb6'}]}, 'timestamp': '2025-11-26 09:56:03.703865', '_unique_id': 'd5475473de114bf988e537b29ec9105a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.704 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.705 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.706 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/memory.usage volume: 51.79296875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ceabd82-244c-407b-8879-438bf1ce5247', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.79296875, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T09:56:03.706076', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '1eb94180-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11438.89077449, 'message_signature': 'c21d4f70199109c6719ae66109319971485933b1fae7b3fcf39c9fa9b5b15391'}]}, 'timestamp': '2025-11-26 09:56:03.706521', '_unique_id': '0219d7657bcf4b40b30412a47712aca5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.707 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.708 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.708 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c12a757e-b865-4bbf-a941-395c13967fc4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:56:03.708906', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '1eb9b17e-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11438.827941815, 'message_signature': '91206ce582cfc21a2ee669cea4217de6d1a147343110054e2c59c60b548027e9'}]}, 'timestamp': '2025-11-26 09:56:03.709404', '_unique_id': 'c42e03611e2442cd8b6e9f0c98be0efe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.710 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes volume: 7111 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '52b20610-07ca-42f6-962e-c3dd43eb0a44', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7111, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:56:03.710820', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '1eb9f85a-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11438.827941815, 'message_signature': '2a8e37aaa690abf6c5eb98da18898348e2b7939db6c2bad3333ea23628dfae62'}]}, 'timestamp': '2025-11-26 09:56:03.711132', '_unique_id': '52a7fbc3851f4ac3b521ec31c09a5227'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.711 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.712 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.712 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4931458f-371d-4f3f-81c5-a900e38c0ced', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:56:03.712412', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '1eba35fe-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11438.827941815, 'message_signature': '67a6b8498c23dd8ef70adbc405daf88de7402253ceddb40dfa08c76f84a0bf84'}]}, 'timestamp': '2025-11-26 09:56:03.712708', '_unique_id': 'f053f375272642d1a5aff4f4a008e115'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:56:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:56:03.713 12 ERROR oslo_messaging.notify.messaging Nov 26 04:56:03 localhost nova_compute[281415]: 2025-11-26 09:56:03.933 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:56:04 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:56:04 localhost nova_compute[281415]: 2025-11-26 09:56:04.336 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:56:04 localhost nova_compute[281415]: 2025-11-26 09:56:04.337 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 04:56:04 localhost nova_compute[281415]: 2025-11-26 09:56:04.337 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 04:56:04 localhost nova_compute[281415]: 2025-11-26 09:56:04.404 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:56:04 localhost nova_compute[281415]: 2025-11-26 09:56:04.405 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:56:04 localhost nova_compute[281415]: 2025-11-26 09:56:04.405 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 04:56:04 localhost nova_compute[281415]: 2025-11-26 09:56:04.406 281419 DEBUG nova.objects.instance [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:56:05 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 26 04:56:05 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7800.1 total, 600.0 interval#012Cumulative writes: 5908 writes, 25K keys, 5908 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5908 writes, 867 syncs, 6.81 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 180 writes, 421 keys, 180 commit groups, 1.0 writes per commit group, ingest: 0.41 MB, 0.00 MB/s#012Interval WAL: 180 writes, 86 syncs, 2.09 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 26 04:56:05 localhost nova_compute[281415]: 2025-11-26 09:56:05.343 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:56:05 localhost nova_compute[281415]: 2025-11-26 09:56:05.400 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:56:05 localhost nova_compute[281415]: 2025-11-26 09:56:05.401 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 04:56:05 localhost nova_compute[281415]: 2025-11-26 09:56:05.788 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:56:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:56:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:56:08 localhost podman[305533]: 2025-11-26 09:56:08.821876195 +0000 UTC m=+0.081672539 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:56:08 localhost podman[305533]: 2025-11-26 09:56:08.854725072 +0000 UTC m=+0.114521376 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2) Nov 26 04:56:08 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:56:08 localhost podman[305534]: 2025-11-26 09:56:08.868736227 +0000 UTC m=+0.124395146 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:56:08 localhost podman[305534]: 2025-11-26 09:56:08.88435199 +0000 UTC m=+0.140010889 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 26 04:56:08 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:56:08 localhost nova_compute[281415]: 2025-11-26 09:56:08.972 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:56:09 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:56:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 26 04:56:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7800.1 total, 600.0 interval#012Cumulative writes: 4999 writes, 22K keys, 4999 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4999 writes, 678 syncs, 7.37 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 33 writes, 128 keys, 33 commit groups, 1.0 writes per commit group, ingest: 0.20 MB, 0.00 MB/s#012Interval WAL: 33 writes, 15 syncs, 2.20 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 26 04:56:10 localhost nova_compute[281415]: 2025-11-26 09:56:10.819 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:56:14 localhost nova_compute[281415]: 2025-11-26 09:56:14.015 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:56:14 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:56:15 localhost openstack_network_exporter[242153]: ERROR 09:56:15 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:56:15 localhost openstack_network_exporter[242153]: ERROR 09:56:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:56:15 localhost openstack_network_exporter[242153]: ERROR 09:56:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:56:15 localhost openstack_network_exporter[242153]: ERROR 09:56:15 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:56:15 localhost openstack_network_exporter[242153]: Nov 26 04:56:15 localhost openstack_network_exporter[242153]: ERROR 09:56:15 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:56:15 localhost openstack_network_exporter[242153]: Nov 26 04:56:15 localhost nova_compute[281415]: 2025-11-26 09:56:15.822 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:56:19 localhost nova_compute[281415]: 2025-11-26 09:56:19.019 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:56:19 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:56:20 localhost nova_compute[281415]: 2025-11-26 09:56:20.858 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:56:24 localhost nova_compute[281415]: 2025-11-26 09:56:24.053 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:56:24 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:56:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:56:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:56:24 localhost podman[305571]: 2025-11-26 09:56:24.24431142 +0000 UTC m=+0.092528828 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 26 04:56:24 localhost podman[305571]: 2025-11-26 09:56:24.252381745 +0000 UTC m=+0.100599153 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 04:56:24 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:56:24 localhost podman[305572]: 2025-11-26 09:56:24.3448157 +0000 UTC m=+0.190335376 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=edpm) Nov 26 04:56:24 localhost podman[305572]: 2025-11-26 09:56:24.356716131 +0000 UTC m=+0.202235777 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm) Nov 26 04:56:24 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:56:25 localhost nova_compute[281415]: 2025-11-26 09:56:25.895 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:56:27 localhost sshd[305613]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:56:27 localhost podman[240049]: time="2025-11-26T09:56:27Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:56:27 localhost podman[240049]: @ - - [26/Nov/2025:09:56:27 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153864 "" "Go-http-client/1.1" Nov 26 04:56:27 localhost podman[240049]: @ - - [26/Nov/2025:09:56:27 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18732 "" "Go-http-client/1.1" Nov 26 04:56:29 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:56:29 localhost nova_compute[281415]: 2025-11-26 09:56:29.106 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:56:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:56:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:56:29 localhost podman[305616]: 2025-11-26 09:56:29.842407425 +0000 UTC m=+0.100645505 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, release=1755695350, vcs-type=git, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 04:56:29 localhost podman[305615]: 2025-11-26 09:56:29.808985691 +0000 UTC m=+0.073270565 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3) Nov 26 04:56:29 localhost podman[305616]: 2025-11-26 09:56:29.884623566 +0000 UTC m=+0.142861636 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, name=ubi9-minimal, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., vcs-type=git) Nov 26 04:56:29 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:56:29 localhost podman[305615]: 2025-11-26 09:56:29.939895962 +0000 UTC m=+0.204180886 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller) Nov 26 04:56:29 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:56:30 localhost nova_compute[281415]: 2025-11-26 09:56:30.935 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:56:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:56:33 localhost podman[305661]: 2025-11-26 09:56:33.825305134 +0000 UTC m=+0.082603197 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 26 04:56:33 localhost podman[305661]: 2025-11-26 09:56:33.838294208 +0000 UTC m=+0.095592341 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 26 04:56:33 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:56:34 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:56:34 localhost nova_compute[281415]: 2025-11-26 09:56:34.143 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:56:35 localhost nova_compute[281415]: 2025-11-26 09:56:35.968 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:56:39 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:56:39 localhost nova_compute[281415]: 2025-11-26 09:56:39.146 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:56:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:56:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:56:39 localhost podman[305686]: 2025-11-26 09:56:39.83985862 +0000 UTC m=+0.089347742 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 26 04:56:39 localhost podman[305686]: 2025-11-26 09:56:39.852301348 +0000 UTC m=+0.101790420 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:56:39 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:56:39 localhost podman[305685]: 2025-11-26 09:56:39.90542796 +0000 UTC m=+0.156860860 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:56:39 localhost podman[305685]: 2025-11-26 09:56:39.938309397 +0000 UTC m=+0.189742267 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent) Nov 26 04:56:39 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:56:41 localhost nova_compute[281415]: 2025-11-26 09:56:41.007 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:56:44 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:56:44 localhost nova_compute[281415]: 2025-11-26 09:56:44.189 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:56:45 localhost openstack_network_exporter[242153]: ERROR 09:56:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:56:45 localhost openstack_network_exporter[242153]: ERROR 09:56:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:56:45 localhost openstack_network_exporter[242153]: ERROR 09:56:45 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:56:45 localhost openstack_network_exporter[242153]: ERROR 09:56:45 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:56:45 localhost openstack_network_exporter[242153]: Nov 26 04:56:45 localhost openstack_network_exporter[242153]: ERROR 09:56:45 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:56:45 localhost openstack_network_exporter[242153]: Nov 26 04:56:46 localhost nova_compute[281415]: 2025-11-26 09:56:46.011 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:56:49 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:56:49 localhost nova_compute[281415]: 2025-11-26 09:56:49.190 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:56:51 localhost nova_compute[281415]: 2025-11-26 09:56:51.049 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:56:52 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:56:52 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 04:56:54 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 04:56:54 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:56:54 localhost nova_compute[281415]: 2025-11-26 09:56:54.227 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:56:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:56:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:56:55 localhost podman[305806]: 2025-11-26 09:56:55.392266944 +0000 UTC m=+0.086885527 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 04:56:55 localhost podman[305806]: 2025-11-26 09:56:55.405280549 +0000 UTC m=+0.099899122 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 04:56:55 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:56:55 localhost podman[305807]: 2025-11-26 09:56:55.497978661 +0000 UTC m=+0.185770217 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm) Nov 26 04:56:55 localhost podman[305807]: 2025-11-26 09:56:55.538407587 +0000 UTC m=+0.226199133 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3) Nov 26 04:56:55 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:56:55 localhost nova_compute[281415]: 2025-11-26 09:56:55.908 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:56:56 localhost nova_compute[281415]: 2025-11-26 09:56:56.096 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:56:56 localhost nova_compute[281415]: 2025-11-26 09:56:56.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:56:57 localhost podman[240049]: time="2025-11-26T09:56:57Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:56:57 localhost podman[240049]: @ - - [26/Nov/2025:09:56:57 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153864 "" "Go-http-client/1.1" Nov 26 04:56:57 localhost podman[240049]: @ - - [26/Nov/2025:09:56:57 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18728 "" "Go-http-client/1.1" Nov 26 04:56:57 localhost nova_compute[281415]: 2025-11-26 09:56:57.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:56:57 localhost nova_compute[281415]: 2025-11-26 09:56:57.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:56:58 localhost nova_compute[281415]: 2025-11-26 09:56:58.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:56:58 localhost nova_compute[281415]: 2025-11-26 09:56:58.848 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 04:56:59 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:56:59 localhost nova_compute[281415]: 2025-11-26 09:56:59.271 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:56:59 localhost nova_compute[281415]: 2025-11-26 09:56:59.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:56:59 localhost nova_compute[281415]: 2025-11-26 09:56:59.849 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:57:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:57:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:57:00 localhost podman[305851]: 2025-11-26 09:57:00.831592619 +0000 UTC m=+0.084132414 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, io.buildah.version=1.33.7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.openshift.expose-services=, managed_by=edpm_ansible, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Nov 26 04:57:00 localhost podman[305851]: 2025-11-26 09:57:00.843535091 +0000 UTC m=+0.096074886 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.tags=minimal rhel9, release=1755695350, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 04:57:00 localhost nova_compute[281415]: 2025-11-26 09:57:00.851 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:57:00 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:57:00 localhost nova_compute[281415]: 2025-11-26 09:57:00.878 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:57:00 localhost nova_compute[281415]: 2025-11-26 09:57:00.879 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:57:00 localhost nova_compute[281415]: 2025-11-26 09:57:00.879 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:57:00 localhost nova_compute[281415]: 2025-11-26 09:57:00.880 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 04:57:00 localhost nova_compute[281415]: 2025-11-26 09:57:00.880 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:57:00 localhost systemd[1]: tmp-crun.o7vNo2.mount: Deactivated successfully. Nov 26 04:57:00 localhost podman[305850]: 2025-11-26 09:57:00.94537591 +0000 UTC m=+0.202862105 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller) Nov 26 04:57:01 localhost podman[305850]: 2025-11-26 09:57:01.030358349 +0000 UTC m=+0.287844554 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:57:01 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:57:01 localhost nova_compute[281415]: 2025-11-26 09:57:01.097 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:57:01 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 04:57:01 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1879699094' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 04:57:01 localhost nova_compute[281415]: 2025-11-26 09:57:01.313 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:57:01 localhost nova_compute[281415]: 2025-11-26 09:57:01.381 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:57:01 localhost nova_compute[281415]: 2025-11-26 09:57:01.382 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:57:01 localhost nova_compute[281415]: 2025-11-26 09:57:01.599 281419 WARNING nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 04:57:01 localhost nova_compute[281415]: 2025-11-26 09:57:01.601 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=11706MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 04:57:01 localhost nova_compute[281415]: 2025-11-26 09:57:01.602 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:57:01 localhost nova_compute[281415]: 2025-11-26 09:57:01.602 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:57:01 localhost nova_compute[281415]: 2025-11-26 09:57:01.680 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 04:57:01 localhost nova_compute[281415]: 2025-11-26 09:57:01.681 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 04:57:01 localhost nova_compute[281415]: 2025-11-26 09:57:01.681 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 04:57:01 localhost nova_compute[281415]: 2025-11-26 09:57:01.729 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:57:02 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 04:57:02 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1195976097' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 04:57:02 localhost nova_compute[281415]: 2025-11-26 09:57:02.183 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:57:02 localhost nova_compute[281415]: 2025-11-26 09:57:02.191 281419 DEBUG nova.compute.provider_tree [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 04:57:02 localhost nova_compute[281415]: 2025-11-26 09:57:02.208 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 04:57:02 localhost nova_compute[281415]: 2025-11-26 09:57:02.211 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 04:57:02 localhost nova_compute[281415]: 2025-11-26 09:57:02.211 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.609s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:57:03 localhost sshd[305937]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:57:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:57:03.664 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:57:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:57:03.666 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:57:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:57:03.666 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:57:04 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:57:04 localhost nova_compute[281415]: 2025-11-26 09:57:04.209 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:57:04 localhost nova_compute[281415]: 2025-11-26 09:57:04.210 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 04:57:04 localhost nova_compute[281415]: 2025-11-26 09:57:04.210 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 04:57:04 localhost nova_compute[281415]: 2025-11-26 09:57:04.309 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:57:04 localhost nova_compute[281415]: 2025-11-26 09:57:04.388 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:57:04 localhost nova_compute[281415]: 2025-11-26 09:57:04.388 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:57:04 localhost nova_compute[281415]: 2025-11-26 09:57:04.388 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 04:57:04 localhost nova_compute[281415]: 2025-11-26 09:57:04.389 281419 DEBUG nova.objects.instance [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:57:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:57:04 localhost nova_compute[281415]: 2025-11-26 09:57:04.818 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:57:04 localhost podman[305939]: 2025-11-26 09:57:04.830375609 +0000 UTC m=+0.090713373 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:57:04 localhost nova_compute[281415]: 2025-11-26 09:57:04.837 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:57:04 localhost nova_compute[281415]: 2025-11-26 09:57:04.838 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 04:57:04 localhost podman[305939]: 2025-11-26 09:57:04.867361401 +0000 UTC m=+0.127699165 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:57:04 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:57:04 localhost ovn_metadata_agent[159481]: 2025-11-26 09:57:04.994 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:5e:b4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '86:cf:7c:68:02:df'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 04:57:04 localhost ovn_metadata_agent[159481]: 2025-11-26 09:57:04.996 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 26 04:57:04 localhost nova_compute[281415]: 2025-11-26 09:57:04.995 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:57:06 localhost nova_compute[281415]: 2025-11-26 09:57:06.125 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:57:09 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:57:09 localhost nova_compute[281415]: 2025-11-26 09:57:09.311 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:57:09 localhost ovn_metadata_agent[159481]: 2025-11-26 09:57:09.998 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fad182b-d1fd-4eb1-a4d3-436a76a6f49e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:57:10 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e88 e88: 6 total, 6 up, 6 in Nov 26 04:57:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:57:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:57:10 localhost systemd[1]: tmp-crun.Q3zJYm.mount: Deactivated successfully. Nov 26 04:57:10 localhost podman[305964]: 2025-11-26 09:57:10.807342125 +0000 UTC m=+0.072478920 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:57:10 localhost systemd[1]: tmp-crun.9F0yQr.mount: Deactivated successfully. Nov 26 04:57:10 localhost podman[305963]: 2025-11-26 09:57:10.828747944 +0000 UTC m=+0.092217539 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118) Nov 26 04:57:10 localhost podman[305964]: 2025-11-26 09:57:10.855613129 +0000 UTC m=+0.120749904 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118) Nov 26 04:57:10 localhost podman[305963]: 2025-11-26 09:57:10.864198929 +0000 UTC m=+0.127668524 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent) Nov 26 04:57:10 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:57:10 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:57:11 localhost nova_compute[281415]: 2025-11-26 09:57:11.168 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:57:12 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e89 e89: 6 total, 6 up, 6 in Nov 26 04:57:14 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:57:14 localhost nova_compute[281415]: 2025-11-26 09:57:14.347 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:57:15 localhost openstack_network_exporter[242153]: ERROR 09:57:15 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:57:15 localhost openstack_network_exporter[242153]: ERROR 09:57:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:57:15 localhost openstack_network_exporter[242153]: ERROR 09:57:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:57:15 localhost openstack_network_exporter[242153]: ERROR 09:57:15 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:57:15 localhost openstack_network_exporter[242153]: Nov 26 04:57:15 localhost openstack_network_exporter[242153]: ERROR 09:57:15 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:57:15 localhost openstack_network_exporter[242153]: Nov 26 04:57:16 localhost nova_compute[281415]: 2025-11-26 09:57:16.172 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:57:19 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:57:19 localhost nova_compute[281415]: 2025-11-26 09:57:19.350 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:57:21 localhost nova_compute[281415]: 2025-11-26 09:57:21.215 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:57:24 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:57:24 localhost nova_compute[281415]: 2025-11-26 09:57:24.394 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:57:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:57:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:57:25 localhost podman[306004]: 2025-11-26 09:57:25.833351934 +0000 UTC m=+0.092706374 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 26 04:57:25 localhost podman[306005]: 2025-11-26 09:57:25.885251089 +0000 UTC m=+0.140374060 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:57:25 localhost podman[306004]: 2025-11-26 09:57:25.894435297 +0000 UTC m=+0.153789787 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 04:57:25 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:57:25 localhost podman[306005]: 2025-11-26 09:57:25.948801667 +0000 UTC m=+0.203924648 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Nov 26 04:57:25 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:57:26 localhost nova_compute[281415]: 2025-11-26 09:57:26.248 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:57:27 localhost podman[240049]: time="2025-11-26T09:57:27Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:57:27 localhost podman[240049]: @ - - [26/Nov/2025:09:57:27 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153864 "" "Go-http-client/1.1" Nov 26 04:57:27 localhost podman[240049]: @ - - [26/Nov/2025:09:57:27 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18729 "" "Go-http-client/1.1" Nov 26 04:57:28 localhost sshd[306047]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:57:29 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:57:29 localhost nova_compute[281415]: 2025-11-26 09:57:29.427 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:57:31 localhost nova_compute[281415]: 2025-11-26 09:57:31.294 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:57:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:57:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:57:31 localhost podman[306049]: 2025-11-26 09:57:31.821458599 +0000 UTC m=+0.079490312 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 26 04:57:31 localhost podman[306049]: 2025-11-26 09:57:31.862484184 +0000 UTC m=+0.120515977 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:57:31 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:57:31 localhost podman[306050]: 2025-11-26 09:57:31.886324357 +0000 UTC m=+0.142407931 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_id=edpm, name=ubi9-minimal, release=1755695350, distribution-scope=public, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers) Nov 26 04:57:31 localhost podman[306050]: 2025-11-26 09:57:31.927882388 +0000 UTC m=+0.183965932 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, maintainer=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, name=ubi9-minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible) Nov 26 04:57:31 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:57:34 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:57:34 localhost nova_compute[281415]: 2025-11-26 09:57:34.461 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:57:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:57:35 localhost podman[306093]: 2025-11-26 09:57:35.814185504 +0000 UTC m=+0.076552133 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 04:57:35 localhost podman[306093]: 2025-11-26 09:57:35.850380352 +0000 UTC m=+0.112747021 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:57:35 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:57:36 localhost nova_compute[281415]: 2025-11-26 09:57:36.335 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:57:36 localhost ovn_controller[153664]: 2025-11-26T09:57:36Z|00067|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory Nov 26 04:57:39 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:57:39 localhost nova_compute[281415]: 2025-11-26 09:57:39.499 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:57:39 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:57:39.836 262471 INFO oslo.privsep.daemon [None req-f7645042-a93a-4621-8eaf-7a8e715fbdb1 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmp6czcc0_q/privsep.sock']#033[00m Nov 26 04:57:40 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:57:40.488 262471 INFO oslo.privsep.daemon [None req-f7645042-a93a-4621-8eaf-7a8e715fbdb1 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Nov 26 04:57:40 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:57:40.359 306121 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Nov 26 04:57:40 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:57:40.364 306121 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Nov 26 04:57:40 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:57:40.369 306121 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m Nov 26 04:57:40 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:57:40.369 306121 INFO oslo.privsep.daemon [-] privsep daemon running as pid 306121#033[00m Nov 26 04:57:40 localhost sshd[306126]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:57:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:57:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:57:41 localhost podman[306128]: 2025-11-26 09:57:41.036184093 +0000 UTC m=+0.092944380 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Nov 26 04:57:41 localhost systemd[1]: tmp-crun.vdX7lG.mount: Deactivated successfully. Nov 26 04:57:41 localhost podman[306129]: 2025-11-26 09:57:41.105101835 +0000 UTC m=+0.160303775 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 26 04:57:41 localhost podman[306129]: 2025-11-26 09:57:41.116436279 +0000 UTC m=+0.171638189 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0) Nov 26 04:57:41 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:57:41.122 262471 INFO oslo.privsep.daemon [None req-f7645042-a93a-4621-8eaf-7a8e715fbdb1 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpzm3wneak/privsep.sock']#033[00m Nov 26 04:57:41 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:57:41 localhost podman[306128]: 2025-11-26 09:57:41.172688025 +0000 UTC m=+0.229448302 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 26 04:57:41 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:57:41 localhost nova_compute[281415]: 2025-11-26 09:57:41.384 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:57:41 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:57:41.843 262471 INFO oslo.privsep.daemon [None req-f7645042-a93a-4621-8eaf-7a8e715fbdb1 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Nov 26 04:57:41 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:57:41.724 306168 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Nov 26 04:57:41 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:57:41.729 306168 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Nov 26 04:57:41 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:57:41.732 306168 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m Nov 26 04:57:41 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:57:41.733 306168 INFO oslo.privsep.daemon [-] privsep daemon running as pid 306168#033[00m Nov 26 04:57:42 localhost nova_compute[281415]: 2025-11-26 09:57:42.250 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:57:42 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:57:42.916 262471 INFO oslo.privsep.daemon [None req-f7645042-a93a-4621-8eaf-7a8e715fbdb1 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpjtrz1iwr/privsep.sock']#033[00m Nov 26 04:57:43 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:57:43.590 262471 INFO oslo.privsep.daemon [None req-f7645042-a93a-4621-8eaf-7a8e715fbdb1 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Nov 26 04:57:43 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:57:43.470 306180 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Nov 26 04:57:43 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:57:43.475 306180 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Nov 26 04:57:43 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:57:43.479 306180 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m Nov 26 04:57:43 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:57:43.479 306180 INFO oslo.privsep.daemon [-] privsep daemon running as pid 306180#033[00m Nov 26 04:57:44 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:57:44 localhost nova_compute[281415]: 2025-11-26 09:57:44.533 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:57:44 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:57:44.987 262471 INFO neutron.agent.linux.ip_lib [None req-f7645042-a93a-4621-8eaf-7a8e715fbdb1 - - - - - -] Device tap0ce966e4-04 cannot be used as it has no MAC address#033[00m Nov 26 04:57:45 localhost nova_compute[281415]: 2025-11-26 09:57:45.068 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:57:45 localhost kernel: device tap0ce966e4-04 entered promiscuous mode Nov 26 04:57:45 localhost ovn_controller[153664]: 2025-11-26T09:57:45Z|00068|binding|INFO|Claiming lport 0ce966e4-04ff-4d33-bbb4-793f124d8338 for this chassis. Nov 26 04:57:45 localhost ovn_controller[153664]: 2025-11-26T09:57:45Z|00069|binding|INFO|0ce966e4-04ff-4d33-bbb4-793f124d8338: Claiming unknown Nov 26 04:57:45 localhost NetworkManager[5970]: [1764151065.0838] manager: (tap0ce966e4-04): new Generic device (/org/freedesktop/NetworkManager/Devices/17) Nov 26 04:57:45 localhost nova_compute[281415]: 2025-11-26 09:57:45.085 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:57:45 localhost systemd-udevd[306195]: Network interface NamePolicy= disabled on kernel command line. Nov 26 04:57:45 localhost ovn_metadata_agent[159481]: 2025-11-26 09:57:45.093 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-4d6c05df-68f7-4c5b-baae-8e36a676fee9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4d6c05df-68f7-4c5b-baae-8e36a676fee9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b4dafc326e594f1996993253bb2d58d6', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2f9e5a10-6cc2-43d9-a208-4616c5b15844, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=0ce966e4-04ff-4d33-bbb4-793f124d8338) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 04:57:45 localhost ovn_metadata_agent[159481]: 2025-11-26 09:57:45.095 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 0ce966e4-04ff-4d33-bbb4-793f124d8338 in datapath 4d6c05df-68f7-4c5b-baae-8e36a676fee9 bound to our chassis#033[00m Nov 26 04:57:45 localhost ovn_metadata_agent[159481]: 2025-11-26 09:57:45.097 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Port 693a1fc3-b336-4d61-952b-36b18b17363b IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 26 04:57:45 localhost ovn_metadata_agent[159481]: 2025-11-26 09:57:45.098 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4d6c05df-68f7-4c5b-baae-8e36a676fee9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 04:57:45 localhost ovn_metadata_agent[159481]: 2025-11-26 09:57:45.100 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[51a80eb6-4b9f-41cc-9f50-bae25859be76]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:57:45 localhost journal[229445]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, ) Nov 26 04:57:45 localhost journal[229445]: hostname: np0005536118.localdomain Nov 26 04:57:45 localhost journal[229445]: ethtool ioctl error on tap0ce966e4-04: No such device Nov 26 04:57:45 localhost ovn_controller[153664]: 2025-11-26T09:57:45Z|00070|binding|INFO|Setting lport 0ce966e4-04ff-4d33-bbb4-793f124d8338 ovn-installed in OVS Nov 26 04:57:45 localhost ovn_controller[153664]: 2025-11-26T09:57:45Z|00071|binding|INFO|Setting lport 0ce966e4-04ff-4d33-bbb4-793f124d8338 up in Southbound Nov 26 04:57:45 localhost nova_compute[281415]: 2025-11-26 09:57:45.118 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:57:45 localhost journal[229445]: ethtool ioctl error on tap0ce966e4-04: No such device Nov 26 04:57:45 localhost journal[229445]: ethtool ioctl error on tap0ce966e4-04: No such device Nov 26 04:57:45 localhost journal[229445]: ethtool ioctl error on tap0ce966e4-04: No such device Nov 26 04:57:45 localhost journal[229445]: ethtool ioctl error on tap0ce966e4-04: No such device Nov 26 04:57:45 localhost journal[229445]: ethtool ioctl error on tap0ce966e4-04: No such device Nov 26 04:57:45 localhost journal[229445]: ethtool ioctl error on tap0ce966e4-04: No such device Nov 26 04:57:45 localhost journal[229445]: ethtool ioctl error on tap0ce966e4-04: No such device Nov 26 04:57:45 localhost nova_compute[281415]: 2025-11-26 09:57:45.159 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:57:45 localhost nova_compute[281415]: 2025-11-26 09:57:45.185 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:57:45 localhost openstack_network_exporter[242153]: ERROR 09:57:45 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:57:45 localhost openstack_network_exporter[242153]: ERROR 09:57:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:57:45 localhost openstack_network_exporter[242153]: ERROR 09:57:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:57:45 localhost openstack_network_exporter[242153]: ERROR 09:57:45 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:57:45 localhost openstack_network_exporter[242153]: Nov 26 04:57:45 localhost openstack_network_exporter[242153]: ERROR 09:57:45 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:57:45 localhost openstack_network_exporter[242153]: Nov 26 04:57:46 localhost podman[306267]: Nov 26 04:57:46 localhost podman[306267]: 2025-11-26 09:57:46.102265866 +0000 UTC m=+0.102895304 container create b58c5885be267e41f25855c6e223cc21d67e81e399edc30b105f880aec89922c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4d6c05df-68f7-4c5b-baae-8e36a676fee9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 26 04:57:46 localhost systemd[1]: Started libpod-conmon-b58c5885be267e41f25855c6e223cc21d67e81e399edc30b105f880aec89922c.scope. Nov 26 04:57:46 localhost podman[306267]: 2025-11-26 09:57:46.05101873 +0000 UTC m=+0.051648188 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 04:57:46 localhost systemd[1]: tmp-crun.UsDnuh.mount: Deactivated successfully. Nov 26 04:57:46 localhost systemd[1]: Started libcrun container. Nov 26 04:57:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a99d75efdc041b4844040716945a6b500b9749f2906c29185ec9f70eb18989fd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 04:57:46 localhost podman[306267]: 2025-11-26 09:57:46.19933366 +0000 UTC m=+0.199963098 container init b58c5885be267e41f25855c6e223cc21d67e81e399edc30b105f880aec89922c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4d6c05df-68f7-4c5b-baae-8e36a676fee9, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Nov 26 04:57:46 localhost podman[306267]: 2025-11-26 09:57:46.210770027 +0000 UTC m=+0.211399455 container start b58c5885be267e41f25855c6e223cc21d67e81e399edc30b105f880aec89922c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4d6c05df-68f7-4c5b-baae-8e36a676fee9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:57:46 localhost dnsmasq[306286]: started, version 2.85 cachesize 150 Nov 26 04:57:46 localhost dnsmasq[306286]: DNS service limited to local subnets Nov 26 04:57:46 localhost dnsmasq[306286]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 04:57:46 localhost dnsmasq[306286]: warning: no upstream servers configured Nov 26 04:57:46 localhost dnsmasq-dhcp[306286]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 26 04:57:46 localhost dnsmasq[306286]: read /var/lib/neutron/dhcp/4d6c05df-68f7-4c5b-baae-8e36a676fee9/addn_hosts - 0 addresses Nov 26 04:57:46 localhost dnsmasq-dhcp[306286]: read /var/lib/neutron/dhcp/4d6c05df-68f7-4c5b-baae-8e36a676fee9/host Nov 26 04:57:46 localhost dnsmasq-dhcp[306286]: read /var/lib/neutron/dhcp/4d6c05df-68f7-4c5b-baae-8e36a676fee9/opts Nov 26 04:57:46 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:57:46.276 262471 INFO neutron.agent.dhcp.agent [None req-21ea673f-35e2-494a-8706-689a7cabb07a - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T09:57:42Z, description=, device_id=a7d1d8da-b1e8-4713-8948-77794764ff14, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=b7e27e1c-9cdd-40f6-ac8b-bfea5f5d4a1f, ip_allocation=immediate, mac_address=fa:16:3e:0d:b1:73, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T09:57:36Z, description=, dns_domain=, id=4d6c05df-68f7-4c5b-baae-8e36a676fee9, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveMigrationTest-2046186367-network, port_security_enabled=True, project_id=b4dafc326e594f1996993253bb2d58d6, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=51394, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=104, status=ACTIVE, subnets=['8d8af917-7174-4be4-87c0-0ac5e178c1cb'], tags=[], tenant_id=b4dafc326e594f1996993253bb2d58d6, updated_at=2025-11-26T09:57:38Z, vlan_transparent=None, network_id=4d6c05df-68f7-4c5b-baae-8e36a676fee9, port_security_enabled=False, project_id=b4dafc326e594f1996993253bb2d58d6, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=138, status=DOWN, tags=[], tenant_id=b4dafc326e594f1996993253bb2d58d6, updated_at=2025-11-26T09:57:42Z on network 4d6c05df-68f7-4c5b-baae-8e36a676fee9#033[00m Nov 26 04:57:46 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:57:46.360 262471 INFO neutron.agent.dhcp.agent [None req-9e67ef52-9945-4ef9-9b5b-e20290c65d5a - - - - - -] DHCP configuration for ports {'4d5f2dd9-f7f6-4d61-8ddb-4559bfe1ff74'} is completed#033[00m Nov 26 04:57:46 localhost nova_compute[281415]: 2025-11-26 09:57:46.388 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:57:46 localhost dnsmasq[306286]: read /var/lib/neutron/dhcp/4d6c05df-68f7-4c5b-baae-8e36a676fee9/addn_hosts - 1 addresses Nov 26 04:57:46 localhost dnsmasq-dhcp[306286]: read /var/lib/neutron/dhcp/4d6c05df-68f7-4c5b-baae-8e36a676fee9/host Nov 26 04:57:46 localhost podman[306304]: 2025-11-26 09:57:46.518293597 +0000 UTC m=+0.064979552 container kill b58c5885be267e41f25855c6e223cc21d67e81e399edc30b105f880aec89922c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4d6c05df-68f7-4c5b-baae-8e36a676fee9, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:57:46 localhost dnsmasq-dhcp[306286]: read /var/lib/neutron/dhcp/4d6c05df-68f7-4c5b-baae-8e36a676fee9/opts Nov 26 04:57:46 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:57:46.651 262471 INFO neutron.agent.dhcp.agent [None req-567c352c-9074-483d-85e7-980eda418a87 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T09:57:42Z, description=, device_id=a7d1d8da-b1e8-4713-8948-77794764ff14, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=b7e27e1c-9cdd-40f6-ac8b-bfea5f5d4a1f, ip_allocation=immediate, mac_address=fa:16:3e:0d:b1:73, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T09:57:36Z, description=, dns_domain=, id=4d6c05df-68f7-4c5b-baae-8e36a676fee9, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveMigrationTest-2046186367-network, port_security_enabled=True, project_id=b4dafc326e594f1996993253bb2d58d6, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=51394, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=104, status=ACTIVE, subnets=['8d8af917-7174-4be4-87c0-0ac5e178c1cb'], tags=[], tenant_id=b4dafc326e594f1996993253bb2d58d6, updated_at=2025-11-26T09:57:38Z, vlan_transparent=None, network_id=4d6c05df-68f7-4c5b-baae-8e36a676fee9, port_security_enabled=False, project_id=b4dafc326e594f1996993253bb2d58d6, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=138, status=DOWN, tags=[], tenant_id=b4dafc326e594f1996993253bb2d58d6, updated_at=2025-11-26T09:57:42Z on network 4d6c05df-68f7-4c5b-baae-8e36a676fee9#033[00m Nov 26 04:57:46 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:57:46.723 262471 INFO neutron.agent.dhcp.agent [None req-eb7cd28b-df55-4677-9f33-cb25a36e402d - - - - - -] DHCP configuration for ports {'b7e27e1c-9cdd-40f6-ac8b-bfea5f5d4a1f'} is completed#033[00m Nov 26 04:57:46 localhost dnsmasq[306286]: read /var/lib/neutron/dhcp/4d6c05df-68f7-4c5b-baae-8e36a676fee9/addn_hosts - 1 addresses Nov 26 04:57:46 localhost dnsmasq-dhcp[306286]: read /var/lib/neutron/dhcp/4d6c05df-68f7-4c5b-baae-8e36a676fee9/host Nov 26 04:57:46 localhost podman[306343]: 2025-11-26 09:57:46.87931071 +0000 UTC m=+0.061384023 container kill b58c5885be267e41f25855c6e223cc21d67e81e399edc30b105f880aec89922c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4d6c05df-68f7-4c5b-baae-8e36a676fee9, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true) Nov 26 04:57:46 localhost dnsmasq-dhcp[306286]: read /var/lib/neutron/dhcp/4d6c05df-68f7-4c5b-baae-8e36a676fee9/opts Nov 26 04:57:47 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:57:47.081 262471 INFO neutron.agent.dhcp.agent [None req-f78d1490-2bde-439c-aab9-64c7cbb96074 - - - - - -] DHCP configuration for ports {'b7e27e1c-9cdd-40f6-ac8b-bfea5f5d4a1f'} is completed#033[00m Nov 26 04:57:49 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:57:49 localhost nova_compute[281415]: 2025-11-26 09:57:49.536 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:57:51 localhost nova_compute[281415]: 2025-11-26 09:57:51.384 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:57:51 localhost nova_compute[281415]: 2025-11-26 09:57:51.390 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:57:52 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:57:52 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 04:57:54 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 04:57:54 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:57:54 localhost nova_compute[281415]: 2025-11-26 09:57:54.580 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:57:55 localhost nova_compute[281415]: 2025-11-26 09:57:55.286 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:57:55 localhost nova_compute[281415]: 2025-11-26 09:57:55.472 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:57:56 localhost nova_compute[281415]: 2025-11-26 09:57:56.423 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:57:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:57:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:57:56 localhost systemd[298302]: Created slice User Background Tasks Slice. Nov 26 04:57:56 localhost systemd[298302]: Starting Cleanup of User's Temporary Files and Directories... Nov 26 04:57:56 localhost systemd[298302]: Finished Cleanup of User's Temporary Files and Directories. Nov 26 04:57:56 localhost systemd[1]: tmp-crun.rGFAPa.mount: Deactivated successfully. Nov 26 04:57:56 localhost podman[306451]: 2025-11-26 09:57:56.896249058 +0000 UTC m=+0.152654003 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 26 04:57:56 localhost podman[306452]: 2025-11-26 09:57:56.852007226 +0000 UTC m=+0.106736699 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:57:56 localhost podman[306451]: 2025-11-26 09:57:56.910379997 +0000 UTC m=+0.166784952 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 04:57:56 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:57:56 localhost podman[306452]: 2025-11-26 09:57:56.937732097 +0000 UTC m=+0.192461530 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 26 04:57:56 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:57:57 localhost neutron_sriov_agent[255515]: 2025-11-26 09:57:57.384 2 INFO neutron.agent.securitygroups_rpc [None req-c7b2275a-9944-4ac4-8aaa-c6eb598f0526 08bd3011065645c0b2694bf134099aad b4dafc326e594f1996993253bb2d58d6 - - default default] Security group member updated ['43f74207-cce2-45c0-b433-8de91a69071b']#033[00m Nov 26 04:57:57 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:57:57.420 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T09:57:57Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=0d550667-720d-4ac9-8474-c7582e0d87e3, ip_allocation=immediate, mac_address=fa:16:3e:e7:38:fd, name=tempest-parent-1619212502, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T09:57:36Z, description=, dns_domain=, id=4d6c05df-68f7-4c5b-baae-8e36a676fee9, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveMigrationTest-2046186367-network, port_security_enabled=True, project_id=b4dafc326e594f1996993253bb2d58d6, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=51394, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=104, status=ACTIVE, subnets=['8d8af917-7174-4be4-87c0-0ac5e178c1cb'], tags=[], tenant_id=b4dafc326e594f1996993253bb2d58d6, updated_at=2025-11-26T09:57:38Z, vlan_transparent=None, network_id=4d6c05df-68f7-4c5b-baae-8e36a676fee9, port_security_enabled=True, project_id=b4dafc326e594f1996993253bb2d58d6, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['43f74207-cce2-45c0-b433-8de91a69071b'], standard_attr_id=275, status=DOWN, tags=[], tenant_id=b4dafc326e594f1996993253bb2d58d6, updated_at=2025-11-26T09:57:57Z on network 4d6c05df-68f7-4c5b-baae-8e36a676fee9#033[00m Nov 26 04:57:57 localhost podman[240049]: time="2025-11-26T09:57:57Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:57:57 localhost podman[240049]: @ - - [26/Nov/2025:09:57:57 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1" Nov 26 04:57:57 localhost podman[240049]: @ - - [26/Nov/2025:09:57:57 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19207 "" "Go-http-client/1.1" Nov 26 04:57:57 localhost podman[306514]: 2025-11-26 09:57:57.618837631 +0000 UTC m=+0.048000497 container kill b58c5885be267e41f25855c6e223cc21d67e81e399edc30b105f880aec89922c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4d6c05df-68f7-4c5b-baae-8e36a676fee9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:57:57 localhost dnsmasq[306286]: read /var/lib/neutron/dhcp/4d6c05df-68f7-4c5b-baae-8e36a676fee9/addn_hosts - 2 addresses Nov 26 04:57:57 localhost dnsmasq-dhcp[306286]: read /var/lib/neutron/dhcp/4d6c05df-68f7-4c5b-baae-8e36a676fee9/host Nov 26 04:57:57 localhost dnsmasq-dhcp[306286]: read /var/lib/neutron/dhcp/4d6c05df-68f7-4c5b-baae-8e36a676fee9/opts Nov 26 04:57:57 localhost nova_compute[281415]: 2025-11-26 09:57:57.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:57:57 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:57:57.868 262471 INFO neutron.agent.dhcp.agent [None req-94a04d4d-f6b8-4074-a9de-cbe5483bcb47 - - - - - -] DHCP configuration for ports {'0d550667-720d-4ac9-8474-c7582e0d87e3'} is completed#033[00m Nov 26 04:57:58 localhost nova_compute[281415]: 2025-11-26 09:57:58.844 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:57:58 localhost nova_compute[281415]: 2025-11-26 09:57:58.972 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:57:58 localhost nova_compute[281415]: 2025-11-26 09:57:58.973 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:57:58 localhost nova_compute[281415]: 2025-11-26 09:57:58.973 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 04:57:59 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:57:59 localhost nova_compute[281415]: 2025-11-26 09:57:59.610 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:57:59 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:57:59.809 262471 INFO neutron.agent.linux.ip_lib [None req-a941824d-c01a-482a-a04c-235c158d1bcb - - - - - -] Device tap06cca76d-18 cannot be used as it has no MAC address#033[00m Nov 26 04:57:59 localhost nova_compute[281415]: 2025-11-26 09:57:59.832 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:57:59 localhost kernel: device tap06cca76d-18 entered promiscuous mode Nov 26 04:57:59 localhost nova_compute[281415]: 2025-11-26 09:57:59.840 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:57:59 localhost ovn_controller[153664]: 2025-11-26T09:57:59Z|00072|binding|INFO|Claiming lport 06cca76d-18d0-4a6c-ad7d-2c93205f45ce for this chassis. Nov 26 04:57:59 localhost ovn_controller[153664]: 2025-11-26T09:57:59Z|00073|binding|INFO|06cca76d-18d0-4a6c-ad7d-2c93205f45ce: Claiming unknown Nov 26 04:57:59 localhost NetworkManager[5970]: [1764151079.8432] manager: (tap06cca76d-18): new Generic device (/org/freedesktop/NetworkManager/Devices/18) Nov 26 04:57:59 localhost nova_compute[281415]: 2025-11-26 09:57:59.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:57:59 localhost nova_compute[281415]: 2025-11-26 09:57:59.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:57:59 localhost systemd-udevd[306547]: Network interface NamePolicy= disabled on kernel command line. Nov 26 04:57:59 localhost ovn_metadata_agent[159481]: 2025-11-26 09:57:59.858 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '19.80.0.3/24', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-4f38550a-23be-44bf-bced-1e632a24bf8c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4f38550a-23be-44bf-bced-1e632a24bf8c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b4dafc326e594f1996993253bb2d58d6', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a19e2cd7-918e-429d-b3f8-5eb7832ddf0e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=06cca76d-18d0-4a6c-ad7d-2c93205f45ce) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 04:57:59 localhost ovn_metadata_agent[159481]: 2025-11-26 09:57:59.862 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 06cca76d-18d0-4a6c-ad7d-2c93205f45ce in datapath 4f38550a-23be-44bf-bced-1e632a24bf8c bound to our chassis#033[00m Nov 26 04:57:59 localhost ovn_metadata_agent[159481]: 2025-11-26 09:57:59.865 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Port cee63832-c8ec-4b65-b763-723a40379b0f IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 26 04:57:59 localhost ovn_metadata_agent[159481]: 2025-11-26 09:57:59.866 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4f38550a-23be-44bf-bced-1e632a24bf8c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 04:57:59 localhost ovn_metadata_agent[159481]: 2025-11-26 09:57:59.867 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[c193d305-151e-416d-90e3-71f953220963]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:57:59 localhost ovn_controller[153664]: 2025-11-26T09:57:59Z|00074|binding|INFO|Setting lport 06cca76d-18d0-4a6c-ad7d-2c93205f45ce ovn-installed in OVS Nov 26 04:57:59 localhost ovn_controller[153664]: 2025-11-26T09:57:59Z|00075|binding|INFO|Setting lport 06cca76d-18d0-4a6c-ad7d-2c93205f45ce up in Southbound Nov 26 04:57:59 localhost nova_compute[281415]: 2025-11-26 09:57:59.879 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:57:59 localhost journal[229445]: ethtool ioctl error on tap06cca76d-18: No such device Nov 26 04:57:59 localhost journal[229445]: ethtool ioctl error on tap06cca76d-18: No such device Nov 26 04:57:59 localhost journal[229445]: ethtool ioctl error on tap06cca76d-18: No such device Nov 26 04:57:59 localhost journal[229445]: ethtool ioctl error on tap06cca76d-18: No such device Nov 26 04:57:59 localhost journal[229445]: ethtool ioctl error on tap06cca76d-18: No such device Nov 26 04:57:59 localhost journal[229445]: ethtool ioctl error on tap06cca76d-18: No such device Nov 26 04:57:59 localhost journal[229445]: ethtool ioctl error on tap06cca76d-18: No such device Nov 26 04:57:59 localhost journal[229445]: ethtool ioctl error on tap06cca76d-18: No such device Nov 26 04:57:59 localhost nova_compute[281415]: 2025-11-26 09:57:59.930 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:57:59 localhost nova_compute[281415]: 2025-11-26 09:57:59.963 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:00 localhost neutron_sriov_agent[255515]: 2025-11-26 09:58:00.070 2 INFO neutron.agent.securitygroups_rpc [None req-45ba97a6-3029-460d-a1de-9a944c33ecdb 08bd3011065645c0b2694bf134099aad b4dafc326e594f1996993253bb2d58d6 - - default default] Security group member updated ['43f74207-cce2-45c0-b433-8de91a69071b']#033[00m Nov 26 04:58:00 localhost podman[306617]: Nov 26 04:58:00 localhost nova_compute[281415]: 2025-11-26 09:58:00.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:58:00 localhost nova_compute[281415]: 2025-11-26 09:58:00.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:58:00 localhost podman[306617]: 2025-11-26 09:58:00.858788879 +0000 UTC m=+0.100923913 container create b80fa7c1255c0f06df471051a854abc953b5143aef047928c80c4d4e0072a560 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4f38550a-23be-44bf-bced-1e632a24bf8c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118) Nov 26 04:58:00 localhost nova_compute[281415]: 2025-11-26 09:58:00.880 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:58:00 localhost nova_compute[281415]: 2025-11-26 09:58:00.881 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:58:00 localhost nova_compute[281415]: 2025-11-26 09:58:00.881 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:58:00 localhost nova_compute[281415]: 2025-11-26 09:58:00.882 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 04:58:00 localhost nova_compute[281415]: 2025-11-26 09:58:00.882 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:58:00 localhost podman[306617]: 2025-11-26 09:58:00.811128153 +0000 UTC m=+0.053263217 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 04:58:00 localhost systemd[1]: Started libpod-conmon-b80fa7c1255c0f06df471051a854abc953b5143aef047928c80c4d4e0072a560.scope. Nov 26 04:58:00 localhost systemd[1]: Started libcrun container. Nov 26 04:58:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2873b0d5166b84ff894f50b3b0b76a303a776490f3765b504a03f6a6020334e3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 04:58:00 localhost podman[306617]: 2025-11-26 09:58:00.960224106 +0000 UTC m=+0.202359140 container init b80fa7c1255c0f06df471051a854abc953b5143aef047928c80c4d4e0072a560 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4f38550a-23be-44bf-bced-1e632a24bf8c, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 26 04:58:00 localhost podman[306617]: 2025-11-26 09:58:00.972838439 +0000 UTC m=+0.214973483 container start b80fa7c1255c0f06df471051a854abc953b5143aef047928c80c4d4e0072a560 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4f38550a-23be-44bf-bced-1e632a24bf8c, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:58:00 localhost dnsmasq[306636]: started, version 2.85 cachesize 150 Nov 26 04:58:00 localhost dnsmasq[306636]: DNS service limited to local subnets Nov 26 04:58:00 localhost dnsmasq[306636]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 04:58:00 localhost dnsmasq[306636]: warning: no upstream servers configured Nov 26 04:58:00 localhost dnsmasq-dhcp[306636]: DHCP, static leases only on 19.80.0.0, lease time 1d Nov 26 04:58:00 localhost dnsmasq[306636]: read /var/lib/neutron/dhcp/4f38550a-23be-44bf-bced-1e632a24bf8c/addn_hosts - 0 addresses Nov 26 04:58:00 localhost dnsmasq-dhcp[306636]: read /var/lib/neutron/dhcp/4f38550a-23be-44bf-bced-1e632a24bf8c/host Nov 26 04:58:00 localhost dnsmasq-dhcp[306636]: read /var/lib/neutron/dhcp/4f38550a-23be-44bf-bced-1e632a24bf8c/opts Nov 26 04:58:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:58:01.049 262471 INFO neutron.agent.dhcp.agent [None req-fc217772-8b19-4790-8861-d5053c85fd79 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T09:57:59Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=a2423fe8-d5c4-4093-930d-d6a8e0773b38, ip_allocation=immediate, mac_address=fa:16:3e:9e:a0:b0, name=tempest-subport-299618990, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T09:57:57Z, description=, dns_domain=, id=4f38550a-23be-44bf-bced-1e632a24bf8c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-subport_net-1506306690, port_security_enabled=True, project_id=b4dafc326e594f1996993253bb2d58d6, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=1692, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=281, status=ACTIVE, subnets=['64f549ec-56e8-4bfa-b47f-5c171087bcaf'], tags=[], tenant_id=b4dafc326e594f1996993253bb2d58d6, updated_at=2025-11-26T09:57:58Z, vlan_transparent=None, network_id=4f38550a-23be-44bf-bced-1e632a24bf8c, port_security_enabled=True, project_id=b4dafc326e594f1996993253bb2d58d6, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['43f74207-cce2-45c0-b433-8de91a69071b'], standard_attr_id=295, status=DOWN, tags=[], tenant_id=b4dafc326e594f1996993253bb2d58d6, updated_at=2025-11-26T09:57:59Z on network 4f38550a-23be-44bf-bced-1e632a24bf8c#033[00m Nov 26 04:58:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:58:01.139 262471 INFO neutron.agent.dhcp.agent [None req-8805cb33-f898-4a05-970d-3af8d7d8d8ae - - - - - -] DHCP configuration for ports {'902cac07-b2a2-4229-98ef-295348da5d42'} is completed#033[00m Nov 26 04:58:01 localhost dnsmasq[306636]: read /var/lib/neutron/dhcp/4f38550a-23be-44bf-bced-1e632a24bf8c/addn_hosts - 1 addresses Nov 26 04:58:01 localhost dnsmasq-dhcp[306636]: read /var/lib/neutron/dhcp/4f38550a-23be-44bf-bced-1e632a24bf8c/host Nov 26 04:58:01 localhost dnsmasq-dhcp[306636]: read /var/lib/neutron/dhcp/4f38550a-23be-44bf-bced-1e632a24bf8c/opts Nov 26 04:58:01 localhost podman[306672]: 2025-11-26 09:58:01.278565625 +0000 UTC m=+0.067717646 container kill b80fa7c1255c0f06df471051a854abc953b5143aef047928c80c4d4e0072a560 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4f38550a-23be-44bf-bced-1e632a24bf8c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 26 04:58:01 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 04:58:01 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1500212677' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 04:58:01 localhost nova_compute[281415]: 2025-11-26 09:58:01.351 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:58:01 localhost nova_compute[281415]: 2025-11-26 09:58:01.418 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:58:01 localhost nova_compute[281415]: 2025-11-26 09:58:01.418 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:58:01 localhost nova_compute[281415]: 2025-11-26 09:58:01.472 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:01 localhost nova_compute[281415]: 2025-11-26 09:58:01.711 281419 WARNING nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 04:58:01 localhost nova_compute[281415]: 2025-11-26 09:58:01.712 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=11429MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 04:58:01 localhost nova_compute[281415]: 2025-11-26 09:58:01.713 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:58:01 localhost nova_compute[281415]: 2025-11-26 09:58:01.713 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:58:01 localhost nova_compute[281415]: 2025-11-26 09:58:01.781 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 04:58:01 localhost nova_compute[281415]: 2025-11-26 09:58:01.781 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 04:58:01 localhost nova_compute[281415]: 2025-11-26 09:58:01.782 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 04:58:01 localhost nova_compute[281415]: 2025-11-26 09:58:01.836 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:58:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:58:01.972 262471 INFO neutron.agent.dhcp.agent [None req-857f6d43-65f2-4f06-9742-5464c98576cb - - - - - -] DHCP configuration for ports {'a2423fe8-d5c4-4093-930d-d6a8e0773b38'} is completed#033[00m Nov 26 04:58:02 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e90 e90: 6 total, 6 up, 6 in Nov 26 04:58:02 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 04:58:02 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3880762759' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 04:58:02 localhost nova_compute[281415]: 2025-11-26 09:58:02.344 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.508s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:58:02 localhost nova_compute[281415]: 2025-11-26 09:58:02.354 281419 DEBUG nova.compute.provider_tree [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 04:58:02 localhost nova_compute[281415]: 2025-11-26 09:58:02.384 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 04:58:02 localhost nova_compute[281415]: 2025-11-26 09:58:02.386 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 04:58:02 localhost nova_compute[281415]: 2025-11-26 09:58:02.387 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.674s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:58:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:58:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:58:02 localhost podman[306719]: 2025-11-26 09:58:02.837460491 +0000 UTC m=+0.089633861 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, release=1755695350, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, managed_by=edpm_ansible, config_id=edpm, distribution-scope=public) Nov 26 04:58:02 localhost podman[306719]: 2025-11-26 09:58:02.877767513 +0000 UTC m=+0.129940843 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, version=9.6, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, maintainer=Red Hat, Inc., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41) Nov 26 04:58:02 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:58:02 localhost podman[306718]: 2025-11-26 09:58:02.890274602 +0000 UTC m=+0.144537115 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:58:02 localhost nova_compute[281415]: 2025-11-26 09:58:02.957 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:03 localhost podman[306718]: 2025-11-26 09:58:03.000352443 +0000 UTC m=+0.254614886 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller) Nov 26 04:58:03 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.586 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'name': 'test', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005536118.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'hostId': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.587 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.590 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bc667141-bc1d-4cd6-b1fe-7d1dbb66e455', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:58:03.587370', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '662e3e8a-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11558.829633244, 'message_signature': 'fcac808a26719c6365c370e95ecd72a3b772b035cce35a5460dc4cac2375c92d'}]}, 'timestamp': '2025-11-26 09:58:03.591563', '_unique_id': '8e51a3cba52b47fa9f4caa5fe39b7484'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.593 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.594 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.594 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '94cb533e-96ad-44c8-9836-5aea562c51de', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:58:03.594778', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '662ed304-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11558.829633244, 'message_signature': '380dcefe7ace658385610b003068e85e722cf572dd485255263610b5f5cecc81'}]}, 'timestamp': '2025-11-26 09:58:03.595294', '_unique_id': '81b3aaea6d7643bba5bec3600810880a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.596 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.597 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.597 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9e25f3e4-dbf1-423a-813d-85b14358bd06', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:58:03.597427', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '662f395c-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11558.829633244, 'message_signature': '9088b0b2f8928d43e8b53a2b8e1ea4b8264412dafdff2790c867f61d171e8e50'}]}, 'timestamp': '2025-11-26 09:58:03.597887', '_unique_id': '27802afd2aa04ca9ab8654b596a514dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.598 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.599 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.600 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '41cd0d74-c355-4242-9127-99e99fd21719', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:58:03.600035', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '662f9f28-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11558.829633244, 'message_signature': 'b04039970ecdbf44c1b5e5e1ab0b0a5128d47c2e21d2d5ad4b8f9cdb37049731'}]}, 'timestamp': '2025-11-26 09:58:03.600494', '_unique_id': 'f277ca5186a34670b1886a09a2a05c30'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.601 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.602 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.614 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.615 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9244f6ff-95d5-4f8a-82ae-4a2e0343a9a0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:58:03.602578', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6631e04e-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11558.844826945, 'message_signature': 'ad4607f141cf8c0f639af4d253fc8a76c2ef1e2e2526dd6a2f45dab47ecd8489'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:58:03.602578', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6631f228-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11558.844826945, 'message_signature': 'fc08953f6ada073ecba4d9328dc64a499b04fefe9590d61fd03a7b82353d9b12'}]}, 'timestamp': '2025-11-26 09:58:03.615696', '_unique_id': 'de6be932ad304df3889cfb0c60a2c902'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.616 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.617 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.652 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 1143371229 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.653 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 23326743 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '355a99f6-bed6-484b-894c-b5d72bfe055f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1143371229, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:58:03.617897', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6637bbe0-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11558.86017671, 'message_signature': '4f1c2ab3d848be484e801985569fe293259c0ae0b24ab51d6a4a41fa235cc8c1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 23326743, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:58:03.617897', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6637d166-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11558.86017671, 'message_signature': '8091c78a9a7e0acff95c4bd526b264018e51aa9c5fbee079afc6d421983f02f0'}]}, 'timestamp': '2025-11-26 09:58:03.654205', '_unique_id': '186169b059b34d5e9a0e25563cc0cbad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.655 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.657 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.657 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '64ab84c8-b4ab-4af3-9450-f785f6cd623c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:58:03.657271', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '66385cbc-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11558.829633244, 'message_signature': '7f7917d246c770df17809c23eefda0bffeceafa110da0cafebe00d5f38a8e804'}]}, 'timestamp': '2025-11-26 09:58:03.657779', '_unique_id': 'c042605449cf4c7ab995549b6c332c3b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.658 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.659 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.659 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.660 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.660 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad1ff061-fcc6-414b-ab7b-538d8a97a31f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:58:03.660064', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6638c864-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11558.844826945, 'message_signature': '12495ae76a5962c563dd68cb4292c0c98bfdeb3bdf3c89df887b20ed603bf65e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:58:03.660064', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6638d840-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11558.844826945, 'message_signature': 'd86c2b6857b0a65287b3d560d4f64e4cacfe0458dfd321a1abcc1238d0622da1'}]}, 'timestamp': '2025-11-26 09:58:03.660903', '_unique_id': '9cd9dc5ba3314c5ea6c074908c439dd0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.661 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.662 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.663 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.663 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:58:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:03.665 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:58:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:03.666 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:58:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:03.667 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '512b11e4-1b67-4ada-8194-8306636ca205', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:58:03.663059', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '66393dbc-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11558.844826945, 'message_signature': '863f181efa700b9587392fe6c76c70376b95e59bd8fe5449c06cf42fa477c57b'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:58:03.663059', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '66394de8-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11558.844826945, 'message_signature': '5f1e05c268f2226d5228367ad5257fb0bc69e3eddd6d9caa7712743b8b654695'}]}, 'timestamp': '2025-11-26 09:58:03.663919', '_unique_id': '5f247c5f82ad485ca554da7db0c460ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.664 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.667 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.685 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/cpu volume: 15380000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f38be3d-44aa-4663-8dd9-70bfe91bb15c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 15380000000, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T09:58:03.668200', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '663ca880-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11558.927480072, 'message_signature': '2f05381a594aefea634c71aa02b45efedf2a245e9bb886e30d4d8dfe10b9a724'}]}, 'timestamp': '2025-11-26 09:58:03.686075', '_unique_id': '82ac104724f94a24b38a6f0dbd81dea4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.687 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.689 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.689 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c5f6f03e-02a4-425e-958d-60cd833457f2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:58:03.689149', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '663d39da-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11558.829633244, 'message_signature': '09e1f7d526a58ecb550416658f622b4d9d22e4468f2404fd7ae7bc24a42cc4a0'}]}, 'timestamp': '2025-11-26 09:58:03.689657', '_unique_id': 'b96812b9d308485d85543e78cd80ca3c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.690 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.691 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.692 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/memory.usage volume: 51.79296875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '36a0385f-b5f7-4a81-8153-b4b39d09d8a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.79296875, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T09:58:03.691987', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '663da758-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11558.927480072, 'message_signature': 'c250c0698b9f0f2bb0c2d68f7d4761e66138b4f3008cec61d600357b9560e2c7'}]}, 'timestamp': '2025-11-26 09:58:03.692439', '_unique_id': 'a3bc78fb30de49d1be6bfcb41194e27e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.693 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.694 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.694 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.694 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e7219668-2978-48c7-b108-838c04d165d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:58:03.694706', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '663e1288-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11558.829633244, 'message_signature': '12ae490bbcfb6d1134805a38b15fb8cbe5749b7470887fb96feb3871c439420e'}]}, 'timestamp': '2025-11-26 09:58:03.695229', '_unique_id': 'e0c981181dab400a8c25a096bb78c8ad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.696 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.697 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.697 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.697 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a7686048-6056-4268-9d64-12d92e09aaf1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:58:03.697359', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '663e78c2-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11558.86017671, 'message_signature': '8bfbf0cc31ac977bdf7e0cc0e54a258973cb9225e371a2506323dff87a84d71b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:58:03.697359', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '663e8a10-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11558.86017671, 'message_signature': 'a716b53069af106f98e1ad84b46270c9233a0c0ca92120ac497d54866f2d1dc4'}]}, 'timestamp': '2025-11-26 09:58:03.698231', '_unique_id': '0dd31497475e425d84420f37b0913615'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.699 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.700 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.700 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.700 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.701 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a63d9e43-671e-4c92-adbc-97f294f5d422', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:58:03.700544', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '663ef554-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11558.86017671, 'message_signature': '4dd9e506aa25d86c8204892e18ce1f560f6563a0d4db0b8505f243c828ef949c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:58:03.700544', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '663f06d4-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11558.86017671, 'message_signature': '163c483769e05c8a786deff2026af607a6186ac237e2affd1d9929c15889efdc'}]}, 'timestamp': '2025-11-26 09:58:03.701421', '_unique_id': 'd995d6e2b5da42f4a2e1ce13c1a91ce1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.702 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.703 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.703 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.704 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2c6217e9-b640-4cfd-ae3f-a36e90b40a56', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:58:03.703567', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '663f6b06-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11558.86017671, 'message_signature': 'c365a691a8da8074a3c06b8b8f779ea59afab3b95d1ca9226a5b9e6adc6a26bf'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:58:03.703567', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '663f7cb8-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11558.86017671, 'message_signature': '92e983da37c60e6f07bc928bb457cf7dd1ab07b493bfda9be4e6790ca874c0ba'}]}, 'timestamp': '2025-11-26 09:58:03.704438', '_unique_id': 'f394a0deac7c4906ae8b954fef6a2611'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.705 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.706 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.706 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.707 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e9d21585-4bba-4399-8a3e-8e2a5d1ee472', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:58:03.706548', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '663fdf5a-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11558.86017671, 'message_signature': '0f65260f84a01b7984769a0dc0b42152d4744d34bb43b74c128e4e425991763f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:58:03.706548', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '663ff18e-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11558.86017671, 'message_signature': 'e89747f8df762a5fa87eb3dc0e4940f304e50f35dbd8948073a04751a702ef69'}]}, 'timestamp': '2025-11-26 09:58:03.707432', '_unique_id': 'ccb76fe186eb42f98a2c6e5fe4fb19ad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.709 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.710 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.710 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6a3a2299-9b64-4d68-b43c-294993c14ae5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:58:03.710618', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '66407f14-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11558.829633244, 'message_signature': 'a6d28da908888dfd1edafbbf8fd2c26f2474c290edb231f605132ef30be34676'}]}, 'timestamp': '2025-11-26 09:58:03.711113', '_unique_id': '7f4667ca427443fbb088c82d2ca1bb40'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.712 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.713 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.713 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 1723586642 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.713 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 89399569 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6e33183b-2a29-4d0a-bbcb-0c6ba5f7d787', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1723586642, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T09:58:03.713530', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6640f02a-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11558.86017671, 'message_signature': '946ac493658748fedd0d9a8e7c43141e1b2ff89533e52911fd2ad619c6d360b5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89399569, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T09:58:03.713530', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '66410178-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11558.86017671, 'message_signature': 'b4cc91ddb40426ec3a1e9f8de7a008346f696667c72388fcd7b86a9a21bab0d9'}]}, 'timestamp': '2025-11-26 09:58:03.714389', '_unique_id': 'b6e79434baf843658214f33e276e20ab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.715 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.719 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.719 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7932f273-c08a-4474-b758-9686c7c65631', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:58:03.719765', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '6641e5de-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11558.829633244, 'message_signature': 'b864a5c6bcae2e59dde33c0681ea1133ed96e74461cacb435dc4835c9bbf82b4'}]}, 'timestamp': '2025-11-26 09:58:03.720281', '_unique_id': '21b382a37aaa4a4ca3715015c50fe9e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.721 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.722 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.722 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes volume: 7111 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '68db42fa-45d9-47de-92a5-8aa355a691f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7111, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T09:58:03.722467', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '66424d76-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11558.829633244, 'message_signature': '955d920bb31cc1c0e7d9656b3ca5d246b144c12a73c6ed9ccbcfd3a83660959f'}]}, 'timestamp': '2025-11-26 09:58:03.722923', '_unique_id': '4c3f0458f7a34602a48d1247811016ac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging yield Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.723 12 ERROR oslo_messaging.notify.messaging Nov 26 04:58:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 09:58:03.724 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 04:58:04 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:58:04 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e91 e91: 6 total, 6 up, 6 in Nov 26 04:58:04 localhost nova_compute[281415]: 2025-11-26 09:58:04.649 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:05 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:05.035 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:5e:b4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '86:cf:7c:68:02:df'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 04:58:05 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:05.039 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 26 04:58:05 localhost nova_compute[281415]: 2025-11-26 09:58:05.035 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:05 localhost nova_compute[281415]: 2025-11-26 09:58:05.388 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:58:05 localhost nova_compute[281415]: 2025-11-26 09:58:05.389 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 04:58:05 localhost nova_compute[281415]: 2025-11-26 09:58:05.389 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 04:58:05 localhost nova_compute[281415]: 2025-11-26 09:58:05.567 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:58:05 localhost nova_compute[281415]: 2025-11-26 09:58:05.568 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:58:05 localhost nova_compute[281415]: 2025-11-26 09:58:05.568 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 04:58:05 localhost nova_compute[281415]: 2025-11-26 09:58:05.569 281419 DEBUG nova.objects.instance [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:58:06 localhost nova_compute[281415]: 2025-11-26 09:58:06.018 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:58:06 localhost nova_compute[281415]: 2025-11-26 09:58:06.031 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:58:06 localhost nova_compute[281415]: 2025-11-26 09:58:06.032 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 04:58:06 localhost nova_compute[281415]: 2025-11-26 09:58:06.498 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:58:06 localhost podman[306762]: 2025-11-26 09:58:06.830162646 +0000 UTC m=+0.091680073 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 04:58:06 localhost podman[306762]: 2025-11-26 09:58:06.866423556 +0000 UTC m=+0.127940943 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 26 04:58:06 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:58:07 localhost nova_compute[281415]: 2025-11-26 09:58:07.626 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:09 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:09.042 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fad182b-d1fd-4eb1-a4d3-436a76a6f49e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:58:09 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:58:09 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:58:09.501 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=np0005536117.localdomain, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T09:57:57Z, description=, device_id=e1ef2930-c173-4abb-ba9c-3ef0c0b08400, device_owner=compute:nova, dns_assignment=[], dns_domain=, dns_name=tempest-livemigrationtest-server-2031069440, extra_dhcp_opts=[], fixed_ips=[], id=0d550667-720d-4ac9-8474-c7582e0d87e3, ip_allocation=immediate, mac_address=fa:16:3e:e7:38:fd, name=tempest-parent-1619212502, network_id=4d6c05df-68f7-4c5b-baae-8e36a676fee9, port_security_enabled=True, project_id=b4dafc326e594f1996993253bb2d58d6, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['43f74207-cce2-45c0-b433-8de91a69071b'], standard_attr_id=275, status=DOWN, tags=[], tenant_id=b4dafc326e594f1996993253bb2d58d6, trunk_details=sub_ports=[], trunk_id=5141c774-a4f6-422c-9887-331cc02fe2c7, updated_at=2025-11-26T09:58:08Z on network 4d6c05df-68f7-4c5b-baae-8e36a676fee9#033[00m Nov 26 04:58:09 localhost nova_compute[281415]: 2025-11-26 09:58:09.693 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:09 localhost systemd[1]: tmp-crun.m0VbKK.mount: Deactivated successfully. Nov 26 04:58:09 localhost dnsmasq[306286]: read /var/lib/neutron/dhcp/4d6c05df-68f7-4c5b-baae-8e36a676fee9/addn_hosts - 2 addresses Nov 26 04:58:09 localhost dnsmasq-dhcp[306286]: read /var/lib/neutron/dhcp/4d6c05df-68f7-4c5b-baae-8e36a676fee9/host Nov 26 04:58:09 localhost dnsmasq-dhcp[306286]: read /var/lib/neutron/dhcp/4d6c05df-68f7-4c5b-baae-8e36a676fee9/opts Nov 26 04:58:09 localhost podman[306800]: 2025-11-26 09:58:09.789053096 +0000 UTC m=+0.077440631 container kill b58c5885be267e41f25855c6e223cc21d67e81e399edc30b105f880aec89922c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4d6c05df-68f7-4c5b-baae-8e36a676fee9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 26 04:58:10 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:58:10.046 262471 INFO neutron.agent.dhcp.agent [None req-060b1cb2-1e06-45b9-b1a0-107e6425ed42 - - - - - -] DHCP configuration for ports {'0d550667-720d-4ac9-8474-c7582e0d87e3'} is completed#033[00m Nov 26 04:58:11 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e92 e92: 6 total, 6 up, 6 in Nov 26 04:58:11 localhost nova_compute[281415]: 2025-11-26 09:58:11.541 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:58:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:58:11 localhost systemd[1]: tmp-crun.PnnjLZ.mount: Deactivated successfully. Nov 26 04:58:11 localhost podman[306823]: 2025-11-26 09:58:11.842637051 +0000 UTC m=+0.098189290 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, config_id=multipathd, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Nov 26 04:58:11 localhost podman[306823]: 2025-11-26 09:58:11.885655006 +0000 UTC m=+0.141207235 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, io.buildah.version=1.41.3) Nov 26 04:58:11 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:58:11 localhost podman[306822]: 2025-11-26 09:58:11.957107604 +0000 UTC m=+0.213531809 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 26 04:58:11 localhost podman[306822]: 2025-11-26 09:58:11.99057707 +0000 UTC m=+0.247001335 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:58:12 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:58:12 localhost systemd[1]: tmp-crun.OC5BcE.mount: Deactivated successfully. Nov 26 04:58:13 localhost nova_compute[281415]: 2025-11-26 09:58:13.072 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:13 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e93 e93: 6 total, 6 up, 6 in Nov 26 04:58:14 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:58:14 localhost nova_compute[281415]: 2025-11-26 09:58:14.728 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:15 localhost openstack_network_exporter[242153]: ERROR 09:58:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:58:15 localhost openstack_network_exporter[242153]: ERROR 09:58:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:58:15 localhost openstack_network_exporter[242153]: ERROR 09:58:15 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:58:15 localhost openstack_network_exporter[242153]: ERROR 09:58:15 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:58:15 localhost openstack_network_exporter[242153]: Nov 26 04:58:15 localhost openstack_network_exporter[242153]: ERROR 09:58:15 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:58:15 localhost openstack_network_exporter[242153]: Nov 26 04:58:16 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e94 e94: 6 total, 6 up, 6 in Nov 26 04:58:16 localhost sshd[306860]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:58:16 localhost nova_compute[281415]: 2025-11-26 09:58:16.544 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:16 localhost nova_compute[281415]: 2025-11-26 09:58:16.890 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:18 localhost neutron_sriov_agent[255515]: 2025-11-26 09:58:18.461 2 INFO neutron.agent.securitygroups_rpc [None req-53c998b2-c146-4ff7-901e-e13928ba1fda 9b97f8a50e9b4d2a829742e2c89653c3 7a98d39e7b5a4b068f04c2241b19fa64 - - default default] Security group member updated ['6cc24193-65ce-4dcb-aaea-4042f3aaa358']#033[00m Nov 26 04:58:18 localhost nova_compute[281415]: 2025-11-26 09:58:18.541 281419 DEBUG nova.virt.libvirt.driver [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] Creating tmpfile /var/lib/nova/instances/tmpsojbyn77 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m Nov 26 04:58:18 localhost nova_compute[281415]: 2025-11-26 09:58:18.576 281419 DEBUG nova.compute.manager [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] destination check data is LibvirtLiveMigrateData(bdms=,block_migration=False,disk_available_mb=13312,disk_over_commit=False,dst_numa_info=,dst_supports_numa_live_migration=,dst_wants_file_backed_memory=False,file_backed_memory_discard=,filename='tmpsojbyn77',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=,is_shared_block_storage=,is_shared_instance_path=,is_volume_backed=,migration=,old_vol_attachment_ids=,serial_listen_addr=None,serial_listen_ports=,src_supports_native_luks=,src_supports_numa_live_migration=,supported_perf_events=,target_connect_addr=,vifs=[VIFMigrateData],wait_for_vif_plugged=) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m Nov 26 04:58:18 localhost nova_compute[281415]: 2025-11-26 09:58:18.608 281419 DEBUG oslo_concurrency.lockutils [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:58:18 localhost nova_compute[281415]: 2025-11-26 09:58:18.608 281419 DEBUG oslo_concurrency.lockutils [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:58:18 localhost nova_compute[281415]: 2025-11-26 09:58:18.622 281419 INFO nova.compute.rpcapi [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66#033[00m Nov 26 04:58:18 localhost nova_compute[281415]: 2025-11-26 09:58:18.623 281419 DEBUG oslo_concurrency.lockutils [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:58:19 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:58:19 localhost nova_compute[281415]: 2025-11-26 09:58:19.730 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:20 localhost nova_compute[281415]: 2025-11-26 09:58:20.770 281419 DEBUG nova.compute.manager [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=,block_migration=False,disk_available_mb=13312,disk_over_commit=False,dst_numa_info=,dst_supports_numa_live_migration=,dst_wants_file_backed_memory=False,file_backed_memory_discard=,filename='tmpsojbyn77',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='e1ef2930-c173-4abb-ba9c-3ef0c0b08400',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=,old_vol_attachment_ids=,serial_listen_addr=None,serial_listen_ports=,src_supports_native_luks=,src_supports_numa_live_migration=,supported_perf_events=,target_connect_addr=,vifs=[VIFMigrateData],wait_for_vif_plugged=) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m Nov 26 04:58:20 localhost nova_compute[281415]: 2025-11-26 09:58:20.816 281419 DEBUG oslo_concurrency.lockutils [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] Acquiring lock "refresh_cache-e1ef2930-c173-4abb-ba9c-3ef0c0b08400" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:58:20 localhost nova_compute[281415]: 2025-11-26 09:58:20.817 281419 DEBUG oslo_concurrency.lockutils [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] Acquired lock "refresh_cache-e1ef2930-c173-4abb-ba9c-3ef0c0b08400" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:58:20 localhost nova_compute[281415]: 2025-11-26 09:58:20.817 281419 DEBUG nova.network.neutron [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m Nov 26 04:58:21 localhost neutron_sriov_agent[255515]: 2025-11-26 09:58:21.213 2 INFO neutron.agent.securitygroups_rpc [None req-4fe495cf-b79f-4232-9edc-b1e2ac36a6ca 9b97f8a50e9b4d2a829742e2c89653c3 7a98d39e7b5a4b068f04c2241b19fa64 - - default default] Security group member updated ['6cc24193-65ce-4dcb-aaea-4042f3aaa358']#033[00m Nov 26 04:58:21 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:58:21.341 262471 INFO neutron.agent.linux.ip_lib [None req-14f07cf1-3d05-4a5c-bff9-49923079aa0b - - - - - -] Device tapbf9d53a1-4d cannot be used as it has no MAC address#033[00m Nov 26 04:58:21 localhost nova_compute[281415]: 2025-11-26 09:58:21.407 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:21 localhost kernel: device tapbf9d53a1-4d entered promiscuous mode Nov 26 04:58:21 localhost NetworkManager[5970]: [1764151101.4174] manager: (tapbf9d53a1-4d): new Generic device (/org/freedesktop/NetworkManager/Devices/19) Nov 26 04:58:21 localhost nova_compute[281415]: 2025-11-26 09:58:21.421 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:21 localhost ovn_controller[153664]: 2025-11-26T09:58:21Z|00076|binding|INFO|Claiming lport bf9d53a1-4d43-45e5-b44e-d0b988d2f2c5 for this chassis. Nov 26 04:58:21 localhost ovn_controller[153664]: 2025-11-26T09:58:21Z|00077|binding|INFO|bf9d53a1-4d43-45e5-b44e-d0b988d2f2c5: Claiming unknown Nov 26 04:58:21 localhost systemd-udevd[306874]: Network interface NamePolicy= disabled on kernel command line. Nov 26 04:58:21 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:21.438 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '19.80.0.3/24', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-d5a027cf-17c0-4785-85e8-7feed63239ef', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d5a027cf-17c0-4785-85e8-7feed63239ef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7a98d39e7b5a4b068f04c2241b19fa64', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8dfd8184-bbd7-44be-9a4c-4bf8ed9fa939, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=bf9d53a1-4d43-45e5-b44e-d0b988d2f2c5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 04:58:21 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:21.441 159486 INFO neutron.agent.ovn.metadata.agent [-] Port bf9d53a1-4d43-45e5-b44e-d0b988d2f2c5 in datapath d5a027cf-17c0-4785-85e8-7feed63239ef bound to our chassis#033[00m Nov 26 04:58:21 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:21.448 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Port 0309020f-22c0-4b54-a3ca-d2cb577f3eaa IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 26 04:58:21 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:21.448 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d5a027cf-17c0-4785-85e8-7feed63239ef, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 04:58:21 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:21.449 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[8e702811-0af2-4ce8-a86f-d1d25e5221ab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:21 localhost journal[229445]: ethtool ioctl error on tapbf9d53a1-4d: No such device Nov 26 04:58:21 localhost ovn_controller[153664]: 2025-11-26T09:58:21Z|00078|binding|INFO|Setting lport bf9d53a1-4d43-45e5-b44e-d0b988d2f2c5 ovn-installed in OVS Nov 26 04:58:21 localhost ovn_controller[153664]: 2025-11-26T09:58:21Z|00079|binding|INFO|Setting lport bf9d53a1-4d43-45e5-b44e-d0b988d2f2c5 up in Southbound Nov 26 04:58:21 localhost nova_compute[281415]: 2025-11-26 09:58:21.459 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:21 localhost journal[229445]: ethtool ioctl error on tapbf9d53a1-4d: No such device Nov 26 04:58:21 localhost journal[229445]: ethtool ioctl error on tapbf9d53a1-4d: No such device Nov 26 04:58:21 localhost journal[229445]: ethtool ioctl error on tapbf9d53a1-4d: No such device Nov 26 04:58:21 localhost journal[229445]: ethtool ioctl error on tapbf9d53a1-4d: No such device Nov 26 04:58:21 localhost journal[229445]: ethtool ioctl error on tapbf9d53a1-4d: No such device Nov 26 04:58:21 localhost journal[229445]: ethtool ioctl error on tapbf9d53a1-4d: No such device Nov 26 04:58:21 localhost journal[229445]: ethtool ioctl error on tapbf9d53a1-4d: No such device Nov 26 04:58:21 localhost nova_compute[281415]: 2025-11-26 09:58:21.502 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:21 localhost nova_compute[281415]: 2025-11-26 09:58:21.535 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:21 localhost nova_compute[281415]: 2025-11-26 09:58:21.547 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:21 localhost nova_compute[281415]: 2025-11-26 09:58:21.756 281419 DEBUG nova.network.neutron [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] Updating instance_info_cache with network_info: [{"id": "0d550667-720d-4ac9-8474-c7582e0d87e3", "address": "fa:16:3e:e7:38:fd", "network": {"id": "4d6c05df-68f7-4c5b-baae-8e36a676fee9", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2046186367-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "b4dafc326e594f1996993253bb2d58d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d550667-72", "ovs_interfaceid": "0d550667-720d-4ac9-8474-c7582e0d87e3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:58:21 localhost nova_compute[281415]: 2025-11-26 09:58:21.776 281419 DEBUG oslo_concurrency.lockutils [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] Releasing lock "refresh_cache-e1ef2930-c173-4abb-ba9c-3ef0c0b08400" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:58:21 localhost nova_compute[281415]: 2025-11-26 09:58:21.779 281419 DEBUG nova.virt.libvirt.driver [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=,block_migration=False,disk_available_mb=13312,disk_over_commit=False,dst_numa_info=,dst_supports_numa_live_migration=,dst_wants_file_backed_memory=False,file_backed_memory_discard=,filename='tmpsojbyn77',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='e1ef2930-c173-4abb-ba9c-3ef0c0b08400',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=,src_supports_native_luks=,src_supports_numa_live_migration=,supported_perf_events=,target_connect_addr=,vifs=[VIFMigrateData],wait_for_vif_plugged=) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m Nov 26 04:58:21 localhost nova_compute[281415]: 2025-11-26 09:58:21.779 281419 DEBUG nova.virt.libvirt.driver [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] Creating instance directory: /var/lib/nova/instances/e1ef2930-c173-4abb-ba9c-3ef0c0b08400 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m Nov 26 04:58:21 localhost nova_compute[281415]: 2025-11-26 09:58:21.780 281419 DEBUG nova.virt.libvirt.driver [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] Ensure instance console log exists: /var/lib/nova/instances/e1ef2930-c173-4abb-ba9c-3ef0c0b08400/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m Nov 26 04:58:21 localhost nova_compute[281415]: 2025-11-26 09:58:21.781 281419 DEBUG nova.virt.libvirt.driver [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m Nov 26 04:58:21 localhost nova_compute[281415]: 2025-11-26 09:58:21.782 281419 DEBUG nova.virt.libvirt.vif [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-26T09:58:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-2031069440',display_name='tempest-LiveMigrationTest-server-2031069440',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(5),hidden=False,host='np0005536117.localdomain',hostname='tempest-livemigrationtest-server-2031069440',id=7,image_ref='211ae400-609a-4c22-9588-f4189139a50b',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-11-26T09:58:14Z,launched_on='np0005536117.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='np0005536117.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b4dafc326e594f1996993253bb2d58d6',ramdisk_id='',reservation_id='r-px4b1dcl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='211ae400-609a-4c22-9588-f4189139a50b',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-1996637473',owner_user_name='tempest-LiveMigrationTest-1996637473-project-member'},tags=,task_state='migrating',terminated_at=None,trusted_certs=,updated_at=2025-11-26T09:58:14Z,user_data=None,user_id='08bd3011065645c0b2694bf134099aad',uuid=e1ef2930-c173-4abb-ba9c-3ef0c0b08400,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0d550667-720d-4ac9-8474-c7582e0d87e3", "address": "fa:16:3e:e7:38:fd", "network": {"id": "4d6c05df-68f7-4c5b-baae-8e36a676fee9", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2046186367-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "b4dafc326e594f1996993253bb2d58d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap0d550667-72", "ovs_interfaceid": "0d550667-720d-4ac9-8474-c7582e0d87e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Nov 26 04:58:21 localhost nova_compute[281415]: 2025-11-26 09:58:21.783 281419 DEBUG nova.network.os_vif_util [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] Converting VIF {"id": "0d550667-720d-4ac9-8474-c7582e0d87e3", "address": "fa:16:3e:e7:38:fd", "network": {"id": "4d6c05df-68f7-4c5b-baae-8e36a676fee9", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2046186367-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "b4dafc326e594f1996993253bb2d58d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap0d550667-72", "ovs_interfaceid": "0d550667-720d-4ac9-8474-c7582e0d87e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Nov 26 04:58:21 localhost nova_compute[281415]: 2025-11-26 09:58:21.784 281419 DEBUG nova.network.os_vif_util [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e7:38:fd,bridge_name='br-int',has_traffic_filtering=True,id=0d550667-720d-4ac9-8474-c7582e0d87e3,network=Network(4d6c05df-68f7-4c5b-baae-8e36a676fee9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0d550667-72') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Nov 26 04:58:21 localhost nova_compute[281415]: 2025-11-26 09:58:21.785 281419 DEBUG os_vif [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:38:fd,bridge_name='br-int',has_traffic_filtering=True,id=0d550667-720d-4ac9-8474-c7582e0d87e3,network=Network(4d6c05df-68f7-4c5b-baae-8e36a676fee9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0d550667-72') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Nov 26 04:58:21 localhost nova_compute[281415]: 2025-11-26 09:58:21.786 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:21 localhost nova_compute[281415]: 2025-11-26 09:58:21.786 281419 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:58:21 localhost nova_compute[281415]: 2025-11-26 09:58:21.787 281419 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 26 04:58:21 localhost nova_compute[281415]: 2025-11-26 09:58:21.793 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:21 localhost nova_compute[281415]: 2025-11-26 09:58:21.794 281419 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0d550667-72, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:58:21 localhost nova_compute[281415]: 2025-11-26 09:58:21.794 281419 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0d550667-72, col_values=(('external_ids', {'iface-id': '0d550667-720d-4ac9-8474-c7582e0d87e3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e7:38:fd', 'vm-uuid': 'e1ef2930-c173-4abb-ba9c-3ef0c0b08400'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:58:21 localhost nova_compute[281415]: 2025-11-26 09:58:21.796 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:21 localhost nova_compute[281415]: 2025-11-26 09:58:21.799 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 26 04:58:21 localhost nova_compute[281415]: 2025-11-26 09:58:21.802 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:21 localhost nova_compute[281415]: 2025-11-26 09:58:21.803 281419 INFO os_vif [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:38:fd,bridge_name='br-int',has_traffic_filtering=True,id=0d550667-720d-4ac9-8474-c7582e0d87e3,network=Network(4d6c05df-68f7-4c5b-baae-8e36a676fee9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0d550667-72')#033[00m Nov 26 04:58:21 localhost nova_compute[281415]: 2025-11-26 09:58:21.804 281419 DEBUG nova.virt.libvirt.driver [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m Nov 26 04:58:21 localhost nova_compute[281415]: 2025-11-26 09:58:21.804 281419 DEBUG nova.compute.manager [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=13312,disk_over_commit=False,dst_numa_info=,dst_supports_numa_live_migration=,dst_wants_file_backed_memory=False,file_backed_memory_discard=,filename='tmpsojbyn77',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='e1ef2930-c173-4abb-ba9c-3ef0c0b08400',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=,src_supports_numa_live_migration=,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m Nov 26 04:58:22 localhost podman[306946]: Nov 26 04:58:22 localhost podman[306946]: 2025-11-26 09:58:22.489861012 +0000 UTC m=+0.097633773 container create a12a0f7702a4c2f73c9ec5868cadbf9ac9ec3b3e7c3e51c0491627b8ad6d0836 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d5a027cf-17c0-4785-85e8-7feed63239ef, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:58:22 localhost systemd[1]: Started libpod-conmon-a12a0f7702a4c2f73c9ec5868cadbf9ac9ec3b3e7c3e51c0491627b8ad6d0836.scope. Nov 26 04:58:22 localhost podman[306946]: 2025-11-26 09:58:22.442642689 +0000 UTC m=+0.050415480 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 04:58:22 localhost systemd[1]: Started libcrun container. Nov 26 04:58:22 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f1adba597191654912891ec7f65deaf87e9bd7b4c252cd031144588c11dad901/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 04:58:22 localhost podman[306946]: 2025-11-26 09:58:22.586407891 +0000 UTC m=+0.194180662 container init a12a0f7702a4c2f73c9ec5868cadbf9ac9ec3b3e7c3e51c0491627b8ad6d0836 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d5a027cf-17c0-4785-85e8-7feed63239ef, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:58:22 localhost podman[306946]: 2025-11-26 09:58:22.59624622 +0000 UTC m=+0.204018991 container start a12a0f7702a4c2f73c9ec5868cadbf9ac9ec3b3e7c3e51c0491627b8ad6d0836 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d5a027cf-17c0-4785-85e8-7feed63239ef, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 26 04:58:22 localhost dnsmasq[306964]: started, version 2.85 cachesize 150 Nov 26 04:58:22 localhost dnsmasq[306964]: DNS service limited to local subnets Nov 26 04:58:22 localhost dnsmasq[306964]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 04:58:22 localhost dnsmasq[306964]: warning: no upstream servers configured Nov 26 04:58:22 localhost dnsmasq-dhcp[306964]: DHCP, static leases only on 19.80.0.0, lease time 1d Nov 26 04:58:22 localhost dnsmasq[306964]: read /var/lib/neutron/dhcp/d5a027cf-17c0-4785-85e8-7feed63239ef/addn_hosts - 0 addresses Nov 26 04:58:22 localhost dnsmasq-dhcp[306964]: read /var/lib/neutron/dhcp/d5a027cf-17c0-4785-85e8-7feed63239ef/host Nov 26 04:58:22 localhost dnsmasq-dhcp[306964]: read /var/lib/neutron/dhcp/d5a027cf-17c0-4785-85e8-7feed63239ef/opts Nov 26 04:58:22 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:58:22.665 262471 INFO neutron.agent.dhcp.agent [None req-085f467d-72ce-4724-ac44-6ee4447754d9 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T09:58:20Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=00b2ba21-3e01-4336-8328-3ffa135dba55, ip_allocation=immediate, mac_address=fa:16:3e:24:00:ed, name=tempest-subport-555983667, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T09:58:18Z, description=, dns_domain=, id=d5a027cf-17c0-4785-85e8-7feed63239ef, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-subport_net-307298584, port_security_enabled=True, project_id=7a98d39e7b5a4b068f04c2241b19fa64, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=6065, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=443, status=ACTIVE, subnets=['07af6029-cdf0-4ddb-8c0f-12b57013ecd4'], tags=[], tenant_id=7a98d39e7b5a4b068f04c2241b19fa64, updated_at=2025-11-26T09:58:19Z, vlan_transparent=None, network_id=d5a027cf-17c0-4785-85e8-7feed63239ef, port_security_enabled=True, project_id=7a98d39e7b5a4b068f04c2241b19fa64, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['6cc24193-65ce-4dcb-aaea-4042f3aaa358'], standard_attr_id=456, status=DOWN, tags=[], tenant_id=7a98d39e7b5a4b068f04c2241b19fa64, updated_at=2025-11-26T09:58:20Z on network d5a027cf-17c0-4785-85e8-7feed63239ef#033[00m Nov 26 04:58:22 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:58:22.785 262471 INFO neutron.agent.dhcp.agent [None req-235f7f3e-5c86-4e0f-9b2c-bd3b2c656fd3 - - - - - -] DHCP configuration for ports {'f203367e-aa22-4a21-9377-6a30389bb759'} is completed#033[00m Nov 26 04:58:22 localhost dnsmasq[306964]: read /var/lib/neutron/dhcp/d5a027cf-17c0-4785-85e8-7feed63239ef/addn_hosts - 1 addresses Nov 26 04:58:22 localhost dnsmasq-dhcp[306964]: read /var/lib/neutron/dhcp/d5a027cf-17c0-4785-85e8-7feed63239ef/host Nov 26 04:58:22 localhost dnsmasq-dhcp[306964]: read /var/lib/neutron/dhcp/d5a027cf-17c0-4785-85e8-7feed63239ef/opts Nov 26 04:58:22 localhost podman[306982]: 2025-11-26 09:58:22.940279628 +0000 UTC m=+0.065967913 container kill a12a0f7702a4c2f73c9ec5868cadbf9ac9ec3b3e7c3e51c0491627b8ad6d0836 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d5a027cf-17c0-4785-85e8-7feed63239ef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:58:23 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e95 e95: 6 total, 6 up, 6 in Nov 26 04:58:23 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:58:23.621 262471 INFO neutron.agent.dhcp.agent [None req-f7f79fff-8f4b-458c-85ef-81471e321860 - - - - - -] DHCP configuration for ports {'00b2ba21-3e01-4336-8328-3ffa135dba55'} is completed#033[00m Nov 26 04:58:23 localhost nova_compute[281415]: 2025-11-26 09:58:23.965 281419 DEBUG nova.network.neutron [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] Port 0d550667-720d-4ac9-8474-c7582e0d87e3 updated with migration profile {'migrating_to': 'np0005536118.localdomain'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m Nov 26 04:58:23 localhost nova_compute[281415]: 2025-11-26 09:58:23.968 281419 DEBUG nova.compute.manager [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=13312,disk_over_commit=False,dst_numa_info=,dst_supports_numa_live_migration=,dst_wants_file_backed_memory=False,file_backed_memory_discard=,filename='tmpsojbyn77',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='e1ef2930-c173-4abb-ba9c-3ef0c0b08400',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=,src_supports_numa_live_migration=,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m Nov 26 04:58:24 localhost sshd[307003]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:58:24 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:58:24 localhost systemd-logind[761]: New session 73 of user nova. Nov 26 04:58:24 localhost systemd[1]: Created slice User Slice of UID 42436. Nov 26 04:58:24 localhost systemd[1]: Starting User Runtime Directory /run/user/42436... Nov 26 04:58:24 localhost systemd[1]: Finished User Runtime Directory /run/user/42436. Nov 26 04:58:24 localhost systemd[1]: Starting User Manager for UID 42436... Nov 26 04:58:24 localhost systemd[307007]: Queued start job for default target Main User Target. Nov 26 04:58:24 localhost systemd[307007]: Created slice User Application Slice. Nov 26 04:58:24 localhost systemd[307007]: Started Mark boot as successful after the user session has run 2 minutes. Nov 26 04:58:24 localhost systemd[307007]: Started Daily Cleanup of User's Temporary Directories. Nov 26 04:58:24 localhost systemd[307007]: Reached target Paths. Nov 26 04:58:24 localhost systemd[307007]: Reached target Timers. Nov 26 04:58:24 localhost systemd[307007]: Starting D-Bus User Message Bus Socket... Nov 26 04:58:24 localhost systemd[307007]: Starting Create User's Volatile Files and Directories... Nov 26 04:58:24 localhost systemd[307007]: Listening on D-Bus User Message Bus Socket. Nov 26 04:58:24 localhost systemd[307007]: Reached target Sockets. Nov 26 04:58:24 localhost systemd[307007]: Finished Create User's Volatile Files and Directories. Nov 26 04:58:24 localhost systemd[307007]: Reached target Basic System. Nov 26 04:58:24 localhost systemd[307007]: Reached target Main User Target. Nov 26 04:58:24 localhost systemd[307007]: Startup finished in 173ms. Nov 26 04:58:24 localhost systemd[1]: Started User Manager for UID 42436. Nov 26 04:58:24 localhost systemd[1]: Started Session 73 of User nova. Nov 26 04:58:24 localhost systemd[1]: Started libvirt secret daemon. Nov 26 04:58:24 localhost kernel: device tap0d550667-72 entered promiscuous mode Nov 26 04:58:24 localhost NetworkManager[5970]: [1764151104.8040] manager: (tap0d550667-72): new Tun device (/org/freedesktop/NetworkManager/Devices/20) Nov 26 04:58:24 localhost ovn_controller[153664]: 2025-11-26T09:58:24Z|00080|binding|INFO|Claiming lport 0d550667-720d-4ac9-8474-c7582e0d87e3 for this additional chassis. Nov 26 04:58:24 localhost ovn_controller[153664]: 2025-11-26T09:58:24Z|00081|binding|INFO|0d550667-720d-4ac9-8474-c7582e0d87e3: Claiming fa:16:3e:e7:38:fd 10.100.0.13 Nov 26 04:58:24 localhost ovn_controller[153664]: 2025-11-26T09:58:24Z|00082|binding|INFO|Claiming lport a2423fe8-d5c4-4093-930d-d6a8e0773b38 for this additional chassis. Nov 26 04:58:24 localhost ovn_controller[153664]: 2025-11-26T09:58:24Z|00083|binding|INFO|a2423fe8-d5c4-4093-930d-d6a8e0773b38: Claiming fa:16:3e:9e:a0:b0 19.80.0.10 Nov 26 04:58:24 localhost nova_compute[281415]: 2025-11-26 09:58:24.808 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:24 localhost systemd-udevd[307055]: Network interface NamePolicy= disabled on kernel command line. Nov 26 04:58:24 localhost nova_compute[281415]: 2025-11-26 09:58:24.812 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:24 localhost ovn_controller[153664]: 2025-11-26T09:58:24Z|00084|binding|INFO|Setting lport 0d550667-720d-4ac9-8474-c7582e0d87e3 ovn-installed in OVS Nov 26 04:58:24 localhost nova_compute[281415]: 2025-11-26 09:58:24.828 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:24 localhost NetworkManager[5970]: [1764151104.8313] device (tap0d550667-72): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Nov 26 04:58:24 localhost nova_compute[281415]: 2025-11-26 09:58:24.830 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:24 localhost NetworkManager[5970]: [1764151104.8325] device (tap0d550667-72): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external') Nov 26 04:58:24 localhost systemd-machined[83873]: New machine qemu-3-instance-00000007. Nov 26 04:58:24 localhost systemd[1]: Started Virtual Machine qemu-3-instance-00000007. Nov 26 04:58:25 localhost nova_compute[281415]: 2025-11-26 09:58:25.235 281419 DEBUG nova.virt.driver [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Emitting event Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Nov 26 04:58:25 localhost nova_compute[281415]: 2025-11-26 09:58:25.237 281419 INFO nova.compute.manager [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] VM Started (Lifecycle Event)#033[00m Nov 26 04:58:25 localhost nova_compute[281415]: 2025-11-26 09:58:25.260 281419 DEBUG nova.compute.manager [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 26 04:58:25 localhost nova_compute[281415]: 2025-11-26 09:58:25.377 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:25 localhost nova_compute[281415]: 2025-11-26 09:58:25.595 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:25 localhost nova_compute[281415]: 2025-11-26 09:58:25.839 281419 DEBUG nova.virt.driver [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Emitting event Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Nov 26 04:58:25 localhost nova_compute[281415]: 2025-11-26 09:58:25.841 281419 INFO nova.compute.manager [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] VM Resumed (Lifecycle Event)#033[00m Nov 26 04:58:25 localhost nova_compute[281415]: 2025-11-26 09:58:25.874 281419 DEBUG nova.compute.manager [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 26 04:58:25 localhost nova_compute[281415]: 2025-11-26 09:58:25.883 281419 DEBUG nova.compute.manager [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Nov 26 04:58:25 localhost nova_compute[281415]: 2025-11-26 09:58:25.914 281419 INFO nova.compute.manager [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] During the sync_power process the instance has moved from host np0005536117.localdomain to host np0005536118.localdomain#033[00m Nov 26 04:58:26 localhost systemd[1]: session-73.scope: Deactivated successfully. Nov 26 04:58:26 localhost systemd-logind[761]: Session 73 logged out. Waiting for processes to exit. Nov 26 04:58:26 localhost systemd-logind[761]: Removed session 73. Nov 26 04:58:26 localhost nova_compute[281415]: 2025-11-26 09:58:26.556 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:26 localhost nova_compute[281415]: 2025-11-26 09:58:26.797 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:27 localhost podman[240049]: time="2025-11-26T09:58:27Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:58:27 localhost podman[240049]: @ - - [26/Nov/2025:09:58:27 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 159331 "" "Go-http-client/1.1" Nov 26 04:58:27 localhost podman[240049]: @ - - [26/Nov/2025:09:58:27 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20191 "" "Go-http-client/1.1" Nov 26 04:58:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:58:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:58:27 localhost podman[307113]: 2025-11-26 09:58:27.860119181 +0000 UTC m=+0.112700001 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:58:27 localhost podman[307113]: 2025-11-26 09:58:27.872681482 +0000 UTC m=+0.125262282 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 26 04:58:27 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:58:27 localhost systemd[1]: tmp-crun.Q1V8JJ.mount: Deactivated successfully. Nov 26 04:58:27 localhost podman[307114]: 2025-11-26 09:58:27.973410218 +0000 UTC m=+0.223681347 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 26 04:58:27 localhost podman[307114]: 2025-11-26 09:58:27.990896618 +0000 UTC m=+0.241167737 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Nov 26 04:58:28 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:58:28 localhost ovn_controller[153664]: 2025-11-26T09:58:28Z|00085|binding|INFO|Claiming lport 0d550667-720d-4ac9-8474-c7582e0d87e3 for this chassis. Nov 26 04:58:28 localhost ovn_controller[153664]: 2025-11-26T09:58:28Z|00086|binding|INFO|0d550667-720d-4ac9-8474-c7582e0d87e3: Claiming fa:16:3e:e7:38:fd 10.100.0.13 Nov 26 04:58:28 localhost ovn_controller[153664]: 2025-11-26T09:58:28Z|00087|binding|INFO|Claiming lport a2423fe8-d5c4-4093-930d-d6a8e0773b38 for this chassis. Nov 26 04:58:28 localhost ovn_controller[153664]: 2025-11-26T09:58:28Z|00088|binding|INFO|a2423fe8-d5c4-4093-930d-d6a8e0773b38: Claiming fa:16:3e:9e:a0:b0 19.80.0.10 Nov 26 04:58:28 localhost ovn_controller[153664]: 2025-11-26T09:58:28Z|00089|binding|INFO|Setting lport 0d550667-720d-4ac9-8474-c7582e0d87e3 up in Southbound Nov 26 04:58:28 localhost ovn_controller[153664]: 2025-11-26T09:58:28Z|00090|binding|INFO|Setting lport a2423fe8-d5c4-4093-930d-d6a8e0773b38 up in Southbound Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:28.339 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e7:38:fd 10.100.0.13'], port_security=['fa:16:3e:e7:38:fd 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-1619212502', 'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'e1ef2930-c173-4abb-ba9c-3ef0c0b08400', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4d6c05df-68f7-4c5b-baae-8e36a676fee9', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-1619212502', 'neutron:project_id': 'b4dafc326e594f1996993253bb2d58d6', 'neutron:revision_number': '9', 'neutron:security_group_ids': '43f74207-cce2-45c0-b433-8de91a69071b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536117.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2f9e5a10-6cc2-43d9-a208-4616c5b15844, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=0d550667-720d-4ac9-8474-c7582e0d87e3) old=Port_Binding(up=[False], additional_chassis=[], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:28.342 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9e:a0:b0 19.80.0.10'], port_security=['fa:16:3e:9e:a0:b0 19.80.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['0d550667-720d-4ac9-8474-c7582e0d87e3'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-299618990', 'neutron:cidrs': '19.80.0.10/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4f38550a-23be-44bf-bced-1e632a24bf8c', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-299618990', 'neutron:project_id': 'b4dafc326e594f1996993253bb2d58d6', 'neutron:revision_number': '3', 'neutron:security_group_ids': '43f74207-cce2-45c0-b433-8de91a69071b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=a19e2cd7-918e-429d-b3f8-5eb7832ddf0e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=a2423fe8-d5c4-4093-930d-d6a8e0773b38) old=Port_Binding(up=[False], additional_chassis=[], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:28.344 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 0d550667-720d-4ac9-8474-c7582e0d87e3 in datapath 4d6c05df-68f7-4c5b-baae-8e36a676fee9 bound to our chassis#033[00m Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:28.347 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Port 693a1fc3-b336-4d61-952b-36b18b17363b IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:28.348 159486 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4d6c05df-68f7-4c5b-baae-8e36a676fee9#033[00m Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:28.364 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[20ddf737-5bb5-474c-8b71-4fbd2d57b271]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:28.365 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4d6c05df-61 in ovnmeta-4d6c05df-68f7-4c5b-baae-8e36a676fee9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:28.367 159592 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4d6c05df-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:28.367 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[6c64bbc8-aae5-4bf3-8838-aeb780fda15d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:28.368 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[7f4f31ae-3a19-454a-95af-7d4310955c62]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:28.388 159623 DEBUG oslo.privsep.daemon [-] privsep: reply[e0155e92-c61b-423d-a73b-df6dd9385c00]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:28.410 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[b4cf993a-b1e2-4e1f-b0d3-d3322bffea38]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:28.453 159603 DEBUG oslo.privsep.daemon [-] privsep: reply[d30f6999-7014-480c-969c-42edb255ce51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:28.462 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[a4560329-032e-4e9d-a188-98f3f2c8f1a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:28 localhost NetworkManager[5970]: [1764151108.4664] manager: (tap4d6c05df-60): new Veth device (/org/freedesktop/NetworkManager/Devices/21) Nov 26 04:58:28 localhost systemd-udevd[307161]: Network interface NamePolicy= disabled on kernel command line. Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:28.510 159603 DEBUG oslo.privsep.daemon [-] privsep: reply[1a2057a2-8c16-4895-9199-8c8b983f94fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:28.515 159603 DEBUG oslo.privsep.daemon [-] privsep: reply[d7fccbed-6e20-4199-b898-ffc0159e1f8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:28 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap4d6c05df-61: link becomes ready Nov 26 04:58:28 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap4d6c05df-60: link becomes ready Nov 26 04:58:28 localhost NetworkManager[5970]: [1764151108.5464] device (tap4d6c05df-60): carrier: link connected Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:28.554 159603 DEBUG oslo.privsep.daemon [-] privsep: reply[7ca37fb3-f25e-4245-97d9-c8ce5bbb8e77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:28.578 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[d9f889ab-0097-4098-bcc7-6e91988de2b5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4d6c05df-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:a4:a0:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1158371, 'reachable_time': 33591, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 307182, 'error': None, 'target': 'ovnmeta-4d6c05df-68f7-4c5b-baae-8e36a676fee9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:28 localhost neutron_sriov_agent[255515]: 2025-11-26 09:58:28.594 2 WARNING neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [req-94683686-a950-4e9e-99bf-20051d3800ba req-81ba4846-eaf3-439f-b910-c4edd0cada49 73b2a26e2a774121b135f0c122458ae8 9aade6d9c4d94af5a0404e802fc179ab - - default default] This port is not SRIOV, skip binding for port 0d550667-720d-4ac9-8474-c7582e0d87e3.#033[00m Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:28.598 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[eee093ab-65f8-4d15-b5e1-8153f2e03195]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea4:a011'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1158371, 'tstamp': 1158371}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 307183, 'error': None, 'target': 'ovnmeta-4d6c05df-68f7-4c5b-baae-8e36a676fee9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:28.618 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[e0136c34-db72-4fbe-8a50-c9bcd79334fb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4d6c05df-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:a4:a0:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1158371, 'reachable_time': 33591, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 307185, 'error': None, 'target': 'ovnmeta-4d6c05df-68f7-4c5b-baae-8e36a676fee9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:28.662 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[8fc0e2cd-5654-44c6-80ee-8e2d92600d91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:28 localhost nova_compute[281415]: 2025-11-26 09:58:28.731 281419 INFO nova.compute.manager [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] Post operation of migration started#033[00m Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:28.745 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[f1f2d7e7-a1c5-40db-93ca-f6393fad09ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:28.747 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4d6c05df-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:28.748 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:28.749 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4d6c05df-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:58:28 localhost nova_compute[281415]: 2025-11-26 09:58:28.794 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:28 localhost kernel: device tap4d6c05df-60 entered promiscuous mode Nov 26 04:58:28 localhost nova_compute[281415]: 2025-11-26 09:58:28.799 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:28.800 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4d6c05df-60, col_values=(('external_ids', {'iface-id': '4d5f2dd9-f7f6-4d61-8ddb-4559bfe1ff74'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:58:28 localhost nova_compute[281415]: 2025-11-26 09:58:28.802 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:28 localhost ovn_controller[153664]: 2025-11-26T09:58:28Z|00091|binding|INFO|Releasing lport 4d5f2dd9-f7f6-4d61-8ddb-4559bfe1ff74 from this chassis (sb_readonly=0) Nov 26 04:58:28 localhost nova_compute[281415]: 2025-11-26 09:58:28.811 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:28.813 159486 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4d6c05df-68f7-4c5b-baae-8e36a676fee9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4d6c05df-68f7-4c5b-baae-8e36a676fee9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m Nov 26 04:58:28 localhost nova_compute[281415]: 2025-11-26 09:58:28.813 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:28.814 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[32a52f83-521a-4988-94e6-9b48a6c166ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:28.815 159486 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: global Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: log /dev/log local0 debug Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: log-tag haproxy-metadata-proxy-4d6c05df-68f7-4c5b-baae-8e36a676fee9 Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: user root Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: group root Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: maxconn 1024 Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: pidfile /var/lib/neutron/external/pids/4d6c05df-68f7-4c5b-baae-8e36a676fee9.pid.haproxy Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: daemon Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: defaults Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: log global Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: mode http Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: option httplog Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: option dontlognull Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: option http-server-close Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: option forwardfor Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: retries 3 Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: timeout http-request 30s Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: timeout connect 30s Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: timeout client 32s Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: timeout server 32s Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: timeout http-keep-alive 30s Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: listen listener Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: bind 169.254.169.254:80 Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: server metadata /var/lib/neutron/metadata_proxy Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: http-request add-header X-OVN-Network-ID 4d6c05df-68f7-4c5b-baae-8e36a676fee9 Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m Nov 26 04:58:28 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:28.817 159486 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4d6c05df-68f7-4c5b-baae-8e36a676fee9', 'env', 'PROCESS_TAG=haproxy-4d6c05df-68f7-4c5b-baae-8e36a676fee9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4d6c05df-68f7-4c5b-baae-8e36a676fee9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m Nov 26 04:58:29 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:58:29 localhost ceph-mon[297296]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0. Nov 26 04:58:29 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:58:29.274917) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 26 04:58:29 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28 Nov 26 04:58:29 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151109274971, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 2500, "num_deletes": 252, "total_data_size": 4556476, "memory_usage": 4681400, "flush_reason": "Manual Compaction"} Nov 26 04:58:29 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started Nov 26 04:58:29 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151109289390, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 2917096, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17368, "largest_seqno": 19863, "table_properties": {"data_size": 2907853, "index_size": 5748, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 20371, "raw_average_key_size": 21, "raw_value_size": 2888783, "raw_average_value_size": 3015, "num_data_blocks": 247, "num_entries": 958, "num_filter_entries": 958, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764150944, "oldest_key_time": 1764150944, "file_creation_time": 1764151109, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79fcc812-ff89-4330-9c0d-148e92884770", "db_session_id": "NHY1MHQN9GMTALZ562AS", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}} Nov 26 04:58:29 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 14509 microseconds, and 4936 cpu microseconds. Nov 26 04:58:29 localhost ceph-mon[297296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 04:58:29 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:58:29.289437) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 2917096 bytes OK Nov 26 04:58:29 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:58:29.289464) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started Nov 26 04:58:29 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:58:29.292295) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done Nov 26 04:58:29 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:58:29.292313) EVENT_LOG_v1 {"time_micros": 1764151109292308, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 26 04:58:29 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:58:29.292338) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 26 04:58:29 localhost ceph-mon[297296]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 4545336, prev total WAL file size 4545336, number of live WAL files 2. Nov 26 04:58:29 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 04:58:29 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:58:29.293443) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131323935' seq:72057594037927935, type:22 .. '7061786F73003131353437' seq:0, type:0; will stop at (end) Nov 26 04:58:29 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 26 04:58:29 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(2848KB)], [27(15MB)] Nov 26 04:58:29 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151109293536, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 19496182, "oldest_snapshot_seqno": -1} Nov 26 04:58:29 localhost podman[307219]: Nov 26 04:58:29 localhost podman[307219]: 2025-11-26 09:58:29.368334349 +0000 UTC m=+0.122349273 container create 5af94a928bca259bc592706935dd68fd915085185d393aecb022dc74d1ecd77c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d6c05df-68f7-4c5b-baae-8e36a676fee9, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Nov 26 04:58:29 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 12012 keys, 17482970 bytes, temperature: kUnknown Nov 26 04:58:29 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151109385643, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 17482970, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17412297, "index_size": 39532, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30085, "raw_key_size": 320410, "raw_average_key_size": 26, "raw_value_size": 17205660, "raw_average_value_size": 1432, "num_data_blocks": 1514, "num_entries": 12012, "num_filter_entries": 12012, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764150724, "oldest_key_time": 0, "file_creation_time": 1764151109, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79fcc812-ff89-4330-9c0d-148e92884770", "db_session_id": "NHY1MHQN9GMTALZ562AS", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}} Nov 26 04:58:29 localhost ceph-mon[297296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 04:58:29 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:58:29.385944) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 17482970 bytes Nov 26 04:58:29 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:58:29.388267) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 211.5 rd, 189.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.8, 15.8 +0.0 blob) out(16.7 +0.0 blob), read-write-amplify(12.7) write-amplify(6.0) OK, records in: 12545, records dropped: 533 output_compression: NoCompression Nov 26 04:58:29 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:58:29.388293) EVENT_LOG_v1 {"time_micros": 1764151109388279, "job": 14, "event": "compaction_finished", "compaction_time_micros": 92186, "compaction_time_cpu_micros": 52704, "output_level": 6, "num_output_files": 1, "total_output_size": 17482970, "num_input_records": 12545, "num_output_records": 12012, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 26 04:58:29 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 04:58:29 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151109388798, "job": 14, "event": "table_file_deletion", "file_number": 29} Nov 26 04:58:29 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 04:58:29 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151109390843, "job": 14, "event": "table_file_deletion", "file_number": 27} Nov 26 04:58:29 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:58:29.293331) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:58:29 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:58:29.391009) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:58:29 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:58:29.391017) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:58:29 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:58:29.391020) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:58:29 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:58:29.391021) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:58:29 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:58:29.391023) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:58:29 localhost podman[307219]: 2025-11-26 09:58:29.299735409 +0000 UTC m=+0.053750333 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Nov 26 04:58:29 localhost systemd[1]: Started libpod-conmon-5af94a928bca259bc592706935dd68fd915085185d393aecb022dc74d1ecd77c.scope. Nov 26 04:58:29 localhost systemd[1]: Started libcrun container. Nov 26 04:58:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/de82163ee46623114019a2ce11ea06e49ec01a76a478959ee59a75d7e062b472/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 04:58:29 localhost podman[307219]: 2025-11-26 09:58:29.46922688 +0000 UTC m=+0.223241804 container init 5af94a928bca259bc592706935dd68fd915085185d393aecb022dc74d1ecd77c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d6c05df-68f7-4c5b-baae-8e36a676fee9, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:58:29 localhost podman[307219]: 2025-11-26 09:58:29.479685998 +0000 UTC m=+0.233700922 container start 5af94a928bca259bc592706935dd68fd915085185d393aecb022dc74d1ecd77c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d6c05df-68f7-4c5b-baae-8e36a676fee9, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true) Nov 26 04:58:29 localhost nova_compute[281415]: 2025-11-26 09:58:29.503 281419 DEBUG oslo_concurrency.lockutils [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] Acquiring lock "refresh_cache-e1ef2930-c173-4abb-ba9c-3ef0c0b08400" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:58:29 localhost nova_compute[281415]: 2025-11-26 09:58:29.504 281419 DEBUG oslo_concurrency.lockutils [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] Acquired lock "refresh_cache-e1ef2930-c173-4abb-ba9c-3ef0c0b08400" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:58:29 localhost nova_compute[281415]: 2025-11-26 09:58:29.505 281419 DEBUG nova.network.neutron [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m Nov 26 04:58:29 localhost neutron-haproxy-ovnmeta-4d6c05df-68f7-4c5b-baae-8e36a676fee9[307233]: [NOTICE] (307237) : New worker (307239) forked Nov 26 04:58:29 localhost neutron-haproxy-ovnmeta-4d6c05df-68f7-4c5b-baae-8e36a676fee9[307233]: [NOTICE] (307237) : Loading success. Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:29.554 159486 INFO neutron.agent.ovn.metadata.agent [-] Port a2423fe8-d5c4-4093-930d-d6a8e0773b38 in datapath 4f38550a-23be-44bf-bced-1e632a24bf8c bound to our chassis#033[00m Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:29.557 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Port cee63832-c8ec-4b65-b763-723a40379b0f IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:29.558 159486 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4f38550a-23be-44bf-bced-1e632a24bf8c#033[00m Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:29.573 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[6ec21d7c-dccb-400f-bdfb-8289d7bad9eb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:29.574 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4f38550a-21 in ovnmeta-4f38550a-23be-44bf-bced-1e632a24bf8c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:29.576 159592 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4f38550a-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:29.576 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[1929de84-4429-4919-ad62-4d416fba3eb8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:29.578 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[8298c61a-c3ce-404f-8197-4f7657217449]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:29.590 159623 DEBUG oslo.privsep.daemon [-] privsep: reply[30fcf64e-00b6-435f-9329-7b0f6e4a2ed6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:29.606 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[5cc63950-bc87-407e-b5e8-9f632f448a60]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:29.637 159603 DEBUG oslo.privsep.daemon [-] privsep: reply[181f8ba8-6d43-4cc5-83af-55ba6d0572c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:29.645 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[1f33927b-ee9f-45c2-a1b7-18b36439337a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:29 localhost NetworkManager[5970]: [1764151109.6483] manager: (tap4f38550a-20): new Veth device (/org/freedesktop/NetworkManager/Devices/22) Nov 26 04:58:29 localhost systemd-udevd[307178]: Network interface NamePolicy= disabled on kernel command line. Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:29.683 159603 DEBUG oslo.privsep.daemon [-] privsep: reply[ab77469c-57a6-4461-b235-30d8a4960d02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:29.687 159603 DEBUG oslo.privsep.daemon [-] privsep: reply[0ca213e5-1a40-4ec4-bfab-1d19376bc6d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:29 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap4f38550a-21: link becomes ready Nov 26 04:58:29 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap4f38550a-20: link becomes ready Nov 26 04:58:29 localhost NetworkManager[5970]: [1764151109.7128] device (tap4f38550a-20): carrier: link connected Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:29.716 159603 DEBUG oslo.privsep.daemon [-] privsep: reply[82f2e44c-9b9e-464b-9423-9502bb690a61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:29.737 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[5abf78ad-58b0-43c9-a791-a0722f8f71d8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4f38550a-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:2b:e1:38'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1158488, 'reachable_time': 20143, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 307258, 'error': None, 'target': 'ovnmeta-4f38550a-23be-44bf-bced-1e632a24bf8c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:29.754 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[689e0d02-6d13-4df2-b320-a4248247a9b6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2b:e138'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1158488, 'tstamp': 1158488}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 307259, 'error': None, 'target': 'ovnmeta-4f38550a-23be-44bf-bced-1e632a24bf8c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:29.778 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[c8bc7969-d1d8-484d-a3fa-f1e52cf3cc46]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4f38550a-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:2b:e1:38'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1158488, 'reachable_time': 20143, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 307260, 'error': None, 'target': 'ovnmeta-4f38550a-23be-44bf-bced-1e632a24bf8c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:29.817 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[677e1f1c-34be-44ff-a2ae-835ead216c25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:29.886 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[d2b9ec20-6126-4272-a470-6ce7985f0ae1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:29.888 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4f38550a-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:29.888 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:29.889 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4f38550a-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:58:29 localhost nova_compute[281415]: 2025-11-26 09:58:29.935 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:29 localhost kernel: device tap4f38550a-20 entered promiscuous mode Nov 26 04:58:29 localhost nova_compute[281415]: 2025-11-26 09:58:29.939 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:29.941 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4f38550a-20, col_values=(('external_ids', {'iface-id': '902cac07-b2a2-4229-98ef-295348da5d42'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:58:29 localhost nova_compute[281415]: 2025-11-26 09:58:29.943 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:29 localhost ovn_controller[153664]: 2025-11-26T09:58:29Z|00092|binding|INFO|Releasing lport 902cac07-b2a2-4229-98ef-295348da5d42 from this chassis (sb_readonly=0) Nov 26 04:58:29 localhost nova_compute[281415]: 2025-11-26 09:58:29.947 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:29.948 159486 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4f38550a-23be-44bf-bced-1e632a24bf8c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4f38550a-23be-44bf-bced-1e632a24bf8c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:29.949 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[e9073f2f-56da-4211-8093-9a5f7a06f502]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:29.950 159486 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: global Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: log /dev/log local0 debug Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: log-tag haproxy-metadata-proxy-4f38550a-23be-44bf-bced-1e632a24bf8c Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: user root Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: group root Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: maxconn 1024 Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: pidfile /var/lib/neutron/external/pids/4f38550a-23be-44bf-bced-1e632a24bf8c.pid.haproxy Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: daemon Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: defaults Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: log global Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: mode http Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: option httplog Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: option dontlognull Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: option http-server-close Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: option forwardfor Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: retries 3 Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: timeout http-request 30s Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: timeout connect 30s Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: timeout client 32s Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: timeout server 32s Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: timeout http-keep-alive 30s Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: listen listener Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: bind 169.254.169.254:80 Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: server metadata /var/lib/neutron/metadata_proxy Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: http-request add-header X-OVN-Network-ID 4f38550a-23be-44bf-bced-1e632a24bf8c Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m Nov 26 04:58:29 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:29.951 159486 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4f38550a-23be-44bf-bced-1e632a24bf8c', 'env', 'PROCESS_TAG=haproxy-4f38550a-23be-44bf-bced-1e632a24bf8c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4f38550a-23be-44bf-bced-1e632a24bf8c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m Nov 26 04:58:29 localhost nova_compute[281415]: 2025-11-26 09:58:29.956 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:30 localhost podman[307293]: Nov 26 04:58:30 localhost podman[307293]: 2025-11-26 09:58:30.430079602 +0000 UTC m=+0.105576385 container create e8bf353e35129ae664e4ff8a3ef5aa6bacb2732ef92ae04226ca9993e2cd5a03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4f38550a-23be-44bf-bced-1e632a24bf8c, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:58:30 localhost systemd[1]: Started libpod-conmon-e8bf353e35129ae664e4ff8a3ef5aa6bacb2732ef92ae04226ca9993e2cd5a03.scope. Nov 26 04:58:30 localhost podman[307293]: 2025-11-26 09:58:30.380474727 +0000 UTC m=+0.055971550 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Nov 26 04:58:30 localhost systemd[1]: tmp-crun.t0ve64.mount: Deactivated successfully. Nov 26 04:58:30 localhost systemd[1]: Started libcrun container. Nov 26 04:58:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a43a8b424417f3fa8a7eea8b9e05899066d5875a06891ceb2764a7144980a6ba/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 04:58:30 localhost podman[307293]: 2025-11-26 09:58:30.546894176 +0000 UTC m=+0.222390959 container init e8bf353e35129ae664e4ff8a3ef5aa6bacb2732ef92ae04226ca9993e2cd5a03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4f38550a-23be-44bf-bced-1e632a24bf8c, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 26 04:58:30 localhost podman[307293]: 2025-11-26 09:58:30.557107125 +0000 UTC m=+0.232603908 container start e8bf353e35129ae664e4ff8a3ef5aa6bacb2732ef92ae04226ca9993e2cd5a03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4f38550a-23be-44bf-bced-1e632a24bf8c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:58:30 localhost neutron-haproxy-ovnmeta-4f38550a-23be-44bf-bced-1e632a24bf8c[307307]: [NOTICE] (307312) : New worker (307314) forked Nov 26 04:58:30 localhost neutron-haproxy-ovnmeta-4f38550a-23be-44bf-bced-1e632a24bf8c[307307]: [NOTICE] (307312) : Loading success. Nov 26 04:58:30 localhost nova_compute[281415]: 2025-11-26 09:58:30.957 281419 DEBUG nova.network.neutron [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] Updating instance_info_cache with network_info: [{"id": "0d550667-720d-4ac9-8474-c7582e0d87e3", "address": "fa:16:3e:e7:38:fd", "network": {"id": "4d6c05df-68f7-4c5b-baae-8e36a676fee9", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2046186367-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "b4dafc326e594f1996993253bb2d58d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d550667-72", "ovs_interfaceid": "0d550667-720d-4ac9-8474-c7582e0d87e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:58:31 localhost nova_compute[281415]: 2025-11-26 09:58:31.003 281419 DEBUG oslo_concurrency.lockutils [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] Releasing lock "refresh_cache-e1ef2930-c173-4abb-ba9c-3ef0c0b08400" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:58:31 localhost nova_compute[281415]: 2025-11-26 09:58:31.033 281419 DEBUG oslo_concurrency.lockutils [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:58:31 localhost nova_compute[281415]: 2025-11-26 09:58:31.033 281419 DEBUG oslo_concurrency.lockutils [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:58:31 localhost nova_compute[281415]: 2025-11-26 09:58:31.034 281419 DEBUG oslo_concurrency.lockutils [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:58:31 localhost nova_compute[281415]: 2025-11-26 09:58:31.040 281419 INFO nova.virt.libvirt.driver [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m Nov 26 04:58:31 localhost journal[202976]: Domain id=3 name='instance-00000007' uuid=e1ef2930-c173-4abb-ba9c-3ef0c0b08400 is tainted: custom-monitor Nov 26 04:58:31 localhost nova_compute[281415]: 2025-11-26 09:58:31.560 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:31 localhost ovn_controller[153664]: 2025-11-26T09:58:31Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e7:38:fd 10.100.0.13 Nov 26 04:58:31 localhost ovn_controller[153664]: 2025-11-26T09:58:31Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e7:38:fd 10.100.0.13 Nov 26 04:58:31 localhost nova_compute[281415]: 2025-11-26 09:58:31.799 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:32 localhost nova_compute[281415]: 2025-11-26 09:58:32.050 281419 INFO nova.virt.libvirt.driver [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m Nov 26 04:58:33 localhost nova_compute[281415]: 2025-11-26 09:58:33.059 281419 INFO nova.virt.libvirt.driver [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m Nov 26 04:58:33 localhost nova_compute[281415]: 2025-11-26 09:58:33.067 281419 DEBUG nova.compute.manager [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 26 04:58:33 localhost nova_compute[281415]: 2025-11-26 09:58:33.086 281419 DEBUG nova.objects.instance [None req-94683686-a950-4e9e-99bf-20051d3800ba fedd3b1386b0406c95f728da494f6ed7 fcdf64e8dd0349709ec1016ea300f569 - - default default] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m Nov 26 04:58:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:58:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:58:33 localhost podman[307323]: 2025-11-26 09:58:33.845278256 +0000 UTC m=+0.103093069 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 26 04:58:33 localhost podman[307324]: 2025-11-26 09:58:33.929945035 +0000 UTC m=+0.182747496 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, distribution-scope=public, io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git) Nov 26 04:58:33 localhost podman[307323]: 2025-11-26 09:58:33.963500413 +0000 UTC m=+0.221315206 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118) Nov 26 04:58:33 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:58:34 localhost podman[307324]: 2025-11-26 09:58:34.019936996 +0000 UTC m=+0.272739507 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1755695350, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_id=edpm) Nov 26 04:58:34 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:58:34 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:58:34 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:58:34.139 262471 INFO neutron.agent.linux.ip_lib [None req-82526e4a-a185-4b27-be78-e521b09e1030 - - - - - -] Device tapf84f6614-29 cannot be used as it has no MAC address#033[00m Nov 26 04:58:34 localhost nova_compute[281415]: 2025-11-26 09:58:34.166 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:34 localhost kernel: device tapf84f6614-29 entered promiscuous mode Nov 26 04:58:34 localhost NetworkManager[5970]: [1764151114.1766] manager: (tapf84f6614-29): new Generic device (/org/freedesktop/NetworkManager/Devices/23) Nov 26 04:58:34 localhost nova_compute[281415]: 2025-11-26 09:58:34.180 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:34 localhost ovn_controller[153664]: 2025-11-26T09:58:34Z|00093|binding|INFO|Claiming lport f84f6614-2902-417c-b63b-598ceb7caab0 for this chassis. Nov 26 04:58:34 localhost ovn_controller[153664]: 2025-11-26T09:58:34Z|00094|binding|INFO|f84f6614-2902-417c-b63b-598ceb7caab0: Claiming unknown Nov 26 04:58:34 localhost systemd-udevd[307378]: Network interface NamePolicy= disabled on kernel command line. Nov 26 04:58:34 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:34.203 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-6a06c83c-a173-43fd-9343-735e8a52503a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a06c83c-a173-43fd-9343-735e8a52503a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f251213a9644261874d24d123ed8f23', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07926bbc-17e6-41e5-b392-ed892f429233, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f84f6614-2902-417c-b63b-598ceb7caab0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 04:58:34 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:34.205 159486 INFO neutron.agent.ovn.metadata.agent [-] Port f84f6614-2902-417c-b63b-598ceb7caab0 in datapath 6a06c83c-a173-43fd-9343-735e8a52503a bound to our chassis#033[00m Nov 26 04:58:34 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:34.208 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6a06c83c-a173-43fd-9343-735e8a52503a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 04:58:34 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:34.209 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[b128351c-0d0a-4db8-b49c-79ca6e855326]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:34 localhost journal[229445]: ethtool ioctl error on tapf84f6614-29: No such device Nov 26 04:58:34 localhost ovn_controller[153664]: 2025-11-26T09:58:34Z|00095|binding|INFO|Setting lport f84f6614-2902-417c-b63b-598ceb7caab0 ovn-installed in OVS Nov 26 04:58:34 localhost ovn_controller[153664]: 2025-11-26T09:58:34Z|00096|binding|INFO|Setting lport f84f6614-2902-417c-b63b-598ceb7caab0 up in Southbound Nov 26 04:58:34 localhost journal[229445]: ethtool ioctl error on tapf84f6614-29: No such device Nov 26 04:58:34 localhost nova_compute[281415]: 2025-11-26 09:58:34.226 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:34 localhost journal[229445]: ethtool ioctl error on tapf84f6614-29: No such device Nov 26 04:58:34 localhost journal[229445]: ethtool ioctl error on tapf84f6614-29: No such device Nov 26 04:58:34 localhost journal[229445]: ethtool ioctl error on tapf84f6614-29: No such device Nov 26 04:58:34 localhost journal[229445]: ethtool ioctl error on tapf84f6614-29: No such device Nov 26 04:58:34 localhost journal[229445]: ethtool ioctl error on tapf84f6614-29: No such device Nov 26 04:58:34 localhost journal[229445]: ethtool ioctl error on tapf84f6614-29: No such device Nov 26 04:58:34 localhost nova_compute[281415]: 2025-11-26 09:58:34.272 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:34 localhost nova_compute[281415]: 2025-11-26 09:58:34.308 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:34 localhost snmpd[66980]: empty variable list in _query Nov 26 04:58:34 localhost snmpd[66980]: empty variable list in _query Nov 26 04:58:34 localhost snmpd[66980]: empty variable list in _query Nov 26 04:58:35 localhost podman[307449]: Nov 26 04:58:35 localhost podman[307449]: 2025-11-26 09:58:35.272552839 +0000 UTC m=+0.101104868 container create d59321185e07c46f5d4a2fb56ea0ba6b861c077b1713e710c604a0a7202b89c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6a06c83c-a173-43fd-9343-735e8a52503a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118) Nov 26 04:58:35 localhost systemd[1]: Started libpod-conmon-d59321185e07c46f5d4a2fb56ea0ba6b861c077b1713e710c604a0a7202b89c6.scope. Nov 26 04:58:35 localhost podman[307449]: 2025-11-26 09:58:35.226960005 +0000 UTC m=+0.055512044 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 04:58:35 localhost systemd[1]: Started libcrun container. Nov 26 04:58:35 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/228f9830e991070f2a480624611f93d44afabdc42ed2aa40abbf57afcbd73a7a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 04:58:35 localhost nova_compute[281415]: 2025-11-26 09:58:35.352 281419 DEBUG nova.compute.manager [req-ebc9bd2a-3b33-4e07-bb4f-64b5452d0606 req-6e5c9a42-c457-4f0b-b828-a892d779ca86 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] Received event network-vif-plugged-0d550667-720d-4ac9-8474-c7582e0d87e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Nov 26 04:58:35 localhost nova_compute[281415]: 2025-11-26 09:58:35.354 281419 DEBUG oslo_concurrency.lockutils [req-ebc9bd2a-3b33-4e07-bb4f-64b5452d0606 req-6e5c9a42-c457-4f0b-b828-a892d779ca86 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] Acquiring lock "e1ef2930-c173-4abb-ba9c-3ef0c0b08400-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:58:35 localhost nova_compute[281415]: 2025-11-26 09:58:35.354 281419 DEBUG oslo_concurrency.lockutils [req-ebc9bd2a-3b33-4e07-bb4f-64b5452d0606 req-6e5c9a42-c457-4f0b-b828-a892d779ca86 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] Lock "e1ef2930-c173-4abb-ba9c-3ef0c0b08400-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:58:35 localhost nova_compute[281415]: 2025-11-26 09:58:35.354 281419 DEBUG oslo_concurrency.lockutils [req-ebc9bd2a-3b33-4e07-bb4f-64b5452d0606 req-6e5c9a42-c457-4f0b-b828-a892d779ca86 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] Lock "e1ef2930-c173-4abb-ba9c-3ef0c0b08400-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:58:35 localhost nova_compute[281415]: 2025-11-26 09:58:35.355 281419 DEBUG nova.compute.manager [req-ebc9bd2a-3b33-4e07-bb4f-64b5452d0606 req-6e5c9a42-c457-4f0b-b828-a892d779ca86 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] No waiting events found dispatching network-vif-plugged-0d550667-720d-4ac9-8474-c7582e0d87e3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Nov 26 04:58:35 localhost nova_compute[281415]: 2025-11-26 09:58:35.355 281419 WARNING nova.compute.manager [req-ebc9bd2a-3b33-4e07-bb4f-64b5452d0606 req-6e5c9a42-c457-4f0b-b828-a892d779ca86 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] Received unexpected event network-vif-plugged-0d550667-720d-4ac9-8474-c7582e0d87e3 for instance with vm_state active and task_state None.#033[00m Nov 26 04:58:35 localhost podman[307449]: 2025-11-26 09:58:35.35959742 +0000 UTC m=+0.188149449 container init d59321185e07c46f5d4a2fb56ea0ba6b861c077b1713e710c604a0a7202b89c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6a06c83c-a173-43fd-9343-735e8a52503a, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:58:35 localhost podman[307449]: 2025-11-26 09:58:35.374535463 +0000 UTC m=+0.203087502 container start d59321185e07c46f5d4a2fb56ea0ba6b861c077b1713e710c604a0a7202b89c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6a06c83c-a173-43fd-9343-735e8a52503a, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Nov 26 04:58:35 localhost dnsmasq[307468]: started, version 2.85 cachesize 150 Nov 26 04:58:35 localhost dnsmasq[307468]: DNS service limited to local subnets Nov 26 04:58:35 localhost dnsmasq[307468]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 04:58:35 localhost dnsmasq[307468]: warning: no upstream servers configured Nov 26 04:58:35 localhost dnsmasq-dhcp[307468]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 26 04:58:35 localhost dnsmasq[307468]: read /var/lib/neutron/dhcp/6a06c83c-a173-43fd-9343-735e8a52503a/addn_hosts - 0 addresses Nov 26 04:58:35 localhost dnsmasq-dhcp[307468]: read /var/lib/neutron/dhcp/6a06c83c-a173-43fd-9343-735e8a52503a/host Nov 26 04:58:35 localhost dnsmasq-dhcp[307468]: read /var/lib/neutron/dhcp/6a06c83c-a173-43fd-9343-735e8a52503a/opts Nov 26 04:58:35 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:58:35.562 262471 INFO neutron.agent.dhcp.agent [None req-31169c79-4f92-4272-afb8-31b18f30da52 - - - - - -] DHCP configuration for ports {'d160bab8-f6a4-4dc7-b4dd-c79a804f7583'} is completed#033[00m Nov 26 04:58:35 localhost nova_compute[281415]: 2025-11-26 09:58:35.644 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:35 localhost nova_compute[281415]: 2025-11-26 09:58:35.851 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:36 localhost systemd[1]: Stopping User Manager for UID 42436... Nov 26 04:58:36 localhost systemd[307007]: Activating special unit Exit the Session... Nov 26 04:58:36 localhost systemd[307007]: Stopped target Main User Target. Nov 26 04:58:36 localhost systemd[307007]: Stopped target Basic System. Nov 26 04:58:36 localhost systemd[307007]: Stopped target Paths. Nov 26 04:58:36 localhost systemd[307007]: Stopped target Sockets. Nov 26 04:58:36 localhost systemd[307007]: Stopped target Timers. Nov 26 04:58:36 localhost systemd[307007]: Stopped Mark boot as successful after the user session has run 2 minutes. Nov 26 04:58:36 localhost systemd[307007]: Stopped Daily Cleanup of User's Temporary Directories. Nov 26 04:58:36 localhost systemd[307007]: Closed D-Bus User Message Bus Socket. Nov 26 04:58:36 localhost systemd[307007]: Stopped Create User's Volatile Files and Directories. Nov 26 04:58:36 localhost systemd[307007]: Removed slice User Application Slice. Nov 26 04:58:36 localhost systemd[307007]: Reached target Shutdown. Nov 26 04:58:36 localhost systemd[307007]: Finished Exit the Session. Nov 26 04:58:36 localhost systemd[307007]: Reached target Exit the Session. Nov 26 04:58:36 localhost systemd[1]: user@42436.service: Deactivated successfully. Nov 26 04:58:36 localhost systemd[1]: Stopped User Manager for UID 42436. Nov 26 04:58:36 localhost systemd[1]: Stopping User Runtime Directory /run/user/42436... Nov 26 04:58:36 localhost systemd[1]: user-runtime-dir@42436.service: Deactivated successfully. Nov 26 04:58:36 localhost systemd[1]: Stopped User Runtime Directory /run/user/42436. Nov 26 04:58:36 localhost systemd[1]: Removed slice User Slice of UID 42436. Nov 26 04:58:36 localhost systemd[1]: tmp-crun.MNi5vp.mount: Deactivated successfully. Nov 26 04:58:36 localhost systemd[1]: run-user-42436.mount: Deactivated successfully. Nov 26 04:58:36 localhost nova_compute[281415]: 2025-11-26 09:58:36.587 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:36 localhost nova_compute[281415]: 2025-11-26 09:58:36.801 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:37 localhost nova_compute[281415]: 2025-11-26 09:58:37.401 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:37 localhost nova_compute[281415]: 2025-11-26 09:58:37.417 281419 DEBUG nova.compute.manager [req-c8634a11-5c7d-4079-92c4-95c775b5f3a2 req-c56e3326-ef57-495e-80d9-a955c8e348d8 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] Received event network-vif-plugged-0d550667-720d-4ac9-8474-c7582e0d87e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Nov 26 04:58:37 localhost nova_compute[281415]: 2025-11-26 09:58:37.418 281419 DEBUG oslo_concurrency.lockutils [req-c8634a11-5c7d-4079-92c4-95c775b5f3a2 req-c56e3326-ef57-495e-80d9-a955c8e348d8 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] Acquiring lock "e1ef2930-c173-4abb-ba9c-3ef0c0b08400-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:58:37 localhost nova_compute[281415]: 2025-11-26 09:58:37.418 281419 DEBUG oslo_concurrency.lockutils [req-c8634a11-5c7d-4079-92c4-95c775b5f3a2 req-c56e3326-ef57-495e-80d9-a955c8e348d8 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] Lock "e1ef2930-c173-4abb-ba9c-3ef0c0b08400-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:58:37 localhost nova_compute[281415]: 2025-11-26 09:58:37.419 281419 DEBUG oslo_concurrency.lockutils [req-c8634a11-5c7d-4079-92c4-95c775b5f3a2 req-c56e3326-ef57-495e-80d9-a955c8e348d8 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] Lock "e1ef2930-c173-4abb-ba9c-3ef0c0b08400-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:58:37 localhost nova_compute[281415]: 2025-11-26 09:58:37.419 281419 DEBUG nova.compute.manager [req-c8634a11-5c7d-4079-92c4-95c775b5f3a2 req-c56e3326-ef57-495e-80d9-a955c8e348d8 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] No waiting events found dispatching network-vif-plugged-0d550667-720d-4ac9-8474-c7582e0d87e3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Nov 26 04:58:37 localhost nova_compute[281415]: 2025-11-26 09:58:37.419 281419 WARNING nova.compute.manager [req-c8634a11-5c7d-4079-92c4-95c775b5f3a2 req-c56e3326-ef57-495e-80d9-a955c8e348d8 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] Received unexpected event network-vif-plugged-0d550667-720d-4ac9-8474-c7582e0d87e3 for instance with vm_state active and task_state None.#033[00m Nov 26 04:58:37 localhost neutron_sriov_agent[255515]: 2025-11-26 09:58:37.619 2 INFO neutron.agent.securitygroups_rpc [req-46ae4a5a-66a8-4bd6-b924-8d5c360138fa req-6b75880f-cfd2-4892-a88f-38fc74c5fbaf 9c32896295294ed68040ac238fcae076 7d3979babeb24fa2a182404ff97181c7 - - default default] Security group rule updated ['9efdd07f-2765-46ab-a36d-e494c9582d7e']#033[00m Nov 26 04:58:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:58:37 localhost podman[307470]: 2025-11-26 09:58:37.838441387 +0000 UTC m=+0.091455486 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 26 04:58:37 localhost podman[307470]: 2025-11-26 09:58:37.848586574 +0000 UTC m=+0.101600663 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 26 04:58:37 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:58:38 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:58:38.940 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T09:58:38Z, description=, device_id=022964cf-d685-42b9-9482-9b35b6682ea5, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=5e44092e-b6b3-491f-b86d-a62fa0794c46, ip_allocation=immediate, mac_address=fa:16:3e:8f:1c:f4, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T09:58:30Z, description=, dns_domain=, id=6a06c83c-a173-43fd-9343-735e8a52503a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersV294TestFqdnHostnames-789735874-network, port_security_enabled=True, project_id=3f251213a9644261874d24d123ed8f23, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=54121, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=518, status=ACTIVE, subnets=['0cabc8aa-78aa-41f6-b524-37bcf1e2f58a'], tags=[], tenant_id=3f251213a9644261874d24d123ed8f23, updated_at=2025-11-26T09:58:33Z, vlan_transparent=None, network_id=6a06c83c-a173-43fd-9343-735e8a52503a, port_security_enabled=False, project_id=3f251213a9644261874d24d123ed8f23, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=554, status=DOWN, tags=[], tenant_id=3f251213a9644261874d24d123ed8f23, updated_at=2025-11-26T09:58:38Z on network 6a06c83c-a173-43fd-9343-735e8a52503a#033[00m Nov 26 04:58:39 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:58:39 localhost dnsmasq[307468]: read /var/lib/neutron/dhcp/6a06c83c-a173-43fd-9343-735e8a52503a/addn_hosts - 1 addresses Nov 26 04:58:39 localhost podman[307508]: 2025-11-26 09:58:39.16785214 +0000 UTC m=+0.066305522 container kill d59321185e07c46f5d4a2fb56ea0ba6b861c077b1713e710c604a0a7202b89c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6a06c83c-a173-43fd-9343-735e8a52503a, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Nov 26 04:58:39 localhost dnsmasq-dhcp[307468]: read /var/lib/neutron/dhcp/6a06c83c-a173-43fd-9343-735e8a52503a/host Nov 26 04:58:39 localhost dnsmasq-dhcp[307468]: read /var/lib/neutron/dhcp/6a06c83c-a173-43fd-9343-735e8a52503a/opts Nov 26 04:58:39 localhost neutron_sriov_agent[255515]: 2025-11-26 09:58:39.334 2 INFO neutron.agent.securitygroups_rpc [req-044fac61-dc69-46ad-8006-1ef7286686fd req-0c3645e6-df26-4fdd-a2cc-e85418308479 9c32896295294ed68040ac238fcae076 7d3979babeb24fa2a182404ff97181c7 - - default default] Security group rule updated ['f85c0631-5d3d-4f39-9aaa-9e0c465f9b26']#033[00m Nov 26 04:58:39 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:58:39.535 262471 INFO neutron.agent.dhcp.agent [None req-08193473-ab95-43b5-b121-505941e693d5 - - - - - -] DHCP configuration for ports {'5e44092e-b6b3-491f-b86d-a62fa0794c46'} is completed#033[00m Nov 26 04:58:40 localhost nova_compute[281415]: 2025-11-26 09:58:40.132 281419 DEBUG oslo_concurrency.lockutils [None req-d2ef637f-d162-46a4-8ef7-f714ceac5f2b 08bd3011065645c0b2694bf134099aad b4dafc326e594f1996993253bb2d58d6 - - default default] Acquiring lock "e1ef2930-c173-4abb-ba9c-3ef0c0b08400" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:58:40 localhost nova_compute[281415]: 2025-11-26 09:58:40.132 281419 DEBUG oslo_concurrency.lockutils [None req-d2ef637f-d162-46a4-8ef7-f714ceac5f2b 08bd3011065645c0b2694bf134099aad b4dafc326e594f1996993253bb2d58d6 - - default default] Lock "e1ef2930-c173-4abb-ba9c-3ef0c0b08400" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:58:40 localhost nova_compute[281415]: 2025-11-26 09:58:40.133 281419 DEBUG oslo_concurrency.lockutils [None req-d2ef637f-d162-46a4-8ef7-f714ceac5f2b 08bd3011065645c0b2694bf134099aad b4dafc326e594f1996993253bb2d58d6 - - default default] Acquiring lock "e1ef2930-c173-4abb-ba9c-3ef0c0b08400-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:58:40 localhost nova_compute[281415]: 2025-11-26 09:58:40.133 281419 DEBUG oslo_concurrency.lockutils [None req-d2ef637f-d162-46a4-8ef7-f714ceac5f2b 08bd3011065645c0b2694bf134099aad b4dafc326e594f1996993253bb2d58d6 - - default default] Lock "e1ef2930-c173-4abb-ba9c-3ef0c0b08400-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:58:40 localhost nova_compute[281415]: 2025-11-26 09:58:40.134 281419 DEBUG oslo_concurrency.lockutils [None req-d2ef637f-d162-46a4-8ef7-f714ceac5f2b 08bd3011065645c0b2694bf134099aad b4dafc326e594f1996993253bb2d58d6 - - default default] Lock "e1ef2930-c173-4abb-ba9c-3ef0c0b08400-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:58:40 localhost nova_compute[281415]: 2025-11-26 09:58:40.136 281419 INFO nova.compute.manager [None req-d2ef637f-d162-46a4-8ef7-f714ceac5f2b 08bd3011065645c0b2694bf134099aad b4dafc326e594f1996993253bb2d58d6 - - default default] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] Terminating instance#033[00m Nov 26 04:58:40 localhost nova_compute[281415]: 2025-11-26 09:58:40.137 281419 DEBUG nova.compute.manager [None req-d2ef637f-d162-46a4-8ef7-f714ceac5f2b 08bd3011065645c0b2694bf134099aad b4dafc326e594f1996993253bb2d58d6 - - default default] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m Nov 26 04:58:40 localhost kernel: device tap0d550667-72 left promiscuous mode Nov 26 04:58:40 localhost NetworkManager[5970]: [1764151120.2063] device (tap0d550667-72): state change: disconnected -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed') Nov 26 04:58:40 localhost nova_compute[281415]: 2025-11-26 09:58:40.219 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:40 localhost ovn_controller[153664]: 2025-11-26T09:58:40Z|00097|binding|INFO|Releasing lport 0d550667-720d-4ac9-8474-c7582e0d87e3 from this chassis (sb_readonly=0) Nov 26 04:58:40 localhost ovn_controller[153664]: 2025-11-26T09:58:40Z|00098|binding|INFO|Setting lport 0d550667-720d-4ac9-8474-c7582e0d87e3 down in Southbound Nov 26 04:58:40 localhost ovn_controller[153664]: 2025-11-26T09:58:40Z|00099|binding|INFO|Releasing lport a2423fe8-d5c4-4093-930d-d6a8e0773b38 from this chassis (sb_readonly=0) Nov 26 04:58:40 localhost ovn_controller[153664]: 2025-11-26T09:58:40Z|00100|binding|INFO|Setting lport a2423fe8-d5c4-4093-930d-d6a8e0773b38 down in Southbound Nov 26 04:58:40 localhost ovn_controller[153664]: 2025-11-26T09:58:40Z|00101|binding|INFO|Removing iface tap0d550667-72 ovn-installed in OVS Nov 26 04:58:40 localhost nova_compute[281415]: 2025-11-26 09:58:40.225 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:40 localhost ovn_controller[153664]: 2025-11-26T09:58:40Z|00102|binding|INFO|Releasing lport 902cac07-b2a2-4229-98ef-295348da5d42 from this chassis (sb_readonly=0) Nov 26 04:58:40 localhost ovn_controller[153664]: 2025-11-26T09:58:40Z|00103|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 04:58:40 localhost ovn_controller[153664]: 2025-11-26T09:58:40Z|00104|binding|INFO|Releasing lport 4d5f2dd9-f7f6-4d61-8ddb-4559bfe1ff74 from this chassis (sb_readonly=0) Nov 26 04:58:40 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:40.239 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e7:38:fd 10.100.0.13'], port_security=['fa:16:3e:e7:38:fd 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-1619212502', 'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'e1ef2930-c173-4abb-ba9c-3ef0c0b08400', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4d6c05df-68f7-4c5b-baae-8e36a676fee9', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-1619212502', 'neutron:project_id': 'b4dafc326e594f1996993253bb2d58d6', 'neutron:revision_number': '12', 'neutron:security_group_ids': '43f74207-cce2-45c0-b433-8de91a69071b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2f9e5a10-6cc2-43d9-a208-4616c5b15844, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=0d550667-720d-4ac9-8474-c7582e0d87e3) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 04:58:40 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:40.242 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9e:a0:b0 19.80.0.10'], port_security=['fa:16:3e:9e:a0:b0 19.80.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['0d550667-720d-4ac9-8474-c7582e0d87e3'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-299618990', 'neutron:cidrs': '19.80.0.10/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4f38550a-23be-44bf-bced-1e632a24bf8c', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-299618990', 'neutron:project_id': 'b4dafc326e594f1996993253bb2d58d6', 'neutron:revision_number': '5', 'neutron:security_group_ids': '43f74207-cce2-45c0-b433-8de91a69071b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=a19e2cd7-918e-429d-b3f8-5eb7832ddf0e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=a2423fe8-d5c4-4093-930d-d6a8e0773b38) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 04:58:40 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:40.244 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 0d550667-720d-4ac9-8474-c7582e0d87e3 in datapath 4d6c05df-68f7-4c5b-baae-8e36a676fee9 unbound from our chassis#033[00m Nov 26 04:58:40 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:40.248 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Port 693a1fc3-b336-4d61-952b-36b18b17363b IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 26 04:58:40 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:40.248 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4d6c05df-68f7-4c5b-baae-8e36a676fee9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 04:58:40 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:40.249 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[df9dc287-9e44-4ade-9e16-39e25d0e588b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:40 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:40.250 159486 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4d6c05df-68f7-4c5b-baae-8e36a676fee9 namespace which is not needed anymore#033[00m Nov 26 04:58:40 localhost systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000007.scope: Deactivated successfully. Nov 26 04:58:40 localhost systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000007.scope: Consumed 5.099s CPU time. Nov 26 04:58:40 localhost systemd-machined[83873]: Machine qemu-3-instance-00000007 terminated. Nov 26 04:58:40 localhost nova_compute[281415]: 2025-11-26 09:58:40.323 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:40 localhost nova_compute[281415]: 2025-11-26 09:58:40.377 281419 INFO nova.virt.libvirt.driver [-] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] Instance destroyed successfully.#033[00m Nov 26 04:58:40 localhost nova_compute[281415]: 2025-11-26 09:58:40.378 281419 DEBUG nova.objects.instance [None req-d2ef637f-d162-46a4-8ef7-f714ceac5f2b 08bd3011065645c0b2694bf134099aad b4dafc326e594f1996993253bb2d58d6 - - default default] Lazy-loading 'resources' on Instance uuid e1ef2930-c173-4abb-ba9c-3ef0c0b08400 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:58:40 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e96 e96: 6 total, 6 up, 6 in Nov 26 04:58:40 localhost nova_compute[281415]: 2025-11-26 09:58:40.432 281419 DEBUG nova.virt.libvirt.vif [None req-d2ef637f-d162-46a4-8ef7-f714ceac5f2b 08bd3011065645c0b2694bf134099aad b4dafc326e594f1996993253bb2d58d6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-26T09:58:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-2031069440',display_name='tempest-LiveMigrationTest-server-2031069440',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(5),hidden=False,host='np0005536118.localdomain',hostname='tempest-livemigrationtest-server-2031069440',id=7,image_ref='211ae400-609a-4c22-9588-f4189139a50b',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-11-26T09:58:14Z,launched_on='np0005536117.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='np0005536118.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='b4dafc326e594f1996993253bb2d58d6',ramdisk_id='',reservation_id='r-px4b1dcl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='211ae400-609a-4c22-9588-f4189139a50b',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-1996637473',owner_user_name='tempest-LiveMigrationTest-1996637473-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2025-11-26T09:58:33Z,user_data=None,user_id='08bd3011065645c0b2694bf134099aad',uuid=e1ef2930-c173-4abb-ba9c-3ef0c0b08400,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0d550667-720d-4ac9-8474-c7582e0d87e3", "address": "fa:16:3e:e7:38:fd", "network": {"id": "4d6c05df-68f7-4c5b-baae-8e36a676fee9", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2046186367-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "b4dafc326e594f1996993253bb2d58d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d550667-72", "ovs_interfaceid": "0d550667-720d-4ac9-8474-c7582e0d87e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m Nov 26 04:58:40 localhost nova_compute[281415]: 2025-11-26 09:58:40.433 281419 DEBUG nova.network.os_vif_util [None req-d2ef637f-d162-46a4-8ef7-f714ceac5f2b 08bd3011065645c0b2694bf134099aad b4dafc326e594f1996993253bb2d58d6 - - default default] Converting VIF {"id": "0d550667-720d-4ac9-8474-c7582e0d87e3", "address": "fa:16:3e:e7:38:fd", "network": {"id": "4d6c05df-68f7-4c5b-baae-8e36a676fee9", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2046186367-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "b4dafc326e594f1996993253bb2d58d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d550667-72", "ovs_interfaceid": "0d550667-720d-4ac9-8474-c7582e0d87e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Nov 26 04:58:40 localhost nova_compute[281415]: 2025-11-26 09:58:40.434 281419 DEBUG nova.network.os_vif_util [None req-d2ef637f-d162-46a4-8ef7-f714ceac5f2b 08bd3011065645c0b2694bf134099aad b4dafc326e594f1996993253bb2d58d6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e7:38:fd,bridge_name='br-int',has_traffic_filtering=True,id=0d550667-720d-4ac9-8474-c7582e0d87e3,network=Network(4d6c05df-68f7-4c5b-baae-8e36a676fee9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0d550667-72') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Nov 26 04:58:40 localhost nova_compute[281415]: 2025-11-26 09:58:40.435 281419 DEBUG os_vif [None req-d2ef637f-d162-46a4-8ef7-f714ceac5f2b 08bd3011065645c0b2694bf134099aad b4dafc326e594f1996993253bb2d58d6 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:38:fd,bridge_name='br-int',has_traffic_filtering=True,id=0d550667-720d-4ac9-8474-c7582e0d87e3,network=Network(4d6c05df-68f7-4c5b-baae-8e36a676fee9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0d550667-72') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m Nov 26 04:58:40 localhost nova_compute[281415]: 2025-11-26 09:58:40.437 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:40 localhost nova_compute[281415]: 2025-11-26 09:58:40.437 281419 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0d550667-72, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:58:40 localhost nova_compute[281415]: 2025-11-26 09:58:40.440 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:40 localhost nova_compute[281415]: 2025-11-26 09:58:40.442 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:40 localhost nova_compute[281415]: 2025-11-26 09:58:40.444 281419 INFO os_vif [None req-d2ef637f-d162-46a4-8ef7-f714ceac5f2b 08bd3011065645c0b2694bf134099aad b4dafc326e594f1996993253bb2d58d6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:38:fd,bridge_name='br-int',has_traffic_filtering=True,id=0d550667-720d-4ac9-8474-c7582e0d87e3,network=Network(4d6c05df-68f7-4c5b-baae-8e36a676fee9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap0d550667-72')#033[00m Nov 26 04:58:40 localhost neutron-haproxy-ovnmeta-4d6c05df-68f7-4c5b-baae-8e36a676fee9[307233]: [NOTICE] (307237) : haproxy version is 2.8.14-c23fe91 Nov 26 04:58:40 localhost neutron-haproxy-ovnmeta-4d6c05df-68f7-4c5b-baae-8e36a676fee9[307233]: [NOTICE] (307237) : path to executable is /usr/sbin/haproxy Nov 26 04:58:40 localhost neutron-haproxy-ovnmeta-4d6c05df-68f7-4c5b-baae-8e36a676fee9[307233]: [WARNING] (307237) : Exiting Master process... Nov 26 04:58:40 localhost neutron-haproxy-ovnmeta-4d6c05df-68f7-4c5b-baae-8e36a676fee9[307233]: [ALERT] (307237) : Current worker (307239) exited with code 143 (Terminated) Nov 26 04:58:40 localhost neutron-haproxy-ovnmeta-4d6c05df-68f7-4c5b-baae-8e36a676fee9[307233]: [WARNING] (307237) : All workers exited. Exiting... (0) Nov 26 04:58:40 localhost systemd[1]: libpod-5af94a928bca259bc592706935dd68fd915085185d393aecb022dc74d1ecd77c.scope: Deactivated successfully. Nov 26 04:58:40 localhost podman[307563]: 2025-11-26 09:58:40.498716367 +0000 UTC m=+0.094623542 container died 5af94a928bca259bc592706935dd68fd915085185d393aecb022dc74d1ecd77c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d6c05df-68f7-4c5b-baae-8e36a676fee9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 26 04:58:40 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5af94a928bca259bc592706935dd68fd915085185d393aecb022dc74d1ecd77c-userdata-shm.mount: Deactivated successfully. Nov 26 04:58:40 localhost podman[307563]: 2025-11-26 09:58:40.602526107 +0000 UTC m=+0.198433272 container cleanup 5af94a928bca259bc592706935dd68fd915085185d393aecb022dc74d1ecd77c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d6c05df-68f7-4c5b-baae-8e36a676fee9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:58:40 localhost podman[307593]: 2025-11-26 09:58:40.615911153 +0000 UTC m=+0.111039960 container cleanup 5af94a928bca259bc592706935dd68fd915085185d393aecb022dc74d1ecd77c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d6c05df-68f7-4c5b-baae-8e36a676fee9, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:58:40 localhost systemd[1]: libpod-conmon-5af94a928bca259bc592706935dd68fd915085185d393aecb022dc74d1ecd77c.scope: Deactivated successfully. Nov 26 04:58:40 localhost podman[307615]: 2025-11-26 09:58:40.732921093 +0000 UTC m=+0.104006166 container remove 5af94a928bca259bc592706935dd68fd915085185d393aecb022dc74d1ecd77c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d6c05df-68f7-4c5b-baae-8e36a676fee9, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:58:40 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:40.742 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[aa494bc9-1486-4c95-8257-7b160892b7a7]: (4, ('Wed Nov 26 09:58:40 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-4d6c05df-68f7-4c5b-baae-8e36a676fee9 (5af94a928bca259bc592706935dd68fd915085185d393aecb022dc74d1ecd77c)\n5af94a928bca259bc592706935dd68fd915085185d393aecb022dc74d1ecd77c\nWed Nov 26 09:58:40 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-4d6c05df-68f7-4c5b-baae-8e36a676fee9 (5af94a928bca259bc592706935dd68fd915085185d393aecb022dc74d1ecd77c)\n5af94a928bca259bc592706935dd68fd915085185d393aecb022dc74d1ecd77c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:40 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:40.745 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[ecab8d9a-175c-4000-be7f-c4ca6367ee39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:40 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:40.746 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4d6c05df-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:58:40 localhost nova_compute[281415]: 2025-11-26 09:58:40.749 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:40 localhost kernel: device tap4d6c05df-60 left promiscuous mode Nov 26 04:58:40 localhost nova_compute[281415]: 2025-11-26 09:58:40.759 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:40 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:40.765 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[aa966e22-83dd-4467-a807-6890f76f3c94]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:40 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:40.784 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[3267d161-6364-4404-b1e6-a3c8e5afdba7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:40 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:40.785 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[d25a9777-d5ba-4a5e-bfe9-9cbf66a9554f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:40 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:40.806 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[093c17b8-8618-4911-9e7c-dff92f5a98a9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1158361, 'reachable_time': 18598, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 307631, 'error': None, 'target': 'ovnmeta-4d6c05df-68f7-4c5b-baae-8e36a676fee9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:40 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:40.809 159623 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4d6c05df-68f7-4c5b-baae-8e36a676fee9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m Nov 26 04:58:40 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:40.810 159623 DEBUG oslo.privsep.daemon [-] privsep: reply[21533535-18b9-43ee-8fe7-ce19e6b05bd4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:40 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:40.811 159486 INFO neutron.agent.ovn.metadata.agent [-] Port a2423fe8-d5c4-4093-930d-d6a8e0773b38 in datapath 4f38550a-23be-44bf-bced-1e632a24bf8c unbound from our chassis#033[00m Nov 26 04:58:40 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:40.816 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Port cee63832-c8ec-4b65-b763-723a40379b0f IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 26 04:58:40 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:40.816 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4f38550a-23be-44bf-bced-1e632a24bf8c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 04:58:40 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:40.817 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[ada8c68a-29fc-4552-af60-3d26605b3d01]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:40 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:40.818 159486 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4f38550a-23be-44bf-bced-1e632a24bf8c namespace which is not needed anymore#033[00m Nov 26 04:58:41 localhost neutron-haproxy-ovnmeta-4f38550a-23be-44bf-bced-1e632a24bf8c[307307]: [NOTICE] (307312) : haproxy version is 2.8.14-c23fe91 Nov 26 04:58:41 localhost neutron-haproxy-ovnmeta-4f38550a-23be-44bf-bced-1e632a24bf8c[307307]: [NOTICE] (307312) : path to executable is /usr/sbin/haproxy Nov 26 04:58:41 localhost neutron-haproxy-ovnmeta-4f38550a-23be-44bf-bced-1e632a24bf8c[307307]: [WARNING] (307312) : Exiting Master process... Nov 26 04:58:41 localhost neutron-haproxy-ovnmeta-4f38550a-23be-44bf-bced-1e632a24bf8c[307307]: [ALERT] (307312) : Current worker (307314) exited with code 143 (Terminated) Nov 26 04:58:41 localhost neutron-haproxy-ovnmeta-4f38550a-23be-44bf-bced-1e632a24bf8c[307307]: [WARNING] (307312) : All workers exited. Exiting... (0) Nov 26 04:58:41 localhost systemd[1]: libpod-e8bf353e35129ae664e4ff8a3ef5aa6bacb2732ef92ae04226ca9993e2cd5a03.scope: Deactivated successfully. Nov 26 04:58:41 localhost podman[307649]: 2025-11-26 09:58:41.071254558 +0000 UTC m=+0.094844018 container died e8bf353e35129ae664e4ff8a3ef5aa6bacb2732ef92ae04226ca9993e2cd5a03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4f38550a-23be-44bf-bced-1e632a24bf8c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:58:41 localhost podman[307649]: 2025-11-26 09:58:41.107208598 +0000 UTC m=+0.130798038 container cleanup e8bf353e35129ae664e4ff8a3ef5aa6bacb2732ef92ae04226ca9993e2cd5a03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4f38550a-23be-44bf-bced-1e632a24bf8c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:58:41 localhost podman[307663]: 2025-11-26 09:58:41.164419465 +0000 UTC m=+0.084620738 container cleanup e8bf353e35129ae664e4ff8a3ef5aa6bacb2732ef92ae04226ca9993e2cd5a03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4f38550a-23be-44bf-bced-1e632a24bf8c, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0) Nov 26 04:58:41 localhost systemd[1]: libpod-conmon-e8bf353e35129ae664e4ff8a3ef5aa6bacb2732ef92ae04226ca9993e2cd5a03.scope: Deactivated successfully. Nov 26 04:58:41 localhost podman[307676]: 2025-11-26 09:58:41.243151283 +0000 UTC m=+0.107340197 container remove e8bf353e35129ae664e4ff8a3ef5aa6bacb2732ef92ae04226ca9993e2cd5a03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4f38550a-23be-44bf-bced-1e632a24bf8c, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:58:41 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:41.249 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[a8b36989-847a-43ec-81ec-7a7f2ef1d52d]: (4, ('Wed Nov 26 09:58:40 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-4f38550a-23be-44bf-bced-1e632a24bf8c (e8bf353e35129ae664e4ff8a3ef5aa6bacb2732ef92ae04226ca9993e2cd5a03)\ne8bf353e35129ae664e4ff8a3ef5aa6bacb2732ef92ae04226ca9993e2cd5a03\nWed Nov 26 09:58:41 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-4f38550a-23be-44bf-bced-1e632a24bf8c (e8bf353e35129ae664e4ff8a3ef5aa6bacb2732ef92ae04226ca9993e2cd5a03)\ne8bf353e35129ae664e4ff8a3ef5aa6bacb2732ef92ae04226ca9993e2cd5a03\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:41 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:41.253 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[fb41ba2b-7545-4d01-8345-7da5df20907d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:41 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:41.254 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4f38550a-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:58:41 localhost nova_compute[281415]: 2025-11-26 09:58:41.258 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:41 localhost kernel: device tap4f38550a-20 left promiscuous mode Nov 26 04:58:41 localhost nova_compute[281415]: 2025-11-26 09:58:41.267 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:41 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:41.272 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[999e9700-3b6d-4e3c-b7fa-357dd96e1a6c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:41 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:41.291 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[a0ac63cc-8cb0-4991-b71e-3fb4305fec13]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:41 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:41.293 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[95ee14a7-2c0e-4fe1-8d59-164e43f64bc6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:41 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:41.312 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[4438057a-437b-40fb-b634-b681cbf85901]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1158480, 'reachable_time': 18509, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 307697, 'error': None, 'target': 'ovnmeta-4f38550a-23be-44bf-bced-1e632a24bf8c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:41 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:41.315 159623 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4f38550a-23be-44bf-bced-1e632a24bf8c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m Nov 26 04:58:41 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:41.315 159623 DEBUG oslo.privsep.daemon [-] privsep: reply[9ea655c2-9f30-44a9-8f1c-05a70c1679c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:41 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:58:41.357 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T09:58:38Z, description=, device_id=022964cf-d685-42b9-9482-9b35b6682ea5, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=5e44092e-b6b3-491f-b86d-a62fa0794c46, ip_allocation=immediate, mac_address=fa:16:3e:8f:1c:f4, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T09:58:30Z, description=, dns_domain=, id=6a06c83c-a173-43fd-9343-735e8a52503a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersV294TestFqdnHostnames-789735874-network, port_security_enabled=True, project_id=3f251213a9644261874d24d123ed8f23, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=54121, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=518, status=ACTIVE, subnets=['0cabc8aa-78aa-41f6-b524-37bcf1e2f58a'], tags=[], tenant_id=3f251213a9644261874d24d123ed8f23, updated_at=2025-11-26T09:58:33Z, vlan_transparent=None, network_id=6a06c83c-a173-43fd-9343-735e8a52503a, port_security_enabled=False, project_id=3f251213a9644261874d24d123ed8f23, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=554, status=DOWN, tags=[], tenant_id=3f251213a9644261874d24d123ed8f23, updated_at=2025-11-26T09:58:38Z on network 6a06c83c-a173-43fd-9343-735e8a52503a#033[00m Nov 26 04:58:41 localhost systemd[1]: var-lib-containers-storage-overlay-a43a8b424417f3fa8a7eea8b9e05899066d5875a06891ceb2764a7144980a6ba-merged.mount: Deactivated successfully. Nov 26 04:58:41 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e8bf353e35129ae664e4ff8a3ef5aa6bacb2732ef92ae04226ca9993e2cd5a03-userdata-shm.mount: Deactivated successfully. Nov 26 04:58:41 localhost systemd[1]: run-netns-ovnmeta\x2d4f38550a\x2d23be\x2d44bf\x2dbced\x2d1e632a24bf8c.mount: Deactivated successfully. Nov 26 04:58:41 localhost systemd[1]: var-lib-containers-storage-overlay-de82163ee46623114019a2ce11ea06e49ec01a76a478959ee59a75d7e062b472-merged.mount: Deactivated successfully. Nov 26 04:58:41 localhost systemd[1]: run-netns-ovnmeta\x2d4d6c05df\x2d68f7\x2d4c5b\x2dbaae\x2d8e36a676fee9.mount: Deactivated successfully. Nov 26 04:58:41 localhost nova_compute[281415]: 2025-11-26 09:58:41.633 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:41 localhost systemd[1]: tmp-crun.BH9lU8.mount: Deactivated successfully. Nov 26 04:58:41 localhost dnsmasq[307468]: read /var/lib/neutron/dhcp/6a06c83c-a173-43fd-9343-735e8a52503a/addn_hosts - 1 addresses Nov 26 04:58:41 localhost dnsmasq-dhcp[307468]: read /var/lib/neutron/dhcp/6a06c83c-a173-43fd-9343-735e8a52503a/host Nov 26 04:58:41 localhost podman[307720]: 2025-11-26 09:58:41.667807938 +0000 UTC m=+0.102045378 container kill d59321185e07c46f5d4a2fb56ea0ba6b861c077b1713e710c604a0a7202b89c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6a06c83c-a173-43fd-9343-735e8a52503a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 26 04:58:41 localhost dnsmasq-dhcp[307468]: read /var/lib/neutron/dhcp/6a06c83c-a173-43fd-9343-735e8a52503a/opts Nov 26 04:58:41 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:58:41.717 262471 INFO neutron.agent.linux.ip_lib [None req-8a1cd949-7dbb-46c4-99db-5c484b169f17 - - - - - -] Device tap07515a91-8d cannot be used as it has no MAC address#033[00m Nov 26 04:58:41 localhost nova_compute[281415]: 2025-11-26 09:58:41.727 281419 INFO nova.virt.libvirt.driver [None req-d2ef637f-d162-46a4-8ef7-f714ceac5f2b 08bd3011065645c0b2694bf134099aad b4dafc326e594f1996993253bb2d58d6 - - default default] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] Deleting instance files /var/lib/nova/instances/e1ef2930-c173-4abb-ba9c-3ef0c0b08400_del#033[00m Nov 26 04:58:41 localhost nova_compute[281415]: 2025-11-26 09:58:41.728 281419 INFO nova.virt.libvirt.driver [None req-d2ef637f-d162-46a4-8ef7-f714ceac5f2b 08bd3011065645c0b2694bf134099aad b4dafc326e594f1996993253bb2d58d6 - - default default] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] Deletion of /var/lib/nova/instances/e1ef2930-c173-4abb-ba9c-3ef0c0b08400_del complete#033[00m Nov 26 04:58:41 localhost nova_compute[281415]: 2025-11-26 09:58:41.758 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:41 localhost kernel: device tap07515a91-8d entered promiscuous mode Nov 26 04:58:41 localhost systemd-udevd[307531]: Network interface NamePolicy= disabled on kernel command line. Nov 26 04:58:41 localhost ovn_controller[153664]: 2025-11-26T09:58:41Z|00105|binding|INFO|Claiming lport 07515a91-8d6e-4310-84cc-6bda0ebe4dbc for this chassis. Nov 26 04:58:41 localhost ovn_controller[153664]: 2025-11-26T09:58:41Z|00106|binding|INFO|07515a91-8d6e-4310-84cc-6bda0ebe4dbc: Claiming unknown Nov 26 04:58:41 localhost NetworkManager[5970]: [1764151121.7680] manager: (tap07515a91-8d): new Generic device (/org/freedesktop/NetworkManager/Devices/24) Nov 26 04:58:41 localhost nova_compute[281415]: 2025-11-26 09:58:41.767 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:41 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:41.782 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-c3527677-edc0-4790-a4a3-cebcf166663a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c3527677-edc0-4790-a4a3-cebcf166663a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cae6053338d645a7a195490ea78e074c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1a969964-6f31-4bac-892c-89573ce5cbd0, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=07515a91-8d6e-4310-84cc-6bda0ebe4dbc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 04:58:41 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:41.784 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 07515a91-8d6e-4310-84cc-6bda0ebe4dbc in datapath c3527677-edc0-4790-a4a3-cebcf166663a bound to our chassis#033[00m Nov 26 04:58:41 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:41.788 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Port ac4f873d-b3ea-4913-b5a0-2458e5b63b97 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 26 04:58:41 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:41.789 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c3527677-edc0-4790-a4a3-cebcf166663a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 04:58:41 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:41.791 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[f187c1cf-49fd-4681-bfb8-ff3bb36ec23b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:41 localhost neutron_sriov_agent[255515]: 2025-11-26 09:58:41.797 2 INFO neutron.agent.securitygroups_rpc [req-d093547e-73d6-40ed-8971-d00735be7899 req-aae4e994-3e12-45e4-9514-edcbbbdc3f4f 9c32896295294ed68040ac238fcae076 7d3979babeb24fa2a182404ff97181c7 - - default default] Security group rule updated ['8fb342bd-c2b4-44ce-8a07-e9d78bf463a0']#033[00m Nov 26 04:58:41 localhost nova_compute[281415]: 2025-11-26 09:58:41.805 281419 INFO nova.compute.manager [None req-d2ef637f-d162-46a4-8ef7-f714ceac5f2b 08bd3011065645c0b2694bf134099aad b4dafc326e594f1996993253bb2d58d6 - - default default] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] Took 1.67 seconds to destroy the instance on the hypervisor.#033[00m Nov 26 04:58:41 localhost nova_compute[281415]: 2025-11-26 09:58:41.806 281419 DEBUG oslo.service.loopingcall [None req-d2ef637f-d162-46a4-8ef7-f714ceac5f2b 08bd3011065645c0b2694bf134099aad b4dafc326e594f1996993253bb2d58d6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m Nov 26 04:58:41 localhost nova_compute[281415]: 2025-11-26 09:58:41.806 281419 DEBUG nova.compute.manager [-] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m Nov 26 04:58:41 localhost nova_compute[281415]: 2025-11-26 09:58:41.807 281419 DEBUG nova.network.neutron [-] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m Nov 26 04:58:41 localhost ovn_controller[153664]: 2025-11-26T09:58:41Z|00107|binding|INFO|Setting lport 07515a91-8d6e-4310-84cc-6bda0ebe4dbc ovn-installed in OVS Nov 26 04:58:41 localhost ovn_controller[153664]: 2025-11-26T09:58:41Z|00108|binding|INFO|Setting lport 07515a91-8d6e-4310-84cc-6bda0ebe4dbc up in Southbound Nov 26 04:58:41 localhost nova_compute[281415]: 2025-11-26 09:58:41.812 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:41 localhost nova_compute[281415]: 2025-11-26 09:58:41.863 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:41 localhost nova_compute[281415]: 2025-11-26 09:58:41.896 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:42 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:58:42.122 262471 INFO neutron.agent.dhcp.agent [None req-f0f852f7-897e-4ae2-870e-685717e5e685 - - - - - -] DHCP configuration for ports {'5e44092e-b6b3-491f-b86d-a62fa0794c46'} is completed#033[00m Nov 26 04:58:42 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e97 e97: 6 total, 6 up, 6 in Nov 26 04:58:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:58:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:58:42 localhost podman[307800]: Nov 26 04:58:42 localhost podman[307800]: 2025-11-26 09:58:42.836818104 +0000 UTC m=+0.107185012 container create c598f3b3b43db4298142ff3d887b3116867dabe5e9678803fdea4a67190259e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c3527677-edc0-4790-a4a3-cebcf166663a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Nov 26 04:58:42 localhost systemd[1]: tmp-crun.ab28WR.mount: Deactivated successfully. Nov 26 04:58:42 localhost podman[307802]: 2025-11-26 09:58:42.863184604 +0000 UTC m=+0.118358932 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 26 04:58:42 localhost podman[307800]: 2025-11-26 09:58:42.788683414 +0000 UTC m=+0.059050352 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 04:58:42 localhost podman[307807]: 2025-11-26 09:58:42.89139522 +0000 UTC m=+0.144537556 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, config_id=multipathd) Nov 26 04:58:42 localhost podman[307807]: 2025-11-26 09:58:42.907399535 +0000 UTC m=+0.160541851 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, config_id=multipathd, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Nov 26 04:58:42 localhost podman[307802]: 2025-11-26 09:58:42.921023069 +0000 UTC m=+0.176197397 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent) Nov 26 04:58:42 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:58:42 localhost systemd[1]: Started libpod-conmon-c598f3b3b43db4298142ff3d887b3116867dabe5e9678803fdea4a67190259e8.scope. Nov 26 04:58:42 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:58:42 localhost systemd[1]: Started libcrun container. Nov 26 04:58:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/943ce31a4e5f39a6d218114bc386e66ed535d0be6b1895c0a105df51ee53c968/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 04:58:42 localhost podman[307800]: 2025-11-26 09:58:42.972445109 +0000 UTC m=+0.242812007 container init c598f3b3b43db4298142ff3d887b3116867dabe5e9678803fdea4a67190259e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c3527677-edc0-4790-a4a3-cebcf166663a, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 26 04:58:42 localhost podman[307800]: 2025-11-26 09:58:42.982863255 +0000 UTC m=+0.253230153 container start c598f3b3b43db4298142ff3d887b3116867dabe5e9678803fdea4a67190259e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c3527677-edc0-4790-a4a3-cebcf166663a, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Nov 26 04:58:42 localhost dnsmasq[307856]: started, version 2.85 cachesize 150 Nov 26 04:58:42 localhost dnsmasq[307856]: DNS service limited to local subnets Nov 26 04:58:42 localhost dnsmasq[307856]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 04:58:42 localhost dnsmasq[307856]: warning: no upstream servers configured Nov 26 04:58:42 localhost dnsmasq-dhcp[307856]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 26 04:58:42 localhost dnsmasq[307856]: read /var/lib/neutron/dhcp/c3527677-edc0-4790-a4a3-cebcf166663a/addn_hosts - 0 addresses Nov 26 04:58:42 localhost dnsmasq-dhcp[307856]: read /var/lib/neutron/dhcp/c3527677-edc0-4790-a4a3-cebcf166663a/host Nov 26 04:58:42 localhost dnsmasq-dhcp[307856]: read /var/lib/neutron/dhcp/c3527677-edc0-4790-a4a3-cebcf166663a/opts Nov 26 04:58:43 localhost nova_compute[281415]: 2025-11-26 09:58:43.036 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:43 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:58:43.194 262471 INFO neutron.agent.dhcp.agent [None req-7acf20b1-b81d-42ba-a55a-c9699e198d98 - - - - - -] DHCP configuration for ports {'777235b6-9b37-4a35-8cf6-640209ecec4b'} is completed#033[00m Nov 26 04:58:43 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e98 e98: 6 total, 6 up, 6 in Nov 26 04:58:43 localhost neutron_sriov_agent[255515]: 2025-11-26 09:58:43.817 2 INFO neutron.agent.securitygroups_rpc [req-9c95162e-141e-4167-a520-8e211d708938 req-a5e5fcc3-c6df-402a-a65a-6dfe152c9488 9c32896295294ed68040ac238fcae076 7d3979babeb24fa2a182404ff97181c7 - - default default] Security group rule updated ['7562502a-9279-429d-8282-0cce114169cd']#033[00m Nov 26 04:58:44 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e98 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:58:44 localhost neutron_sriov_agent[255515]: 2025-11-26 09:58:44.861 2 INFO neutron.agent.securitygroups_rpc [req-9e1ec568-eb05-4874-b324-1a28a704bd34 req-3c198c7a-ed20-4847-8b2f-9582cce11746 9c32896295294ed68040ac238fcae076 7d3979babeb24fa2a182404ff97181c7 - - default default] Security group rule updated ['f0819cb2-09f4-4da2-92c8-d5813ab79c74']#033[00m Nov 26 04:58:45 localhost neutron_sriov_agent[255515]: 2025-11-26 09:58:45.233 2 INFO neutron.agent.securitygroups_rpc [req-0383ae0f-1cfe-464a-9dec-d0faa326c161 req-cfa10f7e-aa64-46e6-81c5-6bf04d179367 9c32896295294ed68040ac238fcae076 7d3979babeb24fa2a182404ff97181c7 - - default default] Security group rule updated ['f0819cb2-09f4-4da2-92c8-d5813ab79c74']#033[00m Nov 26 04:58:45 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:58:45.332 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T09:58:44Z, description=, device_id=d52896d4-cf70-4297-ac1a-320b30d51a7b, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=ca909b29-f705-46c4-93db-f0bed009b69c, ip_allocation=immediate, mac_address=fa:16:3e:15:d4:ab, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T09:58:38Z, description=, dns_domain=, id=c3527677-edc0-4790-a4a3-cebcf166663a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersTestJSON-1115363266-network, port_security_enabled=True, project_id=cae6053338d645a7a195490ea78e074c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=17027, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=552, status=ACTIVE, subnets=['bc1d5476-a486-4a93-9dd6-14039be2c929'], tags=[], tenant_id=cae6053338d645a7a195490ea78e074c, updated_at=2025-11-26T09:58:39Z, vlan_transparent=None, network_id=c3527677-edc0-4790-a4a3-cebcf166663a, port_security_enabled=False, project_id=cae6053338d645a7a195490ea78e074c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=591, status=DOWN, tags=[], tenant_id=cae6053338d645a7a195490ea78e074c, updated_at=2025-11-26T09:58:44Z on network c3527677-edc0-4790-a4a3-cebcf166663a#033[00m Nov 26 04:58:45 localhost ovn_controller[153664]: 2025-11-26T09:58:45Z|00109|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 04:58:45 localhost nova_compute[281415]: 2025-11-26 09:58:45.440 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:45 localhost nova_compute[281415]: 2025-11-26 09:58:45.496 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:45 localhost systemd[1]: tmp-crun.LtFNCB.mount: Deactivated successfully. Nov 26 04:58:45 localhost dnsmasq[307856]: read /var/lib/neutron/dhcp/c3527677-edc0-4790-a4a3-cebcf166663a/addn_hosts - 1 addresses Nov 26 04:58:45 localhost dnsmasq-dhcp[307856]: read /var/lib/neutron/dhcp/c3527677-edc0-4790-a4a3-cebcf166663a/host Nov 26 04:58:45 localhost dnsmasq-dhcp[307856]: read /var/lib/neutron/dhcp/c3527677-edc0-4790-a4a3-cebcf166663a/opts Nov 26 04:58:45 localhost podman[307872]: 2025-11-26 09:58:45.594597162 +0000 UTC m=+0.087017181 container kill c598f3b3b43db4298142ff3d887b3116867dabe5e9678803fdea4a67190259e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c3527677-edc0-4790-a4a3-cebcf166663a, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 26 04:58:45 localhost neutron_sriov_agent[255515]: 2025-11-26 09:58:45.699 2 INFO neutron.agent.securitygroups_rpc [req-4c21050c-a3f9-4cb9-9815-5871fe14b0ec req-d3659f6b-981f-4569-9fad-43494b28ddf3 9c32896295294ed68040ac238fcae076 7d3979babeb24fa2a182404ff97181c7 - - default default] Security group rule updated ['f0819cb2-09f4-4da2-92c8-d5813ab79c74']#033[00m Nov 26 04:58:45 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:58:45.705 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T09:57:57Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=0d550667-720d-4ac9-8474-c7582e0d87e3, ip_allocation=immediate, mac_address=fa:16:3e:e7:38:fd, name=tempest-parent-1619212502, network_id=4d6c05df-68f7-4c5b-baae-8e36a676fee9, port_security_enabled=True, project_id=b4dafc326e594f1996993253bb2d58d6, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=13, security_groups=['43f74207-cce2-45c0-b433-8de91a69071b'], standard_attr_id=275, status=DOWN, tags=[], tenant_id=b4dafc326e594f1996993253bb2d58d6, trunk_details=sub_ports=[], trunk_id=5141c774-a4f6-422c-9887-331cc02fe2c7, updated_at=2025-11-26T09:58:43Z on network 4d6c05df-68f7-4c5b-baae-8e36a676fee9#033[00m Nov 26 04:58:45 localhost openstack_network_exporter[242153]: ERROR 09:58:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:58:45 localhost openstack_network_exporter[242153]: ERROR 09:58:45 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:58:45 localhost openstack_network_exporter[242153]: ERROR 09:58:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:58:45 localhost openstack_network_exporter[242153]: ERROR 09:58:45 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:58:45 localhost openstack_network_exporter[242153]: Nov 26 04:58:45 localhost openstack_network_exporter[242153]: ERROR 09:58:45 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:58:45 localhost openstack_network_exporter[242153]: Nov 26 04:58:46 localhost podman[307909]: 2025-11-26 09:58:46.022841175 +0000 UTC m=+0.071317384 container kill b58c5885be267e41f25855c6e223cc21d67e81e399edc30b105f880aec89922c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4d6c05df-68f7-4c5b-baae-8e36a676fee9, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 26 04:58:46 localhost dnsmasq[306286]: read /var/lib/neutron/dhcp/4d6c05df-68f7-4c5b-baae-8e36a676fee9/addn_hosts - 2 addresses Nov 26 04:58:46 localhost dnsmasq-dhcp[306286]: read /var/lib/neutron/dhcp/4d6c05df-68f7-4c5b-baae-8e36a676fee9/host Nov 26 04:58:46 localhost dnsmasq-dhcp[306286]: read /var/lib/neutron/dhcp/4d6c05df-68f7-4c5b-baae-8e36a676fee9/opts Nov 26 04:58:46 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:58:46.051 262471 INFO neutron.agent.dhcp.agent [None req-3803cd0f-4f68-4591-98ea-1a41fac0b8f0 - - - - - -] DHCP configuration for ports {'ca909b29-f705-46c4-93db-f0bed009b69c'} is completed#033[00m Nov 26 04:58:46 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:58:46.306 262471 INFO neutron.agent.dhcp.agent [None req-e9b0913f-9bd6-49b9-8127-96c620bda33e - - - - - -] DHCP configuration for ports {'0d550667-720d-4ac9-8474-c7582e0d87e3'} is completed#033[00m Nov 26 04:58:46 localhost nova_compute[281415]: 2025-11-26 09:58:46.637 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:46 localhost nova_compute[281415]: 2025-11-26 09:58:46.714 281419 DEBUG nova.network.neutron [-] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:58:46 localhost nova_compute[281415]: 2025-11-26 09:58:46.732 281419 INFO nova.compute.manager [-] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] Took 4.93 seconds to deallocate network for instance.#033[00m Nov 26 04:58:46 localhost nova_compute[281415]: 2025-11-26 09:58:46.779 281419 DEBUG oslo_concurrency.lockutils [None req-d2ef637f-d162-46a4-8ef7-f714ceac5f2b 08bd3011065645c0b2694bf134099aad b4dafc326e594f1996993253bb2d58d6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:58:46 localhost nova_compute[281415]: 2025-11-26 09:58:46.780 281419 DEBUG oslo_concurrency.lockutils [None req-d2ef637f-d162-46a4-8ef7-f714ceac5f2b 08bd3011065645c0b2694bf134099aad b4dafc326e594f1996993253bb2d58d6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:58:46 localhost nova_compute[281415]: 2025-11-26 09:58:46.783 281419 DEBUG oslo_concurrency.lockutils [None req-d2ef637f-d162-46a4-8ef7-f714ceac5f2b 08bd3011065645c0b2694bf134099aad b4dafc326e594f1996993253bb2d58d6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:58:46 localhost nova_compute[281415]: 2025-11-26 09:58:46.829 281419 INFO nova.scheduler.client.report [None req-d2ef637f-d162-46a4-8ef7-f714ceac5f2b 08bd3011065645c0b2694bf134099aad b4dafc326e594f1996993253bb2d58d6 - - default default] Deleted allocations for instance e1ef2930-c173-4abb-ba9c-3ef0c0b08400#033[00m Nov 26 04:58:46 localhost nova_compute[281415]: 2025-11-26 09:58:46.905 281419 DEBUG oslo_concurrency.lockutils [None req-d2ef637f-d162-46a4-8ef7-f714ceac5f2b 08bd3011065645c0b2694bf134099aad b4dafc326e594f1996993253bb2d58d6 - - default default] Lock "e1ef2930-c173-4abb-ba9c-3ef0c0b08400" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 6.772s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:58:48 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e99 e99: 6 total, 6 up, 6 in Nov 26 04:58:48 localhost nova_compute[281415]: 2025-11-26 09:58:48.792 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:48 localhost neutron_sriov_agent[255515]: 2025-11-26 09:58:48.826 2 INFO neutron.agent.securitygroups_rpc [None req-6b62e28e-dab8-4f4b-ba67-bda4d3d2aff0 08bd3011065645c0b2694bf134099aad b4dafc326e594f1996993253bb2d58d6 - - default default] Security group member updated ['43f74207-cce2-45c0-b433-8de91a69071b']#033[00m Nov 26 04:58:48 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:58:48.925 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T09:58:44Z, description=, device_id=d52896d4-cf70-4297-ac1a-320b30d51a7b, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=ca909b29-f705-46c4-93db-f0bed009b69c, ip_allocation=immediate, mac_address=fa:16:3e:15:d4:ab, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T09:58:38Z, description=, dns_domain=, id=c3527677-edc0-4790-a4a3-cebcf166663a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersTestJSON-1115363266-network, port_security_enabled=True, project_id=cae6053338d645a7a195490ea78e074c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=17027, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=552, status=ACTIVE, subnets=['bc1d5476-a486-4a93-9dd6-14039be2c929'], tags=[], tenant_id=cae6053338d645a7a195490ea78e074c, updated_at=2025-11-26T09:58:39Z, vlan_transparent=None, network_id=c3527677-edc0-4790-a4a3-cebcf166663a, port_security_enabled=False, project_id=cae6053338d645a7a195490ea78e074c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=591, status=DOWN, tags=[], tenant_id=cae6053338d645a7a195490ea78e074c, updated_at=2025-11-26T09:58:44Z on network c3527677-edc0-4790-a4a3-cebcf166663a#033[00m Nov 26 04:58:49 localhost dnsmasq[306636]: read /var/lib/neutron/dhcp/4f38550a-23be-44bf-bced-1e632a24bf8c/addn_hosts - 0 addresses Nov 26 04:58:49 localhost dnsmasq-dhcp[306636]: read /var/lib/neutron/dhcp/4f38550a-23be-44bf-bced-1e632a24bf8c/host Nov 26 04:58:49 localhost dnsmasq-dhcp[306636]: read /var/lib/neutron/dhcp/4f38550a-23be-44bf-bced-1e632a24bf8c/opts Nov 26 04:58:49 localhost podman[307960]: 2025-11-26 09:58:49.126300701 +0000 UTC m=+0.067379035 container kill b80fa7c1255c0f06df471051a854abc953b5143aef047928c80c4d4e0072a560 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4f38550a-23be-44bf-bced-1e632a24bf8c, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:58:49 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e99 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:58:49 localhost systemd[1]: tmp-crun.OYEzws.mount: Deactivated successfully. Nov 26 04:58:49 localhost dnsmasq[307856]: read /var/lib/neutron/dhcp/c3527677-edc0-4790-a4a3-cebcf166663a/addn_hosts - 1 addresses Nov 26 04:58:49 localhost dnsmasq-dhcp[307856]: read /var/lib/neutron/dhcp/c3527677-edc0-4790-a4a3-cebcf166663a/host Nov 26 04:58:49 localhost dnsmasq-dhcp[307856]: read /var/lib/neutron/dhcp/c3527677-edc0-4790-a4a3-cebcf166663a/opts Nov 26 04:58:49 localhost podman[307971]: 2025-11-26 09:58:49.182485826 +0000 UTC m=+0.077890164 container kill c598f3b3b43db4298142ff3d887b3116867dabe5e9678803fdea4a67190259e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c3527677-edc0-4790-a4a3-cebcf166663a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118) Nov 26 04:58:49 localhost neutron_sriov_agent[255515]: 2025-11-26 09:58:49.204 2 INFO neutron.agent.securitygroups_rpc [None req-76de4253-bb04-4522-8215-af06a3c304e8 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Security group rule updated ['98096288-cb5f-4c7e-bb1e-1596965807ee']#033[00m Nov 26 04:58:49 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:58:49.463 262471 INFO neutron.agent.dhcp.agent [None req-bf3ccecf-71e5-45b4-a2ee-0dfc59f96e91 - - - - - -] DHCP configuration for ports {'ca909b29-f705-46c4-93db-f0bed009b69c'} is completed#033[00m Nov 26 04:58:49 localhost neutron_sriov_agent[255515]: 2025-11-26 09:58:49.600 2 INFO neutron.agent.securitygroups_rpc [None req-7b842d18-bd8f-438e-abf8-42bace462563 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Security group rule updated ['98096288-cb5f-4c7e-bb1e-1596965807ee']#033[00m Nov 26 04:58:49 localhost dnsmasq[306636]: exiting on receipt of SIGTERM Nov 26 04:58:49 localhost podman[308020]: 2025-11-26 09:58:49.754909342 +0000 UTC m=+0.067769047 container kill b80fa7c1255c0f06df471051a854abc953b5143aef047928c80c4d4e0072a560 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4f38550a-23be-44bf-bced-1e632a24bf8c, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Nov 26 04:58:49 localhost systemd[1]: libpod-b80fa7c1255c0f06df471051a854abc953b5143aef047928c80c4d4e0072a560.scope: Deactivated successfully. Nov 26 04:58:49 localhost podman[308035]: 2025-11-26 09:58:49.827607428 +0000 UTC m=+0.050481493 container died b80fa7c1255c0f06df471051a854abc953b5143aef047928c80c4d4e0072a560 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4f38550a-23be-44bf-bced-1e632a24bf8c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 26 04:58:49 localhost podman[308035]: 2025-11-26 09:58:49.866720815 +0000 UTC m=+0.089594850 container remove b80fa7c1255c0f06df471051a854abc953b5143aef047928c80c4d4e0072a560 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4f38550a-23be-44bf-bced-1e632a24bf8c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:58:49 localhost ovn_controller[153664]: 2025-11-26T09:58:49Z|00110|binding|INFO|Removing iface tap06cca76d-18 ovn-installed in OVS Nov 26 04:58:49 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:49.873 159486 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port cee63832-c8ec-4b65-b763-723a40379b0f with type ""#033[00m Nov 26 04:58:49 localhost ovn_controller[153664]: 2025-11-26T09:58:49Z|00111|binding|INFO|Removing lport 06cca76d-18d0-4a6c-ad7d-2c93205f45ce ovn-installed in OVS Nov 26 04:58:49 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:49.875 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '19.80.0.3/24', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-4f38550a-23be-44bf-bced-1e632a24bf8c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4f38550a-23be-44bf-bced-1e632a24bf8c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b4dafc326e594f1996993253bb2d58d6', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a19e2cd7-918e-429d-b3f8-5eb7832ddf0e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=06cca76d-18d0-4a6c-ad7d-2c93205f45ce) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 04:58:49 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:49.877 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 06cca76d-18d0-4a6c-ad7d-2c93205f45ce in datapath 4f38550a-23be-44bf-bced-1e632a24bf8c unbound from our chassis#033[00m Nov 26 04:58:49 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:49.885 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4f38550a-23be-44bf-bced-1e632a24bf8c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 04:58:49 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:49.886 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[2c2fda52-f554-4518-bd50-20fc8f530636]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:49 localhost systemd[1]: libpod-conmon-b80fa7c1255c0f06df471051a854abc953b5143aef047928c80c4d4e0072a560.scope: Deactivated successfully. Nov 26 04:58:49 localhost nova_compute[281415]: 2025-11-26 09:58:49.918 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:49 localhost nova_compute[281415]: 2025-11-26 09:58:49.923 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:49 localhost kernel: device tap06cca76d-18 left promiscuous mode Nov 26 04:58:49 localhost nova_compute[281415]: 2025-11-26 09:58:49.939 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:49 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:58:49.967 262471 INFO neutron.agent.dhcp.agent [None req-06c02a06-d5ab-464f-beb5-e162f2bcbc35 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 04:58:50 localhost systemd[1]: var-lib-containers-storage-overlay-2873b0d5166b84ff894f50b3b0b76a303a776490f3765b504a03f6a6020334e3-merged.mount: Deactivated successfully. Nov 26 04:58:50 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b80fa7c1255c0f06df471051a854abc953b5143aef047928c80c4d4e0072a560-userdata-shm.mount: Deactivated successfully. Nov 26 04:58:50 localhost systemd[1]: run-netns-qdhcp\x2d4f38550a\x2d23be\x2d44bf\x2dbced\x2d1e632a24bf8c.mount: Deactivated successfully. Nov 26 04:58:50 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:58:50.301 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 04:58:50 localhost nova_compute[281415]: 2025-11-26 09:58:50.443 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:50 localhost ovn_controller[153664]: 2025-11-26T09:58:50Z|00112|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 04:58:50 localhost nova_compute[281415]: 2025-11-26 09:58:50.975 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:51 localhost neutron_sriov_agent[255515]: 2025-11-26 09:58:51.517 2 INFO neutron.agent.securitygroups_rpc [None req-c9386fe5-9cb5-46b4-98b7-07a810df210a 08bd3011065645c0b2694bf134099aad b4dafc326e594f1996993253bb2d58d6 - - default default] Security group member updated ['43f74207-cce2-45c0-b433-8de91a69071b']#033[00m Nov 26 04:58:51 localhost nova_compute[281415]: 2025-11-26 09:58:51.641 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:51 localhost systemd[1]: tmp-crun.TQcxyo.mount: Deactivated successfully. Nov 26 04:58:51 localhost dnsmasq[306286]: read /var/lib/neutron/dhcp/4d6c05df-68f7-4c5b-baae-8e36a676fee9/addn_hosts - 1 addresses Nov 26 04:58:51 localhost dnsmasq-dhcp[306286]: read /var/lib/neutron/dhcp/4d6c05df-68f7-4c5b-baae-8e36a676fee9/host Nov 26 04:58:51 localhost podman[308075]: 2025-11-26 09:58:51.787664215 +0000 UTC m=+0.065717065 container kill b58c5885be267e41f25855c6e223cc21d67e81e399edc30b105f880aec89922c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4d6c05df-68f7-4c5b-baae-8e36a676fee9, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 26 04:58:51 localhost dnsmasq-dhcp[306286]: read /var/lib/neutron/dhcp/4d6c05df-68f7-4c5b-baae-8e36a676fee9/opts Nov 26 04:58:53 localhost dnsmasq[306286]: read /var/lib/neutron/dhcp/4d6c05df-68f7-4c5b-baae-8e36a676fee9/addn_hosts - 0 addresses Nov 26 04:58:53 localhost dnsmasq-dhcp[306286]: read /var/lib/neutron/dhcp/4d6c05df-68f7-4c5b-baae-8e36a676fee9/host Nov 26 04:58:53 localhost dnsmasq-dhcp[306286]: read /var/lib/neutron/dhcp/4d6c05df-68f7-4c5b-baae-8e36a676fee9/opts Nov 26 04:58:53 localhost podman[308150]: 2025-11-26 09:58:53.466150549 +0000 UTC m=+0.066997984 container kill b58c5885be267e41f25855c6e223cc21d67e81e399edc30b105f880aec89922c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4d6c05df-68f7-4c5b-baae-8e36a676fee9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:58:53 localhost ovn_controller[153664]: 2025-11-26T09:58:53Z|00113|binding|INFO|Releasing lport 0ce966e4-04ff-4d33-bbb4-793f124d8338 from this chassis (sb_readonly=0) Nov 26 04:58:53 localhost kernel: device tap0ce966e4-04 left promiscuous mode Nov 26 04:58:53 localhost nova_compute[281415]: 2025-11-26 09:58:53.673 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:53 localhost ovn_controller[153664]: 2025-11-26T09:58:53Z|00114|binding|INFO|Setting lport 0ce966e4-04ff-4d33-bbb4-793f124d8338 down in Southbound Nov 26 04:58:53 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:53.687 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-4d6c05df-68f7-4c5b-baae-8e36a676fee9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4d6c05df-68f7-4c5b-baae-8e36a676fee9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b4dafc326e594f1996993253bb2d58d6', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2f9e5a10-6cc2-43d9-a208-4616c5b15844, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=0ce966e4-04ff-4d33-bbb4-793f124d8338) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 04:58:53 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:53.689 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 0ce966e4-04ff-4d33-bbb4-793f124d8338 in datapath 4d6c05df-68f7-4c5b-baae-8e36a676fee9 unbound from our chassis#033[00m Nov 26 04:58:53 localhost nova_compute[281415]: 2025-11-26 09:58:53.701 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:53 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:53.702 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4d6c05df-68f7-4c5b-baae-8e36a676fee9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 04:58:53 localhost ovn_metadata_agent[159481]: 2025-11-26 09:58:53.704 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[5e973645-bb17-4b9b-8c87-d754aa78e3dc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:58:54 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e99 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:58:54 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e100 e100: 6 total, 6 up, 6 in Nov 26 04:58:54 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:58:54 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 04:58:54 localhost sshd[308223]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:58:55 localhost nova_compute[281415]: 2025-11-26 09:58:55.348 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:55 localhost nova_compute[281415]: 2025-11-26 09:58:55.372 281419 DEBUG nova.virt.driver [-] Emitting event Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Nov 26 04:58:55 localhost nova_compute[281415]: 2025-11-26 09:58:55.372 281419 INFO nova.compute.manager [-] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] VM Stopped (Lifecycle Event)#033[00m Nov 26 04:58:55 localhost nova_compute[281415]: 2025-11-26 09:58:55.420 281419 DEBUG nova.compute.manager [None req-7ea439b7-18e4-4ed7-bcba-647724b48869 - - - - - -] [instance: e1ef2930-c173-4abb-ba9c-3ef0c0b08400] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 26 04:58:55 localhost nova_compute[281415]: 2025-11-26 09:58:55.444 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:55 localhost nova_compute[281415]: 2025-11-26 09:58:55.611 281419 DEBUG oslo_concurrency.lockutils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Acquiring lock "af8a19fc-9bd6-4666-942c-7f001cd8070a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:58:55 localhost nova_compute[281415]: 2025-11-26 09:58:55.612 281419 DEBUG oslo_concurrency.lockutils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Lock "af8a19fc-9bd6-4666-942c-7f001cd8070a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:58:55 localhost nova_compute[281415]: 2025-11-26 09:58:55.631 281419 DEBUG nova.compute.manager [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m Nov 26 04:58:55 localhost nova_compute[281415]: 2025-11-26 09:58:55.715 281419 DEBUG oslo_concurrency.lockutils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:58:55 localhost nova_compute[281415]: 2025-11-26 09:58:55.716 281419 DEBUG oslo_concurrency.lockutils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:58:55 localhost nova_compute[281415]: 2025-11-26 09:58:55.722 281419 DEBUG nova.virt.hardware [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m Nov 26 04:58:55 localhost nova_compute[281415]: 2025-11-26 09:58:55.722 281419 INFO nova.compute.claims [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Claim successful on node np0005536118.localdomain#033[00m Nov 26 04:58:55 localhost nova_compute[281415]: 2025-11-26 09:58:55.879 281419 DEBUG oslo_concurrency.processutils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:58:56 localhost ovn_controller[153664]: 2025-11-26T09:58:56Z|00115|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 04:58:56 localhost nova_compute[281415]: 2025-11-26 09:58:56.098 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:56 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 04:58:56 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1607225633' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 04:58:56 localhost nova_compute[281415]: 2025-11-26 09:58:56.351 281419 DEBUG oslo_concurrency.processutils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:58:56 localhost nova_compute[281415]: 2025-11-26 09:58:56.360 281419 DEBUG nova.compute.provider_tree [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 04:58:56 localhost nova_compute[281415]: 2025-11-26 09:58:56.385 281419 DEBUG nova.scheduler.client.report [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 04:58:56 localhost nova_compute[281415]: 2025-11-26 09:58:56.412 281419 DEBUG oslo_concurrency.lockutils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.696s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:58:56 localhost nova_compute[281415]: 2025-11-26 09:58:56.413 281419 DEBUG nova.compute.manager [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m Nov 26 04:58:56 localhost nova_compute[281415]: 2025-11-26 09:58:56.471 281419 DEBUG nova.compute.manager [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m Nov 26 04:58:56 localhost nova_compute[281415]: 2025-11-26 09:58:56.472 281419 DEBUG nova.network.neutron [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m Nov 26 04:58:56 localhost nova_compute[281415]: 2025-11-26 09:58:56.500 281419 INFO nova.virt.libvirt.driver [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m Nov 26 04:58:56 localhost nova_compute[281415]: 2025-11-26 09:58:56.521 281419 DEBUG nova.compute.manager [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m Nov 26 04:58:56 localhost nova_compute[281415]: 2025-11-26 09:58:56.604 281419 DEBUG nova.compute.manager [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m Nov 26 04:58:56 localhost nova_compute[281415]: 2025-11-26 09:58:56.605 281419 DEBUG nova.virt.libvirt.driver [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m Nov 26 04:58:56 localhost nova_compute[281415]: 2025-11-26 09:58:56.606 281419 INFO nova.virt.libvirt.driver [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Creating image(s)#033[00m Nov 26 04:58:56 localhost nova_compute[281415]: 2025-11-26 09:58:56.639 281419 DEBUG nova.storage.rbd_utils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] rbd image af8a19fc-9bd6-4666-942c-7f001cd8070a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 26 04:58:56 localhost nova_compute[281415]: 2025-11-26 09:58:56.677 281419 DEBUG nova.storage.rbd_utils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] rbd image af8a19fc-9bd6-4666-942c-7f001cd8070a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 26 04:58:56 localhost nova_compute[281415]: 2025-11-26 09:58:56.709 281419 DEBUG nova.storage.rbd_utils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] rbd image af8a19fc-9bd6-4666-942c-7f001cd8070a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 26 04:58:56 localhost nova_compute[281415]: 2025-11-26 09:58:56.714 281419 DEBUG oslo_concurrency.lockutils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Acquiring lock "6b2dc069605f46768004f3dbdb2e7a1bd9152842" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:58:56 localhost nova_compute[281415]: 2025-11-26 09:58:56.715 281419 DEBUG oslo_concurrency.lockutils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Lock "6b2dc069605f46768004f3dbdb2e7a1bd9152842" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:58:56 localhost nova_compute[281415]: 2025-11-26 09:58:56.732 281419 WARNING oslo_policy.policy [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m Nov 26 04:58:56 localhost nova_compute[281415]: 2025-11-26 09:58:56.732 281419 WARNING oslo_policy.policy [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m Nov 26 04:58:56 localhost nova_compute[281415]: 2025-11-26 09:58:56.735 281419 DEBUG nova.policy [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '77a0b73f7f574b48a9c231e26511534a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3f251213a9644261874d24d123ed8f23', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m Nov 26 04:58:56 localhost nova_compute[281415]: 2025-11-26 09:58:56.736 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:56 localhost nova_compute[281415]: 2025-11-26 09:58:56.791 281419 DEBUG nova.virt.libvirt.imagebackend [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Image locations are: [{'url': 'rbd://0d5e5e6d-3c4b-5efe-8c65-346ae6715606/images/211ae400-609a-4c22-9588-f4189139a50b/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://0d5e5e6d-3c4b-5efe-8c65-346ae6715606/images/211ae400-609a-4c22-9588-f4189139a50b/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m Nov 26 04:58:56 localhost nova_compute[281415]: 2025-11-26 09:58:56.846 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:58:56 localhost nova_compute[281415]: 2025-11-26 09:58:56.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:58:56 localhost nova_compute[281415]: 2025-11-26 09:58:56.847 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Nov 26 04:58:56 localhost nova_compute[281415]: 2025-11-26 09:58:56.874 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Nov 26 04:58:57 localhost neutron_sriov_agent[255515]: 2025-11-26 09:58:57.002 2 INFO neutron.agent.securitygroups_rpc [req-5481e922-644f-4e93-a82c-902747102392 req-a7ca780c-f0f5-4b30-b6f5-f0e3ee1b3768 8fff85d5cd9242018e400aeb4a644df9 5e2975cef40c41318a1fdb4067daaa9a - - default default] Security group rule updated ['319ef241-f210-4a0c-9edd-2a43ee7a9e80']#033[00m Nov 26 04:58:57 localhost podman[240049]: time="2025-11-26T09:58:57Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:58:57 localhost podman[240049]: @ - - [26/Nov/2025:09:58:57 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 161157 "" "Go-http-client/1.1" Nov 26 04:58:57 localhost podman[240049]: @ - - [26/Nov/2025:09:58:57 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20674 "" "Go-http-client/1.1" Nov 26 04:58:57 localhost nova_compute[281415]: 2025-11-26 09:58:57.768 281419 DEBUG oslo_concurrency.processutils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6b2dc069605f46768004f3dbdb2e7a1bd9152842.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:58:57 localhost nova_compute[281415]: 2025-11-26 09:58:57.854 281419 DEBUG oslo_concurrency.processutils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6b2dc069605f46768004f3dbdb2e7a1bd9152842.part --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:58:57 localhost nova_compute[281415]: 2025-11-26 09:58:57.856 281419 DEBUG nova.virt.images [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] 211ae400-609a-4c22-9588-f4189139a50b was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m Nov 26 04:58:57 localhost nova_compute[281415]: 2025-11-26 09:58:57.858 281419 DEBUG nova.privsep.utils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m Nov 26 04:58:57 localhost nova_compute[281415]: 2025-11-26 09:58:57.859 281419 DEBUG oslo_concurrency.processutils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/6b2dc069605f46768004f3dbdb2e7a1bd9152842.part /var/lib/nova/instances/_base/6b2dc069605f46768004f3dbdb2e7a1bd9152842.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:58:57 localhost nova_compute[281415]: 2025-11-26 09:58:57.879 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:58:57 localhost neutron_sriov_agent[255515]: 2025-11-26 09:58:57.943 2 INFO neutron.agent.securitygroups_rpc [req-991ca720-2561-4655-a74b-916ee597b993 req-35a98037-a2b5-4521-8b4d-7f375efc7032 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Security group member updated ['98096288-cb5f-4c7e-bb1e-1596965807ee']#033[00m Nov 26 04:58:58 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:58:58.012 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T09:58:57Z, description=, device_id=af8a19fc-9bd6-4666-942c-7f001cd8070a, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=e174856e-03c8-44b0-b9d4-dd4da2f98b9b, ip_allocation=immediate, mac_address=fa:16:3e:d6:8a:86, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T09:58:30Z, description=, dns_domain=, id=6a06c83c-a173-43fd-9343-735e8a52503a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersV294TestFqdnHostnames-789735874-network, port_security_enabled=True, project_id=3f251213a9644261874d24d123ed8f23, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=54121, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=518, status=ACTIVE, subnets=['0cabc8aa-78aa-41f6-b524-37bcf1e2f58a'], tags=[], tenant_id=3f251213a9644261874d24d123ed8f23, updated_at=2025-11-26T09:58:33Z, vlan_transparent=None, network_id=6a06c83c-a173-43fd-9343-735e8a52503a, port_security_enabled=True, project_id=3f251213a9644261874d24d123ed8f23, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['98096288-cb5f-4c7e-bb1e-1596965807ee'], standard_attr_id=612, status=DOWN, tags=[], tenant_id=3f251213a9644261874d24d123ed8f23, updated_at=2025-11-26T09:58:57Z on network 6a06c83c-a173-43fd-9343-735e8a52503a#033[00m Nov 26 04:58:58 localhost nova_compute[281415]: 2025-11-26 09:58:58.152 281419 DEBUG oslo_concurrency.processutils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/6b2dc069605f46768004f3dbdb2e7a1bd9152842.part /var/lib/nova/instances/_base/6b2dc069605f46768004f3dbdb2e7a1bd9152842.converted" returned: 0 in 0.293s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:58:58 localhost nova_compute[281415]: 2025-11-26 09:58:58.158 281419 DEBUG oslo_concurrency.processutils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6b2dc069605f46768004f3dbdb2e7a1bd9152842.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:58:58 localhost nova_compute[281415]: 2025-11-26 09:58:58.228 281419 DEBUG oslo_concurrency.processutils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6b2dc069605f46768004f3dbdb2e7a1bd9152842.converted --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:58:58 localhost nova_compute[281415]: 2025-11-26 09:58:58.230 281419 DEBUG oslo_concurrency.lockutils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Lock "6b2dc069605f46768004f3dbdb2e7a1bd9152842" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 1.515s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:58:58 localhost ceph-mon[297296]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0. Nov 26 04:58:58 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:58:58.246394) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 26 04:58:58 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31 Nov 26 04:58:58 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151138246444, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 658, "num_deletes": 252, "total_data_size": 656698, "memory_usage": 668768, "flush_reason": "Manual Compaction"} Nov 26 04:58:58 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started Nov 26 04:58:58 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151138253880, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 361510, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 19868, "largest_seqno": 20521, "table_properties": {"data_size": 358529, "index_size": 900, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1029, "raw_key_size": 8224, "raw_average_key_size": 21, "raw_value_size": 352124, "raw_average_value_size": 902, "num_data_blocks": 40, "num_entries": 390, "num_filter_entries": 390, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764151110, "oldest_key_time": 1764151110, "file_creation_time": 1764151138, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79fcc812-ff89-4330-9c0d-148e92884770", "db_session_id": "NHY1MHQN9GMTALZ562AS", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}} Nov 26 04:58:58 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 7567 microseconds, and 2057 cpu microseconds. Nov 26 04:58:58 localhost ceph-mon[297296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 04:58:58 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:58:58.253926) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 361510 bytes OK Nov 26 04:58:58 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:58:58.253985) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started Nov 26 04:58:58 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:58:58.256453) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done Nov 26 04:58:58 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:58:58.256479) EVENT_LOG_v1 {"time_micros": 1764151138256472, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 26 04:58:58 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:58:58.256571) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 26 04:58:58 localhost ceph-mon[297296]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 653040, prev total WAL file size 653364, number of live WAL files 2. Nov 26 04:58:58 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 04:58:58 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:58:58.257387) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033373537' seq:72057594037927935, type:22 .. '6D6772737461740034303039' seq:0, type:0; will stop at (end) Nov 26 04:58:58 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 26 04:58:58 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(353KB)], [30(16MB)] Nov 26 04:58:58 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151138257492, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 17844480, "oldest_snapshot_seqno": -1} Nov 26 04:58:58 localhost systemd[1]: tmp-crun.zs12ZE.mount: Deactivated successfully. Nov 26 04:58:58 localhost dnsmasq[307468]: read /var/lib/neutron/dhcp/6a06c83c-a173-43fd-9343-735e8a52503a/addn_hosts - 2 addresses Nov 26 04:58:58 localhost dnsmasq-dhcp[307468]: read /var/lib/neutron/dhcp/6a06c83c-a173-43fd-9343-735e8a52503a/host Nov 26 04:58:58 localhost podman[308331]: 2025-11-26 09:58:58.285787474 +0000 UTC m=+0.089483756 container kill d59321185e07c46f5d4a2fb56ea0ba6b861c077b1713e710c604a0a7202b89c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6a06c83c-a173-43fd-9343-735e8a52503a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0) Nov 26 04:58:58 localhost dnsmasq-dhcp[307468]: read /var/lib/neutron/dhcp/6a06c83c-a173-43fd-9343-735e8a52503a/opts Nov 26 04:58:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:58:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:58:58 localhost nova_compute[281415]: 2025-11-26 09:58:58.313 281419 DEBUG nova.storage.rbd_utils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] rbd image af8a19fc-9bd6-4666-942c-7f001cd8070a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 26 04:58:58 localhost nova_compute[281415]: 2025-11-26 09:58:58.318 281419 DEBUG oslo_concurrency.processutils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/6b2dc069605f46768004f3dbdb2e7a1bd9152842 af8a19fc-9bd6-4666-942c-7f001cd8070a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:58:58 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 11885 keys, 15670353 bytes, temperature: kUnknown Nov 26 04:58:58 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151138330496, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 15670353, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15604886, "index_size": 34646, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 29765, "raw_key_size": 318075, "raw_average_key_size": 26, "raw_value_size": 15404700, "raw_average_value_size": 1296, "num_data_blocks": 1310, "num_entries": 11885, "num_filter_entries": 11885, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764150724, "oldest_key_time": 0, "file_creation_time": 1764151138, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79fcc812-ff89-4330-9c0d-148e92884770", "db_session_id": "NHY1MHQN9GMTALZ562AS", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}} Nov 26 04:58:58 localhost ceph-mon[297296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 04:58:58 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:58:58.331008) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 15670353 bytes Nov 26 04:58:58 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:58:58.333504) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 243.5 rd, 213.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 16.7 +0.0 blob) out(14.9 +0.0 blob), read-write-amplify(92.7) write-amplify(43.3) OK, records in: 12402, records dropped: 517 output_compression: NoCompression Nov 26 04:58:58 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:58:58.333529) EVENT_LOG_v1 {"time_micros": 1764151138333519, "job": 16, "event": "compaction_finished", "compaction_time_micros": 73291, "compaction_time_cpu_micros": 45279, "output_level": 6, "num_output_files": 1, "total_output_size": 15670353, "num_input_records": 12402, "num_output_records": 11885, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 26 04:58:58 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 04:58:58 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151138333695, "job": 16, "event": "table_file_deletion", "file_number": 32} Nov 26 04:58:58 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 04:58:58 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151138336059, "job": 16, "event": "table_file_deletion", "file_number": 30} Nov 26 04:58:58 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:58:58.257227) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:58:58 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:58:58.336181) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:58:58 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:58:58.336190) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:58:58 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:58:58.336193) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:58:58 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:58:58.336196) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:58:58 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:58:58.336199) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:58:58 localhost ovn_controller[153664]: 2025-11-26T09:58:58Z|00116|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 04:58:58 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 04:58:58 localhost podman[308362]: 2025-11-26 09:58:58.39904156 +0000 UTC m=+0.087480045 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 04:58:58 localhost podman[308362]: 2025-11-26 09:58:58.414081006 +0000 UTC m=+0.102519541 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 04:58:58 localhost nova_compute[281415]: 2025-11-26 09:58:58.422 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:58:58 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:58:58 localhost nova_compute[281415]: 2025-11-26 09:58:58.460 281419 DEBUG nova.network.neutron [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Successfully created port: e174856e-03c8-44b0-b9d4-dd4da2f98b9b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m Nov 26 04:58:58 localhost podman[308363]: 2025-11-26 09:58:58.480859072 +0000 UTC m=+0.169176094 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, io.buildah.version=1.41.3) Nov 26 04:58:58 localhost neutron_sriov_agent[255515]: 2025-11-26 09:58:58.495 2 INFO neutron.agent.securitygroups_rpc [req-f1768fb1-1049-4b5b-94df-aa36ff71aa5f req-4465e1fa-97dc-480b-acf8-10af36df7253 8fff85d5cd9242018e400aeb4a644df9 5e2975cef40c41318a1fdb4067daaa9a - - default default] Security group rule updated ['319ef241-f210-4a0c-9edd-2a43ee7a9e80']#033[00m Nov 26 04:58:58 localhost podman[308363]: 2025-11-26 09:58:58.516419771 +0000 UTC m=+0.204736783 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 26 04:58:58 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:58:58 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:58:58.615 262471 INFO neutron.agent.dhcp.agent [None req-3efb37c7-1280-4651-846c-13bc2acdd335 - - - - - -] DHCP configuration for ports {'e174856e-03c8-44b0-b9d4-dd4da2f98b9b'} is completed#033[00m Nov 26 04:58:58 localhost nova_compute[281415]: 2025-11-26 09:58:58.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:58:58 localhost nova_compute[281415]: 2025-11-26 09:58:58.849 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:58:58 localhost nova_compute[281415]: 2025-11-26 09:58:58.849 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 04:58:58 localhost nova_compute[281415]: 2025-11-26 09:58:58.907 281419 DEBUG oslo_concurrency.processutils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/6b2dc069605f46768004f3dbdb2e7a1bd9152842 af8a19fc-9bd6-4666-942c-7f001cd8070a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.589s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.028 281419 DEBUG nova.storage.rbd_utils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] resizing rbd image af8a19fc-9bd6-4666-942c-7f001cd8070a_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m Nov 26 04:58:59 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:58:59.030 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=np0005536118.localdomain, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T09:58:57Z, description=, device_id=af8a19fc-9bd6-4666-942c-7f001cd8070a, device_owner=compute:nova, dns_assignment=[], dns_domain=, dns_name=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com, extra_dhcp_opts=[], fixed_ips=[], id=e174856e-03c8-44b0-b9d4-dd4da2f98b9b, ip_allocation=immediate, mac_address=fa:16:3e:d6:8a:86, name=, network_id=6a06c83c-a173-43fd-9343-735e8a52503a, port_security_enabled=True, project_id=3f251213a9644261874d24d123ed8f23, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['98096288-cb5f-4c7e-bb1e-1596965807ee'], standard_attr_id=612, status=DOWN, tags=[], tenant_id=3f251213a9644261874d24d123ed8f23, updated_at=2025-11-26T09:58:58Z on network 6a06c83c-a173-43fd-9343-735e8a52503a#033[00m Nov 26 04:58:59 localhost dnsmasq[306286]: exiting on receipt of SIGTERM Nov 26 04:58:59 localhost podman[308499]: 2025-11-26 09:58:59.131458901 +0000 UTC m=+0.061154006 container kill b58c5885be267e41f25855c6e223cc21d67e81e399edc30b105f880aec89922c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4d6c05df-68f7-4c5b-baae-8e36a676fee9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:58:59 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e100 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:58:59 localhost systemd[1]: libpod-b58c5885be267e41f25855c6e223cc21d67e81e399edc30b105f880aec89922c.scope: Deactivated successfully. Nov 26 04:58:59 localhost podman[308526]: 2025-11-26 09:58:59.198663349 +0000 UTC m=+0.056560077 container died b58c5885be267e41f25855c6e223cc21d67e81e399edc30b105f880aec89922c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4d6c05df-68f7-4c5b-baae-8e36a676fee9, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.219 281419 DEBUG nova.objects.instance [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Lazy-loading 'migration_context' on Instance uuid af8a19fc-9bd6-4666-942c-7f001cd8070a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.230 281419 DEBUG nova.virt.libvirt.driver [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m Nov 26 04:58:59 localhost podman[308526]: 2025-11-26 09:58:59.231339621 +0000 UTC m=+0.089236299 container cleanup b58c5885be267e41f25855c6e223cc21d67e81e399edc30b105f880aec89922c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4d6c05df-68f7-4c5b-baae-8e36a676fee9, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0) Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.231 281419 DEBUG nova.virt.libvirt.driver [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Ensure instance console log exists: /var/lib/nova/instances/af8a19fc-9bd6-4666-942c-7f001cd8070a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.231 281419 DEBUG oslo_concurrency.lockutils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.232 281419 DEBUG oslo_concurrency.lockutils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.232 281419 DEBUG oslo_concurrency.lockutils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:58:59 localhost systemd[1]: libpod-conmon-b58c5885be267e41f25855c6e223cc21d67e81e399edc30b105f880aec89922c.scope: Deactivated successfully. Nov 26 04:58:59 localhost podman[308537]: 2025-11-26 09:58:59.257117393 +0000 UTC m=+0.094697303 container remove b58c5885be267e41f25855c6e223cc21d67e81e399edc30b105f880aec89922c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4d6c05df-68f7-4c5b-baae-8e36a676fee9, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 26 04:58:59 localhost systemd[1]: var-lib-containers-storage-overlay-a99d75efdc041b4844040716945a6b500b9749f2906c29185ec9f70eb18989fd-merged.mount: Deactivated successfully. Nov 26 04:58:59 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b58c5885be267e41f25855c6e223cc21d67e81e399edc30b105f880aec89922c-userdata-shm.mount: Deactivated successfully. Nov 26 04:58:59 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:58:59.279 262471 INFO neutron.agent.dhcp.agent [None req-c0062f55-6a3c-43c4-a694-29fca361dbfe - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 04:58:59 localhost systemd[1]: run-netns-qdhcp\x2d4d6c05df\x2d68f7\x2d4c5b\x2dbaae\x2d8e36a676fee9.mount: Deactivated successfully. Nov 26 04:58:59 localhost dnsmasq[307468]: read /var/lib/neutron/dhcp/6a06c83c-a173-43fd-9343-735e8a52503a/addn_hosts - 2 addresses Nov 26 04:58:59 localhost dnsmasq-dhcp[307468]: read /var/lib/neutron/dhcp/6a06c83c-a173-43fd-9343-735e8a52503a/host Nov 26 04:58:59 localhost dnsmasq-dhcp[307468]: read /var/lib/neutron/dhcp/6a06c83c-a173-43fd-9343-735e8a52503a/opts Nov 26 04:58:59 localhost podman[308562]: 2025-11-26 09:58:59.31139687 +0000 UTC m=+0.120979731 container kill d59321185e07c46f5d4a2fb56ea0ba6b861c077b1713e710c604a0a7202b89c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6a06c83c-a173-43fd-9343-735e8a52503a, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2) Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.434 281419 DEBUG nova.network.neutron [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Successfully updated port: e174856e-03c8-44b0-b9d4-dd4da2f98b9b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.457 281419 DEBUG oslo_concurrency.lockutils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Acquiring lock "refresh_cache-af8a19fc-9bd6-4666-942c-7f001cd8070a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.457 281419 DEBUG oslo_concurrency.lockutils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Acquired lock "refresh_cache-af8a19fc-9bd6-4666-942c-7f001cd8070a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.458 281419 DEBUG nova.network.neutron [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m Nov 26 04:58:59 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:58:59.499 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.512 281419 DEBUG nova.network.neutron [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.534 281419 DEBUG nova.compute.manager [req-3deb7588-e6fc-4f1d-bab7-429850214e00 req-8137c889-ba0c-4161-96a2-309e73e3cfaf ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Received event network-changed-e174856e-03c8-44b0-b9d4-dd4da2f98b9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.535 281419 DEBUG nova.compute.manager [req-3deb7588-e6fc-4f1d-bab7-429850214e00 req-8137c889-ba0c-4161-96a2-309e73e3cfaf ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Refreshing instance network info cache due to event network-changed-e174856e-03c8-44b0-b9d4-dd4da2f98b9b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.535 281419 DEBUG oslo_concurrency.lockutils [req-3deb7588-e6fc-4f1d-bab7-429850214e00 req-8137c889-ba0c-4161-96a2-309e73e3cfaf ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] Acquiring lock "refresh_cache-af8a19fc-9bd6-4666-942c-7f001cd8070a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:58:59 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:58:59.577 262471 INFO neutron.agent.dhcp.agent [None req-e323ab10-a93c-429a-be69-748b75e747bc - - - - - -] DHCP configuration for ports {'e174856e-03c8-44b0-b9d4-dd4da2f98b9b'} is completed#033[00m Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.788 281419 DEBUG nova.network.neutron [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Updating instance_info_cache with network_info: [{"id": "e174856e-03c8-44b0-b9d4-dd4da2f98b9b", "address": "fa:16:3e:d6:8a:86", "network": {"id": "6a06c83c-a173-43fd-9343-735e8a52503a", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-789735874-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "3f251213a9644261874d24d123ed8f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape174856e-03", "ovs_interfaceid": "e174856e-03c8-44b0-b9d4-dd4da2f98b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.809 281419 DEBUG oslo_concurrency.lockutils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Releasing lock "refresh_cache-af8a19fc-9bd6-4666-942c-7f001cd8070a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.809 281419 DEBUG nova.compute.manager [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Instance network_info: |[{"id": "e174856e-03c8-44b0-b9d4-dd4da2f98b9b", "address": "fa:16:3e:d6:8a:86", "network": {"id": "6a06c83c-a173-43fd-9343-735e8a52503a", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-789735874-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "3f251213a9644261874d24d123ed8f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape174856e-03", "ovs_interfaceid": "e174856e-03c8-44b0-b9d4-dd4da2f98b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.810 281419 DEBUG oslo_concurrency.lockutils [req-3deb7588-e6fc-4f1d-bab7-429850214e00 req-8137c889-ba0c-4161-96a2-309e73e3cfaf ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] Acquired lock "refresh_cache-af8a19fc-9bd6-4666-942c-7f001cd8070a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.810 281419 DEBUG nova.network.neutron [req-3deb7588-e6fc-4f1d-bab7-429850214e00 req-8137c889-ba0c-4161-96a2-309e73e3cfaf ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Refreshing network info cache for port e174856e-03c8-44b0-b9d4-dd4da2f98b9b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.816 281419 DEBUG nova.virt.libvirt.driver [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Start _get_guest_xml network_info=[{"id": "e174856e-03c8-44b0-b9d4-dd4da2f98b9b", "address": "fa:16:3e:d6:8a:86", "network": {"id": "6a06c83c-a173-43fd-9343-735e8a52503a", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-789735874-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "3f251213a9644261874d24d123ed8f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape174856e-03", "ovs_interfaceid": "e174856e-03c8-44b0-b9d4-dd4da2f98b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-26T09:57:08Z,direct_url=,disk_format='qcow2',id=211ae400-609a-4c22-9588-f4189139a50b,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='b2fe3cd6f6ea49b8a2de01b236dd92e3',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2025-11-26T09:57:10Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'image_id': '211ae400-609a-4c22-9588-f4189139a50b'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.823 281419 WARNING nova.virt.libvirt.driver [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.835 281419 DEBUG nova.virt.libvirt.host [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Searching host: 'np0005536118.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.836 281419 DEBUG nova.virt.libvirt.host [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.839 281419 DEBUG nova.virt.libvirt.host [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Searching host: 'np0005536118.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.840 281419 DEBUG nova.virt.libvirt.host [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.840 281419 DEBUG nova.virt.libvirt.driver [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.841 281419 DEBUG nova.virt.hardware [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-26T09:57:07Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='3b6f30a1-d6bf-48f0-b946-f1964a0a6750',id=5,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-26T09:57:08Z,direct_url=,disk_format='qcow2',id=211ae400-609a-4c22-9588-f4189139a50b,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='b2fe3cd6f6ea49b8a2de01b236dd92e3',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2025-11-26T09:57:10Z,virtual_size=,visibility=), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.841 281419 DEBUG nova.virt.hardware [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.842 281419 DEBUG nova.virt.hardware [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.842 281419 DEBUG nova.virt.hardware [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.842 281419 DEBUG nova.virt.hardware [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.843 281419 DEBUG nova.virt.hardware [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.843 281419 DEBUG nova.virt.hardware [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.843 281419 DEBUG nova.virt.hardware [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.844 281419 DEBUG nova.virt.hardware [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.844 281419 DEBUG nova.virt.hardware [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.844 281419 DEBUG nova.virt.hardware [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.849 281419 DEBUG oslo_concurrency.processutils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:58:59 localhost nova_compute[281415]: 2025-11-26 09:58:59.867 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:59:00 localhost nova_compute[281415]: 2025-11-26 09:59:00.228 281419 DEBUG nova.network.neutron [req-3deb7588-e6fc-4f1d-bab7-429850214e00 req-8137c889-ba0c-4161-96a2-309e73e3cfaf ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Updated VIF entry in instance network info cache for port e174856e-03c8-44b0-b9d4-dd4da2f98b9b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m Nov 26 04:59:00 localhost nova_compute[281415]: 2025-11-26 09:59:00.229 281419 DEBUG nova.network.neutron [req-3deb7588-e6fc-4f1d-bab7-429850214e00 req-8137c889-ba0c-4161-96a2-309e73e3cfaf ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Updating instance_info_cache with network_info: [{"id": "e174856e-03c8-44b0-b9d4-dd4da2f98b9b", "address": "fa:16:3e:d6:8a:86", "network": {"id": "6a06c83c-a173-43fd-9343-735e8a52503a", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-789735874-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "3f251213a9644261874d24d123ed8f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape174856e-03", "ovs_interfaceid": "e174856e-03c8-44b0-b9d4-dd4da2f98b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:59:00 localhost nova_compute[281415]: 2025-11-26 09:59:00.244 281419 DEBUG oslo_concurrency.lockutils [req-3deb7588-e6fc-4f1d-bab7-429850214e00 req-8137c889-ba0c-4161-96a2-309e73e3cfaf ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] Releasing lock "refresh_cache-af8a19fc-9bd6-4666-942c-7f001cd8070a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:59:00 localhost systemd[1]: tmp-crun.r4PGZe.mount: Deactivated successfully. Nov 26 04:59:00 localhost dnsmasq[307856]: read /var/lib/neutron/dhcp/c3527677-edc0-4790-a4a3-cebcf166663a/addn_hosts - 0 addresses Nov 26 04:59:00 localhost dnsmasq-dhcp[307856]: read /var/lib/neutron/dhcp/c3527677-edc0-4790-a4a3-cebcf166663a/host Nov 26 04:59:00 localhost dnsmasq-dhcp[307856]: read /var/lib/neutron/dhcp/c3527677-edc0-4790-a4a3-cebcf166663a/opts Nov 26 04:59:00 localhost podman[308637]: 2025-11-26 09:59:00.36811014 +0000 UTC m=+0.079544404 container kill c598f3b3b43db4298142ff3d887b3116867dabe5e9678803fdea4a67190259e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c3527677-edc0-4790-a4a3-cebcf166663a, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:59:00 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 04:59:00 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/264596584' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 04:59:00 localhost nova_compute[281415]: 2025-11-26 09:59:00.446 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:00 localhost nova_compute[281415]: 2025-11-26 09:59:00.448 281419 DEBUG oslo_concurrency.processutils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.599s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:59:00 localhost nova_compute[281415]: 2025-11-26 09:59:00.482 281419 DEBUG nova.storage.rbd_utils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] rbd image af8a19fc-9bd6-4666-942c-7f001cd8070a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 26 04:59:00 localhost nova_compute[281415]: 2025-11-26 09:59:00.489 281419 DEBUG oslo_concurrency.processutils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:59:00 localhost ovn_controller[153664]: 2025-11-26T09:59:00Z|00117|binding|INFO|Releasing lport 07515a91-8d6e-4310-84cc-6bda0ebe4dbc from this chassis (sb_readonly=0) Nov 26 04:59:00 localhost kernel: device tap07515a91-8d left promiscuous mode Nov 26 04:59:00 localhost ovn_controller[153664]: 2025-11-26T09:59:00Z|00118|binding|INFO|Setting lport 07515a91-8d6e-4310-84cc-6bda0ebe4dbc down in Southbound Nov 26 04:59:00 localhost nova_compute[281415]: 2025-11-26 09:59:00.562 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:00 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:00.578 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-c3527677-edc0-4790-a4a3-cebcf166663a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c3527677-edc0-4790-a4a3-cebcf166663a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cae6053338d645a7a195490ea78e074c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1a969964-6f31-4bac-892c-89573ce5cbd0, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=07515a91-8d6e-4310-84cc-6bda0ebe4dbc) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 04:59:00 localhost nova_compute[281415]: 2025-11-26 09:59:00.581 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:00 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:00.582 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 07515a91-8d6e-4310-84cc-6bda0ebe4dbc in datapath c3527677-edc0-4790-a4a3-cebcf166663a unbound from our chassis#033[00m Nov 26 04:59:00 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:00.586 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c3527677-edc0-4790-a4a3-cebcf166663a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 04:59:00 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:00.587 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[18ef6981-cd65-4405-8abb-45719ca0ad88]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:59:00 localhost nova_compute[281415]: 2025-11-26 09:59:00.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:59:00 localhost nova_compute[281415]: 2025-11-26 09:59:00.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:59:00 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 04:59:00 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/451367209' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 04:59:00 localhost nova_compute[281415]: 2025-11-26 09:59:00.969 281419 DEBUG oslo_concurrency.processutils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:59:00 localhost nova_compute[281415]: 2025-11-26 09:59:00.972 281419 DEBUG nova.virt.libvirt.vif [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-26T09:58:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='guest-instance-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(5),hidden=False,host='np0005536118.localdomain',hostname='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com',id=9,image_ref='211ae400-609a-4c22-9588-f4189139a50b',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAljQH8gCNgGhATjkietZYVOGUv9VI425iWTya/TD39tqiN7SIxn9uznipoLdXT8R/xFtDcXKwPW29szUZwpP3LzmUxMGKMSAF2UU0eEvXxvMbIuwBuMXg4aL08pjqc1bw==',key_name='tempest-keypair-1335953635',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0005536118.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005536118.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3f251213a9644261874d24d123ed8f23',ramdisk_id='',reservation_id='r-esc8e7o9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='211ae400-609a-4c22-9588-f4189139a50b',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersV294TestFqdnHostnames-674665483',owner_user_name='tempest-ServersV294TestFqdnHostnames-674665483-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-26T09:58:56Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='77a0b73f7f574b48a9c231e26511534a',uuid=af8a19fc-9bd6-4666-942c-7f001cd8070a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e174856e-03c8-44b0-b9d4-dd4da2f98b9b", "address": "fa:16:3e:d6:8a:86", "network": {"id": "6a06c83c-a173-43fd-9343-735e8a52503a", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-789735874-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "3f251213a9644261874d24d123ed8f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape174856e-03", "ovs_interfaceid": "e174856e-03c8-44b0-b9d4-dd4da2f98b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m Nov 26 04:59:00 localhost nova_compute[281415]: 2025-11-26 09:59:00.972 281419 DEBUG nova.network.os_vif_util [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Converting VIF {"id": "e174856e-03c8-44b0-b9d4-dd4da2f98b9b", "address": "fa:16:3e:d6:8a:86", "network": {"id": "6a06c83c-a173-43fd-9343-735e8a52503a", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-789735874-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "3f251213a9644261874d24d123ed8f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape174856e-03", "ovs_interfaceid": "e174856e-03c8-44b0-b9d4-dd4da2f98b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Nov 26 04:59:00 localhost nova_compute[281415]: 2025-11-26 09:59:00.973 281419 DEBUG nova.network.os_vif_util [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d6:8a:86,bridge_name='br-int',has_traffic_filtering=True,id=e174856e-03c8-44b0-b9d4-dd4da2f98b9b,network=Network(6a06c83c-a173-43fd-9343-735e8a52503a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape174856e-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Nov 26 04:59:00 localhost nova_compute[281415]: 2025-11-26 09:59:00.976 281419 DEBUG nova.objects.instance [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Lazy-loading 'pci_devices' on Instance uuid af8a19fc-9bd6-4666-942c-7f001cd8070a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.021 281419 DEBUG nova.virt.libvirt.driver [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] End _get_guest_xml xml= Nov 26 04:59:01 localhost nova_compute[281415]: af8a19fc-9bd6-4666-942c-7f001cd8070a Nov 26 04:59:01 localhost nova_compute[281415]: instance-00000009 Nov 26 04:59:01 localhost nova_compute[281415]: 131072 Nov 26 04:59:01 localhost nova_compute[281415]: 1 Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: guest-instance-1 Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:58:59 Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: 128 Nov 26 04:59:01 localhost nova_compute[281415]: 1 Nov 26 04:59:01 localhost nova_compute[281415]: 0 Nov 26 04:59:01 localhost nova_compute[281415]: 0 Nov 26 04:59:01 localhost nova_compute[281415]: 1 Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: tempest-ServersV294TestFqdnHostnames-674665483-project-member Nov 26 04:59:01 localhost nova_compute[281415]: tempest-ServersV294TestFqdnHostnames-674665483 Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: RDO Nov 26 04:59:01 localhost nova_compute[281415]: OpenStack Compute Nov 26 04:59:01 localhost nova_compute[281415]: 27.5.2-0.20250829104910.6f8decf.el9 Nov 26 04:59:01 localhost nova_compute[281415]: af8a19fc-9bd6-4666-942c-7f001cd8070a Nov 26 04:59:01 localhost nova_compute[281415]: af8a19fc-9bd6-4666-942c-7f001cd8070a Nov 26 04:59:01 localhost nova_compute[281415]: Virtual Machine Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: hvm Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: /dev/urandom Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: Nov 26 04:59:01 localhost nova_compute[281415]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.022 281419 DEBUG nova.compute.manager [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Preparing to wait for external event network-vif-plugged-e174856e-03c8-44b0-b9d4-dd4da2f98b9b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.023 281419 DEBUG oslo_concurrency.lockutils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Acquiring lock "af8a19fc-9bd6-4666-942c-7f001cd8070a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.024 281419 DEBUG oslo_concurrency.lockutils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Lock "af8a19fc-9bd6-4666-942c-7f001cd8070a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.024 281419 DEBUG oslo_concurrency.lockutils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Lock "af8a19fc-9bd6-4666-942c-7f001cd8070a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.025 281419 DEBUG nova.virt.libvirt.vif [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-26T09:58:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='guest-instance-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(5),hidden=False,host='np0005536118.localdomain',hostname='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com',id=9,image_ref='211ae400-609a-4c22-9588-f4189139a50b',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAljQH8gCNgGhATjkietZYVOGUv9VI425iWTya/TD39tqiN7SIxn9uznipoLdXT8R/xFtDcXKwPW29szUZwpP3LzmUxMGKMSAF2UU0eEvXxvMbIuwBuMXg4aL08pjqc1bw==',key_name='tempest-keypair-1335953635',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0005536118.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005536118.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3f251213a9644261874d24d123ed8f23',ramdisk_id='',reservation_id='r-esc8e7o9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='211ae400-609a-4c22-9588-f4189139a50b',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersV294TestFqdnHostnames-674665483',owner_user_name='tempest-ServersV294TestFqdnHostnames-674665483-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-26T09:58:56Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='77a0b73f7f574b48a9c231e26511534a',uuid=af8a19fc-9bd6-4666-942c-7f001cd8070a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e174856e-03c8-44b0-b9d4-dd4da2f98b9b", "address": "fa:16:3e:d6:8a:86", "network": {"id": "6a06c83c-a173-43fd-9343-735e8a52503a", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-789735874-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "3f251213a9644261874d24d123ed8f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape174856e-03", "ovs_interfaceid": "e174856e-03c8-44b0-b9d4-dd4da2f98b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.026 281419 DEBUG nova.network.os_vif_util [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Converting VIF {"id": "e174856e-03c8-44b0-b9d4-dd4da2f98b9b", "address": "fa:16:3e:d6:8a:86", "network": {"id": "6a06c83c-a173-43fd-9343-735e8a52503a", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-789735874-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "3f251213a9644261874d24d123ed8f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape174856e-03", "ovs_interfaceid": "e174856e-03c8-44b0-b9d4-dd4da2f98b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.026 281419 DEBUG nova.network.os_vif_util [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d6:8a:86,bridge_name='br-int',has_traffic_filtering=True,id=e174856e-03c8-44b0-b9d4-dd4da2f98b9b,network=Network(6a06c83c-a173-43fd-9343-735e8a52503a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape174856e-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.027 281419 DEBUG os_vif [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:8a:86,bridge_name='br-int',has_traffic_filtering=True,id=e174856e-03c8-44b0-b9d4-dd4da2f98b9b,network=Network(6a06c83c-a173-43fd-9343-735e8a52503a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape174856e-03') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.028 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.028 281419 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.029 281419 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.033 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.033 281419 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape174856e-03, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.034 281419 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape174856e-03, col_values=(('external_ids', {'iface-id': 'e174856e-03c8-44b0-b9d4-dd4da2f98b9b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d6:8a:86', 'vm-uuid': 'af8a19fc-9bd6-4666-942c-7f001cd8070a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.076 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.079 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.085 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.086 281419 INFO os_vif [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:8a:86,bridge_name='br-int',has_traffic_filtering=True,id=e174856e-03c8-44b0-b9d4-dd4da2f98b9b,network=Network(6a06c83c-a173-43fd-9343-735e8a52503a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape174856e-03')#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.139 281419 DEBUG nova.virt.libvirt.driver [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.140 281419 DEBUG nova.virt.libvirt.driver [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.140 281419 DEBUG nova.virt.libvirt.driver [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] No VIF found with MAC fa:16:3e:d6:8a:86, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.141 281419 INFO nova.virt.libvirt.driver [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Using config drive#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.174 281419 DEBUG nova.storage.rbd_utils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] rbd image af8a19fc-9bd6-4666-942c-7f001cd8070a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.304 281419 INFO nova.virt.libvirt.driver [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Creating config drive at /var/lib/nova/instances/af8a19fc-9bd6-4666-942c-7f001cd8070a/disk.config#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.312 281419 DEBUG oslo_concurrency.processutils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/af8a19fc-9bd6-4666-942c-7f001cd8070a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpu2vmthg3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.444 281419 DEBUG oslo_concurrency.processutils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/af8a19fc-9bd6-4666-942c-7f001cd8070a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpu2vmthg3" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.481 281419 DEBUG nova.storage.rbd_utils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] rbd image af8a19fc-9bd6-4666-942c-7f001cd8070a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.491 281419 DEBUG oslo_concurrency.processutils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/af8a19fc-9bd6-4666-942c-7f001cd8070a/disk.config af8a19fc-9bd6-4666-942c-7f001cd8070a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.649 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.703 281419 DEBUG oslo_concurrency.processutils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/af8a19fc-9bd6-4666-942c-7f001cd8070a/disk.config af8a19fc-9bd6-4666-942c-7f001cd8070a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.212s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.704 281419 INFO nova.virt.libvirt.driver [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Deleting local config drive /var/lib/nova/instances/af8a19fc-9bd6-4666-942c-7f001cd8070a/disk.config because it was imported into RBD.#033[00m Nov 26 04:59:01 localhost kernel: device tape174856e-03 entered promiscuous mode Nov 26 04:59:01 localhost NetworkManager[5970]: [1764151141.7571] manager: (tape174856e-03): new Tun device (/org/freedesktop/NetworkManager/Devices/25) Nov 26 04:59:01 localhost ovn_controller[153664]: 2025-11-26T09:59:01Z|00119|binding|INFO|Claiming lport e174856e-03c8-44b0-b9d4-dd4da2f98b9b for this chassis. Nov 26 04:59:01 localhost ovn_controller[153664]: 2025-11-26T09:59:01Z|00120|binding|INFO|e174856e-03c8-44b0-b9d4-dd4da2f98b9b: Claiming fa:16:3e:d6:8a:86 10.100.0.10 Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.760 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:01 localhost systemd-udevd[308772]: Network interface NamePolicy= disabled on kernel command line. Nov 26 04:59:01 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:01.768 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d6:8a:86 10.100.0.10'], port_security=['fa:16:3e:d6:8a:86 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'af8a19fc-9bd6-4666-942c-7f001cd8070a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a06c83c-a173-43fd-9343-735e8a52503a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f251213a9644261874d24d123ed8f23', 'neutron:revision_number': '2', 'neutron:security_group_ids': '98096288-cb5f-4c7e-bb1e-1596965807ee', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07926bbc-17e6-41e5-b392-ed892f429233, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=e174856e-03c8-44b0-b9d4-dd4da2f98b9b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 04:59:01 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:01.769 159486 INFO neutron.agent.ovn.metadata.agent [-] Port e174856e-03c8-44b0-b9d4-dd4da2f98b9b in datapath 6a06c83c-a173-43fd-9343-735e8a52503a bound to our chassis#033[00m Nov 26 04:59:01 localhost ovn_controller[153664]: 2025-11-26T09:59:01Z|00121|binding|INFO|Setting lport e174856e-03c8-44b0-b9d4-dd4da2f98b9b ovn-installed in OVS Nov 26 04:59:01 localhost ovn_controller[153664]: 2025-11-26T09:59:01Z|00122|binding|INFO|Setting lport e174856e-03c8-44b0-b9d4-dd4da2f98b9b up in Southbound Nov 26 04:59:01 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:01.772 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Port b38eeded-6aa3-44f5-a834-8d27f0d849a6 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 26 04:59:01 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:01.773 159486 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6a06c83c-a173-43fd-9343-735e8a52503a#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.775 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:01 localhost NetworkManager[5970]: [1764151141.7812] device (tape174856e-03): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Nov 26 04:59:01 localhost NetworkManager[5970]: [1764151141.7823] device (tape174856e-03): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external') Nov 26 04:59:01 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:01.782 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[4981a85e-ec0d-4a7c-9a25-613bdb819932]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:59:01 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:01.785 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6a06c83c-a1 in ovnmeta-6a06c83c-a173-43fd-9343-735e8a52503a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m Nov 26 04:59:01 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:01.788 159592 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6a06c83c-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m Nov 26 04:59:01 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:01.788 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[05b9adc3-43a8-47ab-b47d-7ab42ad8de4d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:59:01 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:01.790 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[6eee1b2d-9a09-4f56-848b-d72da6a509f7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:59:01 localhost systemd-machined[83873]: New machine qemu-4-instance-00000009. Nov 26 04:59:01 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:01.804 159623 DEBUG oslo.privsep.daemon [-] privsep: reply[f9f4d31b-2c32-41a8-b4de-ca6f291805ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:59:01 localhost systemd[1]: Started Virtual Machine qemu-4-instance-00000009. Nov 26 04:59:01 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:01.833 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[bbd86b88-ccbf-4935-a842-b86e977f5e51]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:59:01 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:01.861 159603 DEBUG oslo.privsep.daemon [-] privsep: reply[ea3b496a-f47e-4296-822c-294679dd312c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:59:01 localhost NetworkManager[5970]: [1764151141.8687] manager: (tap6a06c83c-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/26) Nov 26 04:59:01 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:01.869 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[1274ea9e-6f18-4271-912c-efbe0b01369d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.875 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.875 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.876 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.876 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 04:59:01 localhost nova_compute[281415]: 2025-11-26 09:59:01.876 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:59:01 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:01.901 159603 DEBUG oslo.privsep.daemon [-] privsep: reply[ccf7b5f2-0b2e-4aec-a82c-af7a3d30a349]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:59:01 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:01.904 159603 DEBUG oslo.privsep.daemon [-] privsep: reply[d221c4ac-ce5a-4758-9bc5-a13c5b1ff9ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:59:01 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap6a06c83c-a1: link becomes ready Nov 26 04:59:01 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap6a06c83c-a0: link becomes ready Nov 26 04:59:01 localhost NetworkManager[5970]: [1764151141.9278] device (tap6a06c83c-a0): carrier: link connected Nov 26 04:59:01 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:01.931 159603 DEBUG oslo.privsep.daemon [-] privsep: reply[03ebf39d-083c-4be3-9f48-56a439c86e05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:59:01 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:01.950 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[556a3127-e915-4890-881d-16ed64852321]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a06c83c-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:71:a9:a8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1161709, 'reachable_time': 42833, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 308809, 'error': None, 'target': 'ovnmeta-6a06c83c-a173-43fd-9343-735e8a52503a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:59:01 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:01.971 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[884eba0e-d934-4f08-b0d4-f40b8ec823ce]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe71:a9a8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1161709, 'tstamp': 1161709}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 308817, 'error': None, 'target': 'ovnmeta-6a06c83c-a173-43fd-9343-735e8a52503a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:59:01 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:01.990 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[48e31c92-864a-4cac-9544-398bd3c8bfdf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a06c83c-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:71:a9:a8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1161709, 'reachable_time': 42833, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 308826, 'error': None, 'target': 'ovnmeta-6a06c83c-a173-43fd-9343-735e8a52503a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:02.021 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[805028b3-1b19-49e7-b2e4-1cbd9ee9c9ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:02.086 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[678810d4-e89a-46d5-88fe-82f2d9652ef0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:02.089 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a06c83c-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:02.089 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:02.090 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6a06c83c-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:59:02 localhost nova_compute[281415]: 2025-11-26 09:59:02.092 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:02 localhost kernel: device tap6a06c83c-a0 entered promiscuous mode Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:02.100 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6a06c83c-a0, col_values=(('external_ids', {'iface-id': 'd160bab8-f6a4-4dc7-b4dd-c79a804f7583'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:59:02 localhost nova_compute[281415]: 2025-11-26 09:59:02.102 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:02 localhost ovn_controller[153664]: 2025-11-26T09:59:02Z|00123|binding|INFO|Releasing lport d160bab8-f6a4-4dc7-b4dd-c79a804f7583 from this chassis (sb_readonly=0) Nov 26 04:59:02 localhost nova_compute[281415]: 2025-11-26 09:59:02.103 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:02.104 159486 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6a06c83c-a173-43fd-9343-735e8a52503a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6a06c83c-a173-43fd-9343-735e8a52503a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:02.106 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[ad59ad13-f498-48b0-a080-a962d526b5c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:02.107 159486 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: global Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: log /dev/log local0 debug Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: log-tag haproxy-metadata-proxy-6a06c83c-a173-43fd-9343-735e8a52503a Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: user root Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: group root Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: maxconn 1024 Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: pidfile /var/lib/neutron/external/pids/6a06c83c-a173-43fd-9343-735e8a52503a.pid.haproxy Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: daemon Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: defaults Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: log global Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: mode http Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: option httplog Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: option dontlognull Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: option http-server-close Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: option forwardfor Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: retries 3 Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: timeout http-request 30s Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: timeout connect 30s Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: timeout client 32s Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: timeout server 32s Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: timeout http-keep-alive 30s Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: listen listener Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: bind 169.254.169.254:80 Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: server metadata /var/lib/neutron/metadata_proxy Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: http-request add-header X-OVN-Network-ID 6a06c83c-a173-43fd-9343-735e8a52503a Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:02.110 159486 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6a06c83c-a173-43fd-9343-735e8a52503a', 'env', 'PROCESS_TAG=haproxy-6a06c83c-a173-43fd-9343-735e8a52503a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6a06c83c-a173-43fd-9343-735e8a52503a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m Nov 26 04:59:02 localhost nova_compute[281415]: 2025-11-26 09:59:02.111 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:02 localhost nova_compute[281415]: 2025-11-26 09:59:02.229 281419 DEBUG nova.virt.driver [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Emitting event Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Nov 26 04:59:02 localhost nova_compute[281415]: 2025-11-26 09:59:02.229 281419 INFO nova.compute.manager [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] VM Started (Lifecycle Event)#033[00m Nov 26 04:59:02 localhost nova_compute[281415]: 2025-11-26 09:59:02.263 281419 DEBUG nova.compute.manager [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 26 04:59:02 localhost nova_compute[281415]: 2025-11-26 09:59:02.269 281419 DEBUG nova.virt.driver [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Emitting event Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Nov 26 04:59:02 localhost nova_compute[281415]: 2025-11-26 09:59:02.269 281419 INFO nova.compute.manager [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] VM Paused (Lifecycle Event)#033[00m Nov 26 04:59:02 localhost nova_compute[281415]: 2025-11-26 09:59:02.297 281419 DEBUG nova.compute.manager [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 26 04:59:02 localhost nova_compute[281415]: 2025-11-26 09:59:02.303 281419 DEBUG nova.compute.manager [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Nov 26 04:59:02 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 04:59:02 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/777237426' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 04:59:02 localhost nova_compute[281415]: 2025-11-26 09:59:02.325 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:59:02 localhost nova_compute[281415]: 2025-11-26 09:59:02.329 281419 INFO nova.compute.manager [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m Nov 26 04:59:02 localhost nova_compute[281415]: 2025-11-26 09:59:02.416 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:59:02 localhost nova_compute[281415]: 2025-11-26 09:59:02.417 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:59:02 localhost nova_compute[281415]: 2025-11-26 09:59:02.423 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000009 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:59:02 localhost nova_compute[281415]: 2025-11-26 09:59:02.423 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000009 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 04:59:02 localhost podman[308908]: Nov 26 04:59:02 localhost podman[308908]: 2025-11-26 09:59:02.634107699 +0000 UTC m=+0.104013756 container create 0bba657e2e77a7cc168f4577bc1d0b6c16fc2374729fa4df49ae2d44a5b627ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a06c83c-a173-43fd-9343-735e8a52503a, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:59:02 localhost systemd[1]: Started libpod-conmon-0bba657e2e77a7cc168f4577bc1d0b6c16fc2374729fa4df49ae2d44a5b627ec.scope. Nov 26 04:59:02 localhost podman[308908]: 2025-11-26 09:59:02.581660878 +0000 UTC m=+0.051566965 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Nov 26 04:59:02 localhost systemd[1]: Started libcrun container. Nov 26 04:59:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8ab16417e4894483337d1fa0f10cfbe2c30cc5cafd7c0c4eeabb73326e60dff7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 04:59:02 localhost podman[308908]: 2025-11-26 09:59:02.726388369 +0000 UTC m=+0.196294436 container init 0bba657e2e77a7cc168f4577bc1d0b6c16fc2374729fa4df49ae2d44a5b627ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a06c83c-a173-43fd-9343-735e8a52503a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:59:02 localhost nova_compute[281415]: 2025-11-26 09:59:02.730 281419 WARNING nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 04:59:02 localhost nova_compute[281415]: 2025-11-26 09:59:02.732 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=11215MB free_disk=41.50426483154297GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 04:59:02 localhost nova_compute[281415]: 2025-11-26 09:59:02.733 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:59:02 localhost nova_compute[281415]: 2025-11-26 09:59:02.733 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:59:02 localhost podman[308908]: 2025-11-26 09:59:02.737766784 +0000 UTC m=+0.207672841 container start 0bba657e2e77a7cc168f4577bc1d0b6c16fc2374729fa4df49ae2d44a5b627ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a06c83c-a173-43fd-9343-735e8a52503a, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:59:02 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:59:02.755 262471 INFO neutron.agent.linux.ip_lib [None req-4adfeb04-9da2-4fe4-868b-eb11ca5aa516 - - - - - -] Device tap74cd8662-c0 cannot be used as it has no MAC address#033[00m Nov 26 04:59:02 localhost neutron-haproxy-ovnmeta-6a06c83c-a173-43fd-9343-735e8a52503a[308926]: [NOTICE] (308932) : New worker (308936) forked Nov 26 04:59:02 localhost neutron-haproxy-ovnmeta-6a06c83c-a173-43fd-9343-735e8a52503a[308926]: [NOTICE] (308932) : Loading success. Nov 26 04:59:02 localhost nova_compute[281415]: 2025-11-26 09:59:02.782 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:02 localhost kernel: device tap74cd8662-c0 entered promiscuous mode Nov 26 04:59:02 localhost NetworkManager[5970]: [1764151142.7903] manager: (tap74cd8662-c0): new Generic device (/org/freedesktop/NetworkManager/Devices/27) Nov 26 04:59:02 localhost ovn_controller[153664]: 2025-11-26T09:59:02Z|00124|binding|INFO|Claiming lport 74cd8662-c0ce-4a95-bc7b-870335b0d225 for this chassis. Nov 26 04:59:02 localhost ovn_controller[153664]: 2025-11-26T09:59:02Z|00125|binding|INFO|74cd8662-c0ce-4a95-bc7b-870335b0d225: Claiming unknown Nov 26 04:59:02 localhost systemd-udevd[308791]: Network interface NamePolicy= disabled on kernel command line. Nov 26 04:59:02 localhost nova_compute[281415]: 2025-11-26 09:59:02.798 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:02.803 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-4cfea0db-f255-4656-8a2a-dcab9f6b67f3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4cfea0db-f255-4656-8a2a-dcab9f6b67f3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dce21479c7b5485f8cfbe2fd1f9c94a9', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=00173220-15dd-4026-85e2-e6861014e118, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=74cd8662-c0ce-4a95-bc7b-870335b0d225) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:02.808 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 74cd8662-c0ce-4a95-bc7b-870335b0d225 in datapath 4cfea0db-f255-4656-8a2a-dcab9f6b67f3 bound to our chassis#033[00m Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:02.810 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4cfea0db-f255-4656-8a2a-dcab9f6b67f3 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 04:59:02 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:02.811 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[4e4a35a7-5bcd-48d4-8f84-63e199e1812b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:59:02 localhost journal[229445]: ethtool ioctl error on tap74cd8662-c0: No such device Nov 26 04:59:02 localhost journal[229445]: ethtool ioctl error on tap74cd8662-c0: No such device Nov 26 04:59:02 localhost journal[229445]: ethtool ioctl error on tap74cd8662-c0: No such device Nov 26 04:59:02 localhost ovn_controller[153664]: 2025-11-26T09:59:02Z|00126|binding|INFO|Setting lport 74cd8662-c0ce-4a95-bc7b-870335b0d225 ovn-installed in OVS Nov 26 04:59:02 localhost ovn_controller[153664]: 2025-11-26T09:59:02Z|00127|binding|INFO|Setting lport 74cd8662-c0ce-4a95-bc7b-870335b0d225 up in Southbound Nov 26 04:59:02 localhost nova_compute[281415]: 2025-11-26 09:59:02.839 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:02 localhost journal[229445]: ethtool ioctl error on tap74cd8662-c0: No such device Nov 26 04:59:02 localhost journal[229445]: ethtool ioctl error on tap74cd8662-c0: No such device Nov 26 04:59:02 localhost journal[229445]: ethtool ioctl error on tap74cd8662-c0: No such device Nov 26 04:59:02 localhost journal[229445]: ethtool ioctl error on tap74cd8662-c0: No such device Nov 26 04:59:02 localhost journal[229445]: ethtool ioctl error on tap74cd8662-c0: No such device Nov 26 04:59:02 localhost nova_compute[281415]: 2025-11-26 09:59:02.867 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:02 localhost nova_compute[281415]: 2025-11-26 09:59:02.889 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:02 localhost nova_compute[281415]: 2025-11-26 09:59:02.959 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 04:59:02 localhost nova_compute[281415]: 2025-11-26 09:59:02.959 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Instance af8a19fc-9bd6-4666-942c-7f001cd8070a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 04:59:02 localhost nova_compute[281415]: 2025-11-26 09:59:02.960 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 04:59:02 localhost nova_compute[281415]: 2025-11-26 09:59:02.960 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1152MB phys_disk=41GB used_disk=3GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 04:59:03 localhost nova_compute[281415]: 2025-11-26 09:59:03.013 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Refreshing inventories for resource provider 05276789-7461-410b-9529-16f5185a8bff _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Nov 26 04:59:03 localhost nova_compute[281415]: 2025-11-26 09:59:03.079 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Updating ProviderTree inventory for provider 05276789-7461-410b-9529-16f5185a8bff from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Nov 26 04:59:03 localhost nova_compute[281415]: 2025-11-26 09:59:03.080 281419 DEBUG nova.compute.provider_tree [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Updating inventory in ProviderTree for provider 05276789-7461-410b-9529-16f5185a8bff with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Nov 26 04:59:03 localhost nova_compute[281415]: 2025-11-26 09:59:03.093 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Refreshing aggregate associations for resource provider 05276789-7461-410b-9529-16f5185a8bff, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Nov 26 04:59:03 localhost nova_compute[281415]: 2025-11-26 09:59:03.156 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Refreshing trait associations for resource provider 05276789-7461-410b-9529-16f5185a8bff, traits: COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_F16C,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_FMA3,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ACCELERATORS,HW_CPU_X86_AESNI,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_CLMUL,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_BMI,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Nov 26 04:59:03 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e101 e101: 6 total, 6 up, 6 in Nov 26 04:59:03 localhost nova_compute[281415]: 2025-11-26 09:59:03.277 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:59:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:03.665 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:59:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:03.666 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:59:03 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:03.667 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:59:03 localhost podman[309036]: Nov 26 04:59:03 localhost podman[309036]: 2025-11-26 09:59:03.730380889 +0000 UTC m=+0.088094223 container create c4ed018115b15d58b340ff07a56fe52eef12c689ff86ecb951d82b7926005ffc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4cfea0db-f255-4656-8a2a-dcab9f6b67f3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118) Nov 26 04:59:03 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 04:59:03 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/4221681164' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 04:59:03 localhost nova_compute[281415]: 2025-11-26 09:59:03.765 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:59:03 localhost nova_compute[281415]: 2025-11-26 09:59:03.773 281419 DEBUG nova.compute.provider_tree [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 04:59:03 localhost systemd[1]: Started libpod-conmon-c4ed018115b15d58b340ff07a56fe52eef12c689ff86ecb951d82b7926005ffc.scope. Nov 26 04:59:03 localhost podman[309036]: 2025-11-26 09:59:03.68526081 +0000 UTC m=+0.042974184 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 04:59:03 localhost systemd[1]: Started libcrun container. Nov 26 04:59:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/05751cbf3becc81fbdd7b8cbf98a89ac59bf8425ee858f226f665d35dfb3d353/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 04:59:03 localhost podman[309036]: 2025-11-26 09:59:03.813200762 +0000 UTC m=+0.170914106 container init c4ed018115b15d58b340ff07a56fe52eef12c689ff86ecb951d82b7926005ffc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4cfea0db-f255-4656-8a2a-dcab9f6b67f3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:59:03 localhost podman[309036]: 2025-11-26 09:59:03.824846045 +0000 UTC m=+0.182559389 container start c4ed018115b15d58b340ff07a56fe52eef12c689ff86ecb951d82b7926005ffc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4cfea0db-f255-4656-8a2a-dcab9f6b67f3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS) Nov 26 04:59:03 localhost dnsmasq[309056]: started, version 2.85 cachesize 150 Nov 26 04:59:03 localhost dnsmasq[309056]: DNS service limited to local subnets Nov 26 04:59:03 localhost dnsmasq[309056]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 04:59:03 localhost dnsmasq[309056]: warning: no upstream servers configured Nov 26 04:59:03 localhost dnsmasq-dhcp[309056]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 26 04:59:03 localhost dnsmasq[309056]: read /var/lib/neutron/dhcp/4cfea0db-f255-4656-8a2a-dcab9f6b67f3/addn_hosts - 0 addresses Nov 26 04:59:03 localhost dnsmasq-dhcp[309056]: read /var/lib/neutron/dhcp/4cfea0db-f255-4656-8a2a-dcab9f6b67f3/host Nov 26 04:59:03 localhost dnsmasq-dhcp[309056]: read /var/lib/neutron/dhcp/4cfea0db-f255-4656-8a2a-dcab9f6b67f3/opts Nov 26 04:59:04 localhost nova_compute[281415]: 2025-11-26 09:59:04.099 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 04:59:04 localhost nova_compute[281415]: 2025-11-26 09:59:04.125 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 04:59:04 localhost nova_compute[281415]: 2025-11-26 09:59:04.125 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.392s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:59:04 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e101 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:59:04 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:59:04.144 262471 INFO neutron.agent.dhcp.agent [None req-e8ede7bb-a9ca-4c79-aa2a-8eb0cc107fdd - - - - - -] DHCP configuration for ports {'e89ab5b7-fb42-4389-8e17-b1163b6a09ab'} is completed#033[00m Nov 26 04:59:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:59:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:59:04 localhost podman[309058]: 2025-11-26 09:59:04.82864814 +0000 UTC m=+0.080377319 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, distribution-scope=public, container_name=openstack_network_exporter, managed_by=edpm_ansible, config_id=edpm, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git) Nov 26 04:59:04 localhost podman[309058]: 2025-11-26 09:59:04.845353217 +0000 UTC m=+0.097082396 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_id=edpm, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, version=9.6, managed_by=edpm_ansible, release=1755695350, name=ubi9-minimal, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Nov 26 04:59:04 localhost nova_compute[281415]: 2025-11-26 09:59:04.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:59:04 localhost nova_compute[281415]: 2025-11-26 09:59:04.848 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 04:59:04 localhost nova_compute[281415]: 2025-11-26 09:59:04.848 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 04:59:04 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:59:04 localhost nova_compute[281415]: 2025-11-26 09:59:04.880 281419 DEBUG nova.compute.manager [req-57501f99-1ca5-43c9-91fe-ccc25bf19aa2 req-856ab8c6-4a49-4364-8f06-265953180fa2 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Received event network-vif-plugged-e174856e-03c8-44b0-b9d4-dd4da2f98b9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Nov 26 04:59:04 localhost nova_compute[281415]: 2025-11-26 09:59:04.880 281419 DEBUG oslo_concurrency.lockutils [req-57501f99-1ca5-43c9-91fe-ccc25bf19aa2 req-856ab8c6-4a49-4364-8f06-265953180fa2 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] Acquiring lock "af8a19fc-9bd6-4666-942c-7f001cd8070a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:59:04 localhost nova_compute[281415]: 2025-11-26 09:59:04.881 281419 DEBUG oslo_concurrency.lockutils [req-57501f99-1ca5-43c9-91fe-ccc25bf19aa2 req-856ab8c6-4a49-4364-8f06-265953180fa2 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] Lock "af8a19fc-9bd6-4666-942c-7f001cd8070a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:59:04 localhost nova_compute[281415]: 2025-11-26 09:59:04.881 281419 DEBUG oslo_concurrency.lockutils [req-57501f99-1ca5-43c9-91fe-ccc25bf19aa2 req-856ab8c6-4a49-4364-8f06-265953180fa2 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] Lock "af8a19fc-9bd6-4666-942c-7f001cd8070a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:59:04 localhost nova_compute[281415]: 2025-11-26 09:59:04.881 281419 DEBUG nova.compute.manager [req-57501f99-1ca5-43c9-91fe-ccc25bf19aa2 req-856ab8c6-4a49-4364-8f06-265953180fa2 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Processing event network-vif-plugged-e174856e-03c8-44b0-b9d4-dd4da2f98b9b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m Nov 26 04:59:04 localhost nova_compute[281415]: 2025-11-26 09:59:04.882 281419 DEBUG nova.compute.manager [req-57501f99-1ca5-43c9-91fe-ccc25bf19aa2 req-856ab8c6-4a49-4364-8f06-265953180fa2 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Received event network-vif-plugged-e174856e-03c8-44b0-b9d4-dd4da2f98b9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Nov 26 04:59:04 localhost nova_compute[281415]: 2025-11-26 09:59:04.882 281419 DEBUG oslo_concurrency.lockutils [req-57501f99-1ca5-43c9-91fe-ccc25bf19aa2 req-856ab8c6-4a49-4364-8f06-265953180fa2 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] Acquiring lock "af8a19fc-9bd6-4666-942c-7f001cd8070a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:59:04 localhost nova_compute[281415]: 2025-11-26 09:59:04.882 281419 DEBUG oslo_concurrency.lockutils [req-57501f99-1ca5-43c9-91fe-ccc25bf19aa2 req-856ab8c6-4a49-4364-8f06-265953180fa2 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] Lock "af8a19fc-9bd6-4666-942c-7f001cd8070a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:59:04 localhost nova_compute[281415]: 2025-11-26 09:59:04.883 281419 DEBUG oslo_concurrency.lockutils [req-57501f99-1ca5-43c9-91fe-ccc25bf19aa2 req-856ab8c6-4a49-4364-8f06-265953180fa2 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] Lock "af8a19fc-9bd6-4666-942c-7f001cd8070a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:59:04 localhost nova_compute[281415]: 2025-11-26 09:59:04.883 281419 DEBUG nova.compute.manager [req-57501f99-1ca5-43c9-91fe-ccc25bf19aa2 req-856ab8c6-4a49-4364-8f06-265953180fa2 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] No waiting events found dispatching network-vif-plugged-e174856e-03c8-44b0-b9d4-dd4da2f98b9b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Nov 26 04:59:04 localhost nova_compute[281415]: 2025-11-26 09:59:04.883 281419 WARNING nova.compute.manager [req-57501f99-1ca5-43c9-91fe-ccc25bf19aa2 req-856ab8c6-4a49-4364-8f06-265953180fa2 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Received unexpected event network-vif-plugged-e174856e-03c8-44b0-b9d4-dd4da2f98b9b for instance with vm_state building and task_state spawning.#033[00m Nov 26 04:59:04 localhost nova_compute[281415]: 2025-11-26 09:59:04.885 281419 DEBUG nova.compute.manager [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m Nov 26 04:59:04 localhost nova_compute[281415]: 2025-11-26 09:59:04.887 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m Nov 26 04:59:04 localhost nova_compute[281415]: 2025-11-26 09:59:04.899 281419 DEBUG nova.virt.driver [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Emitting event Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Nov 26 04:59:04 localhost nova_compute[281415]: 2025-11-26 09:59:04.900 281419 INFO nova.compute.manager [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] VM Resumed (Lifecycle Event)#033[00m Nov 26 04:59:04 localhost nova_compute[281415]: 2025-11-26 09:59:04.909 281419 DEBUG nova.virt.libvirt.driver [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m Nov 26 04:59:04 localhost nova_compute[281415]: 2025-11-26 09:59:04.914 281419 INFO nova.virt.libvirt.driver [-] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Instance spawned successfully.#033[00m Nov 26 04:59:04 localhost nova_compute[281415]: 2025-11-26 09:59:04.915 281419 DEBUG nova.virt.libvirt.driver [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m Nov 26 04:59:04 localhost nova_compute[281415]: 2025-11-26 09:59:04.922 281419 DEBUG nova.compute.manager [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 26 04:59:04 localhost podman[309057]: 2025-11-26 09:59:04.935950136 +0000 UTC m=+0.189918053 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:59:04 localhost nova_compute[281415]: 2025-11-26 09:59:04.938 281419 DEBUG nova.compute.manager [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Nov 26 04:59:04 localhost nova_compute[281415]: 2025-11-26 09:59:04.945 281419 DEBUG nova.virt.libvirt.driver [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Nov 26 04:59:04 localhost nova_compute[281415]: 2025-11-26 09:59:04.945 281419 DEBUG nova.virt.libvirt.driver [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Nov 26 04:59:04 localhost nova_compute[281415]: 2025-11-26 09:59:04.946 281419 DEBUG nova.virt.libvirt.driver [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Nov 26 04:59:04 localhost nova_compute[281415]: 2025-11-26 09:59:04.946 281419 DEBUG nova.virt.libvirt.driver [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Nov 26 04:59:04 localhost nova_compute[281415]: 2025-11-26 09:59:04.947 281419 DEBUG nova.virt.libvirt.driver [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Nov 26 04:59:04 localhost nova_compute[281415]: 2025-11-26 09:59:04.948 281419 DEBUG nova.virt.libvirt.driver [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Nov 26 04:59:04 localhost nova_compute[281415]: 2025-11-26 09:59:04.973 281419 INFO nova.compute.manager [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m Nov 26 04:59:05 localhost nova_compute[281415]: 2025-11-26 09:59:05.011 281419 INFO nova.compute.manager [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Took 8.41 seconds to spawn the instance on the hypervisor.#033[00m Nov 26 04:59:05 localhost nova_compute[281415]: 2025-11-26 09:59:05.011 281419 DEBUG nova.compute.manager [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 26 04:59:05 localhost podman[309057]: 2025-11-26 09:59:05.017433758 +0000 UTC m=+0.271401645 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 26 04:59:05 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:59:05 localhost nova_compute[281415]: 2025-11-26 09:59:05.096 281419 INFO nova.compute.manager [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Took 9.41 seconds to build instance.#033[00m Nov 26 04:59:05 localhost nova_compute[281415]: 2025-11-26 09:59:05.114 281419 DEBUG oslo_concurrency.lockutils [None req-991ca720-2561-4655-a74b-916ee597b993 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Lock "af8a19fc-9bd6-4666-942c-7f001cd8070a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 9.502s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:59:05 localhost nova_compute[281415]: 2025-11-26 09:59:05.132 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:59:05 localhost nova_compute[281415]: 2025-11-26 09:59:05.132 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:59:05 localhost nova_compute[281415]: 2025-11-26 09:59:05.132 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 04:59:05 localhost nova_compute[281415]: 2025-11-26 09:59:05.133 281419 DEBUG nova.objects.instance [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:59:05 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:05.773 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:5e:b4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '86:cf:7c:68:02:df'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 04:59:05 localhost nova_compute[281415]: 2025-11-26 09:59:05.774 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:05 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:05.774 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 26 04:59:06 localhost nova_compute[281415]: 2025-11-26 09:59:06.138 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:06 localhost ovn_controller[153664]: 2025-11-26T09:59:06Z|00128|binding|INFO|Releasing lport d160bab8-f6a4-4dc7-b4dd-c79a804f7583 from this chassis (sb_readonly=0) Nov 26 04:59:06 localhost ovn_controller[153664]: 2025-11-26T09:59:06Z|00129|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 04:59:06 localhost nova_compute[281415]: 2025-11-26 09:59:06.637 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:06 localhost nova_compute[281415]: 2025-11-26 09:59:06.649 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:06 localhost nova_compute[281415]: 2025-11-26 09:59:06.711 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:59:06 localhost nova_compute[281415]: 2025-11-26 09:59:06.732 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:59:06 localhost nova_compute[281415]: 2025-11-26 09:59:06.733 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 04:59:06 localhost nova_compute[281415]: 2025-11-26 09:59:06.734 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:59:06 localhost nova_compute[281415]: 2025-11-26 09:59:06.735 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Nov 26 04:59:06 localhost nova_compute[281415]: 2025-11-26 09:59:06.745 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:59:06 localhost ovn_controller[153664]: 2025-11-26T09:59:06Z|00130|binding|INFO|Releasing lport d160bab8-f6a4-4dc7-b4dd-c79a804f7583 from this chassis (sb_readonly=0) Nov 26 04:59:06 localhost ovn_controller[153664]: 2025-11-26T09:59:06Z|00131|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 04:59:06 localhost nova_compute[281415]: 2025-11-26 09:59:06.970 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:08 localhost dnsmasq[307856]: exiting on receipt of SIGTERM Nov 26 04:59:08 localhost podman[309115]: 2025-11-26 09:59:08.05019241 +0000 UTC m=+0.108197323 container kill c598f3b3b43db4298142ff3d887b3116867dabe5e9678803fdea4a67190259e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c3527677-edc0-4790-a4a3-cebcf166663a, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:59:08 localhost systemd[1]: tmp-crun.b4Bs3R.mount: Deactivated successfully. Nov 26 04:59:08 localhost systemd[1]: libpod-c598f3b3b43db4298142ff3d887b3116867dabe5e9678803fdea4a67190259e8.scope: Deactivated successfully. Nov 26 04:59:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:59:08 localhost podman[309129]: 2025-11-26 09:59:08.176546653 +0000 UTC m=+0.091118535 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 26 04:59:08 localhost podman[309129]: 2025-11-26 09:59:08.189490396 +0000 UTC m=+0.104062258 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:59:08 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:59:08 localhost nova_compute[281415]: 2025-11-26 09:59:08.225 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:08 localhost podman[309128]: 2025-11-26 09:59:08.25488894 +0000 UTC m=+0.180096755 container died c598f3b3b43db4298142ff3d887b3116867dabe5e9678803fdea4a67190259e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c3527677-edc0-4790-a4a3-cebcf166663a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:59:08 localhost podman[309128]: 2025-11-26 09:59:08.28028944 +0000 UTC m=+0.205497215 container cleanup c598f3b3b43db4298142ff3d887b3116867dabe5e9678803fdea4a67190259e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c3527677-edc0-4790-a4a3-cebcf166663a, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Nov 26 04:59:08 localhost systemd[1]: libpod-conmon-c598f3b3b43db4298142ff3d887b3116867dabe5e9678803fdea4a67190259e8.scope: Deactivated successfully. Nov 26 04:59:08 localhost podman[309136]: 2025-11-26 09:59:08.32113762 +0000 UTC m=+0.217595723 container remove c598f3b3b43db4298142ff3d887b3116867dabe5e9678803fdea4a67190259e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c3527677-edc0-4790-a4a3-cebcf166663a, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:59:08 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:59:08.353 262471 INFO neutron.agent.dhcp.agent [None req-fb9dd6da-50de-447e-be65-1a15f18452a0 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 04:59:08 localhost nova_compute[281415]: 2025-11-26 09:59:08.469 281419 DEBUG nova.compute.manager [req-f090db48-6d03-484d-8871-299d9296f617 req-1529ab88-ad8f-4905-b164-9239f8991598 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Received event network-changed-e174856e-03c8-44b0-b9d4-dd4da2f98b9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Nov 26 04:59:08 localhost nova_compute[281415]: 2025-11-26 09:59:08.470 281419 DEBUG nova.compute.manager [req-f090db48-6d03-484d-8871-299d9296f617 req-1529ab88-ad8f-4905-b164-9239f8991598 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Refreshing instance network info cache due to event network-changed-e174856e-03c8-44b0-b9d4-dd4da2f98b9b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m Nov 26 04:59:08 localhost nova_compute[281415]: 2025-11-26 09:59:08.471 281419 DEBUG oslo_concurrency.lockutils [req-f090db48-6d03-484d-8871-299d9296f617 req-1529ab88-ad8f-4905-b164-9239f8991598 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] Acquiring lock "refresh_cache-af8a19fc-9bd6-4666-942c-7f001cd8070a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:59:08 localhost nova_compute[281415]: 2025-11-26 09:59:08.471 281419 DEBUG oslo_concurrency.lockutils [req-f090db48-6d03-484d-8871-299d9296f617 req-1529ab88-ad8f-4905-b164-9239f8991598 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] Acquired lock "refresh_cache-af8a19fc-9bd6-4666-942c-7f001cd8070a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:59:08 localhost nova_compute[281415]: 2025-11-26 09:59:08.472 281419 DEBUG nova.network.neutron [req-f090db48-6d03-484d-8871-299d9296f617 req-1529ab88-ad8f-4905-b164-9239f8991598 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Refreshing network info cache for port e174856e-03c8-44b0-b9d4-dd4da2f98b9b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m Nov 26 04:59:08 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:59:08.694 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 04:59:09 localhost systemd[1]: var-lib-containers-storage-overlay-943ce31a4e5f39a6d218114bc386e66ed535d0be6b1895c0a105df51ee53c968-merged.mount: Deactivated successfully. Nov 26 04:59:09 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c598f3b3b43db4298142ff3d887b3116867dabe5e9678803fdea4a67190259e8-userdata-shm.mount: Deactivated successfully. Nov 26 04:59:09 localhost systemd[1]: run-netns-qdhcp\x2dc3527677\x2dedc0\x2d4790\x2da4a3\x2dcebcf166663a.mount: Deactivated successfully. Nov 26 04:59:09 localhost nova_compute[281415]: 2025-11-26 09:59:09.039 281419 DEBUG nova.network.neutron [req-f090db48-6d03-484d-8871-299d9296f617 req-1529ab88-ad8f-4905-b164-9239f8991598 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Updated VIF entry in instance network info cache for port e174856e-03c8-44b0-b9d4-dd4da2f98b9b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m Nov 26 04:59:09 localhost nova_compute[281415]: 2025-11-26 09:59:09.040 281419 DEBUG nova.network.neutron [req-f090db48-6d03-484d-8871-299d9296f617 req-1529ab88-ad8f-4905-b164-9239f8991598 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Updating instance_info_cache with network_info: [{"id": "e174856e-03c8-44b0-b9d4-dd4da2f98b9b", "address": "fa:16:3e:d6:8a:86", "network": {"id": "6a06c83c-a173-43fd-9343-735e8a52503a", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-789735874-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.220", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "3f251213a9644261874d24d123ed8f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape174856e-03", "ovs_interfaceid": "e174856e-03c8-44b0-b9d4-dd4da2f98b9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:59:09 localhost nova_compute[281415]: 2025-11-26 09:59:09.066 281419 DEBUG oslo_concurrency.lockutils [req-f090db48-6d03-484d-8871-299d9296f617 req-1529ab88-ad8f-4905-b164-9239f8991598 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] Releasing lock "refresh_cache-af8a19fc-9bd6-4666-942c-7f001cd8070a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:59:09 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:59:09.118 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T09:59:08Z, description=, device_id=f9d12139-4231-4b8f-8a45-5fe08bd3118b, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=4744c63d-720b-4598-8412-86d733d41b44, ip_allocation=immediate, mac_address=fa:16:3e:84:e1:7b, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T09:59:01Z, description=, dns_domain=, id=4cfea0db-f255-4656-8a2a-dcab9f6b67f3, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupsTestJSON-1335898449-network, port_security_enabled=True, project_id=dce21479c7b5485f8cfbe2fd1f9c94a9, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=50140, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=638, status=ACTIVE, subnets=['37f08380-c514-4dd0-82c2-4dde3a691625'], tags=[], tenant_id=dce21479c7b5485f8cfbe2fd1f9c94a9, updated_at=2025-11-26T09:59:01Z, vlan_transparent=None, network_id=4cfea0db-f255-4656-8a2a-dcab9f6b67f3, port_security_enabled=False, project_id=dce21479c7b5485f8cfbe2fd1f9c94a9, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=650, status=DOWN, tags=[], tenant_id=dce21479c7b5485f8cfbe2fd1f9c94a9, updated_at=2025-11-26T09:59:08Z on network 4cfea0db-f255-4656-8a2a-dcab9f6b67f3#033[00m Nov 26 04:59:09 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e101 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:59:09 localhost systemd[1]: tmp-crun.gBq9TK.mount: Deactivated successfully. Nov 26 04:59:09 localhost dnsmasq[309056]: read /var/lib/neutron/dhcp/4cfea0db-f255-4656-8a2a-dcab9f6b67f3/addn_hosts - 1 addresses Nov 26 04:59:09 localhost dnsmasq-dhcp[309056]: read /var/lib/neutron/dhcp/4cfea0db-f255-4656-8a2a-dcab9f6b67f3/host Nov 26 04:59:09 localhost dnsmasq-dhcp[309056]: read /var/lib/neutron/dhcp/4cfea0db-f255-4656-8a2a-dcab9f6b67f3/opts Nov 26 04:59:09 localhost podman[309197]: 2025-11-26 09:59:09.368108385 +0000 UTC m=+0.088360262 container kill c4ed018115b15d58b340ff07a56fe52eef12c689ff86ecb951d82b7926005ffc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4cfea0db-f255-4656-8a2a-dcab9f6b67f3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:59:09 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:59:09.632 262471 INFO neutron.agent.dhcp.agent [None req-1aa1072a-eb20-42e3-abd9-3d210e8ff0b4 - - - - - -] DHCP configuration for ports {'4744c63d-720b-4598-8412-86d733d41b44'} is completed#033[00m Nov 26 04:59:09 localhost neutron_sriov_agent[255515]: 2025-11-26 09:59:09.681 2 INFO neutron.agent.securitygroups_rpc [None req-563da1b8-1beb-4802-a143-320365168679 9b97f8a50e9b4d2a829742e2c89653c3 7a98d39e7b5a4b068f04c2241b19fa64 - - default default] Security group member updated ['6cc24193-65ce-4dcb-aaea-4042f3aaa358']#033[00m Nov 26 04:59:09 localhost dnsmasq[306964]: read /var/lib/neutron/dhcp/d5a027cf-17c0-4785-85e8-7feed63239ef/addn_hosts - 0 addresses Nov 26 04:59:09 localhost dnsmasq-dhcp[306964]: read /var/lib/neutron/dhcp/d5a027cf-17c0-4785-85e8-7feed63239ef/host Nov 26 04:59:09 localhost dnsmasq-dhcp[306964]: read /var/lib/neutron/dhcp/d5a027cf-17c0-4785-85e8-7feed63239ef/opts Nov 26 04:59:09 localhost podman[309233]: 2025-11-26 09:59:09.933042964 +0000 UTC m=+0.053422792 container kill a12a0f7702a4c2f73c9ec5868cadbf9ac9ec3b3e7c3e51c0491627b8ad6d0836 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d5a027cf-17c0-4785-85e8-7feed63239ef, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118) Nov 26 04:59:10 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:59:10.023 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T09:59:08Z, description=, device_id=f9d12139-4231-4b8f-8a45-5fe08bd3118b, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=4744c63d-720b-4598-8412-86d733d41b44, ip_allocation=immediate, mac_address=fa:16:3e:84:e1:7b, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T09:59:01Z, description=, dns_domain=, id=4cfea0db-f255-4656-8a2a-dcab9f6b67f3, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupsTestJSON-1335898449-network, port_security_enabled=True, project_id=dce21479c7b5485f8cfbe2fd1f9c94a9, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=50140, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=638, status=ACTIVE, subnets=['37f08380-c514-4dd0-82c2-4dde3a691625'], tags=[], tenant_id=dce21479c7b5485f8cfbe2fd1f9c94a9, updated_at=2025-11-26T09:59:01Z, vlan_transparent=None, network_id=4cfea0db-f255-4656-8a2a-dcab9f6b67f3, port_security_enabled=False, project_id=dce21479c7b5485f8cfbe2fd1f9c94a9, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=650, status=DOWN, tags=[], tenant_id=dce21479c7b5485f8cfbe2fd1f9c94a9, updated_at=2025-11-26T09:59:08Z on network 4cfea0db-f255-4656-8a2a-dcab9f6b67f3#033[00m Nov 26 04:59:10 localhost dnsmasq[309056]: read /var/lib/neutron/dhcp/4cfea0db-f255-4656-8a2a-dcab9f6b67f3/addn_hosts - 1 addresses Nov 26 04:59:10 localhost dnsmasq-dhcp[309056]: read /var/lib/neutron/dhcp/4cfea0db-f255-4656-8a2a-dcab9f6b67f3/host Nov 26 04:59:10 localhost podman[309272]: 2025-11-26 09:59:10.238094719 +0000 UTC m=+0.061759555 container kill c4ed018115b15d58b340ff07a56fe52eef12c689ff86ecb951d82b7926005ffc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4cfea0db-f255-4656-8a2a-dcab9f6b67f3, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 26 04:59:10 localhost dnsmasq-dhcp[309056]: read /var/lib/neutron/dhcp/4cfea0db-f255-4656-8a2a-dcab9f6b67f3/opts Nov 26 04:59:10 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:59:10.480 262471 INFO neutron.agent.dhcp.agent [None req-93e2825b-36e3-426d-a4e0-2b82649d6a3f - - - - - -] DHCP configuration for ports {'4744c63d-720b-4598-8412-86d733d41b44'} is completed#033[00m Nov 26 04:59:10 localhost dnsmasq[306964]: exiting on receipt of SIGTERM Nov 26 04:59:10 localhost podman[309309]: 2025-11-26 09:59:10.597152982 +0000 UTC m=+0.059532876 container kill a12a0f7702a4c2f73c9ec5868cadbf9ac9ec3b3e7c3e51c0491627b8ad6d0836 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d5a027cf-17c0-4785-85e8-7feed63239ef, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2) Nov 26 04:59:10 localhost systemd[1]: libpod-a12a0f7702a4c2f73c9ec5868cadbf9ac9ec3b3e7c3e51c0491627b8ad6d0836.scope: Deactivated successfully. Nov 26 04:59:10 localhost ovn_controller[153664]: 2025-11-26T09:59:10Z|00132|binding|INFO|Releasing lport d160bab8-f6a4-4dc7-b4dd-c79a804f7583 from this chassis (sb_readonly=0) Nov 26 04:59:10 localhost ovn_controller[153664]: 2025-11-26T09:59:10Z|00133|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 04:59:10 localhost nova_compute[281415]: 2025-11-26 09:59:10.646 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:10 localhost podman[309323]: 2025-11-26 09:59:10.687302858 +0000 UTC m=+0.076558984 container died a12a0f7702a4c2f73c9ec5868cadbf9ac9ec3b3e7c3e51c0491627b8ad6d0836 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d5a027cf-17c0-4785-85e8-7feed63239ef, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3) Nov 26 04:59:10 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a12a0f7702a4c2f73c9ec5868cadbf9ac9ec3b3e7c3e51c0491627b8ad6d0836-userdata-shm.mount: Deactivated successfully. Nov 26 04:59:10 localhost podman[309323]: 2025-11-26 09:59:10.725750564 +0000 UTC m=+0.115006650 container cleanup a12a0f7702a4c2f73c9ec5868cadbf9ac9ec3b3e7c3e51c0491627b8ad6d0836 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d5a027cf-17c0-4785-85e8-7feed63239ef, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3) Nov 26 04:59:10 localhost systemd[1]: libpod-conmon-a12a0f7702a4c2f73c9ec5868cadbf9ac9ec3b3e7c3e51c0491627b8ad6d0836.scope: Deactivated successfully. Nov 26 04:59:10 localhost podman[309330]: 2025-11-26 09:59:10.767692927 +0000 UTC m=+0.126069836 container remove a12a0f7702a4c2f73c9ec5868cadbf9ac9ec3b3e7c3e51c0491627b8ad6d0836 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d5a027cf-17c0-4785-85e8-7feed63239ef, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 26 04:59:10 localhost kernel: device tapbf9d53a1-4d left promiscuous mode Nov 26 04:59:10 localhost nova_compute[281415]: 2025-11-26 09:59:10.781 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:10 localhost ovn_controller[153664]: 2025-11-26T09:59:10Z|00134|binding|INFO|Releasing lport bf9d53a1-4d43-45e5-b44e-d0b988d2f2c5 from this chassis (sb_readonly=0) Nov 26 04:59:10 localhost ovn_controller[153664]: 2025-11-26T09:59:10Z|00135|binding|INFO|Setting lport bf9d53a1-4d43-45e5-b44e-d0b988d2f2c5 down in Southbound Nov 26 04:59:10 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:10.794 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '19.80.0.3/24', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-d5a027cf-17c0-4785-85e8-7feed63239ef', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d5a027cf-17c0-4785-85e8-7feed63239ef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7a98d39e7b5a4b068f04c2241b19fa64', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8dfd8184-bbd7-44be-9a4c-4bf8ed9fa939, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=bf9d53a1-4d43-45e5-b44e-d0b988d2f2c5) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 04:59:10 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:10.795 159486 INFO neutron.agent.ovn.metadata.agent [-] Port bf9d53a1-4d43-45e5-b44e-d0b988d2f2c5 in datapath d5a027cf-17c0-4785-85e8-7feed63239ef unbound from our chassis#033[00m Nov 26 04:59:10 localhost nova_compute[281415]: 2025-11-26 09:59:10.802 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:10 localhost nova_compute[281415]: 2025-11-26 09:59:10.805 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:10 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:10.803 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d5a027cf-17c0-4785-85e8-7feed63239ef, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 04:59:10 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:10.804 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[9ab6e452-c135-4d3c-a03d-a926e7a257d8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:59:11 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:59:11.024 262471 INFO neutron.agent.dhcp.agent [None req-398ce72d-599e-45c8-924f-5854df204a61 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 04:59:11 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:59:11.025 262471 INFO neutron.agent.dhcp.agent [None req-398ce72d-599e-45c8-924f-5854df204a61 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 04:59:11 localhost nova_compute[281415]: 2025-11-26 09:59:11.164 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:11 localhost systemd[1]: var-lib-containers-storage-overlay-f1adba597191654912891ec7f65deaf87e9bd7b4c252cd031144588c11dad901-merged.mount: Deactivated successfully. Nov 26 04:59:11 localhost systemd[1]: run-netns-qdhcp\x2dd5a027cf\x2d17c0\x2d4785\x2d85e8\x2d7feed63239ef.mount: Deactivated successfully. Nov 26 04:59:11 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:59:11.457 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 04:59:11 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e102 e102: 6 total, 6 up, 6 in Nov 26 04:59:11 localhost nova_compute[281415]: 2025-11-26 09:59:11.653 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:11 localhost ovn_controller[153664]: 2025-11-26T09:59:11Z|00136|binding|INFO|Releasing lport d160bab8-f6a4-4dc7-b4dd-c79a804f7583 from this chassis (sb_readonly=0) Nov 26 04:59:11 localhost ovn_controller[153664]: 2025-11-26T09:59:11Z|00137|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 04:59:11 localhost nova_compute[281415]: 2025-11-26 09:59:11.740 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:11 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:11.776 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fad182b-d1fd-4eb1-a4d3-436a76a6f49e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:59:11 localhost neutron_sriov_agent[255515]: 2025-11-26 09:59:11.958 2 INFO neutron.agent.securitygroups_rpc [None req-26db6379-2ee8-4e5a-bdf3-eb4b17371476 9b97f8a50e9b4d2a829742e2c89653c3 7a98d39e7b5a4b068f04c2241b19fa64 - - default default] Security group member updated ['6cc24193-65ce-4dcb-aaea-4042f3aaa358']#033[00m Nov 26 04:59:13 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e103 e103: 6 total, 6 up, 6 in Nov 26 04:59:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:59:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:59:13 localhost podman[309355]: 2025-11-26 09:59:13.845084743 +0000 UTC m=+0.097778858 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true) Nov 26 04:59:13 localhost podman[309355]: 2025-11-26 09:59:13.865500522 +0000 UTC m=+0.118194687 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:59:13 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:59:13 localhost podman[309354]: 2025-11-26 09:59:13.935409363 +0000 UTC m=+0.191002436 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118) Nov 26 04:59:13 localhost podman[309354]: 2025-11-26 09:59:13.944361734 +0000 UTC m=+0.199954847 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0) Nov 26 04:59:13 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:59:14 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e103 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:59:14 localhost ovn_controller[153664]: 2025-11-26T09:59:14Z|00138|binding|INFO|Releasing lport d160bab8-f6a4-4dc7-b4dd-c79a804f7583 from this chassis (sb_readonly=0) Nov 26 04:59:14 localhost ovn_controller[153664]: 2025-11-26T09:59:14Z|00139|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 04:59:14 localhost nova_compute[281415]: 2025-11-26 09:59:14.509 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:14 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e104 e104: 6 total, 6 up, 6 in Nov 26 04:59:15 localhost openstack_network_exporter[242153]: ERROR 09:59:15 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:59:15 localhost openstack_network_exporter[242153]: ERROR 09:59:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:59:15 localhost openstack_network_exporter[242153]: ERROR 09:59:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:59:15 localhost openstack_network_exporter[242153]: ERROR 09:59:15 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:59:15 localhost openstack_network_exporter[242153]: Nov 26 04:59:15 localhost openstack_network_exporter[242153]: ERROR 09:59:15 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:59:15 localhost openstack_network_exporter[242153]: Nov 26 04:59:16 localhost nova_compute[281415]: 2025-11-26 09:59:16.167 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:16 localhost dnsmasq[309056]: read /var/lib/neutron/dhcp/4cfea0db-f255-4656-8a2a-dcab9f6b67f3/addn_hosts - 0 addresses Nov 26 04:59:16 localhost podman[309410]: 2025-11-26 09:59:16.572984266 +0000 UTC m=+0.070868832 container kill c4ed018115b15d58b340ff07a56fe52eef12c689ff86ecb951d82b7926005ffc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4cfea0db-f255-4656-8a2a-dcab9f6b67f3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:59:16 localhost dnsmasq-dhcp[309056]: read /var/lib/neutron/dhcp/4cfea0db-f255-4656-8a2a-dcab9f6b67f3/host Nov 26 04:59:16 localhost dnsmasq-dhcp[309056]: read /var/lib/neutron/dhcp/4cfea0db-f255-4656-8a2a-dcab9f6b67f3/opts Nov 26 04:59:16 localhost nova_compute[281415]: 2025-11-26 09:59:16.655 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:16 localhost kernel: device tap74cd8662-c0 left promiscuous mode Nov 26 04:59:16 localhost ovn_controller[153664]: 2025-11-26T09:59:16Z|00140|binding|INFO|Releasing lport 74cd8662-c0ce-4a95-bc7b-870335b0d225 from this chassis (sb_readonly=0) Nov 26 04:59:16 localhost ovn_controller[153664]: 2025-11-26T09:59:16Z|00141|binding|INFO|Setting lport 74cd8662-c0ce-4a95-bc7b-870335b0d225 down in Southbound Nov 26 04:59:16 localhost nova_compute[281415]: 2025-11-26 09:59:16.798 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:16 localhost nova_compute[281415]: 2025-11-26 09:59:16.824 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:16 localhost nova_compute[281415]: 2025-11-26 09:59:16.826 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:16 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:16.836 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-4cfea0db-f255-4656-8a2a-dcab9f6b67f3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4cfea0db-f255-4656-8a2a-dcab9f6b67f3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dce21479c7b5485f8cfbe2fd1f9c94a9', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=00173220-15dd-4026-85e2-e6861014e118, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=74cd8662-c0ce-4a95-bc7b-870335b0d225) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 04:59:16 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:16.838 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 74cd8662-c0ce-4a95-bc7b-870335b0d225 in datapath 4cfea0db-f255-4656-8a2a-dcab9f6b67f3 unbound from our chassis#033[00m Nov 26 04:59:16 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:16.841 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4cfea0db-f255-4656-8a2a-dcab9f6b67f3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 04:59:16 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:16.842 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[04ad4578-e358-4eed-a513-a84baf48cfa3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:59:18 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e105 e105: 6 total, 6 up, 6 in Nov 26 04:59:18 localhost ovn_controller[153664]: 2025-11-26T09:59:18Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d6:8a:86 10.100.0.10 Nov 26 04:59:18 localhost ovn_controller[153664]: 2025-11-26T09:59:18Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d6:8a:86 10.100.0.10 Nov 26 04:59:19 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e105 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:59:19 localhost nova_compute[281415]: 2025-11-26 09:59:19.802 281419 DEBUG oslo_concurrency.lockutils [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Acquiring lock "1b06d58b-c3ef-48da-b878-9d223cafba01" by "nova.compute.manager.ComputeManager.unshelve_instance..do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:59:19 localhost nova_compute[281415]: 2025-11-26 09:59:19.802 281419 DEBUG oslo_concurrency.lockutils [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Lock "1b06d58b-c3ef-48da-b878-9d223cafba01" acquired by "nova.compute.manager.ComputeManager.unshelve_instance..do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:59:19 localhost nova_compute[281415]: 2025-11-26 09:59:19.803 281419 INFO nova.compute.manager [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] Unshelving#033[00m Nov 26 04:59:19 localhost nova_compute[281415]: 2025-11-26 09:59:19.880 281419 DEBUG oslo_concurrency.lockutils [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:59:19 localhost nova_compute[281415]: 2025-11-26 09:59:19.881 281419 DEBUG oslo_concurrency.lockutils [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:59:19 localhost nova_compute[281415]: 2025-11-26 09:59:19.884 281419 DEBUG nova.objects.instance [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Lazy-loading 'pci_requests' on Instance uuid 1b06d58b-c3ef-48da-b878-9d223cafba01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:59:19 localhost nova_compute[281415]: 2025-11-26 09:59:19.898 281419 DEBUG nova.objects.instance [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Lazy-loading 'numa_topology' on Instance uuid 1b06d58b-c3ef-48da-b878-9d223cafba01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:59:19 localhost nova_compute[281415]: 2025-11-26 09:59:19.912 281419 DEBUG nova.virt.hardware [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m Nov 26 04:59:19 localhost nova_compute[281415]: 2025-11-26 09:59:19.912 281419 INFO nova.compute.claims [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] Claim successful on node np0005536118.localdomain#033[00m Nov 26 04:59:19 localhost ovn_controller[153664]: 2025-11-26T09:59:19Z|00142|binding|INFO|Releasing lport d160bab8-f6a4-4dc7-b4dd-c79a804f7583 from this chassis (sb_readonly=0) Nov 26 04:59:19 localhost ovn_controller[153664]: 2025-11-26T09:59:19Z|00143|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 04:59:20 localhost nova_compute[281415]: 2025-11-26 09:59:20.037 281419 DEBUG oslo_concurrency.processutils [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:59:20 localhost nova_compute[281415]: 2025-11-26 09:59:20.081 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:20 localhost dnsmasq[309056]: exiting on receipt of SIGTERM Nov 26 04:59:20 localhost podman[309469]: 2025-11-26 09:59:20.431251192 +0000 UTC m=+0.067637653 container kill c4ed018115b15d58b340ff07a56fe52eef12c689ff86ecb951d82b7926005ffc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4cfea0db-f255-4656-8a2a-dcab9f6b67f3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118) Nov 26 04:59:20 localhost systemd[1]: libpod-c4ed018115b15d58b340ff07a56fe52eef12c689ff86ecb951d82b7926005ffc.scope: Deactivated successfully. Nov 26 04:59:20 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 04:59:20 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3441691958' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 04:59:20 localhost podman[309483]: 2025-11-26 09:59:20.502566705 +0000 UTC m=+0.055658900 container died c4ed018115b15d58b340ff07a56fe52eef12c689ff86ecb951d82b7926005ffc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4cfea0db-f255-4656-8a2a-dcab9f6b67f3, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118) Nov 26 04:59:20 localhost ovn_controller[153664]: 2025-11-26T09:59:20Z|00144|binding|INFO|Releasing lport d160bab8-f6a4-4dc7-b4dd-c79a804f7583 from this chassis (sb_readonly=0) Nov 26 04:59:20 localhost ovn_controller[153664]: 2025-11-26T09:59:20Z|00145|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 04:59:20 localhost nova_compute[281415]: 2025-11-26 09:59:20.519 281419 DEBUG oslo_concurrency.processutils [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:59:20 localhost systemd[1]: tmp-crun.DSdBkg.mount: Deactivated successfully. Nov 26 04:59:20 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c4ed018115b15d58b340ff07a56fe52eef12c689ff86ecb951d82b7926005ffc-userdata-shm.mount: Deactivated successfully. Nov 26 04:59:20 localhost nova_compute[281415]: 2025-11-26 09:59:20.539 281419 DEBUG nova.compute.provider_tree [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 04:59:20 localhost nova_compute[281415]: 2025-11-26 09:59:20.547 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:20 localhost podman[309483]: 2025-11-26 09:59:20.554281405 +0000 UTC m=+0.107373560 container cleanup c4ed018115b15d58b340ff07a56fe52eef12c689ff86ecb951d82b7926005ffc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4cfea0db-f255-4656-8a2a-dcab9f6b67f3, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:59:20 localhost systemd[1]: libpod-conmon-c4ed018115b15d58b340ff07a56fe52eef12c689ff86ecb951d82b7926005ffc.scope: Deactivated successfully. Nov 26 04:59:20 localhost nova_compute[281415]: 2025-11-26 09:59:20.564 281419 DEBUG nova.scheduler.client.report [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 04:59:20 localhost podman[309485]: 2025-11-26 09:59:20.588966027 +0000 UTC m=+0.133353328 container remove c4ed018115b15d58b340ff07a56fe52eef12c689ff86ecb951d82b7926005ffc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4cfea0db-f255-4656-8a2a-dcab9f6b67f3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:59:20 localhost nova_compute[281415]: 2025-11-26 09:59:20.590 281419 DEBUG oslo_concurrency.lockutils [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.709s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:59:20 localhost nova_compute[281415]: 2025-11-26 09:59:20.629 281419 DEBUG oslo_concurrency.lockutils [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Acquiring lock "refresh_cache-1b06d58b-c3ef-48da-b878-9d223cafba01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:59:20 localhost nova_compute[281415]: 2025-11-26 09:59:20.630 281419 DEBUG oslo_concurrency.lockutils [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Acquired lock "refresh_cache-1b06d58b-c3ef-48da-b878-9d223cafba01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:59:20 localhost nova_compute[281415]: 2025-11-26 09:59:20.630 281419 DEBUG nova.network.neutron [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m Nov 26 04:59:20 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:59:20.633 262471 INFO neutron.agent.dhcp.agent [None req-b3f9c6cd-44ec-494d-9676-8e11d68ca059 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 04:59:20 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:59:20.633 262471 INFO neutron.agent.dhcp.agent [None req-b3f9c6cd-44ec-494d-9676-8e11d68ca059 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 04:59:20 localhost nova_compute[281415]: 2025-11-26 09:59:20.685 281419 DEBUG nova.network.neutron [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m Nov 26 04:59:20 localhost nova_compute[281415]: 2025-11-26 09:59:20.839 281419 DEBUG nova.network.neutron [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:59:20 localhost nova_compute[281415]: 2025-11-26 09:59:20.878 281419 DEBUG oslo_concurrency.lockutils [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Releasing lock "refresh_cache-1b06d58b-c3ef-48da-b878-9d223cafba01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:59:20 localhost nova_compute[281415]: 2025-11-26 09:59:20.880 281419 DEBUG nova.virt.libvirt.driver [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m Nov 26 04:59:20 localhost nova_compute[281415]: 2025-11-26 09:59:20.881 281419 INFO nova.virt.libvirt.driver [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] Creating image(s)#033[00m Nov 26 04:59:20 localhost nova_compute[281415]: 2025-11-26 09:59:20.916 281419 DEBUG nova.storage.rbd_utils [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] rbd image 1b06d58b-c3ef-48da-b878-9d223cafba01_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 26 04:59:20 localhost nova_compute[281415]: 2025-11-26 09:59:20.923 281419 DEBUG nova.objects.instance [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Lazy-loading 'trusted_certs' on Instance uuid 1b06d58b-c3ef-48da-b878-9d223cafba01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:59:20 localhost nova_compute[281415]: 2025-11-26 09:59:20.983 281419 DEBUG nova.storage.rbd_utils [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] rbd image 1b06d58b-c3ef-48da-b878-9d223cafba01_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 26 04:59:21 localhost nova_compute[281415]: 2025-11-26 09:59:21.023 281419 DEBUG nova.storage.rbd_utils [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] rbd image 1b06d58b-c3ef-48da-b878-9d223cafba01_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 26 04:59:21 localhost nova_compute[281415]: 2025-11-26 09:59:21.029 281419 DEBUG oslo_concurrency.lockutils [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Acquiring lock "6d19398d02af3b688e5d589f78612d5c24c02a71" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:59:21 localhost nova_compute[281415]: 2025-11-26 09:59:21.030 281419 DEBUG oslo_concurrency.lockutils [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Lock "6d19398d02af3b688e5d589f78612d5c24c02a71" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:59:21 localhost nova_compute[281415]: 2025-11-26 09:59:21.080 281419 DEBUG nova.virt.libvirt.imagebackend [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Image locations are: [{'url': 'rbd://0d5e5e6d-3c4b-5efe-8c65-346ae6715606/images/ddeb8cdb-c94e-4066-90e1-39eba13b3c06/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://0d5e5e6d-3c4b-5efe-8c65-346ae6715606/images/ddeb8cdb-c94e-4066-90e1-39eba13b3c06/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m Nov 26 04:59:21 localhost nova_compute[281415]: 2025-11-26 09:59:21.247 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:21 localhost nova_compute[281415]: 2025-11-26 09:59:21.267 281419 DEBUG nova.virt.libvirt.imagebackend [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Selected location: {'url': 'rbd://0d5e5e6d-3c4b-5efe-8c65-346ae6715606/images/ddeb8cdb-c94e-4066-90e1-39eba13b3c06/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m Nov 26 04:59:21 localhost nova_compute[281415]: 2025-11-26 09:59:21.268 281419 DEBUG nova.storage.rbd_utils [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] cloning images/ddeb8cdb-c94e-4066-90e1-39eba13b3c06@snap to None/1b06d58b-c3ef-48da-b878-9d223cafba01_disk clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m Nov 26 04:59:21 localhost nova_compute[281415]: 2025-11-26 09:59:21.336 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:21 localhost systemd[1]: var-lib-containers-storage-overlay-05751cbf3becc81fbdd7b8cbf98a89ac59bf8425ee858f226f665d35dfb3d353-merged.mount: Deactivated successfully. Nov 26 04:59:21 localhost systemd[1]: run-netns-qdhcp\x2d4cfea0db\x2df255\x2d4656\x2d8a2a\x2ddcab9f6b67f3.mount: Deactivated successfully. Nov 26 04:59:21 localhost nova_compute[281415]: 2025-11-26 09:59:21.461 281419 DEBUG oslo_concurrency.lockutils [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Lock "6d19398d02af3b688e5d589f78612d5c24c02a71" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.431s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:59:21 localhost nova_compute[281415]: 2025-11-26 09:59:21.661 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:21 localhost nova_compute[281415]: 2025-11-26 09:59:21.675 281419 DEBUG nova.objects.instance [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Lazy-loading 'migration_context' on Instance uuid 1b06d58b-c3ef-48da-b878-9d223cafba01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:59:21 localhost nova_compute[281415]: 2025-11-26 09:59:21.813 281419 DEBUG nova.storage.rbd_utils [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] flattening vms/1b06d58b-c3ef-48da-b878-9d223cafba01_disk flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m Nov 26 04:59:22 localhost nova_compute[281415]: 2025-11-26 09:59:22.532 281419 DEBUG nova.virt.libvirt.driver [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] Image rbd:vms/1b06d58b-c3ef-48da-b878-9d223cafba01_disk:id=openstack:conf=/etc/ceph/ceph.conf flattened successfully while unshelving instance. _try_fetch_image_cache /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11007#033[00m Nov 26 04:59:22 localhost nova_compute[281415]: 2025-11-26 09:59:22.533 281419 DEBUG nova.virt.libvirt.driver [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m Nov 26 04:59:22 localhost nova_compute[281415]: 2025-11-26 09:59:22.533 281419 DEBUG nova.virt.libvirt.driver [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] Ensure instance console log exists: /var/lib/nova/instances/1b06d58b-c3ef-48da-b878-9d223cafba01/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m Nov 26 04:59:22 localhost nova_compute[281415]: 2025-11-26 09:59:22.534 281419 DEBUG oslo_concurrency.lockutils [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:59:22 localhost nova_compute[281415]: 2025-11-26 09:59:22.534 281419 DEBUG oslo_concurrency.lockutils [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:59:22 localhost nova_compute[281415]: 2025-11-26 09:59:22.535 281419 DEBUG oslo_concurrency.lockutils [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:59:22 localhost nova_compute[281415]: 2025-11-26 09:59:22.537 281419 DEBUG nova.virt.libvirt.driver [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='',container_format='bare',created_at=2025-11-26T09:58:57Z,direct_url=,disk_format='raw',id=ddeb8cdb-c94e-4066-90e1-39eba13b3c06,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-841905651-shelved',owner='f1d07cdb6b514a4ca4915ea1d47f9b18',properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=2025-11-26T09:59:15Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'size': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'device_type': 'disk', 'disk_bus': 'virtio', 'boot_index': 0, 'image_id': '211ae400-609a-4c22-9588-f4189139a50b'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m Nov 26 04:59:22 localhost nova_compute[281415]: 2025-11-26 09:59:22.542 281419 WARNING nova.virt.libvirt.driver [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 04:59:22 localhost nova_compute[281415]: 2025-11-26 09:59:22.547 281419 DEBUG nova.virt.libvirt.host [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Searching host: 'np0005536118.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m Nov 26 04:59:22 localhost nova_compute[281415]: 2025-11-26 09:59:22.548 281419 DEBUG nova.virt.libvirt.host [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m Nov 26 04:59:22 localhost nova_compute[281415]: 2025-11-26 09:59:22.549 281419 DEBUG nova.virt.libvirt.host [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Searching host: 'np0005536118.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m Nov 26 04:59:22 localhost nova_compute[281415]: 2025-11-26 09:59:22.550 281419 DEBUG nova.virt.libvirt.host [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m Nov 26 04:59:22 localhost nova_compute[281415]: 2025-11-26 09:59:22.550 281419 DEBUG nova.virt.libvirt.driver [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Nov 26 04:59:22 localhost nova_compute[281415]: 2025-11-26 09:59:22.551 281419 DEBUG nova.virt.hardware [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-26T09:57:07Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='3b6f30a1-d6bf-48f0-b946-f1964a0a6750',id=5,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='',container_format='bare',created_at=2025-11-26T09:58:57Z,direct_url=,disk_format='raw',id=ddeb8cdb-c94e-4066-90e1-39eba13b3c06,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-841905651-shelved',owner='f1d07cdb6b514a4ca4915ea1d47f9b18',properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=2025-11-26T09:59:15Z,virtual_size=,visibility=), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m Nov 26 04:59:22 localhost nova_compute[281415]: 2025-11-26 09:59:22.551 281419 DEBUG nova.virt.hardware [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m Nov 26 04:59:22 localhost nova_compute[281415]: 2025-11-26 09:59:22.552 281419 DEBUG nova.virt.hardware [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m Nov 26 04:59:22 localhost nova_compute[281415]: 2025-11-26 09:59:22.552 281419 DEBUG nova.virt.hardware [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m Nov 26 04:59:22 localhost nova_compute[281415]: 2025-11-26 09:59:22.552 281419 DEBUG nova.virt.hardware [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m Nov 26 04:59:22 localhost nova_compute[281415]: 2025-11-26 09:59:22.553 281419 DEBUG nova.virt.hardware [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m Nov 26 04:59:22 localhost nova_compute[281415]: 2025-11-26 09:59:22.553 281419 DEBUG nova.virt.hardware [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m Nov 26 04:59:22 localhost nova_compute[281415]: 2025-11-26 09:59:22.553 281419 DEBUG nova.virt.hardware [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m Nov 26 04:59:22 localhost nova_compute[281415]: 2025-11-26 09:59:22.554 281419 DEBUG nova.virt.hardware [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m Nov 26 04:59:22 localhost nova_compute[281415]: 2025-11-26 09:59:22.554 281419 DEBUG nova.virt.hardware [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m Nov 26 04:59:22 localhost nova_compute[281415]: 2025-11-26 09:59:22.555 281419 DEBUG nova.virt.hardware [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m Nov 26 04:59:22 localhost nova_compute[281415]: 2025-11-26 09:59:22.555 281419 DEBUG nova.objects.instance [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Lazy-loading 'vcpu_model' on Instance uuid 1b06d58b-c3ef-48da-b878-9d223cafba01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:59:22 localhost nova_compute[281415]: 2025-11-26 09:59:22.575 281419 DEBUG oslo_concurrency.processutils [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:59:23 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 04:59:23 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3357733697' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 04:59:23 localhost nova_compute[281415]: 2025-11-26 09:59:23.045 281419 DEBUG oslo_concurrency.processutils [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:59:23 localhost nova_compute[281415]: 2025-11-26 09:59:23.087 281419 DEBUG nova.storage.rbd_utils [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] rbd image 1b06d58b-c3ef-48da-b878-9d223cafba01_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 26 04:59:23 localhost nova_compute[281415]: 2025-11-26 09:59:23.093 281419 DEBUG oslo_concurrency.processutils [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:59:23 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 04:59:23 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/672746538' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 04:59:23 localhost nova_compute[281415]: 2025-11-26 09:59:23.517 281419 DEBUG oslo_concurrency.processutils [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:59:23 localhost nova_compute[281415]: 2025-11-26 09:59:23.521 281419 DEBUG nova.objects.instance [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Lazy-loading 'pci_devices' on Instance uuid 1b06d58b-c3ef-48da-b878-9d223cafba01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:59:23 localhost nova_compute[281415]: 2025-11-26 09:59:23.542 281419 DEBUG nova.virt.libvirt.driver [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] End _get_guest_xml xml= Nov 26 04:59:23 localhost nova_compute[281415]: 1b06d58b-c3ef-48da-b878-9d223cafba01 Nov 26 04:59:23 localhost nova_compute[281415]: instance-00000006 Nov 26 04:59:23 localhost nova_compute[281415]: 131072 Nov 26 04:59:23 localhost nova_compute[281415]: 1 Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: tempest-UnshelveToHostMultiNodesTest-server-841905651 Nov 26 04:59:23 localhost nova_compute[281415]: 2025-11-26 09:59:22 Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: 128 Nov 26 04:59:23 localhost nova_compute[281415]: 1 Nov 26 04:59:23 localhost nova_compute[281415]: 0 Nov 26 04:59:23 localhost nova_compute[281415]: 0 Nov 26 04:59:23 localhost nova_compute[281415]: 1 Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: tempest-UnshelveToHostMultiNodesTest-873446535-project-member Nov 26 04:59:23 localhost nova_compute[281415]: tempest-UnshelveToHostMultiNodesTest-873446535 Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: RDO Nov 26 04:59:23 localhost nova_compute[281415]: OpenStack Compute Nov 26 04:59:23 localhost nova_compute[281415]: 27.5.2-0.20250829104910.6f8decf.el9 Nov 26 04:59:23 localhost nova_compute[281415]: 1b06d58b-c3ef-48da-b878-9d223cafba01 Nov 26 04:59:23 localhost nova_compute[281415]: 1b06d58b-c3ef-48da-b878-9d223cafba01 Nov 26 04:59:23 localhost nova_compute[281415]: Virtual Machine Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: hvm Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: /dev/urandom Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: Nov 26 04:59:23 localhost nova_compute[281415]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m Nov 26 04:59:23 localhost nova_compute[281415]: 2025-11-26 09:59:23.596 281419 DEBUG nova.virt.libvirt.driver [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m Nov 26 04:59:23 localhost nova_compute[281415]: 2025-11-26 09:59:23.597 281419 DEBUG nova.virt.libvirt.driver [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m Nov 26 04:59:23 localhost nova_compute[281415]: 2025-11-26 09:59:23.598 281419 INFO nova.virt.libvirt.driver [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] Using config drive#033[00m Nov 26 04:59:23 localhost nova_compute[281415]: 2025-11-26 09:59:23.643 281419 DEBUG nova.storage.rbd_utils [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] rbd image 1b06d58b-c3ef-48da-b878-9d223cafba01_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 26 04:59:23 localhost nova_compute[281415]: 2025-11-26 09:59:23.665 281419 DEBUG nova.objects.instance [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Lazy-loading 'ec2_ids' on Instance uuid 1b06d58b-c3ef-48da-b878-9d223cafba01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:59:23 localhost nova_compute[281415]: 2025-11-26 09:59:23.704 281419 DEBUG nova.objects.instance [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Lazy-loading 'keypairs' on Instance uuid 1b06d58b-c3ef-48da-b878-9d223cafba01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:59:23 localhost nova_compute[281415]: 2025-11-26 09:59:23.835 281419 INFO nova.virt.libvirt.driver [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] Creating config drive at /var/lib/nova/instances/1b06d58b-c3ef-48da-b878-9d223cafba01/disk.config#033[00m Nov 26 04:59:23 localhost nova_compute[281415]: 2025-11-26 09:59:23.843 281419 DEBUG oslo_concurrency.processutils [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1b06d58b-c3ef-48da-b878-9d223cafba01/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpd7kbr6s1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:59:23 localhost nova_compute[281415]: 2025-11-26 09:59:23.974 281419 DEBUG oslo_concurrency.processutils [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1b06d58b-c3ef-48da-b878-9d223cafba01/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpd7kbr6s1" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:59:24 localhost nova_compute[281415]: 2025-11-26 09:59:24.019 281419 DEBUG nova.storage.rbd_utils [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] rbd image 1b06d58b-c3ef-48da-b878-9d223cafba01_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 26 04:59:24 localhost nova_compute[281415]: 2025-11-26 09:59:24.025 281419 DEBUG oslo_concurrency.processutils [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/1b06d58b-c3ef-48da-b878-9d223cafba01/disk.config 1b06d58b-c3ef-48da-b878-9d223cafba01_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:59:24 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e105 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:59:24 localhost nova_compute[281415]: 2025-11-26 09:59:24.264 281419 DEBUG oslo_concurrency.processutils [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/1b06d58b-c3ef-48da-b878-9d223cafba01/disk.config 1b06d58b-c3ef-48da-b878-9d223cafba01_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.239s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:59:24 localhost nova_compute[281415]: 2025-11-26 09:59:24.265 281419 INFO nova.virt.libvirt.driver [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] Deleting local config drive /var/lib/nova/instances/1b06d58b-c3ef-48da-b878-9d223cafba01/disk.config because it was imported into RBD.#033[00m Nov 26 04:59:24 localhost systemd-machined[83873]: New machine qemu-5-instance-00000006. Nov 26 04:59:24 localhost systemd[1]: Started Virtual Machine qemu-5-instance-00000006. Nov 26 04:59:24 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:24.505 159587 DEBUG eventlet.wsgi.server [-] (159587) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 26 04:59:24 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:24.507 159587 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /openstack/latest/meta_data.json HTTP/1.0#015 Nov 26 04:59:24 localhost ovn_metadata_agent[159481]: Accept: */*#015 Nov 26 04:59:24 localhost ovn_metadata_agent[159481]: Connection: close#015 Nov 26 04:59:24 localhost ovn_metadata_agent[159481]: Content-Type: text/plain#015 Nov 26 04:59:24 localhost ovn_metadata_agent[159481]: Host: 169.254.169.254#015 Nov 26 04:59:24 localhost ovn_metadata_agent[159481]: User-Agent: curl/7.84.0#015 Nov 26 04:59:24 localhost ovn_metadata_agent[159481]: X-Forwarded-For: 10.100.0.10#015 Nov 26 04:59:24 localhost ovn_metadata_agent[159481]: X-Ovn-Network-Id: 6a06c83c-a173-43fd-9343-735e8a52503a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 26 04:59:24 localhost nova_compute[281415]: 2025-11-26 09:59:24.715 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:24 localhost nova_compute[281415]: 2025-11-26 09:59:24.748 281419 DEBUG nova.virt.driver [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Emitting event Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Nov 26 04:59:24 localhost nova_compute[281415]: 2025-11-26 09:59:24.748 281419 INFO nova.compute.manager [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] VM Resumed (Lifecycle Event)#033[00m Nov 26 04:59:24 localhost nova_compute[281415]: 2025-11-26 09:59:24.754 281419 DEBUG nova.compute.manager [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] Instance event wait completed in 0 seconds for wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m Nov 26 04:59:24 localhost nova_compute[281415]: 2025-11-26 09:59:24.755 281419 DEBUG nova.virt.libvirt.driver [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m Nov 26 04:59:24 localhost nova_compute[281415]: 2025-11-26 09:59:24.761 281419 INFO nova.virt.libvirt.driver [-] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] Instance spawned successfully.#033[00m Nov 26 04:59:24 localhost nova_compute[281415]: 2025-11-26 09:59:24.785 281419 DEBUG nova.compute.manager [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 26 04:59:24 localhost nova_compute[281415]: 2025-11-26 09:59:24.789 281419 DEBUG nova.compute.manager [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Nov 26 04:59:24 localhost nova_compute[281415]: 2025-11-26 09:59:24.808 281419 INFO nova.compute.manager [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m Nov 26 04:59:24 localhost nova_compute[281415]: 2025-11-26 09:59:24.809 281419 DEBUG nova.virt.driver [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] Emitting event Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Nov 26 04:59:24 localhost nova_compute[281415]: 2025-11-26 09:59:24.809 281419 INFO nova.compute.manager [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] VM Started (Lifecycle Event)#033[00m Nov 26 04:59:24 localhost nova_compute[281415]: 2025-11-26 09:59:24.827 281419 DEBUG nova.compute.manager [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 26 04:59:24 localhost nova_compute[281415]: 2025-11-26 09:59:24.832 281419 DEBUG nova.compute.manager [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Nov 26 04:59:24 localhost nova_compute[281415]: 2025-11-26 09:59:24.864 281419 INFO nova.compute.manager [None req-af0d0075-eb2e-4df6-85df-8e4e53901174 - - - - - -] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m Nov 26 04:59:25 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:25.181 159587 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 26 04:59:25 localhost haproxy-metadata-proxy-6a06c83c-a173-43fd-9343-735e8a52503a[308936]: 10.100.0.10:55262 [26/Nov/2025:09:59:24.504] listener listener/metadata 0/0/0/678/678 200 1657 - - ---- 1/1/0/0/0 0/0 "GET /openstack/latest/meta_data.json HTTP/1.1" Nov 26 04:59:25 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:25.182 159587 INFO eventlet.wsgi.server [-] 10.100.0.10, "GET /openstack/latest/meta_data.json HTTP/1.1" status: 200 len: 1673 time: 0.6748958#033[00m Nov 26 04:59:25 localhost nova_compute[281415]: 2025-11-26 09:59:25.298 281419 DEBUG oslo_concurrency.lockutils [None req-7c895742-655f-4516-be4f-337a08dad4ff 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Acquiring lock "af8a19fc-9bd6-4666-942c-7f001cd8070a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:59:25 localhost nova_compute[281415]: 2025-11-26 09:59:25.299 281419 DEBUG oslo_concurrency.lockutils [None req-7c895742-655f-4516-be4f-337a08dad4ff 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Lock "af8a19fc-9bd6-4666-942c-7f001cd8070a" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:59:25 localhost nova_compute[281415]: 2025-11-26 09:59:25.299 281419 DEBUG oslo_concurrency.lockutils [None req-7c895742-655f-4516-be4f-337a08dad4ff 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Acquiring lock "af8a19fc-9bd6-4666-942c-7f001cd8070a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:59:25 localhost nova_compute[281415]: 2025-11-26 09:59:25.300 281419 DEBUG oslo_concurrency.lockutils [None req-7c895742-655f-4516-be4f-337a08dad4ff 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Lock "af8a19fc-9bd6-4666-942c-7f001cd8070a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:59:25 localhost nova_compute[281415]: 2025-11-26 09:59:25.300 281419 DEBUG oslo_concurrency.lockutils [None req-7c895742-655f-4516-be4f-337a08dad4ff 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Lock "af8a19fc-9bd6-4666-942c-7f001cd8070a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:59:25 localhost nova_compute[281415]: 2025-11-26 09:59:25.303 281419 INFO nova.compute.manager [None req-7c895742-655f-4516-be4f-337a08dad4ff 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Terminating instance#033[00m Nov 26 04:59:25 localhost nova_compute[281415]: 2025-11-26 09:59:25.305 281419 DEBUG nova.compute.manager [None req-7c895742-655f-4516-be4f-337a08dad4ff 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m Nov 26 04:59:25 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e106 e106: 6 total, 6 up, 6 in Nov 26 04:59:25 localhost kernel: device tape174856e-03 left promiscuous mode Nov 26 04:59:25 localhost NetworkManager[5970]: [1764151165.3906] device (tape174856e-03): state change: disconnected -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed') Nov 26 04:59:25 localhost nova_compute[281415]: 2025-11-26 09:59:25.398 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:25 localhost ovn_controller[153664]: 2025-11-26T09:59:25Z|00146|binding|INFO|Releasing lport e174856e-03c8-44b0-b9d4-dd4da2f98b9b from this chassis (sb_readonly=0) Nov 26 04:59:25 localhost ovn_controller[153664]: 2025-11-26T09:59:25Z|00147|binding|INFO|Setting lport e174856e-03c8-44b0-b9d4-dd4da2f98b9b down in Southbound Nov 26 04:59:25 localhost ovn_controller[153664]: 2025-11-26T09:59:25Z|00148|binding|INFO|Removing iface tape174856e-03 ovn-installed in OVS Nov 26 04:59:25 localhost nova_compute[281415]: 2025-11-26 09:59:25.410 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:25 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:25.410 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d6:8a:86 10.100.0.10'], port_security=['fa:16:3e:d6:8a:86 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'af8a19fc-9bd6-4666-942c-7f001cd8070a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a06c83c-a173-43fd-9343-735e8a52503a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f251213a9644261874d24d123ed8f23', 'neutron:revision_number': '4', 'neutron:security_group_ids': '98096288-cb5f-4c7e-bb1e-1596965807ee', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain', 'neutron:port_fip': '192.168.122.220'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07926bbc-17e6-41e5-b392-ed892f429233, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=e174856e-03c8-44b0-b9d4-dd4da2f98b9b) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 04:59:25 localhost nova_compute[281415]: 2025-11-26 09:59:25.411 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:25 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:25.413 159486 INFO neutron.agent.ovn.metadata.agent [-] Port e174856e-03c8-44b0-b9d4-dd4da2f98b9b in datapath 6a06c83c-a173-43fd-9343-735e8a52503a unbound from our chassis#033[00m Nov 26 04:59:25 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:25.417 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Port b38eeded-6aa3-44f5-a834-8d27f0d849a6 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 26 04:59:25 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:25.417 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6a06c83c-a173-43fd-9343-735e8a52503a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 04:59:25 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:25.419 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[ef5c10e7-ff58-44b2-baa2-903b291108d7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:59:25 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:25.420 159486 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6a06c83c-a173-43fd-9343-735e8a52503a namespace which is not needed anymore#033[00m Nov 26 04:59:25 localhost systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000009.scope: Deactivated successfully. Nov 26 04:59:25 localhost systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000009.scope: Consumed 14.697s CPU time. Nov 26 04:59:25 localhost systemd-machined[83873]: Machine qemu-4-instance-00000009 terminated. Nov 26 04:59:25 localhost nova_compute[281415]: 2025-11-26 09:59:25.550 281419 INFO nova.virt.libvirt.driver [-] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Instance destroyed successfully.#033[00m Nov 26 04:59:25 localhost nova_compute[281415]: 2025-11-26 09:59:25.551 281419 DEBUG nova.objects.instance [None req-7c895742-655f-4516-be4f-337a08dad4ff 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Lazy-loading 'resources' on Instance uuid af8a19fc-9bd6-4666-942c-7f001cd8070a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:59:25 localhost nova_compute[281415]: 2025-11-26 09:59:25.565 281419 DEBUG nova.virt.libvirt.vif [None req-7c895742-655f-4516-be4f-337a08dad4ff 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-26T09:58:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='guest-instance-1',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(5),hidden=False,host='np0005536118.localdomain',hostname='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com',id=9,image_ref='211ae400-609a-4c22-9588-f4189139a50b',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAljQH8gCNgGhATjkietZYVOGUv9VI425iWTya/TD39tqiN7SIxn9uznipoLdXT8R/xFtDcXKwPW29szUZwpP3LzmUxMGKMSAF2UU0eEvXxvMbIuwBuMXg4aL08pjqc1bw==',key_name='tempest-keypair-1335953635',keypairs=,launch_index=0,launched_at=2025-11-26T09:59:05Z,launched_on='np0005536118.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='np0005536118.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='3f251213a9644261874d24d123ed8f23',ramdisk_id='',reservation_id='r-esc8e7o9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='211ae400-609a-4c22-9588-f4189139a50b',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersV294TestFqdnHostnames-674665483',owner_user_name='tempest-ServersV294TestFqdnHostnames-674665483-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2025-11-26T09:59:05Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='77a0b73f7f574b48a9c231e26511534a',uuid=af8a19fc-9bd6-4666-942c-7f001cd8070a,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e174856e-03c8-44b0-b9d4-dd4da2f98b9b", "address": "fa:16:3e:d6:8a:86", "network": {"id": "6a06c83c-a173-43fd-9343-735e8a52503a", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-789735874-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.220", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "3f251213a9644261874d24d123ed8f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape174856e-03", "ovs_interfaceid": "e174856e-03c8-44b0-b9d4-dd4da2f98b9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m Nov 26 04:59:25 localhost nova_compute[281415]: 2025-11-26 09:59:25.565 281419 DEBUG nova.network.os_vif_util [None req-7c895742-655f-4516-be4f-337a08dad4ff 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Converting VIF {"id": "e174856e-03c8-44b0-b9d4-dd4da2f98b9b", "address": "fa:16:3e:d6:8a:86", "network": {"id": "6a06c83c-a173-43fd-9343-735e8a52503a", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-789735874-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.220", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "3f251213a9644261874d24d123ed8f23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape174856e-03", "ovs_interfaceid": "e174856e-03c8-44b0-b9d4-dd4da2f98b9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Nov 26 04:59:25 localhost nova_compute[281415]: 2025-11-26 09:59:25.567 281419 DEBUG nova.network.os_vif_util [None req-7c895742-655f-4516-be4f-337a08dad4ff 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d6:8a:86,bridge_name='br-int',has_traffic_filtering=True,id=e174856e-03c8-44b0-b9d4-dd4da2f98b9b,network=Network(6a06c83c-a173-43fd-9343-735e8a52503a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape174856e-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Nov 26 04:59:25 localhost nova_compute[281415]: 2025-11-26 09:59:25.568 281419 DEBUG os_vif [None req-7c895742-655f-4516-be4f-337a08dad4ff 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d6:8a:86,bridge_name='br-int',has_traffic_filtering=True,id=e174856e-03c8-44b0-b9d4-dd4da2f98b9b,network=Network(6a06c83c-a173-43fd-9343-735e8a52503a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape174856e-03') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m Nov 26 04:59:25 localhost nova_compute[281415]: 2025-11-26 09:59:25.569 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:25 localhost nova_compute[281415]: 2025-11-26 09:59:25.570 281419 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape174856e-03, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:59:25 localhost nova_compute[281415]: 2025-11-26 09:59:25.573 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:25 localhost nova_compute[281415]: 2025-11-26 09:59:25.574 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:25 localhost nova_compute[281415]: 2025-11-26 09:59:25.578 281419 INFO os_vif [None req-7c895742-655f-4516-be4f-337a08dad4ff 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d6:8a:86,bridge_name='br-int',has_traffic_filtering=True,id=e174856e-03c8-44b0-b9d4-dd4da2f98b9b,network=Network(6a06c83c-a173-43fd-9343-735e8a52503a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape174856e-03')#033[00m Nov 26 04:59:25 localhost neutron-haproxy-ovnmeta-6a06c83c-a173-43fd-9343-735e8a52503a[308926]: [NOTICE] (308932) : haproxy version is 2.8.14-c23fe91 Nov 26 04:59:25 localhost neutron-haproxy-ovnmeta-6a06c83c-a173-43fd-9343-735e8a52503a[308926]: [NOTICE] (308932) : path to executable is /usr/sbin/haproxy Nov 26 04:59:25 localhost neutron-haproxy-ovnmeta-6a06c83c-a173-43fd-9343-735e8a52503a[308926]: [WARNING] (308932) : Exiting Master process... Nov 26 04:59:25 localhost neutron-haproxy-ovnmeta-6a06c83c-a173-43fd-9343-735e8a52503a[308926]: [WARNING] (308932) : Exiting Master process... Nov 26 04:59:25 localhost neutron-haproxy-ovnmeta-6a06c83c-a173-43fd-9343-735e8a52503a[308926]: [ALERT] (308932) : Current worker (308936) exited with code 143 (Terminated) Nov 26 04:59:25 localhost neutron-haproxy-ovnmeta-6a06c83c-a173-43fd-9343-735e8a52503a[308926]: [WARNING] (308932) : All workers exited. Exiting... (0) Nov 26 04:59:25 localhost systemd[1]: libpod-0bba657e2e77a7cc168f4577bc1d0b6c16fc2374729fa4df49ae2d44a5b627ec.scope: Deactivated successfully. Nov 26 04:59:25 localhost podman[309933]: 2025-11-26 09:59:25.665833524 +0000 UTC m=+0.085588497 container died 0bba657e2e77a7cc168f4577bc1d0b6c16fc2374729fa4df49ae2d44a5b627ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a06c83c-a173-43fd-9343-735e8a52503a, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 26 04:59:25 localhost systemd[1]: tmp-crun.vmi3L6.mount: Deactivated successfully. Nov 26 04:59:25 localhost podman[309933]: 2025-11-26 09:59:25.722529015 +0000 UTC m=+0.142283958 container cleanup 0bba657e2e77a7cc168f4577bc1d0b6c16fc2374729fa4df49ae2d44a5b627ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a06c83c-a173-43fd-9343-735e8a52503a, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:59:25 localhost podman[309964]: 2025-11-26 09:59:25.774282995 +0000 UTC m=+0.094642823 container cleanup 0bba657e2e77a7cc168f4577bc1d0b6c16fc2374729fa4df49ae2d44a5b627ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a06c83c-a173-43fd-9343-735e8a52503a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 26 04:59:25 localhost systemd[1]: libpod-conmon-0bba657e2e77a7cc168f4577bc1d0b6c16fc2374729fa4df49ae2d44a5b627ec.scope: Deactivated successfully. Nov 26 04:59:25 localhost podman[309978]: 2025-11-26 09:59:25.849063984 +0000 UTC m=+0.091494217 container remove 0bba657e2e77a7cc168f4577bc1d0b6c16fc2374729fa4df49ae2d44a5b627ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a06c83c-a173-43fd-9343-735e8a52503a, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Nov 26 04:59:25 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:25.860 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[ed0ec2c2-f8e1-4e1c-a751-343f2a9995b0]: (4, ('Wed Nov 26 09:59:25 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6a06c83c-a173-43fd-9343-735e8a52503a (0bba657e2e77a7cc168f4577bc1d0b6c16fc2374729fa4df49ae2d44a5b627ec)\n0bba657e2e77a7cc168f4577bc1d0b6c16fc2374729fa4df49ae2d44a5b627ec\nWed Nov 26 09:59:25 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6a06c83c-a173-43fd-9343-735e8a52503a (0bba657e2e77a7cc168f4577bc1d0b6c16fc2374729fa4df49ae2d44a5b627ec)\n0bba657e2e77a7cc168f4577bc1d0b6c16fc2374729fa4df49ae2d44a5b627ec\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:59:25 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:25.863 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[f538a59e-e95a-46da-b440-c7d9b7a24806]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:59:25 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:25.864 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a06c83c-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 04:59:25 localhost nova_compute[281415]: 2025-11-26 09:59:25.908 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:25 localhost kernel: device tap6a06c83c-a0 left promiscuous mode Nov 26 04:59:25 localhost nova_compute[281415]: 2025-11-26 09:59:25.925 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:25 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:25.928 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[747746cd-d2b7-4667-99ea-4b296b139445]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:59:25 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:25.947 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[278e2227-527e-4978-b199-f60d124753a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:59:25 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:25.947 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[02af0b3b-2783-4adb-8717-c00adb91ad44]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:59:25 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:25.964 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[5a393c28-1d21-4886-be6f-80d1c22174e7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1161702, 'reachable_time': 17189, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 309996, 'error': None, 'target': 'ovnmeta-6a06c83c-a173-43fd-9343-735e8a52503a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:59:25 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:25.967 159623 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6a06c83c-a173-43fd-9343-735e8a52503a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m Nov 26 04:59:25 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:25.967 159623 DEBUG oslo.privsep.daemon [-] privsep: reply[e83099a3-46d3-4d6e-bea5-2f339feb4f1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:59:25 localhost nova_compute[281415]: 2025-11-26 09:59:25.983 281419 DEBUG nova.compute.manager [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 26 04:59:26 localhost nova_compute[281415]: 2025-11-26 09:59:26.056 281419 DEBUG oslo_concurrency.lockutils [None req-9a571ee3-33d1-4d8e-b862-80185597e3c2 8a5a39e4767440b19885f9b9d4b73c2d fecaabee96db4df99aab87c833af138c - - default default] Lock "1b06d58b-c3ef-48da-b878-9d223cafba01" "released" by "nova.compute.manager.ComputeManager.unshelve_instance..do_unshelve_instance" :: held 6.254s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:59:26 localhost nova_compute[281415]: 2025-11-26 09:59:26.370 281419 INFO nova.virt.libvirt.driver [None req-7c895742-655f-4516-be4f-337a08dad4ff 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Deleting instance files /var/lib/nova/instances/af8a19fc-9bd6-4666-942c-7f001cd8070a_del#033[00m Nov 26 04:59:26 localhost nova_compute[281415]: 2025-11-26 09:59:26.373 281419 INFO nova.virt.libvirt.driver [None req-7c895742-655f-4516-be4f-337a08dad4ff 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Deletion of /var/lib/nova/instances/af8a19fc-9bd6-4666-942c-7f001cd8070a_del complete#033[00m Nov 26 04:59:26 localhost nova_compute[281415]: 2025-11-26 09:59:26.516 281419 INFO nova.compute.manager [None req-7c895742-655f-4516-be4f-337a08dad4ff 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Took 1.21 seconds to destroy the instance on the hypervisor.#033[00m Nov 26 04:59:26 localhost nova_compute[281415]: 2025-11-26 09:59:26.517 281419 DEBUG oslo.service.loopingcall [None req-7c895742-655f-4516-be4f-337a08dad4ff 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m Nov 26 04:59:26 localhost nova_compute[281415]: 2025-11-26 09:59:26.517 281419 DEBUG nova.compute.manager [-] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m Nov 26 04:59:26 localhost nova_compute[281415]: 2025-11-26 09:59:26.518 281419 DEBUG nova.network.neutron [-] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m Nov 26 04:59:26 localhost nova_compute[281415]: 2025-11-26 09:59:26.615 281419 DEBUG oslo_concurrency.lockutils [None req-17e39276-73bb-4749-afbe-5be9a4ce9706 d01c4f5f041d4046bdada174e89c4388 f1d07cdb6b514a4ca4915ea1d47f9b18 - - default default] Acquiring lock "1b06d58b-c3ef-48da-b878-9d223cafba01" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:59:26 localhost nova_compute[281415]: 2025-11-26 09:59:26.616 281419 DEBUG oslo_concurrency.lockutils [None req-17e39276-73bb-4749-afbe-5be9a4ce9706 d01c4f5f041d4046bdada174e89c4388 f1d07cdb6b514a4ca4915ea1d47f9b18 - - default default] Lock "1b06d58b-c3ef-48da-b878-9d223cafba01" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:59:26 localhost nova_compute[281415]: 2025-11-26 09:59:26.616 281419 DEBUG oslo_concurrency.lockutils [None req-17e39276-73bb-4749-afbe-5be9a4ce9706 d01c4f5f041d4046bdada174e89c4388 f1d07cdb6b514a4ca4915ea1d47f9b18 - - default default] Acquiring lock "1b06d58b-c3ef-48da-b878-9d223cafba01-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:59:26 localhost nova_compute[281415]: 2025-11-26 09:59:26.617 281419 DEBUG oslo_concurrency.lockutils [None req-17e39276-73bb-4749-afbe-5be9a4ce9706 d01c4f5f041d4046bdada174e89c4388 f1d07cdb6b514a4ca4915ea1d47f9b18 - - default default] Lock "1b06d58b-c3ef-48da-b878-9d223cafba01-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:59:26 localhost nova_compute[281415]: 2025-11-26 09:59:26.617 281419 DEBUG oslo_concurrency.lockutils [None req-17e39276-73bb-4749-afbe-5be9a4ce9706 d01c4f5f041d4046bdada174e89c4388 f1d07cdb6b514a4ca4915ea1d47f9b18 - - default default] Lock "1b06d58b-c3ef-48da-b878-9d223cafba01-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:59:26 localhost nova_compute[281415]: 2025-11-26 09:59:26.619 281419 INFO nova.compute.manager [None req-17e39276-73bb-4749-afbe-5be9a4ce9706 d01c4f5f041d4046bdada174e89c4388 f1d07cdb6b514a4ca4915ea1d47f9b18 - - default default] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] Terminating instance#033[00m Nov 26 04:59:26 localhost nova_compute[281415]: 2025-11-26 09:59:26.621 281419 DEBUG oslo_concurrency.lockutils [None req-17e39276-73bb-4749-afbe-5be9a4ce9706 d01c4f5f041d4046bdada174e89c4388 f1d07cdb6b514a4ca4915ea1d47f9b18 - - default default] Acquiring lock "refresh_cache-1b06d58b-c3ef-48da-b878-9d223cafba01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 04:59:26 localhost nova_compute[281415]: 2025-11-26 09:59:26.622 281419 DEBUG oslo_concurrency.lockutils [None req-17e39276-73bb-4749-afbe-5be9a4ce9706 d01c4f5f041d4046bdada174e89c4388 f1d07cdb6b514a4ca4915ea1d47f9b18 - - default default] Acquired lock "refresh_cache-1b06d58b-c3ef-48da-b878-9d223cafba01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 04:59:26 localhost nova_compute[281415]: 2025-11-26 09:59:26.622 281419 DEBUG nova.network.neutron [None req-17e39276-73bb-4749-afbe-5be9a4ce9706 d01c4f5f041d4046bdada174e89c4388 f1d07cdb6b514a4ca4915ea1d47f9b18 - - default default] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m Nov 26 04:59:26 localhost systemd[1]: var-lib-containers-storage-overlay-8ab16417e4894483337d1fa0f10cfbe2c30cc5cafd7c0c4eeabb73326e60dff7-merged.mount: Deactivated successfully. Nov 26 04:59:26 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0bba657e2e77a7cc168f4577bc1d0b6c16fc2374729fa4df49ae2d44a5b627ec-userdata-shm.mount: Deactivated successfully. Nov 26 04:59:26 localhost systemd[1]: run-netns-ovnmeta\x2d6a06c83c\x2da173\x2d43fd\x2d9343\x2d735e8a52503a.mount: Deactivated successfully. Nov 26 04:59:26 localhost nova_compute[281415]: 2025-11-26 09:59:26.663 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:26 localhost nova_compute[281415]: 2025-11-26 09:59:26.692 281419 DEBUG nova.network.neutron [None req-17e39276-73bb-4749-afbe-5be9a4ce9706 d01c4f5f041d4046bdada174e89c4388 f1d07cdb6b514a4ca4915ea1d47f9b18 - - default default] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m Nov 26 04:59:26 localhost nova_compute[281415]: 2025-11-26 09:59:26.823 281419 DEBUG nova.network.neutron [None req-17e39276-73bb-4749-afbe-5be9a4ce9706 d01c4f5f041d4046bdada174e89c4388 f1d07cdb6b514a4ca4915ea1d47f9b18 - - default default] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:59:26 localhost nova_compute[281415]: 2025-11-26 09:59:26.841 281419 DEBUG oslo_concurrency.lockutils [None req-17e39276-73bb-4749-afbe-5be9a4ce9706 d01c4f5f041d4046bdada174e89c4388 f1d07cdb6b514a4ca4915ea1d47f9b18 - - default default] Releasing lock "refresh_cache-1b06d58b-c3ef-48da-b878-9d223cafba01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 04:59:26 localhost nova_compute[281415]: 2025-11-26 09:59:26.842 281419 DEBUG nova.compute.manager [None req-17e39276-73bb-4749-afbe-5be9a4ce9706 d01c4f5f041d4046bdada174e89c4388 f1d07cdb6b514a4ca4915ea1d47f9b18 - - default default] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m Nov 26 04:59:26 localhost systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000006.scope: Deactivated successfully. Nov 26 04:59:26 localhost systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000006.scope: Consumed 2.534s CPU time. Nov 26 04:59:26 localhost systemd-machined[83873]: Machine qemu-5-instance-00000006 terminated. Nov 26 04:59:27 localhost nova_compute[281415]: 2025-11-26 09:59:27.074 281419 INFO nova.virt.libvirt.driver [-] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] Instance destroyed successfully.#033[00m Nov 26 04:59:27 localhost nova_compute[281415]: 2025-11-26 09:59:27.075 281419 DEBUG nova.objects.instance [None req-17e39276-73bb-4749-afbe-5be9a4ce9706 d01c4f5f041d4046bdada174e89c4388 f1d07cdb6b514a4ca4915ea1d47f9b18 - - default default] Lazy-loading 'resources' on Instance uuid 1b06d58b-c3ef-48da-b878-9d223cafba01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 04:59:27 localhost neutron_sriov_agent[255515]: 2025-11-26 09:59:27.132 2 INFO neutron.agent.securitygroups_rpc [req-7c895742-655f-4516-be4f-337a08dad4ff req-7640936c-2818-49b6-b281-4863042f4a1a 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Security group member updated ['98096288-cb5f-4c7e-bb1e-1596965807ee']#033[00m Nov 26 04:59:27 localhost nova_compute[281415]: 2025-11-26 09:59:27.383 281419 DEBUG nova.network.neutron [-] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:59:27 localhost nova_compute[281415]: 2025-11-26 09:59:27.394 281419 DEBUG nova.compute.manager [req-aa264822-9f6a-45c7-9c19-dda6826033a9 req-021f099a-c2ab-47b0-ae77-1ed93aaeefc6 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Received event network-vif-deleted-e174856e-03c8-44b0-b9d4-dd4da2f98b9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Nov 26 04:59:27 localhost nova_compute[281415]: 2025-11-26 09:59:27.395 281419 INFO nova.compute.manager [req-aa264822-9f6a-45c7-9c19-dda6826033a9 req-021f099a-c2ab-47b0-ae77-1ed93aaeefc6 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Neutron deleted interface e174856e-03c8-44b0-b9d4-dd4da2f98b9b; detaching it from the instance and deleting it from the info cache#033[00m Nov 26 04:59:27 localhost nova_compute[281415]: 2025-11-26 09:59:27.395 281419 DEBUG nova.network.neutron [req-aa264822-9f6a-45c7-9c19-dda6826033a9 req-021f099a-c2ab-47b0-ae77-1ed93aaeefc6 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:59:27 localhost podman[240049]: time="2025-11-26T09:59:27Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:59:27 localhost nova_compute[281415]: 2025-11-26 09:59:27.480 281419 INFO nova.compute.manager [-] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Took 0.96 seconds to deallocate network for instance.#033[00m Nov 26 04:59:27 localhost nova_compute[281415]: 2025-11-26 09:59:27.497 281419 DEBUG nova.compute.manager [req-aa264822-9f6a-45c7-9c19-dda6826033a9 req-021f099a-c2ab-47b0-ae77-1ed93aaeefc6 ee14bbc966da4bb3b9688769e55b4cc4 9aade6d9c4d94af5a0404e802fc179ab - - default default] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Detach interface failed, port_id=e174856e-03c8-44b0-b9d4-dd4da2f98b9b, reason: Instance af8a19fc-9bd6-4666-942c-7f001cd8070a could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m Nov 26 04:59:27 localhost podman[240049]: @ - - [26/Nov/2025:09:59:27 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1" Nov 26 04:59:27 localhost podman[240049]: @ - - [26/Nov/2025:09:59:27 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19257 "" "Go-http-client/1.1" Nov 26 04:59:27 localhost dnsmasq[307468]: read /var/lib/neutron/dhcp/6a06c83c-a173-43fd-9343-735e8a52503a/addn_hosts - 1 addresses Nov 26 04:59:27 localhost dnsmasq-dhcp[307468]: read /var/lib/neutron/dhcp/6a06c83c-a173-43fd-9343-735e8a52503a/host Nov 26 04:59:27 localhost dnsmasq-dhcp[307468]: read /var/lib/neutron/dhcp/6a06c83c-a173-43fd-9343-735e8a52503a/opts Nov 26 04:59:27 localhost podman[310036]: 2025-11-26 09:59:27.595407552 +0000 UTC m=+0.170599833 container kill d59321185e07c46f5d4a2fb56ea0ba6b861c077b1713e710c604a0a7202b89c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6a06c83c-a173-43fd-9343-735e8a52503a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118) Nov 26 04:59:27 localhost nova_compute[281415]: 2025-11-26 09:59:27.637 281419 DEBUG oslo_concurrency.lockutils [None req-7c895742-655f-4516-be4f-337a08dad4ff 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:59:27 localhost nova_compute[281415]: 2025-11-26 09:59:27.641 281419 DEBUG oslo_concurrency.lockutils [None req-7c895742-655f-4516-be4f-337a08dad4ff 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:59:27 localhost nova_compute[281415]: 2025-11-26 09:59:27.744 281419 INFO nova.virt.libvirt.driver [None req-17e39276-73bb-4749-afbe-5be9a4ce9706 d01c4f5f041d4046bdada174e89c4388 f1d07cdb6b514a4ca4915ea1d47f9b18 - - default default] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] Deleting instance files /var/lib/nova/instances/1b06d58b-c3ef-48da-b878-9d223cafba01_del#033[00m Nov 26 04:59:27 localhost nova_compute[281415]: 2025-11-26 09:59:27.746 281419 INFO nova.virt.libvirt.driver [None req-17e39276-73bb-4749-afbe-5be9a4ce9706 d01c4f5f041d4046bdada174e89c4388 f1d07cdb6b514a4ca4915ea1d47f9b18 - - default default] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] Deletion of /var/lib/nova/instances/1b06d58b-c3ef-48da-b878-9d223cafba01_del complete#033[00m Nov 26 04:59:27 localhost sshd[310054]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:59:27 localhost nova_compute[281415]: 2025-11-26 09:59:27.753 281419 DEBUG oslo_concurrency.processutils [None req-7c895742-655f-4516-be4f-337a08dad4ff 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:59:27 localhost nova_compute[281415]: 2025-11-26 09:59:27.803 281419 INFO nova.compute.manager [None req-17e39276-73bb-4749-afbe-5be9a4ce9706 d01c4f5f041d4046bdada174e89c4388 f1d07cdb6b514a4ca4915ea1d47f9b18 - - default default] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] Took 0.96 seconds to destroy the instance on the hypervisor.#033[00m Nov 26 04:59:27 localhost nova_compute[281415]: 2025-11-26 09:59:27.804 281419 DEBUG oslo.service.loopingcall [None req-17e39276-73bb-4749-afbe-5be9a4ce9706 d01c4f5f041d4046bdada174e89c4388 f1d07cdb6b514a4ca4915ea1d47f9b18 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m Nov 26 04:59:27 localhost nova_compute[281415]: 2025-11-26 09:59:27.806 281419 DEBUG nova.compute.manager [-] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m Nov 26 04:59:27 localhost nova_compute[281415]: 2025-11-26 09:59:27.806 281419 DEBUG nova.network.neutron [-] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m Nov 26 04:59:27 localhost nova_compute[281415]: 2025-11-26 09:59:27.893 281419 DEBUG nova.network.neutron [-] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m Nov 26 04:59:27 localhost nova_compute[281415]: 2025-11-26 09:59:27.914 281419 DEBUG nova.network.neutron [-] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 04:59:27 localhost nova_compute[281415]: 2025-11-26 09:59:27.925 281419 INFO nova.compute.manager [-] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] Took 0.12 seconds to deallocate network for instance.#033[00m Nov 26 04:59:27 localhost nova_compute[281415]: 2025-11-26 09:59:27.976 281419 DEBUG oslo_concurrency.lockutils [None req-17e39276-73bb-4749-afbe-5be9a4ce9706 d01c4f5f041d4046bdada174e89c4388 f1d07cdb6b514a4ca4915ea1d47f9b18 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 04:59:28 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 04:59:28 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3606946874' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 04:59:28 localhost nova_compute[281415]: 2025-11-26 09:59:28.242 281419 DEBUG oslo_concurrency.processutils [None req-7c895742-655f-4516-be4f-337a08dad4ff 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:59:28 localhost nova_compute[281415]: 2025-11-26 09:59:28.251 281419 DEBUG nova.compute.provider_tree [None req-7c895742-655f-4516-be4f-337a08dad4ff 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 04:59:28 localhost ceph-mon[297296]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0. Nov 26 04:59:28 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:59:28.254087) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 26 04:59:28 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34 Nov 26 04:59:28 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151168254148, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 710, "num_deletes": 258, "total_data_size": 645689, "memory_usage": 658720, "flush_reason": "Manual Compaction"} Nov 26 04:59:28 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started Nov 26 04:59:28 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151168259821, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 421047, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20526, "largest_seqno": 21231, "table_properties": {"data_size": 417760, "index_size": 1142, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 8044, "raw_average_key_size": 19, "raw_value_size": 410882, "raw_average_value_size": 987, "num_data_blocks": 50, "num_entries": 416, "num_filter_entries": 416, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764151138, "oldest_key_time": 1764151138, "file_creation_time": 1764151168, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79fcc812-ff89-4330-9c0d-148e92884770", "db_session_id": "NHY1MHQN9GMTALZ562AS", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}} Nov 26 04:59:28 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 5783 microseconds, and 2294 cpu microseconds. Nov 26 04:59:28 localhost ceph-mon[297296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 04:59:28 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:59:28.259869) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 421047 bytes OK Nov 26 04:59:28 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:59:28.259895) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started Nov 26 04:59:28 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:59:28.263813) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done Nov 26 04:59:28 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:59:28.263836) EVENT_LOG_v1 {"time_micros": 1764151168263829, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 26 04:59:28 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:59:28.263855) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 26 04:59:28 localhost ceph-mon[297296]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 641793, prev total WAL file size 642117, number of live WAL files 2. Nov 26 04:59:28 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 04:59:28 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:59:28.264557) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033373637' seq:72057594037927935, type:22 .. '6C6F676D0034303139' seq:0, type:0; will stop at (end) Nov 26 04:59:28 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 26 04:59:28 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(411KB)], [33(14MB)] Nov 26 04:59:28 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151168264620, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 16091400, "oldest_snapshot_seqno": -1} Nov 26 04:59:28 localhost nova_compute[281415]: 2025-11-26 09:59:28.293 281419 DEBUG nova.scheduler.client.report [None req-7c895742-655f-4516-be4f-337a08dad4ff 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 04:59:28 localhost nova_compute[281415]: 2025-11-26 09:59:28.320 281419 DEBUG oslo_concurrency.lockutils [None req-7c895742-655f-4516-be4f-337a08dad4ff 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.679s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:59:28 localhost nova_compute[281415]: 2025-11-26 09:59:28.323 281419 DEBUG oslo_concurrency.lockutils [None req-17e39276-73bb-4749-afbe-5be9a4ce9706 d01c4f5f041d4046bdada174e89c4388 f1d07cdb6b514a4ca4915ea1d47f9b18 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.347s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 04:59:28 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 11767 keys, 15985575 bytes, temperature: kUnknown Nov 26 04:59:28 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151168351161, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 15985575, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15919751, "index_size": 35323, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 29445, "raw_key_size": 316693, "raw_average_key_size": 26, "raw_value_size": 15720559, "raw_average_value_size": 1335, "num_data_blocks": 1336, "num_entries": 11767, "num_filter_entries": 11767, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764150724, "oldest_key_time": 0, "file_creation_time": 1764151168, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79fcc812-ff89-4330-9c0d-148e92884770", "db_session_id": "NHY1MHQN9GMTALZ562AS", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}} Nov 26 04:59:28 localhost ceph-mon[297296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 04:59:28 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:59:28.351575) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 15985575 bytes Nov 26 04:59:28 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:59:28.353457) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 185.7 rd, 184.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 14.9 +0.0 blob) out(15.2 +0.0 blob), read-write-amplify(76.2) write-amplify(38.0) OK, records in: 12301, records dropped: 534 output_compression: NoCompression Nov 26 04:59:28 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:59:28.353491) EVENT_LOG_v1 {"time_micros": 1764151168353475, "job": 18, "event": "compaction_finished", "compaction_time_micros": 86663, "compaction_time_cpu_micros": 55097, "output_level": 6, "num_output_files": 1, "total_output_size": 15985575, "num_input_records": 12301, "num_output_records": 11767, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 26 04:59:28 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 04:59:28 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151168353730, "job": 18, "event": "table_file_deletion", "file_number": 35} Nov 26 04:59:28 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 04:59:28 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151168356556, "job": 18, "event": "table_file_deletion", "file_number": 33} Nov 26 04:59:28 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:59:28.264431) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:59:28 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:59:28.356653) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:59:28 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:59:28.356660) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:59:28 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:59:28.356663) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:59:28 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:59:28.356666) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:59:28 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-09:59:28.356669) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 04:59:28 localhost nova_compute[281415]: 2025-11-26 09:59:28.369 281419 INFO nova.scheduler.client.report [None req-7c895742-655f-4516-be4f-337a08dad4ff 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Deleted allocations for instance af8a19fc-9bd6-4666-942c-7f001cd8070a#033[00m Nov 26 04:59:28 localhost nova_compute[281415]: 2025-11-26 09:59:28.420 281419 DEBUG oslo_concurrency.processutils [None req-17e39276-73bb-4749-afbe-5be9a4ce9706 d01c4f5f041d4046bdada174e89c4388 f1d07cdb6b514a4ca4915ea1d47f9b18 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 04:59:28 localhost nova_compute[281415]: 2025-11-26 09:59:28.448 281419 DEBUG oslo_concurrency.lockutils [None req-7c895742-655f-4516-be4f-337a08dad4ff 77a0b73f7f574b48a9c231e26511534a 3f251213a9644261874d24d123ed8f23 - - default default] Lock "af8a19fc-9bd6-4666-942c-7f001cd8070a" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 3.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:59:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:59:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:59:28 localhost podman[310101]: 2025-11-26 09:59:28.837214865 +0000 UTC m=+0.091945672 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 26 04:59:28 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 04:59:28 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/4205076900' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 04:59:28 localhost nova_compute[281415]: 2025-11-26 09:59:28.894 281419 DEBUG oslo_concurrency.processutils [None req-17e39276-73bb-4749-afbe-5be9a4ce9706 d01c4f5f041d4046bdada174e89c4388 f1d07cdb6b514a4ca4915ea1d47f9b18 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 04:59:28 localhost systemd[1]: tmp-crun.eJQnLW.mount: Deactivated successfully. Nov 26 04:59:28 localhost nova_compute[281415]: 2025-11-26 09:59:28.906 281419 DEBUG nova.compute.provider_tree [None req-17e39276-73bb-4749-afbe-5be9a4ce9706 d01c4f5f041d4046bdada174e89c4388 f1d07cdb6b514a4ca4915ea1d47f9b18 - - default default] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 04:59:28 localhost podman[310102]: 2025-11-26 09:59:28.917649297 +0000 UTC m=+0.167766539 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:59:28 localhost nova_compute[281415]: 2025-11-26 09:59:28.926 281419 DEBUG nova.scheduler.client.report [None req-17e39276-73bb-4749-afbe-5be9a4ce9706 d01c4f5f041d4046bdada174e89c4388 f1d07cdb6b514a4ca4915ea1d47f9b18 - - default default] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 04:59:28 localhost podman[310101]: 2025-11-26 09:59:28.931695011 +0000 UTC m=+0.186425808 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 26 04:59:28 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 04:59:28 localhost nova_compute[281415]: 2025-11-26 09:59:28.952 281419 DEBUG oslo_concurrency.lockutils [None req-17e39276-73bb-4749-afbe-5be9a4ce9706 d01c4f5f041d4046bdada174e89c4388 f1d07cdb6b514a4ca4915ea1d47f9b18 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.629s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:59:28 localhost podman[310102]: 2025-11-26 09:59:28.956542874 +0000 UTC m=+0.206660186 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:59:28 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:59:28 localhost nova_compute[281415]: 2025-11-26 09:59:28.987 281419 INFO nova.scheduler.client.report [None req-17e39276-73bb-4749-afbe-5be9a4ce9706 d01c4f5f041d4046bdada174e89c4388 f1d07cdb6b514a4ca4915ea1d47f9b18 - - default default] Deleted allocations for instance 1b06d58b-c3ef-48da-b878-9d223cafba01#033[00m Nov 26 04:59:29 localhost nova_compute[281415]: 2025-11-26 09:59:29.087 281419 DEBUG oslo_concurrency.lockutils [None req-17e39276-73bb-4749-afbe-5be9a4ce9706 d01c4f5f041d4046bdada174e89c4388 f1d07cdb6b514a4ca4915ea1d47f9b18 - - default default] Lock "1b06d58b-c3ef-48da-b878-9d223cafba01" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.471s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 04:59:29 localhost sshd[310143]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:59:29 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e106 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:59:29 localhost sshd[310145]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:59:29 localhost sshd[310147]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:59:30 localhost nova_compute[281415]: 2025-11-26 09:59:30.574 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:31 localhost ovn_controller[153664]: 2025-11-26T09:59:31Z|00149|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 04:59:31 localhost nova_compute[281415]: 2025-11-26 09:59:31.174 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:31 localhost podman[310164]: 2025-11-26 09:59:31.425772325 +0000 UTC m=+0.062706500 container kill d59321185e07c46f5d4a2fb56ea0ba6b861c077b1713e710c604a0a7202b89c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6a06c83c-a173-43fd-9343-735e8a52503a, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:59:31 localhost dnsmasq[307468]: read /var/lib/neutron/dhcp/6a06c83c-a173-43fd-9343-735e8a52503a/addn_hosts - 0 addresses Nov 26 04:59:31 localhost dnsmasq-dhcp[307468]: read /var/lib/neutron/dhcp/6a06c83c-a173-43fd-9343-735e8a52503a/host Nov 26 04:59:31 localhost dnsmasq-dhcp[307468]: read /var/lib/neutron/dhcp/6a06c83c-a173-43fd-9343-735e8a52503a/opts Nov 26 04:59:31 localhost nova_compute[281415]: 2025-11-26 09:59:31.675 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:31 localhost ovn_controller[153664]: 2025-11-26T09:59:31Z|00150|binding|INFO|Releasing lport f84f6614-2902-417c-b63b-598ceb7caab0 from this chassis (sb_readonly=0) Nov 26 04:59:31 localhost ovn_controller[153664]: 2025-11-26T09:59:31Z|00151|binding|INFO|Setting lport f84f6614-2902-417c-b63b-598ceb7caab0 down in Southbound Nov 26 04:59:31 localhost kernel: device tapf84f6614-29 left promiscuous mode Nov 26 04:59:31 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:31.688 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-6a06c83c-a173-43fd-9343-735e8a52503a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a06c83c-a173-43fd-9343-735e8a52503a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f251213a9644261874d24d123ed8f23', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07926bbc-17e6-41e5-b392-ed892f429233, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f84f6614-2902-417c-b63b-598ceb7caab0) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 04:59:31 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:31.690 159486 INFO neutron.agent.ovn.metadata.agent [-] Port f84f6614-2902-417c-b63b-598ceb7caab0 in datapath 6a06c83c-a173-43fd-9343-735e8a52503a unbound from our chassis#033[00m Nov 26 04:59:31 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:31.692 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6a06c83c-a173-43fd-9343-735e8a52503a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 04:59:31 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:31.694 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[ca4bb49c-61e0-4b37-83f4-67f45abf7e07]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:59:31 localhost nova_compute[281415]: 2025-11-26 09:59:31.695 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:33 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e107 e107: 6 total, 6 up, 6 in Nov 26 04:59:33 localhost ovn_controller[153664]: 2025-11-26T09:59:33Z|00152|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 04:59:33 localhost sshd[310186]: main: sshd: ssh-rsa algorithm is disabled Nov 26 04:59:33 localhost nova_compute[281415]: 2025-11-26 09:59:33.674 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:34 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:59:34 localhost podman[310206]: 2025-11-26 09:59:34.146869625 +0000 UTC m=+0.064200484 container kill d59321185e07c46f5d4a2fb56ea0ba6b861c077b1713e710c604a0a7202b89c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6a06c83c-a173-43fd-9343-735e8a52503a, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 04:59:34 localhost dnsmasq[307468]: exiting on receipt of SIGTERM Nov 26 04:59:34 localhost systemd[1]: libpod-d59321185e07c46f5d4a2fb56ea0ba6b861c077b1713e710c604a0a7202b89c6.scope: Deactivated successfully. Nov 26 04:59:34 localhost podman[310222]: 2025-11-26 09:59:34.238138976 +0000 UTC m=+0.067819901 container died d59321185e07c46f5d4a2fb56ea0ba6b861c077b1713e710c604a0a7202b89c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6a06c83c-a173-43fd-9343-735e8a52503a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 26 04:59:34 localhost podman[310222]: 2025-11-26 09:59:34.273323534 +0000 UTC m=+0.103004419 container cleanup d59321185e07c46f5d4a2fb56ea0ba6b861c077b1713e710c604a0a7202b89c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6a06c83c-a173-43fd-9343-735e8a52503a, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0) Nov 26 04:59:34 localhost systemd[1]: libpod-conmon-d59321185e07c46f5d4a2fb56ea0ba6b861c077b1713e710c604a0a7202b89c6.scope: Deactivated successfully. Nov 26 04:59:34 localhost podman[310221]: 2025-11-26 09:59:34.326424441 +0000 UTC m=+0.155727395 container remove d59321185e07c46f5d4a2fb56ea0ba6b861c077b1713e710c604a0a7202b89c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6a06c83c-a173-43fd-9343-735e8a52503a, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118) Nov 26 04:59:34 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:59:34.363 262471 INFO neutron.agent.dhcp.agent [None req-0780a239-69b5-4116-8fb2-ce963a5afe37 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 04:59:34 localhost snmpd[66980]: empty variable list in _query Nov 26 04:59:34 localhost snmpd[66980]: empty variable list in _query Nov 26 04:59:34 localhost snmpd[66980]: empty variable list in _query Nov 26 04:59:34 localhost snmpd[66980]: empty variable list in _query Nov 26 04:59:34 localhost snmpd[66980]: empty variable list in _query Nov 26 04:59:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 04:59:35 localhost podman[310252]: 2025-11-26 09:59:35.086461715 +0000 UTC m=+0.091111658 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, version=9.6, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal) Nov 26 04:59:35 localhost podman[310252]: 2025-11-26 09:59:35.103318942 +0000 UTC m=+0.107968885 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, distribution-scope=public, version=9.6, io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter) Nov 26 04:59:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 04:59:35 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 04:59:35 localhost systemd[1]: var-lib-containers-storage-overlay-228f9830e991070f2a480624611f93d44afabdc42ed2aa40abbf57afcbd73a7a-merged.mount: Deactivated successfully. Nov 26 04:59:35 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d59321185e07c46f5d4a2fb56ea0ba6b861c077b1713e710c604a0a7202b89c6-userdata-shm.mount: Deactivated successfully. Nov 26 04:59:35 localhost systemd[1]: run-netns-qdhcp\x2d6a06c83c\x2da173\x2d43fd\x2d9343\x2d735e8a52503a.mount: Deactivated successfully. Nov 26 04:59:35 localhost podman[310272]: 2025-11-26 09:59:35.230866244 +0000 UTC m=+0.097063004 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:59:35 localhost podman[310272]: 2025-11-26 09:59:35.339635922 +0000 UTC m=+0.205832702 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Nov 26 04:59:35 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:59:35.338 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 04:59:35 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 04:59:35 localhost nova_compute[281415]: 2025-11-26 09:59:35.576 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:36 localhost nova_compute[281415]: 2025-11-26 09:59:36.713 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 04:59:38 localhost systemd[1]: tmp-crun.qm6SMH.mount: Deactivated successfully. Nov 26 04:59:38 localhost podman[310298]: 2025-11-26 09:59:38.834550493 +0000 UTC m=+0.093812578 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 04:59:38 localhost podman[310298]: 2025-11-26 09:59:38.845180716 +0000 UTC m=+0.104442791 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 04:59:38 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 04:59:39 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:59:40 localhost nova_compute[281415]: 2025-11-26 09:59:40.547 281419 DEBUG nova.virt.driver [-] Emitting event Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Nov 26 04:59:40 localhost nova_compute[281415]: 2025-11-26 09:59:40.549 281419 INFO nova.compute.manager [-] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] VM Stopped (Lifecycle Event)#033[00m Nov 26 04:59:40 localhost nova_compute[281415]: 2025-11-26 09:59:40.578 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:40 localhost nova_compute[281415]: 2025-11-26 09:59:40.663 281419 DEBUG nova.compute.manager [None req-3178416e-978f-43eb-8309-30387d29db7e - - - - - -] [instance: af8a19fc-9bd6-4666-942c-7f001cd8070a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 26 04:59:41 localhost nova_compute[281415]: 2025-11-26 09:59:41.743 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:42 localhost nova_compute[281415]: 2025-11-26 09:59:42.072 281419 DEBUG nova.virt.driver [-] Emitting event Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Nov 26 04:59:42 localhost nova_compute[281415]: 2025-11-26 09:59:42.072 281419 INFO nova.compute.manager [-] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] VM Stopped (Lifecycle Event)#033[00m Nov 26 04:59:42 localhost nova_compute[281415]: 2025-11-26 09:59:42.093 281419 DEBUG nova.compute.manager [None req-cc0bc71f-505e-466d-ad6f-bfbbcecf1ded - - - - - -] [instance: 1b06d58b-c3ef-48da-b878-9d223cafba01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 26 04:59:42 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:59:42.170 262471 INFO neutron.agent.linux.ip_lib [None req-087c1cab-bd40-4c7e-8e3d-29180267829e - - - - - -] Device tap4628ae84-e4 cannot be used as it has no MAC address#033[00m Nov 26 04:59:42 localhost nova_compute[281415]: 2025-11-26 09:59:42.199 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:42 localhost kernel: device tap4628ae84-e4 entered promiscuous mode Nov 26 04:59:42 localhost NetworkManager[5970]: [1764151182.2093] manager: (tap4628ae84-e4): new Generic device (/org/freedesktop/NetworkManager/Devices/28) Nov 26 04:59:42 localhost nova_compute[281415]: 2025-11-26 09:59:42.210 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:42 localhost ovn_controller[153664]: 2025-11-26T09:59:42Z|00153|binding|INFO|Claiming lport 4628ae84-e43c-4c22-8931-7c0550ea7c42 for this chassis. Nov 26 04:59:42 localhost ovn_controller[153664]: 2025-11-26T09:59:42Z|00154|binding|INFO|4628ae84-e43c-4c22-8931-7c0550ea7c42: Claiming unknown Nov 26 04:59:42 localhost systemd-udevd[310329]: Network interface NamePolicy= disabled on kernel command line. Nov 26 04:59:42 localhost ovn_controller[153664]: 2025-11-26T09:59:42Z|00155|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 04:59:42 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:42.233 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-5230fb15-6d27-4074-9dee-73233f11625e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5230fb15-6d27-4074-9dee-73233f11625e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f6aa38ed66a493cb8e8466fbc8ea76a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=20dc77b0-98d7-4e3c-a77b-614db2aec6ef, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=4628ae84-e43c-4c22-8931-7c0550ea7c42) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 04:59:42 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:42.235 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 4628ae84-e43c-4c22-8931-7c0550ea7c42 in datapath 5230fb15-6d27-4074-9dee-73233f11625e bound to our chassis#033[00m Nov 26 04:59:42 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:42.237 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5230fb15-6d27-4074-9dee-73233f11625e or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 04:59:42 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:42.239 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[5110316f-b9b0-4bb6-a126-d0b616a70bdf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:59:42 localhost nova_compute[281415]: 2025-11-26 09:59:42.262 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:42 localhost nova_compute[281415]: 2025-11-26 09:59:42.277 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:42 localhost ovn_controller[153664]: 2025-11-26T09:59:42Z|00156|binding|INFO|Setting lport 4628ae84-e43c-4c22-8931-7c0550ea7c42 ovn-installed in OVS Nov 26 04:59:42 localhost ovn_controller[153664]: 2025-11-26T09:59:42Z|00157|binding|INFO|Setting lport 4628ae84-e43c-4c22-8931-7c0550ea7c42 up in Southbound Nov 26 04:59:42 localhost nova_compute[281415]: 2025-11-26 09:59:42.284 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:42 localhost nova_compute[281415]: 2025-11-26 09:59:42.309 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:42 localhost nova_compute[281415]: 2025-11-26 09:59:42.344 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:43 localhost podman[310385]: Nov 26 04:59:43 localhost podman[310385]: 2025-11-26 09:59:43.278682938 +0000 UTC m=+0.104162143 container create f09bb269d9979abd96f973d9ec19dbc04aaf55dc9c89526233c2cade83f6d876 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5230fb15-6d27-4074-9dee-73233f11625e, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118) Nov 26 04:59:43 localhost systemd[1]: Started libpod-conmon-f09bb269d9979abd96f973d9ec19dbc04aaf55dc9c89526233c2cade83f6d876.scope. Nov 26 04:59:43 localhost podman[310385]: 2025-11-26 09:59:43.227976872 +0000 UTC m=+0.053456087 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 04:59:43 localhost systemd[1]: Started libcrun container. Nov 26 04:59:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9415c9a89010a49699ddf5624fb7ebbd600a482a114211946090a9810c27d009/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 04:59:43 localhost podman[310385]: 2025-11-26 09:59:43.361324525 +0000 UTC m=+0.186803710 container init f09bb269d9979abd96f973d9ec19dbc04aaf55dc9c89526233c2cade83f6d876 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5230fb15-6d27-4074-9dee-73233f11625e, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0) Nov 26 04:59:43 localhost podman[310385]: 2025-11-26 09:59:43.372275658 +0000 UTC m=+0.197754853 container start f09bb269d9979abd96f973d9ec19dbc04aaf55dc9c89526233c2cade83f6d876 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5230fb15-6d27-4074-9dee-73233f11625e, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 26 04:59:43 localhost dnsmasq[310403]: started, version 2.85 cachesize 150 Nov 26 04:59:43 localhost dnsmasq[310403]: DNS service limited to local subnets Nov 26 04:59:43 localhost dnsmasq[310403]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 04:59:43 localhost dnsmasq[310403]: warning: no upstream servers configured Nov 26 04:59:43 localhost dnsmasq-dhcp[310403]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 26 04:59:43 localhost dnsmasq[310403]: read /var/lib/neutron/dhcp/5230fb15-6d27-4074-9dee-73233f11625e/addn_hosts - 0 addresses Nov 26 04:59:43 localhost dnsmasq-dhcp[310403]: read /var/lib/neutron/dhcp/5230fb15-6d27-4074-9dee-73233f11625e/host Nov 26 04:59:43 localhost dnsmasq-dhcp[310403]: read /var/lib/neutron/dhcp/5230fb15-6d27-4074-9dee-73233f11625e/opts Nov 26 04:59:43 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:59:43.539 262471 INFO neutron.agent.dhcp.agent [None req-310a19e7-9486-4dd8-be3a-60736bff5aa3 - - - - - -] DHCP configuration for ports {'4a2b5fbf-a6df-44f0-be90-2ed3162bc90f'} is completed#033[00m Nov 26 04:59:44 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:59:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 04:59:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 04:59:44 localhost podman[310404]: 2025-11-26 09:59:44.330525659 +0000 UTC m=+0.084607146 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent) Nov 26 04:59:44 localhost podman[310404]: 2025-11-26 09:59:44.336629119 +0000 UTC m=+0.090710636 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent) Nov 26 04:59:44 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 04:59:44 localhost podman[310405]: 2025-11-26 09:59:44.3858267 +0000 UTC m=+0.136700693 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2) Nov 26 04:59:44 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e108 e108: 6 total, 6 up, 6 in Nov 26 04:59:44 localhost podman[310405]: 2025-11-26 09:59:44.403556923 +0000 UTC m=+0.154430906 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3) Nov 26 04:59:44 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 04:59:45 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e109 e109: 6 total, 6 up, 6 in Nov 26 04:59:45 localhost nova_compute[281415]: 2025-11-26 09:59:45.580 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:45 localhost openstack_network_exporter[242153]: ERROR 09:59:45 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 04:59:45 localhost openstack_network_exporter[242153]: ERROR 09:59:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:59:45 localhost openstack_network_exporter[242153]: ERROR 09:59:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 04:59:45 localhost openstack_network_exporter[242153]: ERROR 09:59:45 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 04:59:45 localhost openstack_network_exporter[242153]: Nov 26 04:59:45 localhost openstack_network_exporter[242153]: ERROR 09:59:45 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 04:59:45 localhost openstack_network_exporter[242153]: Nov 26 04:59:45 localhost nova_compute[281415]: 2025-11-26 09:59:45.927 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:46 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e110 e110: 6 total, 6 up, 6 in Nov 26 04:59:46 localhost nova_compute[281415]: 2025-11-26 09:59:46.782 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:47 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:59:47.050 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T09:59:46Z, description=, device_id=e172e3e6-383e-43a5-abe7-dc28b36f9927, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=e1a26500-0854-4300-a3f4-a83ecd16963d, ip_allocation=immediate, mac_address=fa:16:3e:10:eb:df, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T09:59:39Z, description=, dns_domain=, id=5230fb15-6d27-4074-9dee-73233f11625e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPsNegativeTestJSON-596290282-network, port_security_enabled=True, project_id=8f6aa38ed66a493cb8e8466fbc8ea76a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=52755, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=812, status=ACTIVE, subnets=['eea8bb90-14a3-4bff-8239-6edec8a6219d'], tags=[], tenant_id=8f6aa38ed66a493cb8e8466fbc8ea76a, updated_at=2025-11-26T09:59:41Z, vlan_transparent=None, network_id=5230fb15-6d27-4074-9dee-73233f11625e, port_security_enabled=False, project_id=8f6aa38ed66a493cb8e8466fbc8ea76a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=856, status=DOWN, tags=[], tenant_id=8f6aa38ed66a493cb8e8466fbc8ea76a, updated_at=2025-11-26T09:59:46Z on network 5230fb15-6d27-4074-9dee-73233f11625e#033[00m Nov 26 04:59:47 localhost systemd[1]: tmp-crun.qe4Gwj.mount: Deactivated successfully. Nov 26 04:59:47 localhost dnsmasq[310403]: read /var/lib/neutron/dhcp/5230fb15-6d27-4074-9dee-73233f11625e/addn_hosts - 1 addresses Nov 26 04:59:47 localhost dnsmasq-dhcp[310403]: read /var/lib/neutron/dhcp/5230fb15-6d27-4074-9dee-73233f11625e/host Nov 26 04:59:47 localhost dnsmasq-dhcp[310403]: read /var/lib/neutron/dhcp/5230fb15-6d27-4074-9dee-73233f11625e/opts Nov 26 04:59:47 localhost podman[310457]: 2025-11-26 09:59:47.312035939 +0000 UTC m=+0.080005659 container kill f09bb269d9979abd96f973d9ec19dbc04aaf55dc9c89526233c2cade83f6d876 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5230fb15-6d27-4074-9dee-73233f11625e, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 26 04:59:47 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e111 e111: 6 total, 6 up, 6 in Nov 26 04:59:47 localhost nova_compute[281415]: 2025-11-26 09:59:47.485 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:47 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:59:47.592 262471 INFO neutron.agent.dhcp.agent [None req-48348426-c51b-499c-b4f4-edcba6e64b52 - - - - - -] DHCP configuration for ports {'e1a26500-0854-4300-a3f4-a83ecd16963d'} is completed#033[00m Nov 26 04:59:47 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:59:47.860 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T09:59:46Z, description=, device_id=e172e3e6-383e-43a5-abe7-dc28b36f9927, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=e1a26500-0854-4300-a3f4-a83ecd16963d, ip_allocation=immediate, mac_address=fa:16:3e:10:eb:df, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T09:59:39Z, description=, dns_domain=, id=5230fb15-6d27-4074-9dee-73233f11625e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPsNegativeTestJSON-596290282-network, port_security_enabled=True, project_id=8f6aa38ed66a493cb8e8466fbc8ea76a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=52755, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=812, status=ACTIVE, subnets=['eea8bb90-14a3-4bff-8239-6edec8a6219d'], tags=[], tenant_id=8f6aa38ed66a493cb8e8466fbc8ea76a, updated_at=2025-11-26T09:59:41Z, vlan_transparent=None, network_id=5230fb15-6d27-4074-9dee-73233f11625e, port_security_enabled=False, project_id=8f6aa38ed66a493cb8e8466fbc8ea76a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=856, status=DOWN, tags=[], tenant_id=8f6aa38ed66a493cb8e8466fbc8ea76a, updated_at=2025-11-26T09:59:46Z on network 5230fb15-6d27-4074-9dee-73233f11625e#033[00m Nov 26 04:59:48 localhost dnsmasq[310403]: read /var/lib/neutron/dhcp/5230fb15-6d27-4074-9dee-73233f11625e/addn_hosts - 1 addresses Nov 26 04:59:48 localhost dnsmasq-dhcp[310403]: read /var/lib/neutron/dhcp/5230fb15-6d27-4074-9dee-73233f11625e/host Nov 26 04:59:48 localhost podman[310495]: 2025-11-26 09:59:48.123633866 +0000 UTC m=+0.054082085 container kill f09bb269d9979abd96f973d9ec19dbc04aaf55dc9c89526233c2cade83f6d876 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5230fb15-6d27-4074-9dee-73233f11625e, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 26 04:59:48 localhost dnsmasq-dhcp[310403]: read /var/lib/neutron/dhcp/5230fb15-6d27-4074-9dee-73233f11625e/opts Nov 26 04:59:48 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:59:48.373 262471 INFO neutron.agent.dhcp.agent [None req-4c210663-c512-4a53-9242-638cecf8f7ea - - - - - -] DHCP configuration for ports {'e1a26500-0854-4300-a3f4-a83ecd16963d'} is completed#033[00m Nov 26 04:59:48 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e112 e112: 6 total, 6 up, 6 in Nov 26 04:59:49 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:59:49 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e113 e113: 6 total, 6 up, 6 in Nov 26 04:59:50 localhost dnsmasq[310403]: read /var/lib/neutron/dhcp/5230fb15-6d27-4074-9dee-73233f11625e/addn_hosts - 0 addresses Nov 26 04:59:50 localhost dnsmasq-dhcp[310403]: read /var/lib/neutron/dhcp/5230fb15-6d27-4074-9dee-73233f11625e/host Nov 26 04:59:50 localhost podman[310532]: 2025-11-26 09:59:50.109367989 +0000 UTC m=+0.064553014 container kill f09bb269d9979abd96f973d9ec19dbc04aaf55dc9c89526233c2cade83f6d876 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5230fb15-6d27-4074-9dee-73233f11625e, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:59:50 localhost dnsmasq-dhcp[310403]: read /var/lib/neutron/dhcp/5230fb15-6d27-4074-9dee-73233f11625e/opts Nov 26 04:59:50 localhost ovn_controller[153664]: 2025-11-26T09:59:50Z|00158|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 04:59:50 localhost kernel: device tap4628ae84-e4 left promiscuous mode Nov 26 04:59:50 localhost nova_compute[281415]: 2025-11-26 09:59:50.352 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:50 localhost ovn_controller[153664]: 2025-11-26T09:59:50Z|00159|binding|INFO|Releasing lport 4628ae84-e43c-4c22-8931-7c0550ea7c42 from this chassis (sb_readonly=0) Nov 26 04:59:50 localhost ovn_controller[153664]: 2025-11-26T09:59:50Z|00160|binding|INFO|Setting lport 4628ae84-e43c-4c22-8931-7c0550ea7c42 down in Southbound Nov 26 04:59:50 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:50.372 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-5230fb15-6d27-4074-9dee-73233f11625e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5230fb15-6d27-4074-9dee-73233f11625e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f6aa38ed66a493cb8e8466fbc8ea76a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=20dc77b0-98d7-4e3c-a77b-614db2aec6ef, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=4628ae84-e43c-4c22-8931-7c0550ea7c42) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 04:59:50 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:50.374 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 4628ae84-e43c-4c22-8931-7c0550ea7c42 in datapath 5230fb15-6d27-4074-9dee-73233f11625e unbound from our chassis#033[00m Nov 26 04:59:50 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:50.376 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5230fb15-6d27-4074-9dee-73233f11625e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 04:59:50 localhost ovn_metadata_agent[159481]: 2025-11-26 09:59:50.378 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[88f1622d-99ac-4db2-a06c-1508ff51ec1b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 04:59:50 localhost nova_compute[281415]: 2025-11-26 09:59:50.394 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:50 localhost nova_compute[281415]: 2025-11-26 09:59:50.398 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:50 localhost nova_compute[281415]: 2025-11-26 09:59:50.406 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:50 localhost nova_compute[281415]: 2025-11-26 09:59:50.582 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:51 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e114 e114: 6 total, 6 up, 6 in Nov 26 04:59:51 localhost nova_compute[281415]: 2025-11-26 09:59:51.832 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:52 localhost ovn_controller[153664]: 2025-11-26T09:59:52Z|00161|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 04:59:52 localhost nova_compute[281415]: 2025-11-26 09:59:52.374 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:52 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e115 e115: 6 total, 6 up, 6 in Nov 26 04:59:52 localhost dnsmasq[310403]: exiting on receipt of SIGTERM Nov 26 04:59:52 localhost podman[310573]: 2025-11-26 09:59:52.982046311 +0000 UTC m=+0.067908174 container kill f09bb269d9979abd96f973d9ec19dbc04aaf55dc9c89526233c2cade83f6d876 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5230fb15-6d27-4074-9dee-73233f11625e, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 26 04:59:52 localhost systemd[1]: libpod-f09bb269d9979abd96f973d9ec19dbc04aaf55dc9c89526233c2cade83f6d876.scope: Deactivated successfully. Nov 26 04:59:53 localhost podman[310588]: 2025-11-26 09:59:53.065264775 +0000 UTC m=+0.061636269 container died f09bb269d9979abd96f973d9ec19dbc04aaf55dc9c89526233c2cade83f6d876 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5230fb15-6d27-4074-9dee-73233f11625e, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Nov 26 04:59:53 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f09bb269d9979abd96f973d9ec19dbc04aaf55dc9c89526233c2cade83f6d876-userdata-shm.mount: Deactivated successfully. Nov 26 04:59:53 localhost systemd[1]: var-lib-containers-storage-overlay-9415c9a89010a49699ddf5624fb7ebbd600a482a114211946090a9810c27d009-merged.mount: Deactivated successfully. Nov 26 04:59:53 localhost podman[310588]: 2025-11-26 09:59:53.107736838 +0000 UTC m=+0.104108282 container remove f09bb269d9979abd96f973d9ec19dbc04aaf55dc9c89526233c2cade83f6d876 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5230fb15-6d27-4074-9dee-73233f11625e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 26 04:59:53 localhost systemd[1]: libpod-conmon-f09bb269d9979abd96f973d9ec19dbc04aaf55dc9c89526233c2cade83f6d876.scope: Deactivated successfully. Nov 26 04:59:53 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:59:53.144 262471 INFO neutron.agent.dhcp.agent [None req-195252bd-8981-4031-9f78-42da705471f8 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 04:59:53 localhost neutron_dhcp_agent[262467]: 2025-11-26 09:59:53.242 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 04:59:53 localhost systemd[1]: run-netns-qdhcp\x2d5230fb15\x2d6d27\x2d4074\x2d9dee\x2d73233f11625e.mount: Deactivated successfully. Nov 26 04:59:54 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:59:54 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e116 e116: 6 total, 6 up, 6 in Nov 26 04:59:55 localhost nova_compute[281415]: 2025-11-26 09:59:55.583 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:55 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 04:59:55 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 04:59:55 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e117 e117: 6 total, 6 up, 6 in Nov 26 04:59:56 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e118 e118: 6 total, 6 up, 6 in Nov 26 04:59:56 localhost nova_compute[281415]: 2025-11-26 09:59:56.878 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 04:59:57 localhost podman[240049]: time="2025-11-26T09:59:57Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 04:59:57 localhost podman[240049]: @ - - [26/Nov/2025:09:59:57 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153864 "" "Go-http-client/1.1" Nov 26 04:59:57 localhost podman[240049]: @ - - [26/Nov/2025:09:59:57 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18775 "" "Go-http-client/1.1" Nov 26 04:59:57 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e119 e119: 6 total, 6 up, 6 in Nov 26 04:59:57 localhost neutron_sriov_agent[255515]: 2025-11-26 09:59:57.749 2 INFO neutron.agent.securitygroups_rpc [None req-259e3320-aac7-4d1b-ad51-b6386cf85fbf 548134ffb793479eb94902ddefb1b5ab e3b38f82c6da4ad287ebc431c374eafd - - default default] Security group member updated ['d6e17814-1e68-46b5-8f93-34c56ab1f155']#033[00m Nov 26 04:59:58 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e120 e120: 6 total, 6 up, 6 in Nov 26 04:59:58 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 04:59:58 localhost nova_compute[281415]: 2025-11-26 09:59:58.756 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:59:58 localhost neutron_sriov_agent[255515]: 2025-11-26 09:59:58.774 2 INFO neutron.agent.securitygroups_rpc [None req-8f333ddf-95b1-47b5-b17d-4c21698bba3d 548134ffb793479eb94902ddefb1b5ab e3b38f82c6da4ad287ebc431c374eafd - - default default] Security group member updated ['d6e17814-1e68-46b5-8f93-34c56ab1f155']#033[00m Nov 26 04:59:58 localhost nova_compute[281415]: 2025-11-26 09:59:58.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:59:59 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e120 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 04:59:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 04:59:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 04:59:59 localhost podman[310700]: 2025-11-26 09:59:59.835580464 +0000 UTC m=+0.086444781 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 26 04:59:59 localhost nova_compute[281415]: 2025-11-26 09:59:59.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 04:59:59 localhost podman[310700]: 2025-11-26 09:59:59.877263033 +0000 UTC m=+0.128127340 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 04:59:59 localhost systemd[1]: tmp-crun.5qP6Tr.mount: Deactivated successfully. Nov 26 04:59:59 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 04:59:59 localhost podman[310699]: 2025-11-26 09:59:59.907308738 +0000 UTC m=+0.159720801 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 26 04:59:59 localhost podman[310699]: 2025-11-26 09:59:59.922430525 +0000 UTC m=+0.174842638 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 04:59:59 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 05:00:00 localhost nova_compute[281415]: 2025-11-26 10:00:00.544 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:00 localhost nova_compute[281415]: 2025-11-26 10:00:00.585 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:00 localhost ceph-mon[297296]: overall HEALTH_OK Nov 26 05:00:00 localhost nova_compute[281415]: 2025-11-26 10:00:00.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:00:00 localhost nova_compute[281415]: 2025-11-26 10:00:00.849 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:00:00 localhost nova_compute[281415]: 2025-11-26 10:00:00.849 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:00:00 localhost nova_compute[281415]: 2025-11-26 10:00:00.849 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 05:00:01 localhost nova_compute[281415]: 2025-11-26 10:00:01.851 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:00:01 localhost nova_compute[281415]: 2025-11-26 10:00:01.904 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:02 localhost nova_compute[281415]: 2025-11-26 10:00:02.843 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:00:02 localhost nova_compute[281415]: 2025-11-26 10:00:02.917 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:00:03 localhost nova_compute[281415]: 2025-11-26 10:00:03.088 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:00:03 localhost nova_compute[281415]: 2025-11-26 10:00:03.088 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:00:03 localhost nova_compute[281415]: 2025-11-26 10:00:03.089 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:00:03 localhost nova_compute[281415]: 2025-11-26 10:00:03.089 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 05:00:03 localhost nova_compute[281415]: 2025-11-26 10:00:03.090 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 05:00:03 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e121 e121: 6 total, 6 up, 6 in Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.585 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'name': 'test', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005536118.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'hostId': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.586 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 26 05:00:03 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 05:00:03 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2838886077' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.616 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.617 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e30f2e05-64d9-40b5-9b91-4133aa55b18f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:00:03.586652', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'adb8ad76-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11678.828930633, 'message_signature': 'ba4f83056636ce5890fa845698ddcedea74a39378be52af579535e463f11f573'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:00:03.586652', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'adb8c2b6-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11678.828930633, 'message_signature': '73c4eb0138b158bff4212b443d5fd6444c84d2b984c509987f2b8ecc1ba7b1b9'}]}, 'timestamp': '2025-11-26 10:00:03.617477', '_unique_id': '54b9e15d745c4c9dba2c28bbafad7336'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.619 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.620 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.621 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 1143371229 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.621 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 23326743 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f7a2f7a-d58d-41bb-ac9d-d6c9db480fe7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1143371229, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:00:03.621014', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'adb95ef6-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11678.828930633, 'message_signature': '4e0dadcf603df451c220fbb98c4600b106e2d850364e80d8ec6f1135c2c21a74'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 23326743, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:00:03.621014', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'adb96f22-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11678.828930633, 'message_signature': '770ae7c1c7201b62836965581a6c1643c35525044d9a07666ed25703cab8999a'}]}, 'timestamp': '2025-11-26 10:00:03.621863', '_unique_id': '587494b893e541a2b27875f3745294e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.622 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.623 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.624 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 26 05:00:03 localhost nova_compute[281415]: 2025-11-26 10:00:03.631 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.541s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.647 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/cpu volume: 16000000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '30537f43-5085-4b01-abd3-7af9e8791286', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 16000000000, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T10:00:03.624253', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'adbd5baa-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11678.889233731, 'message_signature': 'f53058f819e7390db22e826445a177d90254e441e125983ac7b00055c2b82802'}]}, 'timestamp': '2025-11-26 10:00:03.647620', '_unique_id': 'e3a040e0a25a45399ba6373ac6c23c58'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.648 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.649 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.666 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:00:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:03.666 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.667 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:00:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:03.667 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:00:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:03.668 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '54060a67-50c5-4c3c-8284-6421903f8676', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:00:03.649812', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'adc05ef4-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11678.892060634, 'message_signature': '388735e33669669aa02705fa6849f49dc34ed4ef46628998e9110b15cec816e3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:00:03.649812', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'adc0710a-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11678.892060634, 'message_signature': '45d5d6374139f5a74f8ed7385c0b0361c05092af4ab5c612d6f5714d1014bc04'}]}, 'timestamp': '2025-11-26 10:00:03.667791', '_unique_id': '797c8c04e35c470b8483e63e7baeeaee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.668 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.669 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.670 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 1723586642 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.670 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 89399569 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '43f666c6-697a-435f-bc02-1bad87a0ef4f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1723586642, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:00:03.670031', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'adc0d942-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11678.828930633, 'message_signature': '89a9c0e6def070f08bb2cf0ebe7bb8618cf9c991577c7e0c77fe74fbf1d3bfa3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89399569, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:00:03.670031', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'adc0e932-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11678.828930633, 'message_signature': 'a0175b6f8dae757992dabe4fc0b8f11a8208f48c982f72c39a7861a3509d0658'}]}, 'timestamp': '2025-11-26 10:00:03.670861', '_unique_id': 'f1995f3fe5254c57a3a13273b973e3f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.671 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.672 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.675 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes volume: 7557 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0878f331-9c72-47ac-be7a-154757431f81', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7557, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:00:03.673041', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'adc1c30c-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11678.91529336, 'message_signature': '41d981773e0443e27016755a16d2339b9eb1b0896a3a18ba9bbcc2a1d71a5910'}]}, 'timestamp': '2025-11-26 10:00:03.676473', '_unique_id': 'e0708153ca524703b3ad24a3333bb3f0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.677 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.678 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.678 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7e99ddf-f327-4641-87d6-47a4e44a5c2b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:00:03.678597', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'adc22928-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11678.91529336, 'message_signature': '884ecb51af4b51d7a48ffe4bf2d0db379413617a5b553072256f8d688b4b54c3'}]}, 'timestamp': '2025-11-26 10:00:03.679117', '_unique_id': 'ac85589657444144bc0dbcabb3180646'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.680 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.681 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.681 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '37538f73-90c2-4a16-8267-cd3e4dfb64e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:00:03.681539', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'adc29c32-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11678.91529336, 'message_signature': '29b576d43e8b16eb4f50e9243111661f97b8765e5ebcff61ac216b05b2ee7283'}]}, 'timestamp': '2025-11-26 10:00:03.682068', '_unique_id': 'e6908d5a62f44ca6a2be3499369d8116'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.682 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.684 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.684 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.684 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '04f4e21c-9c47-417c-b8d2-9518841e965c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:00:03.684214', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'adc3030c-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11678.892060634, 'message_signature': '6a7872e01d98c5eb09e1f597e0e68033852be8035d3fa9250db06035e3c70a44'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:00:03.684214', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'adc31306-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11678.892060634, 'message_signature': 'b093c37d29b2b91ba2f72418ca05e6de41c8782e8b347a28aaa4d8a4eaa847f6'}]}, 'timestamp': '2025-11-26 10:00:03.685073', '_unique_id': '37642ea8ad8d47879762432101fb5403'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.686 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.687 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.687 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '534f9710-2394-452f-9610-f575f19ea58f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:00:03.687213', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'adc37968-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11678.91529336, 'message_signature': '6243c8b3ba84d48fc909541518b36671e5c521d50ea2fa2ebf8810d920706f18'}]}, 'timestamp': '2025-11-26 10:00:03.687690', '_unique_id': 'cf644128d0ba4bd4bd559315fa0cee65'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.688 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.689 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.689 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.690 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '101bb4cc-3d51-4f2e-92f7-741dea3fcd50', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:00:03.689762', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'adc3dd36-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11678.828930633, 'message_signature': '81f32370d6cb7052754ccfa253e632d09af61bf2c96f7f25d7d7df802e17ce64'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:00:03.689762', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'adc3ed44-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11678.828930633, 'message_signature': 'e81000e15bd7445e36c572b13d66d98292e44600cd5983b19212c76842aded4a'}]}, 'timestamp': '2025-11-26 10:00:03.690628', '_unique_id': '6c248d9dc0f84135bed967ecd2359567'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.691 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.692 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.692 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.693 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.693 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '32114a8d-91be-4b5a-9a7d-5e658a631194', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:00:03.693025', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'adc45b4e-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11678.828930633, 'message_signature': 'ae9ff47523c53e5d7b307155ec0a6331a07af796a903ecd3875e84d2bfb1257a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:00:03.693025', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'adc46b7a-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11678.828930633, 'message_signature': '88d0454ea5a125700b16773ac58a3dc43ad3dda27d61d0402de1571b554a6b0f'}]}, 'timestamp': '2025-11-26 10:00:03.693883', '_unique_id': '2bbd714e364e4ea1b49b1448f83d50dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.694 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.695 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.696 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2f55ae8e-80ca-4c5c-85a4-d6446bade6a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:00:03.696091', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'adc4d326-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11678.91529336, 'message_signature': '850f330a9166aed212e5aae1c836b8bc92ba78674ee9844bb9b6c749d1f53139'}]}, 'timestamp': '2025-11-26 10:00:03.696540', '_unique_id': '4cb69cdd839c48ccb0c0c52bb2706bd7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.697 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.698 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.698 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7fbaf100-ce72-4423-b605-3dcadf860ce6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:00:03.698622', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'adc535dc-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11678.91529336, 'message_signature': '6822396c31dfcf5af37c3b9e21d0cca287bca3c133cf92c9b9edcd2abdc67b9c'}]}, 'timestamp': '2025-11-26 10:00:03.699098', '_unique_id': '1d91287c582045b2bef5e9e30dc5f15b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.699 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.701 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.701 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2d1b0a0-e495-4a0c-b0f8-71b55307af21', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:00:03.701170', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'adc5998c-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11678.91529336, 'message_signature': 'ff9e021de6d76fa8b1cc566ad9169c3f38908db165bab64c625d6022449f7899'}]}, 'timestamp': '2025-11-26 10:00:03.701619', '_unique_id': '0f45f73b1ba5405ca0fc5f1ce3b0f1a6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.702 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.703 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.703 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes.delta volume: 446 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ff729b4-eed2-4bbb-a192-641e2e1ccf36', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 446, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:00:03.703710', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'adc5fdc8-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11678.91529336, 'message_signature': 'a583e2453a8c7330b7226550781420ed5d1ed5108aa2530c51aaabd0ddf45f46'}]}, 'timestamp': '2025-11-26 10:00:03.704188', '_unique_id': 'ee613f1611eb4814b5b5932f28a19e97'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.705 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.706 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.706 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.706 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd69b03e3-16fd-4143-8def-30152276437a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:00:03.706268', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'adc6606a-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11678.892060634, 'message_signature': '5e968cf9eb179450446dbed1bb983e38f89928270a9c0f75f43a04b6101de77d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:00:03.706268', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'adc67046-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11678.892060634, 'message_signature': 'ccc7491a6624cec469355908801a44994a55a66f1eba73f601ba2ce9a9b4b478'}]}, 'timestamp': '2025-11-26 10:00:03.707118', '_unique_id': '6fe591f8d2624466b345dd4c7b912bb0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.708 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.709 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.709 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/memory.usage volume: 51.79296875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '348242da-7646-4d95-a7c6-8a0f54fdf071', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.79296875, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T10:00:03.709245', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'adc6d4c8-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11678.889233731, 'message_signature': 'e053d8e4460faf94e1bf4995cf31c56682bf110e2b8f070fb7698ae69bd03d7e'}]}, 'timestamp': '2025-11-26 10:00:03.709685', '_unique_id': '2f11e8cf6a4d43da852d18aba64649ae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.710 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.711 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.711 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets volume: 68 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b8b285d4-bc18-43d7-8303-fe6779a09c6e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 68, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:00:03.711891', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'adc73bb6-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11678.91529336, 'message_signature': 'dd4fd1a480e7a226ae4436839d7955f76fa9715ff2ee33ef2e34c5907747eba7'}]}, 'timestamp': '2025-11-26 10:00:03.712239', '_unique_id': '6fc6b4bc6fbb451b8ef03e93a17daf21'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.712 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.713 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.713 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '19696012-d4ba-482d-8e8e-cc59929a84bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:00:03.713580', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'adc77b08-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11678.91529336, 'message_signature': 'cd395ce42c1e2b80de038ba8a2a709640dc680dc19c1c0606b594d8c0ae52d67'}]}, 'timestamp': '2025-11-26 10:00:03.713860', '_unique_id': '94c27d2d88584050b1f2defe564a21b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.714 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.715 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.715 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.715 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a2e2db14-75a8-45da-90f2-761219842bfe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:00:03.715138', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'adc7b7b2-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11678.828930633, 'message_signature': '55006a2ade518de17075441dc862c232921e3e05d241466c165f8d67ae015d50'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:00:03.715138', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'adc7c23e-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11678.828930633, 'message_signature': '21480169727f04336884a183c2ea58ec0835498e2bde8ad96791573ef26d8b30'}]}, 'timestamp': '2025-11-26 10:00:03.715663', '_unique_id': 'bfaf2857dc3a46639d5b029a4d6bb1f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:00:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:00:03.716 12 ERROR oslo_messaging.notify.messaging Nov 26 05:00:03 localhost nova_compute[281415]: 2025-11-26 10:00:03.879 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 05:00:03 localhost nova_compute[281415]: 2025-11-26 10:00:03.880 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 05:00:04 localhost nova_compute[281415]: 2025-11-26 10:00:04.112 281419 WARNING nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 05:00:04 localhost nova_compute[281415]: 2025-11-26 10:00:04.114 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=11276MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 05:00:04 localhost nova_compute[281415]: 2025-11-26 10:00:04.115 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:00:04 localhost nova_compute[281415]: 2025-11-26 10:00:04.115 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:00:04 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e121 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:00:04 localhost nova_compute[281415]: 2025-11-26 10:00:04.391 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 05:00:04 localhost nova_compute[281415]: 2025-11-26 10:00:04.392 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 05:00:04 localhost nova_compute[281415]: 2025-11-26 10:00:04.392 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 05:00:04 localhost nova_compute[281415]: 2025-11-26 10:00:04.434 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 05:00:04 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 05:00:04 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2221340159' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 05:00:04 localhost nova_compute[281415]: 2025-11-26 10:00:04.905 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 05:00:04 localhost nova_compute[281415]: 2025-11-26 10:00:04.912 281419 DEBUG nova.compute.provider_tree [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 05:00:05 localhost nova_compute[281415]: 2025-11-26 10:00:05.012 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 05:00:05 localhost nova_compute[281415]: 2025-11-26 10:00:05.166 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 05:00:05 localhost nova_compute[281415]: 2025-11-26 10:00:05.167 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.051s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:00:05 localhost nova_compute[281415]: 2025-11-26 10:00:05.587 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 05:00:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 05:00:05 localhost podman[310783]: 2025-11-26 10:00:05.847772724 +0000 UTC m=+0.101059031 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:00:05 localhost podman[310783]: 2025-11-26 10:00:05.898692076 +0000 UTC m=+0.151978373 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:00:05 localhost podman[310784]: 2025-11-26 10:00:05.900737097 +0000 UTC m=+0.150242443 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc.) Nov 26 05:00:05 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 05:00:05 localhost podman[310784]: 2025-11-26 10:00:05.983465726 +0000 UTC m=+0.232971032 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350) Nov 26 05:00:05 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 05:00:06 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:06.379 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:5e:b4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '86:cf:7c:68:02:df'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:00:06 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:06.380 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 26 05:00:06 localhost nova_compute[281415]: 2025-11-26 10:00:06.387 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:06 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:06.621 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:5e:b4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '86:cf:7c:68:02:df'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:00:06 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:06.622 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 26 05:00:06 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:06.623 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fad182b-d1fd-4eb1-a4d3-436a76a6f49e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 05:00:06 localhost nova_compute[281415]: 2025-11-26 10:00:06.662 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:06 localhost nova_compute[281415]: 2025-11-26 10:00:06.908 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:07 localhost nova_compute[281415]: 2025-11-26 10:00:07.098 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:00:07 localhost nova_compute[281415]: 2025-11-26 10:00:07.099 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 05:00:07 localhost nova_compute[281415]: 2025-11-26 10:00:07.099 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 05:00:07 localhost neutron_sriov_agent[255515]: 2025-11-26 10:00:07.617 2 INFO neutron.agent.securitygroups_rpc [None req-8cbeeeea-d660-4c01-b1f8-ec4520dbfed8 a6d3119a44eb4dd6b7a9872b07560553 b9f4c56ce4e7446ea02b109676f985d4 - - default default] Security group member updated ['d0702f76-4e73-4a67-b21b-4066319d1117']#033[00m Nov 26 05:00:07 localhost ovn_controller[153664]: 2025-11-26T10:00:07Z|00162|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:00:07 localhost nova_compute[281415]: 2025-11-26 10:00:07.677 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:07 localhost nova_compute[281415]: 2025-11-26 10:00:07.916 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 05:00:07 localhost nova_compute[281415]: 2025-11-26 10:00:07.917 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 05:00:07 localhost nova_compute[281415]: 2025-11-26 10:00:07.917 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 05:00:07 localhost nova_compute[281415]: 2025-11-26 10:00:07.918 281419 DEBUG nova.objects.instance [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 05:00:09 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e121 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:00:09 localhost nova_compute[281415]: 2025-11-26 10:00:09.348 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 05:00:09 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:09.382 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fad182b-d1fd-4eb1-a4d3-436a76a6f49e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 05:00:09 localhost nova_compute[281415]: 2025-11-26 10:00:09.384 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 05:00:09 localhost nova_compute[281415]: 2025-11-26 10:00:09.384 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 05:00:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 05:00:09 localhost systemd[1]: tmp-crun.BlvAdT.mount: Deactivated successfully. Nov 26 05:00:09 localhost podman[310829]: 2025-11-26 10:00:09.823288708 +0000 UTC m=+0.078560358 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 26 05:00:09 localhost podman[310829]: 2025-11-26 10:00:09.837157897 +0000 UTC m=+0.092429537 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 05:00:09 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 05:00:10 localhost sshd[310852]: main: sshd: ssh-rsa algorithm is disabled Nov 26 05:00:10 localhost neutron_sriov_agent[255515]: 2025-11-26 10:00:10.103 2 INFO neutron.agent.securitygroups_rpc [None req-15999523-4cb6-4d66-bfcc-5e92a1e5386a a6d3119a44eb4dd6b7a9872b07560553 b9f4c56ce4e7446ea02b109676f985d4 - - default default] Security group member updated ['d0702f76-4e73-4a67-b21b-4066319d1117']#033[00m Nov 26 05:00:10 localhost nova_compute[281415]: 2025-11-26 10:00:10.589 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:10 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:10.664 262471 INFO neutron.agent.linux.ip_lib [None req-93da6862-896c-499a-8870-69ddf2d45963 - - - - - -] Device tapc3b952ce-9b cannot be used as it has no MAC address#033[00m Nov 26 05:00:10 localhost nova_compute[281415]: 2025-11-26 10:00:10.692 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:10 localhost kernel: device tapc3b952ce-9b entered promiscuous mode Nov 26 05:00:10 localhost NetworkManager[5970]: [1764151210.7046] manager: (tapc3b952ce-9b): new Generic device (/org/freedesktop/NetworkManager/Devices/29) Nov 26 05:00:10 localhost nova_compute[281415]: 2025-11-26 10:00:10.704 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:10 localhost systemd-udevd[310864]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:00:10 localhost ovn_controller[153664]: 2025-11-26T10:00:10Z|00163|binding|INFO|Claiming lport c3b952ce-9bb7-4678-bed6-0aa6419f4025 for this chassis. Nov 26 05:00:10 localhost ovn_controller[153664]: 2025-11-26T10:00:10Z|00164|binding|INFO|c3b952ce-9bb7-4678-bed6-0aa6419f4025: Claiming unknown Nov 26 05:00:10 localhost journal[229445]: ethtool ioctl error on tapc3b952ce-9b: No such device Nov 26 05:00:10 localhost ovn_controller[153664]: 2025-11-26T10:00:10Z|00165|binding|INFO|Setting lport c3b952ce-9bb7-4678-bed6-0aa6419f4025 ovn-installed in OVS Nov 26 05:00:10 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:10.752 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd37b8fb99ac14f7abe4246c12a00693a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d5cb9bdc-f862-45b3-88c3-a89b4ea6e23b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=c3b952ce-9bb7-4678-bed6-0aa6419f4025) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:00:10 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:10.755 159486 INFO neutron.agent.ovn.metadata.agent [-] Port c3b952ce-9bb7-4678-bed6-0aa6419f4025 in datapath 6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0 bound to our chassis#033[00m Nov 26 05:00:10 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:10.756 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:00:10 localhost nova_compute[281415]: 2025-11-26 10:00:10.755 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:10 localhost ovn_controller[153664]: 2025-11-26T10:00:10Z|00166|binding|INFO|Setting lport c3b952ce-9bb7-4678-bed6-0aa6419f4025 up in Southbound Nov 26 05:00:10 localhost journal[229445]: ethtool ioctl error on tapc3b952ce-9b: No such device Nov 26 05:00:10 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:10.757 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[a7d46d40-bdff-4cea-a94c-7c76f0f90cf4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:00:10 localhost journal[229445]: ethtool ioctl error on tapc3b952ce-9b: No such device Nov 26 05:00:10 localhost journal[229445]: ethtool ioctl error on tapc3b952ce-9b: No such device Nov 26 05:00:10 localhost journal[229445]: ethtool ioctl error on tapc3b952ce-9b: No such device Nov 26 05:00:10 localhost journal[229445]: ethtool ioctl error on tapc3b952ce-9b: No such device Nov 26 05:00:10 localhost journal[229445]: ethtool ioctl error on tapc3b952ce-9b: No such device Nov 26 05:00:10 localhost journal[229445]: ethtool ioctl error on tapc3b952ce-9b: No such device Nov 26 05:00:10 localhost nova_compute[281415]: 2025-11-26 10:00:10.803 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:10 localhost nova_compute[281415]: 2025-11-26 10:00:10.837 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:11 localhost nova_compute[281415]: 2025-11-26 10:00:11.800 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:11 localhost podman[310935]: Nov 26 05:00:11 localhost podman[310935]: 2025-11-26 10:00:11.848149135 +0000 UTC m=+0.120626789 container create ea63e11eecbdcfbb856f2552df16c5a86ce22807530cf952bf8afbef4b19837e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 26 05:00:11 localhost systemd[1]: Started libpod-conmon-ea63e11eecbdcfbb856f2552df16c5a86ce22807530cf952bf8afbef4b19837e.scope. Nov 26 05:00:11 localhost podman[310935]: 2025-11-26 10:00:11.808155425 +0000 UTC m=+0.080633150 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:00:11 localhost systemd[1]: Started libcrun container. Nov 26 05:00:11 localhost nova_compute[281415]: 2025-11-26 10:00:11.911 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/40552554b86beceec2b6fc8b5bec668e9ca5f78cde377db62b1810fdfce4aab5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:00:11 localhost podman[310935]: 2025-11-26 10:00:11.928053231 +0000 UTC m=+0.200530905 container init ea63e11eecbdcfbb856f2552df16c5a86ce22807530cf952bf8afbef4b19837e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 26 05:00:11 localhost podman[310935]: 2025-11-26 10:00:11.93955234 +0000 UTC m=+0.212030014 container start ea63e11eecbdcfbb856f2552df16c5a86ce22807530cf952bf8afbef4b19837e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 26 05:00:11 localhost dnsmasq[310954]: started, version 2.85 cachesize 150 Nov 26 05:00:11 localhost dnsmasq[310954]: DNS service limited to local subnets Nov 26 05:00:11 localhost dnsmasq[310954]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:00:11 localhost dnsmasq[310954]: warning: no upstream servers configured Nov 26 05:00:11 localhost dnsmasq-dhcp[310954]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 26 05:00:11 localhost dnsmasq[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/addn_hosts - 0 addresses Nov 26 05:00:11 localhost dnsmasq-dhcp[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/host Nov 26 05:00:11 localhost dnsmasq-dhcp[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/opts Nov 26 05:00:12 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:12.177 262471 INFO neutron.agent.dhcp.agent [None req-d5c3d6a9-67be-453e-9306-e5479535f83b - - - - - -] DHCP configuration for ports {'c8d38661-b11f-4233-98d6-75e543014fd2'} is completed#033[00m Nov 26 05:00:12 localhost nova_compute[281415]: 2025-11-26 10:00:12.184 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:12 localhost neutron_sriov_agent[255515]: 2025-11-26 10:00:12.939 2 INFO neutron.agent.securitygroups_rpc [None req-7fbea241-f06a-4686-97c3-fe162ced3a7e 684937b5c7734b9c969ca031c7cce5d1 d37b8fb99ac14f7abe4246c12a00693a - - default default] Security group member updated ['9aae036b-31a0-48e0-a520-90f4ef48cf07']#033[00m Nov 26 05:00:12 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:12.979 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:00:12Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=785f04e8-9e02-4ce1-b9d0-f8f326a8c70d, ip_allocation=immediate, mac_address=fa:16:3e:c1:db:c0, name=tempest-AllowedAddressPairTestJSON-1558802090, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:00:06Z, description=, dns_domain=, id=6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-92122956, port_security_enabled=True, project_id=d37b8fb99ac14f7abe4246c12a00693a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=45946, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=984, status=ACTIVE, subnets=['cfe82ce6-9e60-497f-aec2-98ba55a9d5d8'], tags=[], tenant_id=d37b8fb99ac14f7abe4246c12a00693a, updated_at=2025-11-26T10:00:09Z, vlan_transparent=None, network_id=6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0, port_security_enabled=True, project_id=d37b8fb99ac14f7abe4246c12a00693a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['9aae036b-31a0-48e0-a520-90f4ef48cf07'], standard_attr_id=1007, status=DOWN, tags=[], tenant_id=d37b8fb99ac14f7abe4246c12a00693a, updated_at=2025-11-26T10:00:12Z on network 6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0#033[00m Nov 26 05:00:13 localhost podman[310970]: 2025-11-26 10:00:13.249920255 +0000 UTC m=+0.063736281 container kill ea63e11eecbdcfbb856f2552df16c5a86ce22807530cf952bf8afbef4b19837e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 26 05:00:13 localhost dnsmasq[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/addn_hosts - 1 addresses Nov 26 05:00:13 localhost dnsmasq-dhcp[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/host Nov 26 05:00:13 localhost dnsmasq-dhcp[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/opts Nov 26 05:00:13 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:13.595 262471 INFO neutron.agent.dhcp.agent [None req-7e969b65-2d3c-4293-a3da-91c240ae2849 - - - - - -] DHCP configuration for ports {'785f04e8-9e02-4ce1-b9d0-f8f326a8c70d'} is completed#033[00m Nov 26 05:00:14 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e121 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:00:14 localhost neutron_sriov_agent[255515]: 2025-11-26 10:00:14.457 2 INFO neutron.agent.securitygroups_rpc [None req-9b51f3e5-62eb-48d2-a438-3677da851903 684937b5c7734b9c969ca031c7cce5d1 d37b8fb99ac14f7abe4246c12a00693a - - default default] Security group member updated ['9aae036b-31a0-48e0-a520-90f4ef48cf07']#033[00m Nov 26 05:00:14 localhost ovn_controller[153664]: 2025-11-26T10:00:14Z|00167|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:00:14 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:14.549 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:00:13Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=00fb450c-743c-40f0-8d30-d81d1fc30a70, ip_allocation=immediate, mac_address=fa:16:3e:f6:a1:00, name=tempest-AllowedAddressPairTestJSON-1394669906, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:00:06Z, description=, dns_domain=, id=6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-92122956, port_security_enabled=True, project_id=d37b8fb99ac14f7abe4246c12a00693a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=45946, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=984, status=ACTIVE, subnets=['cfe82ce6-9e60-497f-aec2-98ba55a9d5d8'], tags=[], tenant_id=d37b8fb99ac14f7abe4246c12a00693a, updated_at=2025-11-26T10:00:09Z, vlan_transparent=None, network_id=6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0, port_security_enabled=True, project_id=d37b8fb99ac14f7abe4246c12a00693a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['9aae036b-31a0-48e0-a520-90f4ef48cf07'], standard_attr_id=1009, status=DOWN, tags=[], tenant_id=d37b8fb99ac14f7abe4246c12a00693a, updated_at=2025-11-26T10:00:14Z on network 6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0#033[00m Nov 26 05:00:14 localhost nova_compute[281415]: 2025-11-26 10:00:14.549 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 05:00:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 05:00:14 localhost dnsmasq[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/addn_hosts - 2 addresses Nov 26 05:00:14 localhost dnsmasq-dhcp[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/host Nov 26 05:00:14 localhost dnsmasq-dhcp[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/opts Nov 26 05:00:14 localhost podman[311009]: 2025-11-26 10:00:14.821542475 +0000 UTC m=+0.080964909 container kill ea63e11eecbdcfbb856f2552df16c5a86ce22807530cf952bf8afbef4b19837e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0) Nov 26 05:00:14 localhost systemd[1]: tmp-crun.60LpAC.mount: Deactivated successfully. Nov 26 05:00:14 localhost podman[311012]: 2025-11-26 10:00:14.945309765 +0000 UTC m=+0.193170098 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=multipathd) Nov 26 05:00:14 localhost podman[311010]: 2025-11-26 10:00:14.9112522 +0000 UTC m=+0.161685618 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Nov 26 05:00:14 localhost podman[311012]: 2025-11-26 10:00:14.96546418 +0000 UTC m=+0.213324503 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:00:14 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 05:00:14 localhost podman[311010]: 2025-11-26 10:00:14.994472335 +0000 UTC m=+0.244905753 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible) Nov 26 05:00:15 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 05:00:15 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:15.150 262471 INFO neutron.agent.dhcp.agent [None req-1f8ceda8-f4c3-4d9f-930d-c18fc9795c10 - - - - - -] DHCP configuration for ports {'00fb450c-743c-40f0-8d30-d81d1fc30a70'} is completed#033[00m Nov 26 05:00:15 localhost neutron_sriov_agent[255515]: 2025-11-26 10:00:15.523 2 INFO neutron.agent.securitygroups_rpc [None req-303a6e2c-0add-4546-91cd-147e102efa18 684937b5c7734b9c969ca031c7cce5d1 d37b8fb99ac14f7abe4246c12a00693a - - default default] Security group member updated ['9aae036b-31a0-48e0-a520-90f4ef48cf07']#033[00m Nov 26 05:00:15 localhost nova_compute[281415]: 2025-11-26 10:00:15.633 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:15 localhost openstack_network_exporter[242153]: ERROR 10:00:15 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 05:00:15 localhost openstack_network_exporter[242153]: ERROR 10:00:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:00:15 localhost openstack_network_exporter[242153]: ERROR 10:00:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:00:15 localhost openstack_network_exporter[242153]: ERROR 10:00:15 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 05:00:15 localhost openstack_network_exporter[242153]: Nov 26 05:00:15 localhost openstack_network_exporter[242153]: ERROR 10:00:15 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 05:00:15 localhost openstack_network_exporter[242153]: Nov 26 05:00:15 localhost dnsmasq[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/addn_hosts - 1 addresses Nov 26 05:00:15 localhost podman[311085]: 2025-11-26 10:00:15.810063118 +0000 UTC m=+0.068974986 container kill ea63e11eecbdcfbb856f2552df16c5a86ce22807530cf952bf8afbef4b19837e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 26 05:00:15 localhost systemd[1]: tmp-crun.I34oai.mount: Deactivated successfully. Nov 26 05:00:15 localhost dnsmasq-dhcp[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/host Nov 26 05:00:15 localhost dnsmasq-dhcp[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/opts Nov 26 05:00:16 localhost neutron_sriov_agent[255515]: 2025-11-26 10:00:16.190 2 INFO neutron.agent.securitygroups_rpc [None req-6f337358-e286-462a-9073-cbde780c63bb 684937b5c7734b9c969ca031c7cce5d1 d37b8fb99ac14f7abe4246c12a00693a - - default default] Security group member updated ['9aae036b-31a0-48e0-a520-90f4ef48cf07']#033[00m Nov 26 05:00:16 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:16.245 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:00:15Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=c06aed00-135d-4459-b6ab-d15a3b09ff28, ip_allocation=immediate, mac_address=fa:16:3e:a6:11:23, name=tempest-AllowedAddressPairTestJSON-1970190097, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:00:06Z, description=, dns_domain=, id=6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-92122956, port_security_enabled=True, project_id=d37b8fb99ac14f7abe4246c12a00693a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=45946, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=984, status=ACTIVE, subnets=['cfe82ce6-9e60-497f-aec2-98ba55a9d5d8'], tags=[], tenant_id=d37b8fb99ac14f7abe4246c12a00693a, updated_at=2025-11-26T10:00:09Z, vlan_transparent=None, network_id=6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0, port_security_enabled=True, project_id=d37b8fb99ac14f7abe4246c12a00693a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['9aae036b-31a0-48e0-a520-90f4ef48cf07'], standard_attr_id=1016, status=DOWN, tags=[], tenant_id=d37b8fb99ac14f7abe4246c12a00693a, updated_at=2025-11-26T10:00:16Z on network 6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0#033[00m Nov 26 05:00:16 localhost systemd[1]: tmp-crun.tt0V3j.mount: Deactivated successfully. Nov 26 05:00:16 localhost podman[311120]: 2025-11-26 10:00:16.489169166 +0000 UTC m=+0.074855549 container kill ea63e11eecbdcfbb856f2552df16c5a86ce22807530cf952bf8afbef4b19837e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3) Nov 26 05:00:16 localhost dnsmasq[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/addn_hosts - 2 addresses Nov 26 05:00:16 localhost dnsmasq-dhcp[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/host Nov 26 05:00:16 localhost dnsmasq-dhcp[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/opts Nov 26 05:00:16 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:16.801 262471 INFO neutron.agent.dhcp.agent [None req-e94cd106-2548-49d4-99fd-46f7850fcbd7 - - - - - -] DHCP configuration for ports {'c06aed00-135d-4459-b6ab-d15a3b09ff28'} is completed#033[00m Nov 26 05:00:16 localhost nova_compute[281415]: 2025-11-26 10:00:16.951 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:17 localhost neutron_sriov_agent[255515]: 2025-11-26 10:00:17.571 2 INFO neutron.agent.securitygroups_rpc [None req-957ead50-cc48-4bc4-8f2e-905d08dc737c 684937b5c7734b9c969ca031c7cce5d1 d37b8fb99ac14f7abe4246c12a00693a - - default default] Security group member updated ['9aae036b-31a0-48e0-a520-90f4ef48cf07']#033[00m Nov 26 05:00:18 localhost systemd[1]: tmp-crun.uNBuMx.mount: Deactivated successfully. Nov 26 05:00:18 localhost dnsmasq[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/addn_hosts - 1 addresses Nov 26 05:00:18 localhost dnsmasq-dhcp[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/host Nov 26 05:00:18 localhost podman[311158]: 2025-11-26 10:00:18.058322973 +0000 UTC m=+0.073753737 container kill ea63e11eecbdcfbb856f2552df16c5a86ce22807530cf952bf8afbef4b19837e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 26 05:00:18 localhost dnsmasq-dhcp[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/opts Nov 26 05:00:18 localhost neutron_sriov_agent[255515]: 2025-11-26 10:00:18.399 2 INFO neutron.agent.securitygroups_rpc [None req-12d24bee-82ad-49e6-a9bd-97fe357e67f7 684937b5c7734b9c969ca031c7cce5d1 d37b8fb99ac14f7abe4246c12a00693a - - default default] Security group member updated ['9aae036b-31a0-48e0-a520-90f4ef48cf07']#033[00m Nov 26 05:00:18 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:18.426 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:00:18Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=9ac4e64a-2095-4def-ba36-457a4d35fa21, ip_allocation=immediate, mac_address=fa:16:3e:9a:26:f9, name=tempest-AllowedAddressPairTestJSON-1846153760, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:00:06Z, description=, dns_domain=, id=6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-92122956, port_security_enabled=True, project_id=d37b8fb99ac14f7abe4246c12a00693a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=45946, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=984, status=ACTIVE, subnets=['cfe82ce6-9e60-497f-aec2-98ba55a9d5d8'], tags=[], tenant_id=d37b8fb99ac14f7abe4246c12a00693a, updated_at=2025-11-26T10:00:09Z, vlan_transparent=None, network_id=6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0, port_security_enabled=True, project_id=d37b8fb99ac14f7abe4246c12a00693a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['9aae036b-31a0-48e0-a520-90f4ef48cf07'], standard_attr_id=1037, status=DOWN, tags=[], tenant_id=d37b8fb99ac14f7abe4246c12a00693a, updated_at=2025-11-26T10:00:18Z on network 6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0#033[00m Nov 26 05:00:18 localhost dnsmasq[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/addn_hosts - 2 addresses Nov 26 05:00:18 localhost dnsmasq-dhcp[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/host Nov 26 05:00:18 localhost dnsmasq-dhcp[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/opts Nov 26 05:00:18 localhost podman[311197]: 2025-11-26 10:00:18.652204697 +0000 UTC m=+0.055783796 container kill ea63e11eecbdcfbb856f2552df16c5a86ce22807530cf952bf8afbef4b19837e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2) Nov 26 05:00:18 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:18.967 262471 INFO neutron.agent.dhcp.agent [None req-94f63700-fc32-4078-97bc-8bf35ef26e2b - - - - - -] DHCP configuration for ports {'9ac4e64a-2095-4def-ba36-457a4d35fa21'} is completed#033[00m Nov 26 05:00:19 localhost nova_compute[281415]: 2025-11-26 10:00:19.116 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:19 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e121 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:00:19 localhost neutron_sriov_agent[255515]: 2025-11-26 10:00:19.218 2 INFO neutron.agent.securitygroups_rpc [None req-d9b5470e-467c-4e2f-9eaf-5cbe5692d310 684937b5c7734b9c969ca031c7cce5d1 d37b8fb99ac14f7abe4246c12a00693a - - default default] Security group member updated ['9aae036b-31a0-48e0-a520-90f4ef48cf07']#033[00m Nov 26 05:00:19 localhost nova_compute[281415]: 2025-11-26 10:00:19.343 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:19 localhost podman[311235]: 2025-11-26 10:00:19.519058652 +0000 UTC m=+0.063855584 container kill ea63e11eecbdcfbb856f2552df16c5a86ce22807530cf952bf8afbef4b19837e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 26 05:00:19 localhost dnsmasq[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/addn_hosts - 1 addresses Nov 26 05:00:19 localhost dnsmasq-dhcp[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/host Nov 26 05:00:19 localhost dnsmasq-dhcp[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/opts Nov 26 05:00:19 localhost neutron_sriov_agent[255515]: 2025-11-26 10:00:19.922 2 INFO neutron.agent.securitygroups_rpc [None req-eaa55e34-2657-48ae-84c6-9c16b6671375 684937b5c7734b9c969ca031c7cce5d1 d37b8fb99ac14f7abe4246c12a00693a - - default default] Security group member updated ['9aae036b-31a0-48e0-a520-90f4ef48cf07']#033[00m Nov 26 05:00:19 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:19.960 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:00:19Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=cd1e58a6-aee3-4320-a760-ab8f15294888, ip_allocation=immediate, mac_address=fa:16:3e:ac:ce:57, name=tempest-AllowedAddressPairTestJSON-526371495, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:00:06Z, description=, dns_domain=, id=6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-92122956, port_security_enabled=True, project_id=d37b8fb99ac14f7abe4246c12a00693a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=45946, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=984, status=ACTIVE, subnets=['cfe82ce6-9e60-497f-aec2-98ba55a9d5d8'], tags=[], tenant_id=d37b8fb99ac14f7abe4246c12a00693a, updated_at=2025-11-26T10:00:09Z, vlan_transparent=None, network_id=6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0, port_security_enabled=True, project_id=d37b8fb99ac14f7abe4246c12a00693a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['9aae036b-31a0-48e0-a520-90f4ef48cf07'], standard_attr_id=1043, status=DOWN, tags=[], tenant_id=d37b8fb99ac14f7abe4246c12a00693a, updated_at=2025-11-26T10:00:19Z on network 6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0#033[00m Nov 26 05:00:20 localhost dnsmasq[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/addn_hosts - 2 addresses Nov 26 05:00:20 localhost dnsmasq-dhcp[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/host Nov 26 05:00:20 localhost dnsmasq-dhcp[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/opts Nov 26 05:00:20 localhost podman[311274]: 2025-11-26 10:00:20.235078219 +0000 UTC m=+0.074664383 container kill ea63e11eecbdcfbb856f2552df16c5a86ce22807530cf952bf8afbef4b19837e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 26 05:00:20 localhost systemd[1]: tmp-crun.QQMZEc.mount: Deactivated successfully. Nov 26 05:00:20 localhost neutron_sriov_agent[255515]: 2025-11-26 10:00:20.373 2 INFO neutron.agent.securitygroups_rpc [None req-a49a460d-a05c-4b24-a598-36b4a5b52638 684937b5c7734b9c969ca031c7cce5d1 d37b8fb99ac14f7abe4246c12a00693a - - default default] Security group member updated ['9aae036b-31a0-48e0-a520-90f4ef48cf07']#033[00m Nov 26 05:00:20 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:20.417 262471 INFO neutron.agent.dhcp.agent [None req-30f34449-369c-4ebf-890e-d201f6ce9971 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:00:20Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=e3c8a495-f372-4c72-b0cf-b6f089812a58, ip_allocation=immediate, mac_address=fa:16:3e:2a:3c:13, name=tempest-AllowedAddressPairTestJSON-1980130516, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:00:06Z, description=, dns_domain=, id=6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-92122956, port_security_enabled=True, project_id=d37b8fb99ac14f7abe4246c12a00693a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=45946, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=984, status=ACTIVE, subnets=['cfe82ce6-9e60-497f-aec2-98ba55a9d5d8'], tags=[], tenant_id=d37b8fb99ac14f7abe4246c12a00693a, updated_at=2025-11-26T10:00:09Z, vlan_transparent=None, network_id=6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0, port_security_enabled=True, project_id=d37b8fb99ac14f7abe4246c12a00693a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['9aae036b-31a0-48e0-a520-90f4ef48cf07'], standard_attr_id=1054, status=DOWN, tags=[], tenant_id=d37b8fb99ac14f7abe4246c12a00693a, updated_at=2025-11-26T10:00:20Z on network 6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0#033[00m Nov 26 05:00:20 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:20.533 262471 INFO neutron.agent.dhcp.agent [None req-193ee6ed-a5bf-4876-b5a3-8a281b1c55a6 - - - - - -] DHCP configuration for ports {'cd1e58a6-aee3-4320-a760-ab8f15294888'} is completed#033[00m Nov 26 05:00:20 localhost nova_compute[281415]: 2025-11-26 10:00:20.675 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:20 localhost dnsmasq[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/addn_hosts - 3 addresses Nov 26 05:00:20 localhost dnsmasq-dhcp[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/host Nov 26 05:00:20 localhost podman[311312]: 2025-11-26 10:00:20.714006314 +0000 UTC m=+0.093497429 container kill ea63e11eecbdcfbb856f2552df16c5a86ce22807530cf952bf8afbef4b19837e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 26 05:00:20 localhost dnsmasq-dhcp[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/opts Nov 26 05:00:21 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:21.019 262471 INFO neutron.agent.dhcp.agent [None req-27f355f7-d625-427a-aed0-232631ea90cb - - - - - -] DHCP configuration for ports {'e3c8a495-f372-4c72-b0cf-b6f089812a58'} is completed#033[00m Nov 26 05:00:21 localhost neutron_sriov_agent[255515]: 2025-11-26 10:00:21.416 2 INFO neutron.agent.securitygroups_rpc [None req-419af1a5-6e76-4b4e-8d83-ee3588649363 684937b5c7734b9c969ca031c7cce5d1 d37b8fb99ac14f7abe4246c12a00693a - - default default] Security group member updated ['9aae036b-31a0-48e0-a520-90f4ef48cf07']#033[00m Nov 26 05:00:21 localhost dnsmasq[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/addn_hosts - 2 addresses Nov 26 05:00:21 localhost dnsmasq-dhcp[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/host Nov 26 05:00:21 localhost podman[311350]: 2025-11-26 10:00:21.710237304 +0000 UTC m=+0.070518281 container kill ea63e11eecbdcfbb856f2552df16c5a86ce22807530cf952bf8afbef4b19837e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118) Nov 26 05:00:21 localhost dnsmasq-dhcp[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/opts Nov 26 05:00:21 localhost nova_compute[281415]: 2025-11-26 10:00:21.986 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:22 localhost neutron_sriov_agent[255515]: 2025-11-26 10:00:22.398 2 INFO neutron.agent.securitygroups_rpc [None req-70c7bb4e-1499-4f77-95ab-737d71ad8c06 684937b5c7734b9c969ca031c7cce5d1 d37b8fb99ac14f7abe4246c12a00693a - - default default] Security group member updated ['9aae036b-31a0-48e0-a520-90f4ef48cf07']#033[00m Nov 26 05:00:22 localhost dnsmasq[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/addn_hosts - 1 addresses Nov 26 05:00:22 localhost dnsmasq-dhcp[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/host Nov 26 05:00:22 localhost dnsmasq-dhcp[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/opts Nov 26 05:00:22 localhost podman[311385]: 2025-11-26 10:00:22.686139856 +0000 UTC m=+0.072819749 container kill ea63e11eecbdcfbb856f2552df16c5a86ce22807530cf952bf8afbef4b19837e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 26 05:00:23 localhost neutron_sriov_agent[255515]: 2025-11-26 10:00:23.122 2 INFO neutron.agent.securitygroups_rpc [None req-5d82ca2f-2276-4414-b5c1-38abeefcdffd 684937b5c7734b9c969ca031c7cce5d1 d37b8fb99ac14f7abe4246c12a00693a - - default default] Security group member updated ['9aae036b-31a0-48e0-a520-90f4ef48cf07']#033[00m Nov 26 05:00:23 localhost dnsmasq[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/addn_hosts - 0 addresses Nov 26 05:00:23 localhost dnsmasq-dhcp[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/host Nov 26 05:00:23 localhost dnsmasq-dhcp[310954]: read /var/lib/neutron/dhcp/6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0/opts Nov 26 05:00:23 localhost podman[311424]: 2025-11-26 10:00:23.468362545 +0000 UTC m=+0.068484912 container kill ea63e11eecbdcfbb856f2552df16c5a86ce22807530cf952bf8afbef4b19837e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 26 05:00:23 localhost neutron_sriov_agent[255515]: 2025-11-26 10:00:23.635 2 INFO neutron.agent.securitygroups_rpc [None req-072628c6-7f3c-4ac5-85af-c56311d0802e 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:00:23 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:23.651 262471 INFO neutron.agent.linux.ip_lib [None req-2effe1ee-4ff5-4d0b-8f35-2926cc1c1ad5 - - - - - -] Device tap31cd2fdc-2e cannot be used as it has no MAC address#033[00m Nov 26 05:00:23 localhost nova_compute[281415]: 2025-11-26 10:00:23.689 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:23 localhost kernel: device tap31cd2fdc-2e entered promiscuous mode Nov 26 05:00:23 localhost NetworkManager[5970]: [1764151223.6983] manager: (tap31cd2fdc-2e): new Generic device (/org/freedesktop/NetworkManager/Devices/30) Nov 26 05:00:23 localhost ovn_controller[153664]: 2025-11-26T10:00:23Z|00168|binding|INFO|Claiming lport 31cd2fdc-2e6f-4921-868b-969dbfc08b2d for this chassis. Nov 26 05:00:23 localhost nova_compute[281415]: 2025-11-26 10:00:23.698 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:23 localhost ovn_controller[153664]: 2025-11-26T10:00:23Z|00169|binding|INFO|31cd2fdc-2e6f-4921-868b-969dbfc08b2d: Claiming unknown Nov 26 05:00:23 localhost systemd-udevd[311456]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:00:23 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:23.713 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=31cd2fdc-2e6f-4921-868b-969dbfc08b2d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:00:23 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:23.716 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 31cd2fdc-2e6f-4921-868b-969dbfc08b2d in datapath cc3dc995-51cd-4d70-be2c-11c47524552d bound to our chassis#033[00m Nov 26 05:00:23 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:23.717 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cc3dc995-51cd-4d70-be2c-11c47524552d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:00:23 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:23.722 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[ea56ff82-2a2d-4033-8fa0-62d478e058eb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:00:23 localhost journal[229445]: ethtool ioctl error on tap31cd2fdc-2e: No such device Nov 26 05:00:23 localhost ovn_controller[153664]: 2025-11-26T10:00:23Z|00170|binding|INFO|Setting lport 31cd2fdc-2e6f-4921-868b-969dbfc08b2d ovn-installed in OVS Nov 26 05:00:23 localhost ovn_controller[153664]: 2025-11-26T10:00:23Z|00171|binding|INFO|Setting lport 31cd2fdc-2e6f-4921-868b-969dbfc08b2d up in Southbound Nov 26 05:00:23 localhost journal[229445]: ethtool ioctl error on tap31cd2fdc-2e: No such device Nov 26 05:00:23 localhost nova_compute[281415]: 2025-11-26 10:00:23.742 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:23 localhost journal[229445]: ethtool ioctl error on tap31cd2fdc-2e: No such device Nov 26 05:00:23 localhost journal[229445]: ethtool ioctl error on tap31cd2fdc-2e: No such device Nov 26 05:00:23 localhost journal[229445]: ethtool ioctl error on tap31cd2fdc-2e: No such device Nov 26 05:00:23 localhost journal[229445]: ethtool ioctl error on tap31cd2fdc-2e: No such device Nov 26 05:00:23 localhost journal[229445]: ethtool ioctl error on tap31cd2fdc-2e: No such device Nov 26 05:00:23 localhost journal[229445]: ethtool ioctl error on tap31cd2fdc-2e: No such device Nov 26 05:00:23 localhost nova_compute[281415]: 2025-11-26 10:00:23.787 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:23 localhost nova_compute[281415]: 2025-11-26 10:00:23.818 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:24 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e121 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:00:24 localhost systemd[1]: tmp-crun.drMGXo.mount: Deactivated successfully. Nov 26 05:00:24 localhost dnsmasq[310954]: exiting on receipt of SIGTERM Nov 26 05:00:24 localhost podman[311520]: 2025-11-26 10:00:24.432306253 +0000 UTC m=+0.083307168 container kill ea63e11eecbdcfbb856f2552df16c5a86ce22807530cf952bf8afbef4b19837e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 26 05:00:24 localhost systemd[1]: libpod-ea63e11eecbdcfbb856f2552df16c5a86ce22807530cf952bf8afbef4b19837e.scope: Deactivated successfully. Nov 26 05:00:24 localhost neutron_sriov_agent[255515]: 2025-11-26 10:00:24.475 2 INFO neutron.agent.securitygroups_rpc [None req-1864c532-089e-46ef-8828-dd46578b2770 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:00:24 localhost podman[311534]: 2025-11-26 10:00:24.506586823 +0000 UTC m=+0.059528256 container died ea63e11eecbdcfbb856f2552df16c5a86ce22807530cf952bf8afbef4b19837e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:00:24 localhost podman[311534]: 2025-11-26 10:00:24.601987827 +0000 UTC m=+0.154929220 container cleanup ea63e11eecbdcfbb856f2552df16c5a86ce22807530cf952bf8afbef4b19837e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 26 05:00:24 localhost nova_compute[281415]: 2025-11-26 10:00:24.638 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:24 localhost systemd[1]: libpod-conmon-ea63e11eecbdcfbb856f2552df16c5a86ce22807530cf952bf8afbef4b19837e.scope: Deactivated successfully. Nov 26 05:00:24 localhost podman[311541]: 2025-11-26 10:00:24.719278006 +0000 UTC m=+0.253018993 container remove ea63e11eecbdcfbb856f2552df16c5a86ce22807530cf952bf8afbef4b19837e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 26 05:00:24 localhost nova_compute[281415]: 2025-11-26 10:00:24.734 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:24 localhost kernel: device tapc3b952ce-9b left promiscuous mode Nov 26 05:00:24 localhost ovn_controller[153664]: 2025-11-26T10:00:24Z|00172|binding|INFO|Releasing lport c3b952ce-9bb7-4678-bed6-0aa6419f4025 from this chassis (sb_readonly=0) Nov 26 05:00:24 localhost ovn_controller[153664]: 2025-11-26T10:00:24Z|00173|binding|INFO|Setting lport c3b952ce-9bb7-4678-bed6-0aa6419f4025 down in Southbound Nov 26 05:00:24 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:24.743 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd37b8fb99ac14f7abe4246c12a00693a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d5cb9bdc-f862-45b3-88c3-a89b4ea6e23b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=c3b952ce-9bb7-4678-bed6-0aa6419f4025) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:00:24 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:24.745 159486 INFO neutron.agent.ovn.metadata.agent [-] Port c3b952ce-9bb7-4678-bed6-0aa6419f4025 in datapath 6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0 unbound from our chassis#033[00m Nov 26 05:00:24 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:24.747 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6ae5bbd4-7c80-4c74-a199-dc17d95fe4c0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:00:24 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:24.755 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[9908671e-992b-4f1d-b9ad-d0203af0a969]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:00:24 localhost nova_compute[281415]: 2025-11-26 10:00:24.758 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:24 localhost podman[311611]: Nov 26 05:00:24 localhost podman[311611]: 2025-11-26 10:00:24.987262269 +0000 UTC m=+0.093852748 container create 291eda10b73414259239399adb1948214ba902523383b5f0e1067601ade833a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 26 05:00:25 localhost ovn_controller[153664]: 2025-11-26T10:00:25Z|00174|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:00:25 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:25.024 262471 INFO neutron.agent.dhcp.agent [None req-634e673d-08b3-4c9c-90f4-6c5550fd277d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:00:25 localhost systemd[1]: Started libpod-conmon-291eda10b73414259239399adb1948214ba902523383b5f0e1067601ade833a4.scope. Nov 26 05:00:25 localhost podman[311611]: 2025-11-26 10:00:24.943296882 +0000 UTC m=+0.049887401 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:00:25 localhost systemd[1]: Started libcrun container. Nov 26 05:00:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/63b51beabb9e2456eacdbb2399ff6ec0ce91fd21cce3dd862063836d4ba7f63a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:00:25 localhost podman[311611]: 2025-11-26 10:00:25.069503904 +0000 UTC m=+0.176094383 container init 291eda10b73414259239399adb1948214ba902523383b5f0e1067601ade833a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 26 05:00:25 localhost podman[311611]: 2025-11-26 10:00:25.079595002 +0000 UTC m=+0.186185481 container start 291eda10b73414259239399adb1948214ba902523383b5f0e1067601ade833a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:00:25 localhost dnsmasq[311629]: started, version 2.85 cachesize 150 Nov 26 05:00:25 localhost dnsmasq[311629]: DNS service limited to local subnets Nov 26 05:00:25 localhost dnsmasq[311629]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:00:25 localhost dnsmasq[311629]: warning: no upstream servers configured Nov 26 05:00:25 localhost dnsmasq-dhcp[311629]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 26 05:00:25 localhost dnsmasq[311629]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:00:25 localhost dnsmasq-dhcp[311629]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:00:25 localhost dnsmasq-dhcp[311629]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:00:25 localhost nova_compute[281415]: 2025-11-26 10:00:25.090 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:25 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:25.108 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:00:25 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:25.128 262471 INFO neutron.agent.dhcp.agent [None req-2effe1ee-4ff5-4d0b-8f35-2926cc1c1ad5 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:00:23Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=81c3e8d3-7e16-4566-89d4-904305da9d6a, ip_allocation=immediate, mac_address=fa:16:3e:4c:96:14, name=tempest-NetworksTestDHCPv6-1698904901, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:00:20Z, description=, dns_domain=, id=cc3dc995-51cd-4d70-be2c-11c47524552d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-843096697, port_security_enabled=True, project_id=fbf16d8f1271436498d8d9cbfb24239d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42907, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1055, status=ACTIVE, subnets=['57cbf00c-86b6-45b1-870b-b4d2a3275c2d'], tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:00:22Z, vlan_transparent=None, network_id=cc3dc995-51cd-4d70-be2c-11c47524552d, port_security_enabled=True, project_id=fbf16d8f1271436498d8d9cbfb24239d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['513251a1-00ec-4f61-b1d4-b1337479c848'], standard_attr_id=1078, status=DOWN, tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:00:23Z on network cc3dc995-51cd-4d70-be2c-11c47524552d#033[00m Nov 26 05:00:25 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:25.185 262471 INFO neutron.agent.dhcp.agent [None req-99ef34ab-3ed7-44bd-bff3-25797a4a567c - - - - - -] DHCP configuration for ports {'ba010266-c829-4775-9f81-9e5e8ac0a898'} is completed#033[00m Nov 26 05:00:25 localhost dnsmasq[311629]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 1 addresses Nov 26 05:00:25 localhost dnsmasq-dhcp[311629]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:00:25 localhost podman[311649]: 2025-11-26 10:00:25.34299505 +0000 UTC m=+0.063694609 container kill 291eda10b73414259239399adb1948214ba902523383b5f0e1067601ade833a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0) Nov 26 05:00:25 localhost dnsmasq-dhcp[311629]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:00:25 localhost systemd[1]: var-lib-containers-storage-overlay-40552554b86beceec2b6fc8b5bec668e9ca5f78cde377db62b1810fdfce4aab5-merged.mount: Deactivated successfully. Nov 26 05:00:25 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ea63e11eecbdcfbb856f2552df16c5a86ce22807530cf952bf8afbef4b19837e-userdata-shm.mount: Deactivated successfully. Nov 26 05:00:25 localhost systemd[1]: run-netns-qdhcp\x2d6ae5bbd4\x2d7c80\x2d4c74\x2da199\x2ddc17d95fe4c0.mount: Deactivated successfully. Nov 26 05:00:25 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:25.638 262471 INFO neutron.agent.dhcp.agent [None req-59e86086-1952-491c-a3f1-82ebb7af6911 - - - - - -] DHCP configuration for ports {'81c3e8d3-7e16-4566-89d4-904305da9d6a'} is completed#033[00m Nov 26 05:00:25 localhost ovn_controller[153664]: 2025-11-26T10:00:25Z|00175|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:00:25 localhost nova_compute[281415]: 2025-11-26 10:00:25.721 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:25 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:25.725 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:00:25 localhost dnsmasq[311629]: exiting on receipt of SIGTERM Nov 26 05:00:25 localhost podman[311687]: 2025-11-26 10:00:25.879111191 +0000 UTC m=+0.065212094 container kill 291eda10b73414259239399adb1948214ba902523383b5f0e1067601ade833a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Nov 26 05:00:25 localhost systemd[1]: libpod-291eda10b73414259239399adb1948214ba902523383b5f0e1067601ade833a4.scope: Deactivated successfully. Nov 26 05:00:26 localhost podman[311699]: 2025-11-26 10:00:26.038749286 +0000 UTC m=+0.145492779 container died 291eda10b73414259239399adb1948214ba902523383b5f0e1067601ade833a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 26 05:00:26 localhost podman[311699]: 2025-11-26 10:00:26.062708835 +0000 UTC m=+0.169452318 container cleanup 291eda10b73414259239399adb1948214ba902523383b5f0e1067601ade833a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Nov 26 05:00:26 localhost systemd[1]: libpod-conmon-291eda10b73414259239399adb1948214ba902523383b5f0e1067601ade833a4.scope: Deactivated successfully. Nov 26 05:00:26 localhost podman[311703]: 2025-11-26 10:00:26.125076805 +0000 UTC m=+0.218932247 container remove 291eda10b73414259239399adb1948214ba902523383b5f0e1067601ade833a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 26 05:00:26 localhost ovn_controller[153664]: 2025-11-26T10:00:26Z|00176|binding|INFO|Releasing lport 31cd2fdc-2e6f-4921-868b-969dbfc08b2d from this chassis (sb_readonly=0) Nov 26 05:00:26 localhost kernel: device tap31cd2fdc-2e left promiscuous mode Nov 26 05:00:26 localhost ovn_controller[153664]: 2025-11-26T10:00:26Z|00177|binding|INFO|Setting lport 31cd2fdc-2e6f-4921-868b-969dbfc08b2d down in Southbound Nov 26 05:00:26 localhost nova_compute[281415]: 2025-11-26 10:00:26.143 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:26 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:26.149 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=31cd2fdc-2e6f-4921-868b-969dbfc08b2d) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:00:26 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:26.150 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 31cd2fdc-2e6f-4921-868b-969dbfc08b2d in datapath cc3dc995-51cd-4d70-be2c-11c47524552d unbound from our chassis#033[00m Nov 26 05:00:26 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:26.151 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cc3dc995-51cd-4d70-be2c-11c47524552d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:00:26 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:26.152 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[6758d3f7-4338-441f-a5a5-56b7073cfc34]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:00:26 localhost nova_compute[281415]: 2025-11-26 10:00:26.159 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:26 localhost systemd[1]: var-lib-containers-storage-overlay-63b51beabb9e2456eacdbb2399ff6ec0ce91fd21cce3dd862063836d4ba7f63a-merged.mount: Deactivated successfully. Nov 26 05:00:26 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-291eda10b73414259239399adb1948214ba902523383b5f0e1067601ade833a4-userdata-shm.mount: Deactivated successfully. Nov 26 05:00:26 localhost systemd[1]: run-netns-qdhcp\x2dcc3dc995\x2d51cd\x2d4d70\x2dbe2c\x2d11c47524552d.mount: Deactivated successfully. Nov 26 05:00:27 localhost nova_compute[281415]: 2025-11-26 10:00:27.032 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:27 localhost neutron_sriov_agent[255515]: 2025-11-26 10:00:27.309 2 INFO neutron.agent.securitygroups_rpc [None req-2348c9e4-c606-40f3-a3ad-3090c00dd0c9 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:00:27 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:27.355 262471 INFO neutron.agent.linux.ip_lib [None req-18983283-df2a-4299-ad22-85a9946efc88 - - - - - -] Device tap58ebaef9-bd cannot be used as it has no MAC address#033[00m Nov 26 05:00:27 localhost nova_compute[281415]: 2025-11-26 10:00:27.380 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:27 localhost kernel: device tap58ebaef9-bd entered promiscuous mode Nov 26 05:00:27 localhost NetworkManager[5970]: [1764151227.3913] manager: (tap58ebaef9-bd): new Generic device (/org/freedesktop/NetworkManager/Devices/31) Nov 26 05:00:27 localhost ovn_controller[153664]: 2025-11-26T10:00:27Z|00178|binding|INFO|Claiming lport 58ebaef9-bda8-4639-806f-32ac518cdf56 for this chassis. Nov 26 05:00:27 localhost ovn_controller[153664]: 2025-11-26T10:00:27Z|00179|binding|INFO|58ebaef9-bda8-4639-806f-32ac518cdf56: Claiming unknown Nov 26 05:00:27 localhost nova_compute[281415]: 2025-11-26 10:00:27.393 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:27 localhost systemd-udevd[311739]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:00:27 localhost ovn_controller[153664]: 2025-11-26T10:00:27Z|00180|binding|INFO|Setting lport 58ebaef9-bda8-4639-806f-32ac518cdf56 up in Southbound Nov 26 05:00:27 localhost ovn_controller[153664]: 2025-11-26T10:00:27Z|00181|binding|INFO|Setting lport 58ebaef9-bda8-4639-806f-32ac518cdf56 ovn-installed in OVS Nov 26 05:00:27 localhost nova_compute[281415]: 2025-11-26 10:00:27.403 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:27 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:27.406 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=58ebaef9-bda8-4639-806f-32ac518cdf56) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:00:27 localhost nova_compute[281415]: 2025-11-26 10:00:27.407 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:27 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:27.412 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 58ebaef9-bda8-4639-806f-32ac518cdf56 in datapath cc3dc995-51cd-4d70-be2c-11c47524552d bound to our chassis#033[00m Nov 26 05:00:27 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:27.414 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cc3dc995-51cd-4d70-be2c-11c47524552d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:00:27 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:27.415 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[939d6da7-1519-4d62-8c05-4e1638063cd1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:00:27 localhost journal[229445]: ethtool ioctl error on tap58ebaef9-bd: No such device Nov 26 05:00:27 localhost nova_compute[281415]: 2025-11-26 10:00:27.419 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:27 localhost journal[229445]: ethtool ioctl error on tap58ebaef9-bd: No such device Nov 26 05:00:27 localhost journal[229445]: ethtool ioctl error on tap58ebaef9-bd: No such device Nov 26 05:00:27 localhost journal[229445]: ethtool ioctl error on tap58ebaef9-bd: No such device Nov 26 05:00:27 localhost journal[229445]: ethtool ioctl error on tap58ebaef9-bd: No such device Nov 26 05:00:27 localhost journal[229445]: ethtool ioctl error on tap58ebaef9-bd: No such device Nov 26 05:00:27 localhost journal[229445]: ethtool ioctl error on tap58ebaef9-bd: No such device Nov 26 05:00:27 localhost journal[229445]: ethtool ioctl error on tap58ebaef9-bd: No such device Nov 26 05:00:27 localhost podman[240049]: time="2025-11-26T10:00:27Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 05:00:27 localhost nova_compute[281415]: 2025-11-26 10:00:27.467 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:27 localhost podman[240049]: @ - - [26/Nov/2025:10:00:27 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153864 "" "Go-http-client/1.1" Nov 26 05:00:27 localhost nova_compute[281415]: 2025-11-26 10:00:27.507 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:27 localhost podman[240049]: @ - - [26/Nov/2025:10:00:27 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18773 "" "Go-http-client/1.1" Nov 26 05:00:27 localhost nova_compute[281415]: 2025-11-26 10:00:27.547 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:28 localhost podman[311811]: Nov 26 05:00:28 localhost podman[311811]: 2025-11-26 10:00:28.419093369 +0000 UTC m=+0.094865849 container create 0a4fb7119d733a69e1d5e4c35c86b4ccce715865a967a3eadbd970ecb0116e37 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118) Nov 26 05:00:28 localhost systemd[1]: Started libpod-conmon-0a4fb7119d733a69e1d5e4c35c86b4ccce715865a967a3eadbd970ecb0116e37.scope. Nov 26 05:00:28 localhost podman[311811]: 2025-11-26 10:00:28.376015799 +0000 UTC m=+0.051788299 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:00:28 localhost systemd[1]: tmp-crun.i1N1Xw.mount: Deactivated successfully. Nov 26 05:00:28 localhost systemd[1]: Started libcrun container. Nov 26 05:00:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c5a2576a79c1cf4dfdfc241a9aea6fc85c73911ea296492f572f6a433ec659a0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:00:28 localhost podman[311811]: 2025-11-26 10:00:28.526367753 +0000 UTC m=+0.202140223 container init 0a4fb7119d733a69e1d5e4c35c86b4ccce715865a967a3eadbd970ecb0116e37 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 26 05:00:28 localhost podman[311811]: 2025-11-26 10:00:28.536697328 +0000 UTC m=+0.212469808 container start 0a4fb7119d733a69e1d5e4c35c86b4ccce715865a967a3eadbd970ecb0116e37 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_managed=true) Nov 26 05:00:28 localhost dnsmasq[311830]: started, version 2.85 cachesize 150 Nov 26 05:00:28 localhost dnsmasq[311830]: DNS service limited to local subnets Nov 26 05:00:28 localhost dnsmasq[311830]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:00:28 localhost dnsmasq[311830]: warning: no upstream servers configured Nov 26 05:00:28 localhost dnsmasq[311830]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:00:28 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:28.612 262471 INFO neutron.agent.dhcp.agent [None req-18983283-df2a-4299-ad22-85a9946efc88 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:00:26Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=7016c438-8ab2-4f49-a239-8eaaf7a7dcf6, ip_allocation=immediate, mac_address=fa:16:3e:7e:b1:57, name=tempest-NetworksTestDHCPv6-1478444377, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:00:20Z, description=, dns_domain=, id=cc3dc995-51cd-4d70-be2c-11c47524552d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-843096697, port_security_enabled=True, project_id=fbf16d8f1271436498d8d9cbfb24239d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42907, qos_policy_id=None, revision_number=4, router:external=False, shared=False, standard_attr_id=1055, status=ACTIVE, subnets=['7bfb4836-60f9-4a39-93f1-c47ebbf1ce40'], tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:00:25Z, vlan_transparent=None, network_id=cc3dc995-51cd-4d70-be2c-11c47524552d, port_security_enabled=True, project_id=fbf16d8f1271436498d8d9cbfb24239d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['513251a1-00ec-4f61-b1d4-b1337479c848'], standard_attr_id=1091, status=DOWN, tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:00:26Z on network cc3dc995-51cd-4d70-be2c-11c47524552d#033[00m Nov 26 05:00:28 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:28.728 262471 INFO neutron.agent.dhcp.agent [None req-085b7c7b-86c5-45f4-8c1d-88e833cca882 - - - - - -] DHCP configuration for ports {'ba010266-c829-4775-9f81-9e5e8ac0a898'} is completed#033[00m Nov 26 05:00:28 localhost neutron_sriov_agent[255515]: 2025-11-26 10:00:28.791 2 INFO neutron.agent.securitygroups_rpc [None req-629c2367-ed50-4f72-8e95-787351eb5c9d 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:00:28 localhost dnsmasq[311830]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 1 addresses Nov 26 05:00:28 localhost podman[311850]: 2025-11-26 10:00:28.81176863 +0000 UTC m=+0.059921308 container kill 0a4fb7119d733a69e1d5e4c35c86b4ccce715865a967a3eadbd970ecb0116e37 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118) Nov 26 05:00:29 localhost nova_compute[281415]: 2025-11-26 10:00:29.013 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:29 localhost kernel: device tap58ebaef9-bd left promiscuous mode Nov 26 05:00:29 localhost ovn_controller[153664]: 2025-11-26T10:00:29Z|00182|binding|INFO|Releasing lport 58ebaef9-bda8-4639-806f-32ac518cdf56 from this chassis (sb_readonly=0) Nov 26 05:00:29 localhost ovn_controller[153664]: 2025-11-26T10:00:29Z|00183|binding|INFO|Setting lport 58ebaef9-bda8-4639-806f-32ac518cdf56 down in Southbound Nov 26 05:00:29 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:29.024 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=58ebaef9-bda8-4639-806f-32ac518cdf56) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:00:29 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:29.026 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 58ebaef9-bda8-4639-806f-32ac518cdf56 in datapath cc3dc995-51cd-4d70-be2c-11c47524552d unbound from our chassis#033[00m Nov 26 05:00:29 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:29.027 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cc3dc995-51cd-4d70-be2c-11c47524552d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:00:29 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:29.028 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[4f7bb5e1-021e-4241-b789-d76d9a771f0f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:00:29 localhost nova_compute[281415]: 2025-11-26 10:00:29.041 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:29 localhost nova_compute[281415]: 2025-11-26 10:00:29.044 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.054 262471 INFO neutron.agent.dhcp.agent [None req-c9298e89-15dd-4c99-ae5b-fb6ef566c88d - - - - - -] DHCP configuration for ports {'7016c438-8ab2-4f49-a239-8eaaf7a7dcf6'} is completed#033[00m Nov 26 05:00:29 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e121 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:00:29 localhost dnsmasq[311830]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:00:29 localhost podman[311890]: 2025-11-26 10:00:29.264000897 +0000 UTC m=+0.064672378 container kill 0a4fb7119d733a69e1d5e4c35c86b4ccce715865a967a3eadbd970ecb0116e37 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0) Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.293 262471 ERROR neutron.agent.dhcp.agent [None req-18983283-df2a-4299-ad22-85a9946efc88 - - - - - -] Unable to reload_allocations dhcp for cc3dc995-51cd-4d70-be2c-11c47524552d.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap58ebaef9-bd not found in namespace qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d. Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.293 262471 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.293 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.293 262471 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.293 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.293 262471 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.293 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.293 262471 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.293 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.293 262471 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.293 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.293 262471 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.293 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.293 262471 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.293 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.293 262471 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.293 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.293 262471 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.293 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.293 262471 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.293 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.293 262471 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.293 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.293 262471 ERROR neutron.agent.dhcp.agent return fut.result() Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.293 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.293 262471 ERROR neutron.agent.dhcp.agent return self.__get_result() Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.293 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.293 262471 ERROR neutron.agent.dhcp.agent raise self._exception Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.293 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.293 262471 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.293 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.293 262471 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.293 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.293 262471 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.293 262471 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap58ebaef9-bd not found in namespace qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d. Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.293 262471 ERROR neutron.agent.dhcp.agent #033[00m Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.302 262471 INFO neutron.agent.dhcp.agent [-] Synchronizing state#033[00m Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.447 262471 INFO neutron.agent.dhcp.agent [None req-691a5d6e-e46c-4c84-8d49-e89d3d4a0607 - - - - - -] All active networks have been fetched through RPC.#033[00m Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.448 262471 INFO neutron.agent.dhcp.agent [-] Starting network cc3dc995-51cd-4d70-be2c-11c47524552d dhcp configuration#033[00m Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.448 262471 INFO neutron.agent.dhcp.agent [-] Finished network cc3dc995-51cd-4d70-be2c-11c47524552d dhcp configuration#033[00m Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.449 262471 INFO neutron.agent.dhcp.agent [None req-691a5d6e-e46c-4c84-8d49-e89d3d4a0607 - - - - - -] Synchronizing state complete#033[00m Nov 26 05:00:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:29.526 262471 INFO neutron.agent.dhcp.agent [None req-8d681942-7bfa-45b0-bf9c-442f34465cda - - - - - -] DHCP configuration for ports {'ba010266-c829-4775-9f81-9e5e8ac0a898'} is completed#033[00m Nov 26 05:00:29 localhost dnsmasq[311830]: exiting on receipt of SIGTERM Nov 26 05:00:29 localhost systemd[1]: libpod-0a4fb7119d733a69e1d5e4c35c86b4ccce715865a967a3eadbd970ecb0116e37.scope: Deactivated successfully. Nov 26 05:00:29 localhost podman[311921]: 2025-11-26 10:00:29.741648423 +0000 UTC m=+0.070080937 container kill 0a4fb7119d733a69e1d5e4c35c86b4ccce715865a967a3eadbd970ecb0116e37 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118) Nov 26 05:00:29 localhost podman[311934]: 2025-11-26 10:00:29.818395057 +0000 UTC m=+0.059779494 container died 0a4fb7119d733a69e1d5e4c35c86b4ccce715865a967a3eadbd970ecb0116e37 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0) Nov 26 05:00:29 localhost podman[311934]: 2025-11-26 10:00:29.85782146 +0000 UTC m=+0.099205857 container cleanup 0a4fb7119d733a69e1d5e4c35c86b4ccce715865a967a3eadbd970ecb0116e37 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Nov 26 05:00:29 localhost systemd[1]: libpod-conmon-0a4fb7119d733a69e1d5e4c35c86b4ccce715865a967a3eadbd970ecb0116e37.scope: Deactivated successfully. Nov 26 05:00:29 localhost podman[311936]: 2025-11-26 10:00:29.899573371 +0000 UTC m=+0.133813647 container remove 0a4fb7119d733a69e1d5e4c35c86b4ccce715865a967a3eadbd970ecb0116e37 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:00:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 05:00:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 05:00:30 localhost neutron_sriov_agent[255515]: 2025-11-26 10:00:30.274 2 INFO neutron.agent.securitygroups_rpc [None req-95ce95ba-1231-43ed-bcb9-db9a4b5f0d32 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:00:30 localhost podman[311964]: 2025-11-26 10:00:30.33865488 +0000 UTC m=+0.087767889 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 05:00:30 localhost podman[311964]: 2025-11-26 10:00:30.352326933 +0000 UTC m=+0.101439962 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 26 05:00:30 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 05:00:30 localhost podman[311965]: 2025-11-26 10:00:30.41120997 +0000 UTC m=+0.155814596 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Nov 26 05:00:30 localhost systemd[1]: var-lib-containers-storage-overlay-c5a2576a79c1cf4dfdfc241a9aea6fc85c73911ea296492f572f6a433ec659a0-merged.mount: Deactivated successfully. Nov 26 05:00:30 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0a4fb7119d733a69e1d5e4c35c86b4ccce715865a967a3eadbd970ecb0116e37-userdata-shm.mount: Deactivated successfully. Nov 26 05:00:30 localhost systemd[1]: run-netns-qdhcp\x2dcc3dc995\x2d51cd\x2d4d70\x2dbe2c\x2d11c47524552d.mount: Deactivated successfully. Nov 26 05:00:30 localhost podman[311965]: 2025-11-26 10:00:30.45460989 +0000 UTC m=+0.199214506 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 26 05:00:30 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 05:00:30 localhost nova_compute[281415]: 2025-11-26 10:00:30.763 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:31 localhost neutron_sriov_agent[255515]: 2025-11-26 10:00:31.001 2 INFO neutron.agent.securitygroups_rpc [None req-e53bc910-54f4-4d69-abfa-23f80dc43217 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:00:32 localhost nova_compute[281415]: 2025-11-26 10:00:32.072 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:32.124 262471 INFO neutron.agent.linux.ip_lib [None req-14ad631a-a894-4cad-a06d-df0eabbe8a79 - - - - - -] Device tapc7f987a5-c8 cannot be used as it has no MAC address#033[00m Nov 26 05:00:32 localhost nova_compute[281415]: 2025-11-26 10:00:32.152 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:32 localhost kernel: device tapc7f987a5-c8 entered promiscuous mode Nov 26 05:00:32 localhost NetworkManager[5970]: [1764151232.1605] manager: (tapc7f987a5-c8): new Generic device (/org/freedesktop/NetworkManager/Devices/32) Nov 26 05:00:32 localhost nova_compute[281415]: 2025-11-26 10:00:32.160 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:32 localhost ovn_controller[153664]: 2025-11-26T10:00:32Z|00184|binding|INFO|Claiming lport c7f987a5-c820-4d79-ac17-63f98824d9fc for this chassis. Nov 26 05:00:32 localhost ovn_controller[153664]: 2025-11-26T10:00:32Z|00185|binding|INFO|c7f987a5-c820-4d79-ac17-63f98824d9fc: Claiming unknown Nov 26 05:00:32 localhost systemd-udevd[312016]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:00:32 localhost ovn_controller[153664]: 2025-11-26T10:00:32Z|00186|binding|INFO|Setting lport c7f987a5-c820-4d79-ac17-63f98824d9fc ovn-installed in OVS Nov 26 05:00:32 localhost nova_compute[281415]: 2025-11-26 10:00:32.175 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:32 localhost journal[229445]: ethtool ioctl error on tapc7f987a5-c8: No such device Nov 26 05:00:32 localhost nova_compute[281415]: 2025-11-26 10:00:32.184 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:32 localhost journal[229445]: ethtool ioctl error on tapc7f987a5-c8: No such device Nov 26 05:00:32 localhost nova_compute[281415]: 2025-11-26 10:00:32.189 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:32 localhost journal[229445]: ethtool ioctl error on tapc7f987a5-c8: No such device Nov 26 05:00:32 localhost journal[229445]: ethtool ioctl error on tapc7f987a5-c8: No such device Nov 26 05:00:32 localhost journal[229445]: ethtool ioctl error on tapc7f987a5-c8: No such device Nov 26 05:00:32 localhost journal[229445]: ethtool ioctl error on tapc7f987a5-c8: No such device Nov 26 05:00:32 localhost journal[229445]: ethtool ioctl error on tapc7f987a5-c8: No such device Nov 26 05:00:32 localhost journal[229445]: ethtool ioctl error on tapc7f987a5-c8: No such device Nov 26 05:00:32 localhost nova_compute[281415]: 2025-11-26 10:00:32.235 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:32 localhost nova_compute[281415]: 2025-11-26 10:00:32.263 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:32 localhost ovn_controller[153664]: 2025-11-26T10:00:32Z|00187|binding|INFO|Setting lport c7f987a5-c820-4d79-ac17-63f98824d9fc up in Southbound Nov 26 05:00:32 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:32.669 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=c7f987a5-c820-4d79-ac17-63f98824d9fc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:00:32 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:32.671 159486 INFO neutron.agent.ovn.metadata.agent [-] Port c7f987a5-c820-4d79-ac17-63f98824d9fc in datapath cc3dc995-51cd-4d70-be2c-11c47524552d bound to our chassis#033[00m Nov 26 05:00:32 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:32.673 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cc3dc995-51cd-4d70-be2c-11c47524552d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:00:32 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:32.675 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[95b5d6f3-0f27-4847-bbe0-f8e09425c611]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:00:33 localhost podman[312088]: Nov 26 05:00:33 localhost podman[312088]: 2025-11-26 10:00:33.176790891 +0000 UTC m=+0.093333743 container create 87d0563eb3993ce88324878821b4a587261a066ebb0923573dd77b2303ef9d70 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118) Nov 26 05:00:33 localhost systemd[1]: Started libpod-conmon-87d0563eb3993ce88324878821b4a587261a066ebb0923573dd77b2303ef9d70.scope. Nov 26 05:00:33 localhost podman[312088]: 2025-11-26 10:00:33.129987812 +0000 UTC m=+0.046530694 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:00:33 localhost systemd[1]: tmp-crun.rL4Mcs.mount: Deactivated successfully. Nov 26 05:00:33 localhost systemd[1]: Started libcrun container. Nov 26 05:00:33 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8789417913b15cf14785583410d038eb9c7aaf5f38adde838ab254a3db8653a8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:00:33 localhost podman[312088]: 2025-11-26 10:00:33.267152617 +0000 UTC m=+0.183695469 container init 87d0563eb3993ce88324878821b4a587261a066ebb0923573dd77b2303ef9d70 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 26 05:00:33 localhost podman[312088]: 2025-11-26 10:00:33.318720197 +0000 UTC m=+0.235263049 container start 87d0563eb3993ce88324878821b4a587261a066ebb0923573dd77b2303ef9d70 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 26 05:00:33 localhost dnsmasq[312106]: started, version 2.85 cachesize 150 Nov 26 05:00:33 localhost dnsmasq[312106]: DNS service limited to local subnets Nov 26 05:00:33 localhost dnsmasq[312106]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:00:33 localhost dnsmasq[312106]: warning: no upstream servers configured Nov 26 05:00:33 localhost dnsmasq-dhcp[312106]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 26 05:00:33 localhost dnsmasq[312106]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 1 addresses Nov 26 05:00:33 localhost dnsmasq-dhcp[312106]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:00:33 localhost dnsmasq-dhcp[312106]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:00:33 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:33.533 262471 INFO neutron.agent.dhcp.agent [None req-4fa2b379-176b-41e4-b185-172704624db0 - - - - - -] DHCP configuration for ports {'b9e19b90-0127-4e8e-9cec-83ef7c3f00d1', 'ba010266-c829-4775-9f81-9e5e8ac0a898'} is completed#033[00m Nov 26 05:00:33 localhost dnsmasq[312106]: exiting on receipt of SIGTERM Nov 26 05:00:33 localhost podman[312125]: 2025-11-26 10:00:33.68701754 +0000 UTC m=+0.058746284 container kill 87d0563eb3993ce88324878821b4a587261a066ebb0923573dd77b2303ef9d70 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0) Nov 26 05:00:33 localhost systemd[1]: libpod-87d0563eb3993ce88324878821b4a587261a066ebb0923573dd77b2303ef9d70.scope: Deactivated successfully. Nov 26 05:00:33 localhost podman[312138]: 2025-11-26 10:00:33.767843723 +0000 UTC m=+0.063286917 container died 87d0563eb3993ce88324878821b4a587261a066ebb0923573dd77b2303ef9d70 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:00:33 localhost podman[312138]: 2025-11-26 10:00:33.799155497 +0000 UTC m=+0.094598631 container cleanup 87d0563eb3993ce88324878821b4a587261a066ebb0923573dd77b2303ef9d70 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Nov 26 05:00:33 localhost systemd[1]: libpod-conmon-87d0563eb3993ce88324878821b4a587261a066ebb0923573dd77b2303ef9d70.scope: Deactivated successfully. Nov 26 05:00:33 localhost podman[312140]: 2025-11-26 10:00:33.852612133 +0000 UTC m=+0.140234706 container remove 87d0563eb3993ce88324878821b4a587261a066ebb0923573dd77b2303ef9d70 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 26 05:00:33 localhost ovn_controller[153664]: 2025-11-26T10:00:33Z|00188|binding|INFO|Releasing lport c7f987a5-c820-4d79-ac17-63f98824d9fc from this chassis (sb_readonly=0) Nov 26 05:00:33 localhost nova_compute[281415]: 2025-11-26 10:00:33.866 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:33 localhost ovn_controller[153664]: 2025-11-26T10:00:33Z|00189|binding|INFO|Setting lport c7f987a5-c820-4d79-ac17-63f98824d9fc down in Southbound Nov 26 05:00:33 localhost kernel: device tapc7f987a5-c8 left promiscuous mode Nov 26 05:00:33 localhost nova_compute[281415]: 2025-11-26 10:00:33.893 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:33 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:33.894 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=c7f987a5-c820-4d79-ac17-63f98824d9fc) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:00:33 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:33.897 159486 INFO neutron.agent.ovn.metadata.agent [-] Port c7f987a5-c820-4d79-ac17-63f98824d9fc in datapath cc3dc995-51cd-4d70-be2c-11c47524552d unbound from our chassis#033[00m Nov 26 05:00:33 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:33.899 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cc3dc995-51cd-4d70-be2c-11c47524552d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:00:33 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:33.900 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[a5a1f58f-1cf6-4f18-a787-6adb75880382]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:00:34 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e121 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:00:34 localhost systemd[1]: var-lib-containers-storage-overlay-8789417913b15cf14785583410d038eb9c7aaf5f38adde838ab254a3db8653a8-merged.mount: Deactivated successfully. Nov 26 05:00:34 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-87d0563eb3993ce88324878821b4a587261a066ebb0923573dd77b2303ef9d70-userdata-shm.mount: Deactivated successfully. Nov 26 05:00:34 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:34.701 262471 INFO neutron.agent.dhcp.agent [None req-cccc08ab-f716-4ec2-a3b0-4d513a05193c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:00:34 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:34.701 262471 INFO neutron.agent.dhcp.agent [None req-cccc08ab-f716-4ec2-a3b0-4d513a05193c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:00:34 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:34.702 262471 INFO neutron.agent.dhcp.agent [None req-cccc08ab-f716-4ec2-a3b0-4d513a05193c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:00:34 localhost systemd[1]: run-netns-qdhcp\x2dcc3dc995\x2d51cd\x2d4d70\x2dbe2c\x2d11c47524552d.mount: Deactivated successfully. Nov 26 05:00:35 localhost ovn_controller[153664]: 2025-11-26T10:00:35Z|00190|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:00:35 localhost nova_compute[281415]: 2025-11-26 10:00:35.447 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:35 localhost nova_compute[281415]: 2025-11-26 10:00:35.765 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:35 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:35.868 262471 INFO neutron.agent.linux.ip_lib [None req-4b97254c-7034-4db8-b11e-844b5008bc17 - - - - - -] Device tap2046126f-08 cannot be used as it has no MAC address#033[00m Nov 26 05:00:35 localhost nova_compute[281415]: 2025-11-26 10:00:35.902 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:35 localhost kernel: device tap2046126f-08 entered promiscuous mode Nov 26 05:00:35 localhost ovn_controller[153664]: 2025-11-26T10:00:35Z|00191|binding|INFO|Claiming lport 2046126f-0872-450a-9358-71925605aaf4 for this chassis. Nov 26 05:00:35 localhost ovn_controller[153664]: 2025-11-26T10:00:35Z|00192|binding|INFO|2046126f-0872-450a-9358-71925605aaf4: Claiming unknown Nov 26 05:00:35 localhost nova_compute[281415]: 2025-11-26 10:00:35.913 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:35 localhost NetworkManager[5970]: [1764151235.9142] manager: (tap2046126f-08): new Generic device (/org/freedesktop/NetworkManager/Devices/33) Nov 26 05:00:35 localhost systemd-udevd[312177]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:00:35 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:35.923 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=2046126f-0872-450a-9358-71925605aaf4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:00:35 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:35.926 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 2046126f-0872-450a-9358-71925605aaf4 in datapath cc3dc995-51cd-4d70-be2c-11c47524552d bound to our chassis#033[00m Nov 26 05:00:35 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:35.927 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cc3dc995-51cd-4d70-be2c-11c47524552d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:00:35 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:35.928 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[d951e406-ba42-48ec-81e1-54269b3e1808]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:00:35 localhost journal[229445]: ethtool ioctl error on tap2046126f-08: No such device Nov 26 05:00:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 05:00:35 localhost journal[229445]: ethtool ioctl error on tap2046126f-08: No such device Nov 26 05:00:35 localhost nova_compute[281415]: 2025-11-26 10:00:35.951 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:35 localhost ovn_controller[153664]: 2025-11-26T10:00:35Z|00193|binding|INFO|Setting lport 2046126f-0872-450a-9358-71925605aaf4 ovn-installed in OVS Nov 26 05:00:35 localhost ovn_controller[153664]: 2025-11-26T10:00:35Z|00194|binding|INFO|Setting lport 2046126f-0872-450a-9358-71925605aaf4 up in Southbound Nov 26 05:00:35 localhost journal[229445]: ethtool ioctl error on tap2046126f-08: No such device Nov 26 05:00:35 localhost journal[229445]: ethtool ioctl error on tap2046126f-08: No such device Nov 26 05:00:35 localhost journal[229445]: ethtool ioctl error on tap2046126f-08: No such device Nov 26 05:00:35 localhost journal[229445]: ethtool ioctl error on tap2046126f-08: No such device Nov 26 05:00:35 localhost journal[229445]: ethtool ioctl error on tap2046126f-08: No such device Nov 26 05:00:35 localhost journal[229445]: ethtool ioctl error on tap2046126f-08: No such device Nov 26 05:00:35 localhost nova_compute[281415]: 2025-11-26 10:00:35.997 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:36 localhost nova_compute[281415]: 2025-11-26 10:00:36.028 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 05:00:36 localhost podman[312184]: 2025-11-26 10:00:36.063437244 +0000 UTC m=+0.096787006 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251118) Nov 26 05:00:36 localhost podman[312184]: 2025-11-26 10:00:36.137390245 +0000 UTC m=+0.170740027 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:00:36 localhost podman[312223]: 2025-11-26 10:00:36.14704801 +0000 UTC m=+0.074989083 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_id=edpm, vendor=Red Hat, Inc., release=1755695350, container_name=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Nov 26 05:00:36 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 05:00:36 localhost neutron_sriov_agent[255515]: 2025-11-26 10:00:36.154 2 INFO neutron.agent.securitygroups_rpc [None req-8652b4a0-b233-4639-9f8e-dc86a803b151 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:00:36 localhost podman[312223]: 2025-11-26 10:00:36.170264274 +0000 UTC m=+0.098205357 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, release=1755695350, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Nov 26 05:00:36 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 05:00:36 localhost podman[312293]: Nov 26 05:00:36 localhost podman[312293]: 2025-11-26 10:00:36.896838142 +0000 UTC m=+0.093771256 container create b917c887e25449b63ddfa743398d0c5c128b8f964f2f579e106e825e44cac088 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Nov 26 05:00:36 localhost systemd[1]: Started libpod-conmon-b917c887e25449b63ddfa743398d0c5c128b8f964f2f579e106e825e44cac088.scope. Nov 26 05:00:36 localhost podman[312293]: 2025-11-26 10:00:36.854271117 +0000 UTC m=+0.051204261 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:00:36 localhost systemd[1]: Started libcrun container. Nov 26 05:00:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/881f4836fd65125b183ffc7d4a5a914b650c6956e961bf6a22b56ce59747acba/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:00:36 localhost podman[312293]: 2025-11-26 10:00:36.982386625 +0000 UTC m=+0.179319729 container init b917c887e25449b63ddfa743398d0c5c128b8f964f2f579e106e825e44cac088 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:00:36 localhost podman[312293]: 2025-11-26 10:00:36.995645877 +0000 UTC m=+0.192578981 container start b917c887e25449b63ddfa743398d0c5c128b8f964f2f579e106e825e44cac088 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 26 05:00:37 localhost dnsmasq[312312]: started, version 2.85 cachesize 150 Nov 26 05:00:37 localhost dnsmasq[312312]: DNS service limited to local subnets Nov 26 05:00:37 localhost dnsmasq[312312]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:00:37 localhost dnsmasq[312312]: warning: no upstream servers configured Nov 26 05:00:37 localhost dnsmasq-dhcp[312312]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 26 05:00:37 localhost dnsmasq[312312]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:00:37 localhost dnsmasq-dhcp[312312]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:00:37 localhost dnsmasq-dhcp[312312]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:00:37 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:37.082 262471 INFO neutron.agent.dhcp.agent [None req-4b97254c-7034-4db8-b11e-844b5008bc17 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:00:35Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=683d0c42-c4d3-4940-9766-3261f7de113f, ip_allocation=immediate, mac_address=fa:16:3e:3f:4a:73, name=tempest-NetworksTestDHCPv6-937355610, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:00:20Z, description=, dns_domain=, id=cc3dc995-51cd-4d70-be2c-11c47524552d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-843096697, port_security_enabled=True, project_id=fbf16d8f1271436498d8d9cbfb24239d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42907, qos_policy_id=None, revision_number=8, router:external=False, shared=False, standard_attr_id=1055, status=ACTIVE, subnets=['9f41497c-95b4-45ae-bd11-28f56ed06942'], tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:00:34Z, vlan_transparent=None, network_id=cc3dc995-51cd-4d70-be2c-11c47524552d, port_security_enabled=True, project_id=fbf16d8f1271436498d8d9cbfb24239d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['513251a1-00ec-4f61-b1d4-b1337479c848'], standard_attr_id=1130, status=DOWN, tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:00:35Z on network cc3dc995-51cd-4d70-be2c-11c47524552d#033[00m Nov 26 05:00:37 localhost nova_compute[281415]: 2025-11-26 10:00:37.100 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:37 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:37.217 262471 INFO neutron.agent.dhcp.agent [None req-eb504f6a-2662-4341-b26a-5d06f2cb9925 - - - - - -] DHCP configuration for ports {'ba010266-c829-4775-9f81-9e5e8ac0a898'} is completed#033[00m Nov 26 05:00:37 localhost neutron_sriov_agent[255515]: 2025-11-26 10:00:37.267 2 INFO neutron.agent.securitygroups_rpc [None req-aa8c48ff-def9-4bcd-8b23-764ccea5b94e 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:00:37 localhost podman[312331]: 2025-11-26 10:00:37.30703586 +0000 UTC m=+0.066219314 container kill b917c887e25449b63ddfa743398d0c5c128b8f964f2f579e106e825e44cac088 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 26 05:00:37 localhost dnsmasq[312312]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 1 addresses Nov 26 05:00:37 localhost dnsmasq-dhcp[312312]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:00:37 localhost dnsmasq-dhcp[312312]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:00:37 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:37.546 262471 INFO neutron.agent.dhcp.agent [None req-a9c133f7-711e-4b90-b382-1be4564277d8 - - - - - -] DHCP configuration for ports {'683d0c42-c4d3-4940-9766-3261f7de113f'} is completed#033[00m Nov 26 05:00:37 localhost dnsmasq[312312]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:00:37 localhost dnsmasq-dhcp[312312]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:00:37 localhost dnsmasq-dhcp[312312]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:00:37 localhost podman[312371]: 2025-11-26 10:00:37.702096051 +0000 UTC m=+0.069827561 container kill b917c887e25449b63ddfa743398d0c5c128b8f964f2f579e106e825e44cac088 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:00:37 localhost systemd[1]: tmp-crun.aJ61HY.mount: Deactivated successfully. Nov 26 05:00:38 localhost systemd[1]: tmp-crun.QSjaIx.mount: Deactivated successfully. Nov 26 05:00:38 localhost dnsmasq[312312]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:00:38 localhost dnsmasq-dhcp[312312]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:00:38 localhost dnsmasq-dhcp[312312]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:00:38 localhost podman[312408]: 2025-11-26 10:00:38.218955144 +0000 UTC m=+0.076683273 container kill b917c887e25449b63ddfa743398d0c5c128b8f964f2f579e106e825e44cac088 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 26 05:00:38 localhost neutron_sriov_agent[255515]: 2025-11-26 10:00:38.565 2 INFO neutron.agent.securitygroups_rpc [None req-15311245-75cf-497b-af9b-564a2e5046af 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:00:38 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:38.597 262471 INFO neutron.agent.dhcp.agent [None req-ecb82d04-325d-4ac5-89d8-3612515edb1b - - - - - -] DHCP configuration for ports {'2046126f-0872-450a-9358-71925605aaf4', 'ba010266-c829-4775-9f81-9e5e8ac0a898'} is completed#033[00m Nov 26 05:00:38 localhost dnsmasq[312312]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:00:38 localhost dnsmasq-dhcp[312312]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:00:38 localhost podman[312447]: 2025-11-26 10:00:38.746486202 +0000 UTC m=+0.073934792 container kill b917c887e25449b63ddfa743398d0c5c128b8f964f2f579e106e825e44cac088 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 26 05:00:38 localhost dnsmasq-dhcp[312312]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:00:38 localhost systemd[1]: tmp-crun.ldoVIS.mount: Deactivated successfully. Nov 26 05:00:38 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:38.937 262471 INFO neutron.agent.dhcp.agent [None req-cffe847e-0510-406c-a925-fb3259d4ce37 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:00:38Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=ec9707ec-3d19-43b9-99e9-bfb36fd87671, ip_allocation=immediate, mac_address=fa:16:3e:b7:66:ad, name=tempest-NetworksTestDHCPv6-1093509210, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:00:20Z, description=, dns_domain=, id=cc3dc995-51cd-4d70-be2c-11c47524552d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-843096697, port_security_enabled=True, project_id=fbf16d8f1271436498d8d9cbfb24239d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42907, qos_policy_id=None, revision_number=10, router:external=False, shared=False, standard_attr_id=1055, status=ACTIVE, subnets=['347210d9-affd-437d-b5b4-afe0c632fdda'], tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:00:37Z, vlan_transparent=None, network_id=cc3dc995-51cd-4d70-be2c-11c47524552d, port_security_enabled=True, project_id=fbf16d8f1271436498d8d9cbfb24239d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['513251a1-00ec-4f61-b1d4-b1337479c848'], standard_attr_id=1161, status=DOWN, tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:00:38Z on network cc3dc995-51cd-4d70-be2c-11c47524552d#033[00m Nov 26 05:00:39 localhost dnsmasq[312312]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 1 addresses Nov 26 05:00:39 localhost dnsmasq-dhcp[312312]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:00:39 localhost dnsmasq-dhcp[312312]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:00:39 localhost podman[312487]: 2025-11-26 10:00:39.135053371 +0000 UTC m=+0.065628936 container kill b917c887e25449b63ddfa743398d0c5c128b8f964f2f579e106e825e44cac088 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Nov 26 05:00:39 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e121 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:00:39 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:39.493 262471 INFO neutron.agent.dhcp.agent [None req-a29646bc-cd6c-4c1a-aad5-448c68969ac2 - - - - - -] DHCP configuration for ports {'2046126f-0872-450a-9358-71925605aaf4', 'ba010266-c829-4775-9f81-9e5e8ac0a898'} is completed#033[00m Nov 26 05:00:39 localhost neutron_sriov_agent[255515]: 2025-11-26 10:00:39.735 2 INFO neutron.agent.securitygroups_rpc [None req-ac27dd80-c0f9-474c-a96d-9059cfbe6714 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:00:39 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:39.768 262471 INFO neutron.agent.dhcp.agent [None req-b7b505bc-d4b2-4f5f-915f-60e1b5cf45f1 - - - - - -] DHCP configuration for ports {'ec9707ec-3d19-43b9-99e9-bfb36fd87671'} is completed#033[00m Nov 26 05:00:40 localhost dnsmasq[312312]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:00:40 localhost dnsmasq-dhcp[312312]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:00:40 localhost podman[312524]: 2025-11-26 10:00:40.165595664 +0000 UTC m=+0.070731897 container kill b917c887e25449b63ddfa743398d0c5c128b8f964f2f579e106e825e44cac088 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Nov 26 05:00:40 localhost dnsmasq-dhcp[312312]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:00:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 05:00:40 localhost systemd[1]: tmp-crun.1ZXSbK.mount: Deactivated successfully. Nov 26 05:00:40 localhost podman[312538]: 2025-11-26 10:00:40.281456511 +0000 UTC m=+0.096025363 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 26 05:00:40 localhost podman[312538]: 2025-11-26 10:00:40.293105775 +0000 UTC m=+0.107674627 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 05:00:40 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 05:00:40 localhost nova_compute[281415]: 2025-11-26 10:00:40.790 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:41 localhost systemd[1]: tmp-crun.427NNM.mount: Deactivated successfully. Nov 26 05:00:41 localhost dnsmasq[312312]: exiting on receipt of SIGTERM Nov 26 05:00:41 localhost podman[312584]: 2025-11-26 10:00:41.791017749 +0000 UTC m=+0.081205856 container kill b917c887e25449b63ddfa743398d0c5c128b8f964f2f579e106e825e44cac088 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0) Nov 26 05:00:41 localhost systemd[1]: libpod-b917c887e25449b63ddfa743398d0c5c128b8f964f2f579e106e825e44cac088.scope: Deactivated successfully. Nov 26 05:00:41 localhost podman[312605]: 2025-11-26 10:00:41.866436723 +0000 UTC m=+0.046153691 container died b917c887e25449b63ddfa743398d0c5c128b8f964f2f579e106e825e44cac088 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3) Nov 26 05:00:41 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b917c887e25449b63ddfa743398d0c5c128b8f964f2f579e106e825e44cac088-userdata-shm.mount: Deactivated successfully. Nov 26 05:00:41 localhost podman[312605]: 2025-11-26 10:00:41.902546318 +0000 UTC m=+0.082263266 container remove b917c887e25449b63ddfa743398d0c5c128b8f964f2f579e106e825e44cac088 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:00:41 localhost systemd[1]: libpod-conmon-b917c887e25449b63ddfa743398d0c5c128b8f964f2f579e106e825e44cac088.scope: Deactivated successfully. Nov 26 05:00:41 localhost ovn_controller[153664]: 2025-11-26T10:00:41Z|00195|binding|INFO|Releasing lport 2046126f-0872-450a-9358-71925605aaf4 from this chassis (sb_readonly=0) Nov 26 05:00:41 localhost kernel: device tap2046126f-08 left promiscuous mode Nov 26 05:00:41 localhost ovn_controller[153664]: 2025-11-26T10:00:41Z|00196|binding|INFO|Setting lport 2046126f-0872-450a-9358-71925605aaf4 down in Southbound Nov 26 05:00:41 localhost nova_compute[281415]: 2025-11-26 10:00:41.951 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:41 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:41.961 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=2046126f-0872-450a-9358-71925605aaf4) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:00:41 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:41.963 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 2046126f-0872-450a-9358-71925605aaf4 in datapath cc3dc995-51cd-4d70-be2c-11c47524552d unbound from our chassis#033[00m Nov 26 05:00:41 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:41.964 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cc3dc995-51cd-4d70-be2c-11c47524552d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:00:41 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:41.965 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[03377ded-5115-45b3-9303-697e6231e310]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:00:41 localhost nova_compute[281415]: 2025-11-26 10:00:41.972 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:42 localhost nova_compute[281415]: 2025-11-26 10:00:42.103 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:42 localhost neutron_sriov_agent[255515]: 2025-11-26 10:00:42.227 2 INFO neutron.agent.securitygroups_rpc [None req-f6c0b21b-8cde-45f3-88d5-80eedd7257c7 a288769cbd524b60bf567f14d8972f45 eeec26dadb514e97b8b9dea346739ecd - - default default] Security group member updated ['b769a949-37a6-4294-867f-262d5bc33e35']#033[00m Nov 26 05:00:42 localhost neutron_sriov_agent[255515]: 2025-11-26 10:00:42.465 2 INFO neutron.agent.securitygroups_rpc [None req-1660a980-1b13-430b-8b2c-f1e4906cfc7b 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:00:42 localhost systemd[1]: var-lib-containers-storage-overlay-881f4836fd65125b183ffc7d4a5a914b650c6956e961bf6a22b56ce59747acba-merged.mount: Deactivated successfully. Nov 26 05:00:42 localhost systemd[1]: run-netns-qdhcp\x2dcc3dc995\x2d51cd\x2d4d70\x2dbe2c\x2d11c47524552d.mount: Deactivated successfully. Nov 26 05:00:43 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:43.037 262471 INFO neutron.agent.linux.ip_lib [None req-51ae501d-f878-43d5-b916-2b621b18a06a - - - - - -] Device tap3c3b338b-cd cannot be used as it has no MAC address#033[00m Nov 26 05:00:43 localhost nova_compute[281415]: 2025-11-26 10:00:43.095 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:43 localhost kernel: device tap3c3b338b-cd entered promiscuous mode Nov 26 05:00:43 localhost NetworkManager[5970]: [1764151243.1085] manager: (tap3c3b338b-cd): new Generic device (/org/freedesktop/NetworkManager/Devices/34) Nov 26 05:00:43 localhost systemd-udevd[312634]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:00:43 localhost ovn_controller[153664]: 2025-11-26T10:00:43Z|00197|binding|INFO|Claiming lport 3c3b338b-cdb6-4947-b1b2-a42d094ec69d for this chassis. Nov 26 05:00:43 localhost nova_compute[281415]: 2025-11-26 10:00:43.111 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:43 localhost neutron_sriov_agent[255515]: 2025-11-26 10:00:43.111 2 INFO neutron.agent.securitygroups_rpc [None req-98bf2e7b-26b0-4cfa-8fef-ebb707a47196 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:00:43 localhost ovn_controller[153664]: 2025-11-26T10:00:43Z|00198|binding|INFO|3c3b338b-cdb6-4947-b1b2-a42d094ec69d: Claiming unknown Nov 26 05:00:43 localhost ovn_controller[153664]: 2025-11-26T10:00:43Z|00199|binding|INFO|Setting lport 3c3b338b-cdb6-4947-b1b2-a42d094ec69d ovn-installed in OVS Nov 26 05:00:43 localhost nova_compute[281415]: 2025-11-26 10:00:43.122 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:43 localhost ovn_controller[153664]: 2025-11-26T10:00:43Z|00200|binding|INFO|Setting lport 3c3b338b-cdb6-4947-b1b2-a42d094ec69d up in Southbound Nov 26 05:00:43 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:43.126 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=3c3b338b-cdb6-4947-b1b2-a42d094ec69d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:00:43 localhost nova_compute[281415]: 2025-11-26 10:00:43.125 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:43 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:43.128 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 3c3b338b-cdb6-4947-b1b2-a42d094ec69d in datapath cc3dc995-51cd-4d70-be2c-11c47524552d bound to our chassis#033[00m Nov 26 05:00:43 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:43.129 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cc3dc995-51cd-4d70-be2c-11c47524552d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:00:43 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:43.131 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[4efe958c-b046-49da-9009-fdbddb12d447]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:00:43 localhost journal[229445]: ethtool ioctl error on tap3c3b338b-cd: No such device Nov 26 05:00:43 localhost nova_compute[281415]: 2025-11-26 10:00:43.141 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:43 localhost journal[229445]: ethtool ioctl error on tap3c3b338b-cd: No such device Nov 26 05:00:43 localhost journal[229445]: ethtool ioctl error on tap3c3b338b-cd: No such device Nov 26 05:00:43 localhost journal[229445]: ethtool ioctl error on tap3c3b338b-cd: No such device Nov 26 05:00:43 localhost journal[229445]: ethtool ioctl error on tap3c3b338b-cd: No such device Nov 26 05:00:43 localhost journal[229445]: ethtool ioctl error on tap3c3b338b-cd: No such device Nov 26 05:00:43 localhost journal[229445]: ethtool ioctl error on tap3c3b338b-cd: No such device Nov 26 05:00:43 localhost journal[229445]: ethtool ioctl error on tap3c3b338b-cd: No such device Nov 26 05:00:43 localhost nova_compute[281415]: 2025-11-26 10:00:43.194 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:43 localhost nova_compute[281415]: 2025-11-26 10:00:43.226 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:44 localhost podman[312706]: Nov 26 05:00:44 localhost podman[312706]: 2025-11-26 10:00:44.141043675 +0000 UTC m=+0.096758084 container create 4208bf7e741801f2f636a86de14a54f0a453159be0574cfdf548a761bc2139c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 26 05:00:44 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e121 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:00:44 localhost systemd[1]: Started libpod-conmon-4208bf7e741801f2f636a86de14a54f0a453159be0574cfdf548a761bc2139c7.scope. Nov 26 05:00:44 localhost podman[312706]: 2025-11-26 10:00:44.094802722 +0000 UTC m=+0.050517181 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:00:44 localhost systemd[1]: Started libcrun container. Nov 26 05:00:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/772f7a13db17459d130cb30e6659cd449d58d27e4ea80a3f236393d1bf746a5d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:00:44 localhost podman[312706]: 2025-11-26 10:00:44.23108328 +0000 UTC m=+0.186797699 container init 4208bf7e741801f2f636a86de14a54f0a453159be0574cfdf548a761bc2139c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:00:44 localhost podman[312706]: 2025-11-26 10:00:44.241324683 +0000 UTC m=+0.197039092 container start 4208bf7e741801f2f636a86de14a54f0a453159be0574cfdf548a761bc2139c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Nov 26 05:00:44 localhost dnsmasq[312724]: started, version 2.85 cachesize 150 Nov 26 05:00:44 localhost dnsmasq[312724]: DNS service limited to local subnets Nov 26 05:00:44 localhost dnsmasq[312724]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:00:44 localhost dnsmasq[312724]: warning: no upstream servers configured Nov 26 05:00:44 localhost dnsmasq-dhcp[312724]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 26 05:00:44 localhost dnsmasq[312724]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:00:44 localhost dnsmasq-dhcp[312724]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:00:44 localhost dnsmasq-dhcp[312724]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:00:44 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:44.450 262471 INFO neutron.agent.dhcp.agent [None req-6c0d4622-3962-483c-8aa6-f87b43c3ea6f - - - - - -] DHCP configuration for ports {'ba010266-c829-4775-9f81-9e5e8ac0a898'} is completed#033[00m Nov 26 05:00:44 localhost neutron_sriov_agent[255515]: 2025-11-26 10:00:44.591 2 INFO neutron.agent.securitygroups_rpc [None req-4630a2d4-6f14-41ff-a91c-31263cca4375 d4acbeebb8b34223b4a73b397ac29666 05d917b602bb4665ad2ef0dceefd4842 - - default default] Security group member updated ['90f5e819-a82e-4098-b570-22d3f7905e1f']#033[00m Nov 26 05:00:44 localhost neutron_sriov_agent[255515]: 2025-11-26 10:00:44.654 2 INFO neutron.agent.securitygroups_rpc [None req-a7f294a6-d6ff-4906-ac47-b9ce1a9fba28 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:00:44 localhost dnsmasq[312724]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:00:44 localhost dnsmasq-dhcp[312724]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:00:44 localhost dnsmasq-dhcp[312724]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:00:44 localhost podman[312741]: 2025-11-26 10:00:44.660807415 +0000 UTC m=+0.070709637 container kill 4208bf7e741801f2f636a86de14a54f0a453159be0574cfdf548a761bc2139c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:00:44 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:44.836 262471 INFO neutron.agent.dhcp.agent [None req-fa7f5dce-e6bf-436f-8a88-7003c9ff7f3f - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:00:42Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=87fa4473-0228-407e-9469-395808b80499, ip_allocation=immediate, mac_address=fa:16:3e:16:85:53, name=tempest-NetworksTestDHCPv6-701060286, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:00:20Z, description=, dns_domain=, id=cc3dc995-51cd-4d70-be2c-11c47524552d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-843096697, port_security_enabled=True, project_id=fbf16d8f1271436498d8d9cbfb24239d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42907, qos_policy_id=None, revision_number=12, router:external=False, shared=False, standard_attr_id=1055, status=ACTIVE, subnets=['bd9ef546-aead-4c94-8203-7d8a655b5592'], tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:00:41Z, vlan_transparent=None, network_id=cc3dc995-51cd-4d70-be2c-11c47524552d, port_security_enabled=True, project_id=fbf16d8f1271436498d8d9cbfb24239d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['513251a1-00ec-4f61-b1d4-b1337479c848'], standard_attr_id=1170, status=DOWN, tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:00:42Z on network cc3dc995-51cd-4d70-be2c-11c47524552d#033[00m Nov 26 05:00:44 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:44.976 262471 INFO neutron.agent.dhcp.agent [None req-70b53792-c53b-40f7-b40c-b1400ab4ce3e - - - - - -] DHCP configuration for ports {'3c3b338b-cdb6-4947-b1b2-a42d094ec69d', 'ba010266-c829-4775-9f81-9e5e8ac0a898'} is completed#033[00m Nov 26 05:00:45 localhost dnsmasq[312724]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 1 addresses Nov 26 05:00:45 localhost dnsmasq-dhcp[312724]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:00:45 localhost dnsmasq-dhcp[312724]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:00:45 localhost podman[312779]: 2025-11-26 10:00:45.033010932 +0000 UTC m=+0.065259367 container kill 4208bf7e741801f2f636a86de14a54f0a453159be0574cfdf548a761bc2139c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 26 05:00:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 05:00:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 05:00:45 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:45.224 262471 INFO neutron.agent.dhcp.agent [None req-fa7f5dce-e6bf-436f-8a88-7003c9ff7f3f - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:00:44Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=c4be5d61-6cf3-496b-8e3f-7c51febac553, ip_allocation=immediate, mac_address=fa:16:3e:48:82:b1, name=tempest-NetworksTestDHCPv6-608224422, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:00:20Z, description=, dns_domain=, id=cc3dc995-51cd-4d70-be2c-11c47524552d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-843096697, port_security_enabled=True, project_id=fbf16d8f1271436498d8d9cbfb24239d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42907, qos_policy_id=None, revision_number=14, router:external=False, shared=False, standard_attr_id=1055, status=ACTIVE, subnets=['945c201b-4ffd-4c80-bea7-b74eb5ac3f39'], tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:00:43Z, vlan_transparent=None, network_id=cc3dc995-51cd-4d70-be2c-11c47524552d, port_security_enabled=True, project_id=fbf16d8f1271436498d8d9cbfb24239d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['513251a1-00ec-4f61-b1d4-b1337479c848'], standard_attr_id=1176, status=DOWN, tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:00:44Z on network cc3dc995-51cd-4d70-be2c-11c47524552d#033[00m Nov 26 05:00:45 localhost systemd[1]: tmp-crun.03ie9d.mount: Deactivated successfully. Nov 26 05:00:45 localhost podman[312798]: 2025-11-26 10:00:45.336906554 +0000 UTC m=+0.159002931 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118) Nov 26 05:00:45 localhost podman[312800]: 2025-11-26 10:00:45.308183267 +0000 UTC m=+0.128088569 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 26 05:00:45 localhost podman[312798]: 2025-11-26 10:00:45.372563755 +0000 UTC m=+0.194660112 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 26 05:00:45 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 05:00:45 localhost podman[312800]: 2025-11-26 10:00:45.393374049 +0000 UTC m=+0.213279321 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Nov 26 05:00:45 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 05:00:45 localhost dnsmasq[312724]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 2 addresses Nov 26 05:00:45 localhost podman[312853]: 2025-11-26 10:00:45.451647748 +0000 UTC m=+0.061256228 container kill 4208bf7e741801f2f636a86de14a54f0a453159be0574cfdf548a761bc2139c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 26 05:00:45 localhost dnsmasq-dhcp[312724]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:00:45 localhost dnsmasq-dhcp[312724]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:00:45 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:45.454 262471 INFO neutron.agent.dhcp.agent [None req-ed4f016d-9aa0-48e8-8e64-b833c3e5c913 - - - - - -] DHCP configuration for ports {'87fa4473-0228-407e-9469-395808b80499'} is completed#033[00m Nov 26 05:00:45 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:45.724 262471 INFO neutron.agent.dhcp.agent [None req-93d0386c-c100-4559-bd94-a60f7769d678 - - - - - -] DHCP configuration for ports {'c4be5d61-6cf3-496b-8e3f-7c51febac553'} is completed#033[00m Nov 26 05:00:45 localhost openstack_network_exporter[242153]: ERROR 10:00:45 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 05:00:45 localhost openstack_network_exporter[242153]: ERROR 10:00:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:00:45 localhost openstack_network_exporter[242153]: ERROR 10:00:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:00:45 localhost openstack_network_exporter[242153]: ERROR 10:00:45 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 05:00:45 localhost openstack_network_exporter[242153]: Nov 26 05:00:45 localhost nova_compute[281415]: 2025-11-26 10:00:45.821 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:45 localhost openstack_network_exporter[242153]: ERROR 10:00:45 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 05:00:45 localhost openstack_network_exporter[242153]: Nov 26 05:00:45 localhost dnsmasq[312724]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 1 addresses Nov 26 05:00:45 localhost dnsmasq-dhcp[312724]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:00:45 localhost dnsmasq-dhcp[312724]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:00:45 localhost podman[312893]: 2025-11-26 10:00:45.916668392 +0000 UTC m=+0.063867315 container kill 4208bf7e741801f2f636a86de14a54f0a453159be0574cfdf548a761bc2139c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Nov 26 05:00:46 localhost neutron_sriov_agent[255515]: 2025-11-26 10:00:46.008 2 INFO neutron.agent.securitygroups_rpc [None req-796e2b59-94cf-4109-a5dc-0f48e6512a49 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:00:46 localhost systemd[1]: tmp-crun.AFVfy0.mount: Deactivated successfully. Nov 26 05:00:46 localhost neutron_sriov_agent[255515]: 2025-11-26 10:00:46.166 2 INFO neutron.agent.securitygroups_rpc [None req-6ced4a6a-a058-418f-b6e5-03e43543ca3b a288769cbd524b60bf567f14d8972f45 eeec26dadb514e97b8b9dea346739ecd - - default default] Security group member updated ['b769a949-37a6-4294-867f-262d5bc33e35']#033[00m Nov 26 05:00:46 localhost systemd[1]: tmp-crun.ozQ5DP.mount: Deactivated successfully. Nov 26 05:00:46 localhost dnsmasq[312724]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:00:46 localhost dnsmasq-dhcp[312724]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:00:46 localhost dnsmasq-dhcp[312724]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:00:46 localhost podman[312932]: 2025-11-26 10:00:46.304433998 +0000 UTC m=+0.074292182 container kill 4208bf7e741801f2f636a86de14a54f0a453159be0574cfdf548a761bc2139c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 26 05:00:46 localhost dnsmasq[312724]: exiting on receipt of SIGTERM Nov 26 05:00:46 localhost podman[312971]: 2025-11-26 10:00:46.772191373 +0000 UTC m=+0.071618073 container kill 4208bf7e741801f2f636a86de14a54f0a453159be0574cfdf548a761bc2139c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Nov 26 05:00:46 localhost systemd[1]: libpod-4208bf7e741801f2f636a86de14a54f0a453159be0574cfdf548a761bc2139c7.scope: Deactivated successfully. Nov 26 05:00:46 localhost podman[312985]: 2025-11-26 10:00:46.849806881 +0000 UTC m=+0.060131624 container died 4208bf7e741801f2f636a86de14a54f0a453159be0574cfdf548a761bc2139c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118) Nov 26 05:00:46 localhost podman[312985]: 2025-11-26 10:00:46.8887368 +0000 UTC m=+0.099061503 container cleanup 4208bf7e741801f2f636a86de14a54f0a453159be0574cfdf548a761bc2139c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Nov 26 05:00:46 localhost systemd[1]: libpod-conmon-4208bf7e741801f2f636a86de14a54f0a453159be0574cfdf548a761bc2139c7.scope: Deactivated successfully. Nov 26 05:00:46 localhost podman[312987]: 2025-11-26 10:00:46.935949422 +0000 UTC m=+0.139707631 container remove 4208bf7e741801f2f636a86de14a54f0a453159be0574cfdf548a761bc2139c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Nov 26 05:00:46 localhost ovn_controller[153664]: 2025-11-26T10:00:46Z|00201|binding|INFO|Releasing lport 3c3b338b-cdb6-4947-b1b2-a42d094ec69d from this chassis (sb_readonly=0) Nov 26 05:00:46 localhost nova_compute[281415]: 2025-11-26 10:00:46.994 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:46 localhost ovn_controller[153664]: 2025-11-26T10:00:46Z|00202|binding|INFO|Setting lport 3c3b338b-cdb6-4947-b1b2-a42d094ec69d down in Southbound Nov 26 05:00:46 localhost kernel: device tap3c3b338b-cd left promiscuous mode Nov 26 05:00:47 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:47.008 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=3c3b338b-cdb6-4947-b1b2-a42d094ec69d) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:00:47 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:47.010 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 3c3b338b-cdb6-4947-b1b2-a42d094ec69d in datapath cc3dc995-51cd-4d70-be2c-11c47524552d unbound from our chassis#033[00m Nov 26 05:00:47 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:47.011 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cc3dc995-51cd-4d70-be2c-11c47524552d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:00:47 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:47.013 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[3899dfdd-e521-4ac2-9e4e-47e4ad9e0016]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:00:47 localhost nova_compute[281415]: 2025-11-26 10:00:47.020 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:47 localhost nova_compute[281415]: 2025-11-26 10:00:47.106 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:47 localhost systemd[1]: var-lib-containers-storage-overlay-772f7a13db17459d130cb30e6659cd449d58d27e4ea80a3f236393d1bf746a5d-merged.mount: Deactivated successfully. Nov 26 05:00:47 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4208bf7e741801f2f636a86de14a54f0a453159be0574cfdf548a761bc2139c7-userdata-shm.mount: Deactivated successfully. Nov 26 05:00:47 localhost systemd[1]: run-netns-qdhcp\x2dcc3dc995\x2d51cd\x2d4d70\x2dbe2c\x2d11c47524552d.mount: Deactivated successfully. Nov 26 05:00:47 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:47.237 262471 INFO neutron.agent.dhcp.agent [None req-6673fe81-4a50-465c-9ca3-3d01ecd86af5 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:00:47 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:47.977 262471 INFO neutron.agent.linux.ip_lib [None req-1fe08345-1270-42be-a365-3d88f4d6049b - - - - - -] Device tap7627ead4-05 cannot be used as it has no MAC address#033[00m Nov 26 05:00:48 localhost nova_compute[281415]: 2025-11-26 10:00:48.000 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:48 localhost kernel: device tap7627ead4-05 entered promiscuous mode Nov 26 05:00:48 localhost ovn_controller[153664]: 2025-11-26T10:00:48Z|00203|binding|INFO|Claiming lport 7627ead4-050b-4457-90de-911c48975078 for this chassis. Nov 26 05:00:48 localhost ovn_controller[153664]: 2025-11-26T10:00:48Z|00204|binding|INFO|7627ead4-050b-4457-90de-911c48975078: Claiming unknown Nov 26 05:00:48 localhost NetworkManager[5970]: [1764151248.0092] manager: (tap7627ead4-05): new Generic device (/org/freedesktop/NetworkManager/Devices/35) Nov 26 05:00:48 localhost nova_compute[281415]: 2025-11-26 10:00:48.012 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:48 localhost systemd-udevd[313024]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:00:48 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:48.016 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7627ead4-050b-4457-90de-911c48975078) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:00:48 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:48.018 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 7627ead4-050b-4457-90de-911c48975078 in datapath cc3dc995-51cd-4d70-be2c-11c47524552d bound to our chassis#033[00m Nov 26 05:00:48 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:48.020 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cc3dc995-51cd-4d70-be2c-11c47524552d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:00:48 localhost ovn_controller[153664]: 2025-11-26T10:00:48Z|00205|binding|INFO|Setting lport 7627ead4-050b-4457-90de-911c48975078 ovn-installed in OVS Nov 26 05:00:48 localhost ovn_controller[153664]: 2025-11-26T10:00:48Z|00206|binding|INFO|Setting lport 7627ead4-050b-4457-90de-911c48975078 up in Southbound Nov 26 05:00:48 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:48.021 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[9bf0d271-4feb-4b21-b411-b3a8526b30b8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:00:48 localhost nova_compute[281415]: 2025-11-26 10:00:48.046 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:48 localhost journal[229445]: ethtool ioctl error on tap7627ead4-05: No such device Nov 26 05:00:48 localhost nova_compute[281415]: 2025-11-26 10:00:48.066 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:48 localhost nova_compute[281415]: 2025-11-26 10:00:48.071 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:48 localhost journal[229445]: ethtool ioctl error on tap7627ead4-05: No such device Nov 26 05:00:48 localhost journal[229445]: ethtool ioctl error on tap7627ead4-05: No such device Nov 26 05:00:48 localhost journal[229445]: ethtool ioctl error on tap7627ead4-05: No such device Nov 26 05:00:48 localhost journal[229445]: ethtool ioctl error on tap7627ead4-05: No such device Nov 26 05:00:48 localhost journal[229445]: ethtool ioctl error on tap7627ead4-05: No such device Nov 26 05:00:48 localhost journal[229445]: ethtool ioctl error on tap7627ead4-05: No such device Nov 26 05:00:48 localhost journal[229445]: ethtool ioctl error on tap7627ead4-05: No such device Nov 26 05:00:48 localhost nova_compute[281415]: 2025-11-26 10:00:48.117 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:48 localhost sshd[313048]: main: sshd: ssh-rsa algorithm is disabled Nov 26 05:00:48 localhost nova_compute[281415]: 2025-11-26 10:00:48.150 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:49 localhost podman[313097]: Nov 26 05:00:49 localhost podman[313097]: 2025-11-26 10:00:49.087702961 +0000 UTC m=+0.097451775 container create 9a8431948e8b546f2fc6459240e8119b42236f9908a4b429297186809fc5955e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Nov 26 05:00:49 localhost systemd[1]: Started libpod-conmon-9a8431948e8b546f2fc6459240e8119b42236f9908a4b429297186809fc5955e.scope. Nov 26 05:00:49 localhost podman[313097]: 2025-11-26 10:00:49.04222589 +0000 UTC m=+0.051974754 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:00:49 localhost systemd[1]: Started libcrun container. Nov 26 05:00:49 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e121 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:00:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dc85548c2d7ee8958b875e379c15db622eb4a14afe80c296fb65af075bac7bcc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:00:49 localhost podman[313097]: 2025-11-26 10:00:49.171319017 +0000 UTC m=+0.181067861 container init 9a8431948e8b546f2fc6459240e8119b42236f9908a4b429297186809fc5955e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 26 05:00:49 localhost podman[313097]: 2025-11-26 10:00:49.181870118 +0000 UTC m=+0.191618962 container start 9a8431948e8b546f2fc6459240e8119b42236f9908a4b429297186809fc5955e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS) Nov 26 05:00:49 localhost dnsmasq[313115]: started, version 2.85 cachesize 150 Nov 26 05:00:49 localhost dnsmasq[313115]: DNS service limited to local subnets Nov 26 05:00:49 localhost dnsmasq[313115]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:00:49 localhost dnsmasq[313115]: warning: no upstream servers configured Nov 26 05:00:49 localhost dnsmasq-dhcp[313115]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 26 05:00:49 localhost dnsmasq[313115]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:00:49 localhost dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:00:49 localhost dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:00:49 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:49.317 262471 INFO neutron.agent.dhcp.agent [None req-24702c4a-5a31-4a1b-a0b7-c741037acb9a - - - - - -] DHCP configuration for ports {'ba010266-c829-4775-9f81-9e5e8ac0a898'} is completed#033[00m Nov 26 05:00:49 localhost dnsmasq[313115]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:00:49 localhost dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:00:49 localhost dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:00:49 localhost podman[313133]: 2025-11-26 10:00:49.565662187 +0000 UTC m=+0.066993926 container kill 9a8431948e8b546f2fc6459240e8119b42236f9908a4b429297186809fc5955e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS) Nov 26 05:00:49 localhost neutron_sriov_agent[255515]: 2025-11-26 10:00:49.885 2 INFO neutron.agent.securitygroups_rpc [None req-a7103a11-51ce-4b09-bb68-e73bcd8973ee d4acbeebb8b34223b4a73b397ac29666 05d917b602bb4665ad2ef0dceefd4842 - - default default] Security group member updated ['90f5e819-a82e-4098-b570-22d3f7905e1f']#033[00m Nov 26 05:00:49 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:49.951 262471 INFO neutron.agent.dhcp.agent [None req-b4f3e17d-623b-4ff1-892a-ad9706187ddc - - - - - -] DHCP configuration for ports {'7627ead4-050b-4457-90de-911c48975078', 'ba010266-c829-4775-9f81-9e5e8ac0a898'} is completed#033[00m Nov 26 05:00:50 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:50.108 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:00:49Z, description=, device_id=d31c288d-6ed2-4189-99fb-f941509a5b11, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=03105257-80e9-44e8-ad68-e3729bbf858b, ip_allocation=immediate, mac_address=fa:16:3e:23:f0:6f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:00:20Z, description=, dns_domain=, id=cc3dc995-51cd-4d70-be2c-11c47524552d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-843096697, port_security_enabled=True, project_id=fbf16d8f1271436498d8d9cbfb24239d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42907, qos_policy_id=None, revision_number=18, router:external=False, shared=False, standard_attr_id=1055, status=ACTIVE, subnets=['4e93cabe-c0aa-42f9-a1b0-fe82016bce53'], tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:00:48Z, vlan_transparent=None, network_id=cc3dc995-51cd-4d70-be2c-11c47524552d, port_security_enabled=False, project_id=fbf16d8f1271436498d8d9cbfb24239d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1221, status=DOWN, tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:00:49Z on network cc3dc995-51cd-4d70-be2c-11c47524552d#033[00m Nov 26 05:00:50 localhost dnsmasq[313115]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 1 addresses Nov 26 05:00:50 localhost dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:00:50 localhost podman[313171]: 2025-11-26 10:00:50.310110872 +0000 UTC m=+0.061333699 container kill 9a8431948e8b546f2fc6459240e8119b42236f9908a4b429297186809fc5955e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 26 05:00:50 localhost dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:00:50 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:50.608 262471 INFO neutron.agent.dhcp.agent [None req-54a73558-1c56-4d3a-b495-2eeb1a2425cb - - - - - -] DHCP configuration for ports {'03105257-80e9-44e8-ad68-e3729bbf858b'} is completed#033[00m Nov 26 05:00:50 localhost nova_compute[281415]: 2025-11-26 10:00:50.850 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:51.163 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:00:49Z, description=, device_id=d31c288d-6ed2-4189-99fb-f941509a5b11, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=03105257-80e9-44e8-ad68-e3729bbf858b, ip_allocation=immediate, mac_address=fa:16:3e:23:f0:6f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:00:20Z, description=, dns_domain=, id=cc3dc995-51cd-4d70-be2c-11c47524552d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-843096697, port_security_enabled=True, project_id=fbf16d8f1271436498d8d9cbfb24239d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42907, qos_policy_id=None, revision_number=18, router:external=False, shared=False, standard_attr_id=1055, status=ACTIVE, subnets=['4e93cabe-c0aa-42f9-a1b0-fe82016bce53'], tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:00:48Z, vlan_transparent=None, network_id=cc3dc995-51cd-4d70-be2c-11c47524552d, port_security_enabled=False, project_id=fbf16d8f1271436498d8d9cbfb24239d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1221, status=DOWN, tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:00:49Z on network cc3dc995-51cd-4d70-be2c-11c47524552d#033[00m Nov 26 05:00:51 localhost dnsmasq[313115]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 1 addresses Nov 26 05:00:51 localhost dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:00:51 localhost podman[313209]: 2025-11-26 10:00:51.380399286 +0000 UTC m=+0.068737087 container kill 9a8431948e8b546f2fc6459240e8119b42236f9908a4b429297186809fc5955e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 26 05:00:51 localhost dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:00:51 localhost nova_compute[281415]: 2025-11-26 10:00:51.816 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:51.830 262471 INFO neutron.agent.dhcp.agent [None req-65024779-51d6-4758-80dd-25f77b60caa5 - - - - - -] DHCP configuration for ports {'03105257-80e9-44e8-ad68-e3729bbf858b'} is completed#033[00m Nov 26 05:00:52 localhost nova_compute[281415]: 2025-11-26 10:00:52.144 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:53 localhost dnsmasq[313115]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:00:53 localhost podman[313247]: 2025-11-26 10:00:53.761757496 +0000 UTC m=+0.066278386 container kill 9a8431948e8b546f2fc6459240e8119b42236f9908a4b429297186809fc5955e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 26 05:00:53 localhost dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:00:53 localhost dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:00:54 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e121 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:00:55 localhost nova_compute[281415]: 2025-11-26 10:00:55.885 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:56 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:00:56 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:00:56 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:00:56 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:00:56 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:00:56 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:00:56 localhost dnsmasq[313115]: exiting on receipt of SIGTERM Nov 26 05:00:56 localhost systemd[1]: tmp-crun.YUI6eb.mount: Deactivated successfully. Nov 26 05:00:56 localhost systemd[1]: libpod-9a8431948e8b546f2fc6459240e8119b42236f9908a4b429297186809fc5955e.scope: Deactivated successfully. Nov 26 05:00:56 localhost podman[313374]: 2025-11-26 10:00:56.642615297 +0000 UTC m=+0.074336914 container kill 9a8431948e8b546f2fc6459240e8119b42236f9908a4b429297186809fc5955e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 26 05:00:56 localhost podman[313390]: 2025-11-26 10:00:56.715594579 +0000 UTC m=+0.058018832 container died 9a8431948e8b546f2fc6459240e8119b42236f9908a4b429297186809fc5955e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 26 05:00:56 localhost ovn_controller[153664]: 2025-11-26T10:00:56Z|00207|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:00:56 localhost systemd[1]: tmp-crun.uPgZ7w.mount: Deactivated successfully. Nov 26 05:00:56 localhost nova_compute[281415]: 2025-11-26 10:00:56.749 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:56 localhost podman[313390]: 2025-11-26 10:00:56.830141718 +0000 UTC m=+0.172565921 container cleanup 9a8431948e8b546f2fc6459240e8119b42236f9908a4b429297186809fc5955e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:00:56 localhost systemd[1]: libpod-conmon-9a8431948e8b546f2fc6459240e8119b42236f9908a4b429297186809fc5955e.scope: Deactivated successfully. Nov 26 05:00:56 localhost podman[313393]: 2025-11-26 10:00:56.858472523 +0000 UTC m=+0.190121368 container remove 9a8431948e8b546f2fc6459240e8119b42236f9908a4b429297186809fc5955e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 26 05:00:56 localhost nova_compute[281415]: 2025-11-26 10:00:56.874 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:56 localhost kernel: device tap7627ead4-05 left promiscuous mode Nov 26 05:00:56 localhost ovn_controller[153664]: 2025-11-26T10:00:56Z|00208|binding|INFO|Releasing lport 7627ead4-050b-4457-90de-911c48975078 from this chassis (sb_readonly=0) Nov 26 05:00:56 localhost ovn_controller[153664]: 2025-11-26T10:00:56Z|00209|binding|INFO|Setting lport 7627ead4-050b-4457-90de-911c48975078 down in Southbound Nov 26 05:00:56 localhost nova_compute[281415]: 2025-11-26 10:00:56.902 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:56 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:56.907 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7627ead4-050b-4457-90de-911c48975078) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:00:56 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:56.910 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 7627ead4-050b-4457-90de-911c48975078 in datapath cc3dc995-51cd-4d70-be2c-11c47524552d unbound from our chassis#033[00m Nov 26 05:00:56 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:56.912 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cc3dc995-51cd-4d70-be2c-11c47524552d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:00:56 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:56.913 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[b959ae3b-d890-4db7-b4de-426531ff2df6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:00:57 localhost nova_compute[281415]: 2025-11-26 10:00:57.186 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:57 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:57.252 262471 INFO neutron.agent.dhcp.agent [None req-dc584d5a-347f-4563-8acf-0bdeeeb327cf - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:00:57 localhost podman[240049]: time="2025-11-26T10:00:57Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 05:00:57 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 05:00:57 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:00:57 localhost podman[240049]: @ - - [26/Nov/2025:10:00:57 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153864 "" "Go-http-client/1.1" Nov 26 05:00:57 localhost podman[240049]: @ - - [26/Nov/2025:10:00:57 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18775 "" "Go-http-client/1.1" Nov 26 05:00:57 localhost systemd[1]: var-lib-containers-storage-overlay-dc85548c2d7ee8958b875e379c15db622eb4a14afe80c296fb65af075bac7bcc-merged.mount: Deactivated successfully. Nov 26 05:00:57 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9a8431948e8b546f2fc6459240e8119b42236f9908a4b429297186809fc5955e-userdata-shm.mount: Deactivated successfully. Nov 26 05:00:57 localhost systemd[1]: run-netns-qdhcp\x2dcc3dc995\x2d51cd\x2d4d70\x2dbe2c\x2d11c47524552d.mount: Deactivated successfully. Nov 26 05:00:58 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:58.151 262471 INFO neutron.agent.linux.ip_lib [None req-8081e60d-c0e5-499e-835b-72e61ed02939 - - - - - -] Device tape47fb967-d0 cannot be used as it has no MAC address#033[00m Nov 26 05:00:58 localhost nova_compute[281415]: 2025-11-26 10:00:58.180 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:58 localhost kernel: device tape47fb967-d0 entered promiscuous mode Nov 26 05:00:58 localhost ovn_controller[153664]: 2025-11-26T10:00:58Z|00210|binding|INFO|Claiming lport e47fb967-d06b-445a-9414-426bd237eaa7 for this chassis. Nov 26 05:00:58 localhost ovn_controller[153664]: 2025-11-26T10:00:58Z|00211|binding|INFO|e47fb967-d06b-445a-9414-426bd237eaa7: Claiming unknown Nov 26 05:00:58 localhost NetworkManager[5970]: [1764151258.1914] manager: (tape47fb967-d0): new Generic device (/org/freedesktop/NetworkManager/Devices/36) Nov 26 05:00:58 localhost nova_compute[281415]: 2025-11-26 10:00:58.194 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:58 localhost systemd-udevd[313479]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:00:58 localhost ovn_controller[153664]: 2025-11-26T10:00:58Z|00212|binding|INFO|Setting lport e47fb967-d06b-445a-9414-426bd237eaa7 ovn-installed in OVS Nov 26 05:00:58 localhost ovn_controller[153664]: 2025-11-26T10:00:58Z|00213|binding|INFO|Setting lport e47fb967-d06b-445a-9414-426bd237eaa7 up in Southbound Nov 26 05:00:58 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:58.203 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e47fb967-d06b-445a-9414-426bd237eaa7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:00:58 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:58.205 159486 INFO neutron.agent.ovn.metadata.agent [-] Port e47fb967-d06b-445a-9414-426bd237eaa7 in datapath cc3dc995-51cd-4d70-be2c-11c47524552d bound to our chassis#033[00m Nov 26 05:00:58 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:58.206 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cc3dc995-51cd-4d70-be2c-11c47524552d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:00:58 localhost ovn_metadata_agent[159481]: 2025-11-26 10:00:58.207 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[247b59fa-da35-4f3b-8261-a884b8bb0abe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:00:58 localhost nova_compute[281415]: 2025-11-26 10:00:58.241 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:58 localhost nova_compute[281415]: 2025-11-26 10:00:58.264 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:58 localhost nova_compute[281415]: 2025-11-26 10:00:58.316 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:58 localhost nova_compute[281415]: 2025-11-26 10:00:58.354 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:00:58 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:00:59 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e121 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:00:59 localhost podman[313534]: Nov 26 05:00:59 localhost podman[313534]: 2025-11-26 10:00:59.292926069 +0000 UTC m=+0.097259849 container create c811fc0489f18f8d875e7029378f10a9a64839958e089c24e270381a48104047 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:00:59 localhost systemd[1]: Started libpod-conmon-c811fc0489f18f8d875e7029378f10a9a64839958e089c24e270381a48104047.scope. Nov 26 05:00:59 localhost podman[313534]: 2025-11-26 10:00:59.243357137 +0000 UTC m=+0.047690957 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:00:59 localhost systemd[1]: Started libcrun container. Nov 26 05:00:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9be1c678b86218545f264a807c97d07b61b70f19ad9c439d1d59af3eb626a2c7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:00:59 localhost podman[313534]: 2025-11-26 10:00:59.379469202 +0000 UTC m=+0.183802992 container init c811fc0489f18f8d875e7029378f10a9a64839958e089c24e270381a48104047 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:00:59 localhost podman[313534]: 2025-11-26 10:00:59.393497065 +0000 UTC m=+0.197830845 container start c811fc0489f18f8d875e7029378f10a9a64839958e089c24e270381a48104047 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 26 05:00:59 localhost dnsmasq[313553]: started, version 2.85 cachesize 150 Nov 26 05:00:59 localhost dnsmasq[313553]: DNS service limited to local subnets Nov 26 05:00:59 localhost dnsmasq[313553]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:00:59 localhost dnsmasq[313553]: warning: no upstream servers configured Nov 26 05:00:59 localhost dnsmasq[313553]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:00:59 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:59.469 262471 INFO neutron.agent.dhcp.agent [None req-8081e60d-c0e5-499e-835b-72e61ed02939 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:00:57Z, description=, device_id=c96897b9-31c3-462b-b4f5-098c4aa03e09, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=546b8f53-bcf4-4c47-9d3d-5d868c56c4b4, ip_allocation=immediate, mac_address=fa:16:3e:6e:57:51, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:00:20Z, description=, dns_domain=, id=cc3dc995-51cd-4d70-be2c-11c47524552d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-843096697, port_security_enabled=True, project_id=fbf16d8f1271436498d8d9cbfb24239d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42907, qos_policy_id=None, revision_number=20, router:external=False, shared=False, standard_attr_id=1055, status=ACTIVE, subnets=['18a7decd-c4b9-4ebf-bb02-2420d0b0469c'], tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:00:56Z, vlan_transparent=None, network_id=cc3dc995-51cd-4d70-be2c-11c47524552d, port_security_enabled=False, project_id=fbf16d8f1271436498d8d9cbfb24239d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1245, status=DOWN, tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:00:58Z on network cc3dc995-51cd-4d70-be2c-11c47524552d#033[00m Nov 26 05:00:59 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:59.556 262471 INFO neutron.agent.dhcp.agent [None req-ed9d5e69-2f76-4cc5-a7df-41de9448e69b - - - - - -] DHCP configuration for ports {'ba010266-c829-4775-9f81-9e5e8ac0a898'} is completed#033[00m Nov 26 05:00:59 localhost dnsmasq[313553]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 1 addresses Nov 26 05:00:59 localhost podman[313569]: 2025-11-26 10:00:59.68900601 +0000 UTC m=+0.064230005 container kill c811fc0489f18f8d875e7029378f10a9a64839958e089c24e270381a48104047 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:00:59 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:00:59.884 262471 INFO neutron.agent.dhcp.agent [None req-8081e60d-c0e5-499e-835b-72e61ed02939 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:00:57Z, description=, device_id=c96897b9-31c3-462b-b4f5-098c4aa03e09, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=546b8f53-bcf4-4c47-9d3d-5d868c56c4b4, ip_allocation=immediate, mac_address=fa:16:3e:6e:57:51, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:00:20Z, description=, dns_domain=, id=cc3dc995-51cd-4d70-be2c-11c47524552d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-843096697, port_security_enabled=True, project_id=fbf16d8f1271436498d8d9cbfb24239d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42907, qos_policy_id=None, revision_number=20, router:external=False, shared=False, standard_attr_id=1055, status=ACTIVE, subnets=['18a7decd-c4b9-4ebf-bb02-2420d0b0469c'], tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:00:56Z, vlan_transparent=None, network_id=cc3dc995-51cd-4d70-be2c-11c47524552d, port_security_enabled=False, project_id=fbf16d8f1271436498d8d9cbfb24239d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1245, status=DOWN, tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:00:58Z on network cc3dc995-51cd-4d70-be2c-11c47524552d#033[00m Nov 26 05:00:59 localhost nova_compute[281415]: 2025-11-26 10:00:59.955 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:00 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:00.025 262471 INFO neutron.agent.dhcp.agent [None req-0b25529d-4b59-49ec-aee3-b2d2503bdf26 - - - - - -] DHCP configuration for ports {'546b8f53-bcf4-4c47-9d3d-5d868c56c4b4'} is completed#033[00m Nov 26 05:01:00 localhost dnsmasq[313553]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 1 addresses Nov 26 05:01:00 localhost podman[313607]: 2025-11-26 10:01:00.099494806 +0000 UTC m=+0.076430054 container kill c811fc0489f18f8d875e7029378f10a9a64839958e089c24e270381a48104047 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:01:00 localhost nova_compute[281415]: 2025-11-26 10:01:00.130 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:01:00 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:00.464 262471 INFO neutron.agent.dhcp.agent [None req-3d0f8831-d343-4b0d-bf92-9623931a3963 - - - - - -] DHCP configuration for ports {'546b8f53-bcf4-4c47-9d3d-5d868c56c4b4'} is completed#033[00m Nov 26 05:01:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 05:01:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 05:01:00 localhost systemd[1]: tmp-crun.5ZxKkQ.mount: Deactivated successfully. Nov 26 05:01:00 localhost podman[313629]: 2025-11-26 10:01:00.844600501 +0000 UTC m=+0.099743713 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 05:01:00 localhost nova_compute[281415]: 2025-11-26 10:01:00.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:01:00 localhost podman[313629]: 2025-11-26 10:01:00.882641232 +0000 UTC m=+0.137784444 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 05:01:00 localhost nova_compute[281415]: 2025-11-26 10:01:00.888 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:00 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 05:01:00 localhost podman[313630]: 2025-11-26 10:01:00.89952697 +0000 UTC m=+0.150535409 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Nov 26 05:01:00 localhost podman[313630]: 2025-11-26 10:01:00.941407696 +0000 UTC m=+0.192416175 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 26 05:01:00 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 05:01:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:01.209 262471 INFO neutron.agent.linux.ip_lib [None req-e6ed6e98-7c91-448f-ad4c-0d55011f1401 - - - - - -] Device tapd7937d5d-fd cannot be used as it has no MAC address#033[00m Nov 26 05:01:01 localhost nova_compute[281415]: 2025-11-26 10:01:01.284 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:01 localhost kernel: device tapd7937d5d-fd entered promiscuous mode Nov 26 05:01:01 localhost ovn_controller[153664]: 2025-11-26T10:01:01Z|00214|binding|INFO|Claiming lport d7937d5d-fd9a-4156-9554-2b8e5c61a861 for this chassis. Nov 26 05:01:01 localhost ovn_controller[153664]: 2025-11-26T10:01:01Z|00215|binding|INFO|d7937d5d-fd9a-4156-9554-2b8e5c61a861: Claiming unknown Nov 26 05:01:01 localhost NetworkManager[5970]: [1764151261.2961] manager: (tapd7937d5d-fd): new Generic device (/org/freedesktop/NetworkManager/Devices/37) Nov 26 05:01:01 localhost nova_compute[281415]: 2025-11-26 10:01:01.294 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:01 localhost systemd[1]: tmp-crun.uF4fnO.mount: Deactivated successfully. Nov 26 05:01:01 localhost systemd-udevd[313714]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:01:01 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:01.307 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-a67680f7-b6ab-4378-9ba9-788d286c7bb6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a67680f7-b6ab-4378-9ba9-788d286c7bb6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f6fde3de4aa44469b622b6f00e3baee9', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4f4e28f9-bd6b-4809-ac7f-8831f217257b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d7937d5d-fd9a-4156-9554-2b8e5c61a861) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:01:01 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:01.309 159486 INFO neutron.agent.ovn.metadata.agent [-] Port d7937d5d-fd9a-4156-9554-2b8e5c61a861 in datapath a67680f7-b6ab-4378-9ba9-788d286c7bb6 bound to our chassis#033[00m Nov 26 05:01:01 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:01.310 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network a67680f7-b6ab-4378-9ba9-788d286c7bb6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:01:01 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:01.311 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[9c915db2-1dfd-43ad-8d34-9e39603981d9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:01:01 localhost ovn_controller[153664]: 2025-11-26T10:01:01Z|00216|binding|INFO|Setting lport d7937d5d-fd9a-4156-9554-2b8e5c61a861 ovn-installed in OVS Nov 26 05:01:01 localhost ovn_controller[153664]: 2025-11-26T10:01:01Z|00217|binding|INFO|Setting lport d7937d5d-fd9a-4156-9554-2b8e5c61a861 up in Southbound Nov 26 05:01:01 localhost nova_compute[281415]: 2025-11-26 10:01:01.336 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:01 localhost dnsmasq[313553]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:01:01 localhost podman[313706]: 2025-11-26 10:01:01.360845976 +0000 UTC m=+0.084027429 container kill c811fc0489f18f8d875e7029378f10a9a64839958e089c24e270381a48104047 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2) Nov 26 05:01:01 localhost nova_compute[281415]: 2025-11-26 10:01:01.382 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:01 localhost nova_compute[281415]: 2025-11-26 10:01:01.421 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:01 localhost nova_compute[281415]: 2025-11-26 10:01:01.607 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:01 localhost kernel: device tape47fb967-d0 left promiscuous mode Nov 26 05:01:01 localhost ovn_controller[153664]: 2025-11-26T10:01:01Z|00218|binding|INFO|Releasing lport e47fb967-d06b-445a-9414-426bd237eaa7 from this chassis (sb_readonly=0) Nov 26 05:01:01 localhost ovn_controller[153664]: 2025-11-26T10:01:01Z|00219|binding|INFO|Setting lport e47fb967-d06b-445a-9414-426bd237eaa7 down in Southbound Nov 26 05:01:01 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:01.622 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e47fb967-d06b-445a-9414-426bd237eaa7) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:01:01 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:01.624 159486 INFO neutron.agent.ovn.metadata.agent [-] Port e47fb967-d06b-445a-9414-426bd237eaa7 in datapath cc3dc995-51cd-4d70-be2c-11c47524552d unbound from our chassis#033[00m Nov 26 05:01:01 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:01.625 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cc3dc995-51cd-4d70-be2c-11c47524552d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:01:01 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:01.626 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[bde3aafe-94a0-4d84-95ba-4377d4c926db]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:01:01 localhost nova_compute[281415]: 2025-11-26 10:01:01.632 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:01 localhost nova_compute[281415]: 2025-11-26 10:01:01.633 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:01 localhost nova_compute[281415]: 2025-11-26 10:01:01.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:01:01 localhost nova_compute[281415]: 2025-11-26 10:01:01.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:01:01 localhost nova_compute[281415]: 2025-11-26 10:01:01.849 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:01:01 localhost nova_compute[281415]: 2025-11-26 10:01:01.849 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:01:01 localhost nova_compute[281415]: 2025-11-26 10:01:01.850 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 05:01:01 localhost dnsmasq[313553]: exiting on receipt of SIGTERM Nov 26 05:01:01 localhost podman[313775]: 2025-11-26 10:01:01.976344908 +0000 UTC m=+0.068122610 container kill c811fc0489f18f8d875e7029378f10a9a64839958e089c24e270381a48104047 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 26 05:01:01 localhost systemd[1]: libpod-c811fc0489f18f8d875e7029378f10a9a64839958e089c24e270381a48104047.scope: Deactivated successfully. Nov 26 05:01:02 localhost ovn_controller[153664]: 2025-11-26T10:01:02Z|00220|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:01:02 localhost nova_compute[281415]: 2025-11-26 10:01:02.057 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:02 localhost podman[313792]: 2025-11-26 10:01:02.066001552 +0000 UTC m=+0.077273630 container died c811fc0489f18f8d875e7029378f10a9a64839958e089c24e270381a48104047 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:01:02 localhost podman[313792]: 2025-11-26 10:01:02.099688546 +0000 UTC m=+0.110960574 container cleanup c811fc0489f18f8d875e7029378f10a9a64839958e089c24e270381a48104047 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 26 05:01:02 localhost systemd[1]: libpod-conmon-c811fc0489f18f8d875e7029378f10a9a64839958e089c24e270381a48104047.scope: Deactivated successfully. Nov 26 05:01:02 localhost podman[313799]: 2025-11-26 10:01:02.141900071 +0000 UTC m=+0.139065743 container remove c811fc0489f18f8d875e7029378f10a9a64839958e089c24e270381a48104047 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:01:02 localhost nova_compute[281415]: 2025-11-26 10:01:02.188 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:02 localhost systemd[1]: var-lib-containers-storage-overlay-9be1c678b86218545f264a807c97d07b61b70f19ad9c439d1d59af3eb626a2c7-merged.mount: Deactivated successfully. Nov 26 05:01:02 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c811fc0489f18f8d875e7029378f10a9a64839958e089c24e270381a48104047-userdata-shm.mount: Deactivated successfully. Nov 26 05:01:02 localhost podman[313841]: Nov 26 05:01:02 localhost podman[313841]: 2025-11-26 10:01:02.385923697 +0000 UTC m=+0.094488028 container create 9aacdd4777bf676e24da06a4ad9871ad88368bc3ba50964b36f1e7e08301fea4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a67680f7-b6ab-4378-9ba9-788d286c7bb6, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:01:02 localhost systemd[1]: Started libpod-conmon-9aacdd4777bf676e24da06a4ad9871ad88368bc3ba50964b36f1e7e08301fea4.scope. Nov 26 05:01:02 localhost podman[313841]: 2025-11-26 10:01:02.343533367 +0000 UTC m=+0.052097748 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:01:02 localhost systemd[1]: tmp-crun.2xCBbT.mount: Deactivated successfully. Nov 26 05:01:02 localhost systemd[1]: Started libcrun container. Nov 26 05:01:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e4f2887044c3a71aa35078f40444aa57ec739fb227f3d27ed3109d60112fdf9b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:01:02 localhost podman[313841]: 2025-11-26 10:01:02.474810939 +0000 UTC m=+0.183375290 container init 9aacdd4777bf676e24da06a4ad9871ad88368bc3ba50964b36f1e7e08301fea4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a67680f7-b6ab-4378-9ba9-788d286c7bb6, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 26 05:01:02 localhost podman[313841]: 2025-11-26 10:01:02.484204525 +0000 UTC m=+0.192768876 container start 9aacdd4777bf676e24da06a4ad9871ad88368bc3ba50964b36f1e7e08301fea4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a67680f7-b6ab-4378-9ba9-788d286c7bb6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Nov 26 05:01:02 localhost dnsmasq[313860]: started, version 2.85 cachesize 150 Nov 26 05:01:02 localhost dnsmasq[313860]: DNS service limited to local subnets Nov 26 05:01:02 localhost dnsmasq[313860]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:01:02 localhost dnsmasq[313860]: warning: no upstream servers configured Nov 26 05:01:02 localhost dnsmasq-dhcp[313860]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 26 05:01:02 localhost dnsmasq[313860]: read /var/lib/neutron/dhcp/a67680f7-b6ab-4378-9ba9-788d286c7bb6/addn_hosts - 0 addresses Nov 26 05:01:02 localhost dnsmasq-dhcp[313860]: read /var/lib/neutron/dhcp/a67680f7-b6ab-4378-9ba9-788d286c7bb6/host Nov 26 05:01:02 localhost dnsmasq-dhcp[313860]: read /var/lib/neutron/dhcp/a67680f7-b6ab-4378-9ba9-788d286c7bb6/opts Nov 26 05:01:02 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:02.664 262471 INFO neutron.agent.dhcp.agent [None req-2cbcfa58-da2e-4cfb-badd-afc0a2abb361 - - - - - -] DHCP configuration for ports {'d7791386-37ef-4eb2-8fe4-394b9a9c067f'} is completed#033[00m Nov 26 05:01:02 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:02.671 262471 INFO neutron.agent.dhcp.agent [None req-2e866f97-6186-4313-93e2-74cd50153e07 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:01:02 localhost nova_compute[281415]: 2025-11-26 10:01:02.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:01:03 localhost systemd[1]: run-netns-qdhcp\x2dcc3dc995\x2d51cd\x2d4d70\x2dbe2c\x2d11c47524552d.mount: Deactivated successfully. Nov 26 05:01:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:03.667 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:01:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:03.667 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:01:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:03.668 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:01:03 localhost nova_compute[281415]: 2025-11-26 10:01:03.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:01:04 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e121 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:01:05 localhost nova_compute[281415]: 2025-11-26 10:01:05.921 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:06 localhost nova_compute[281415]: 2025-11-26 10:01:06.360 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:01:06 localhost nova_compute[281415]: 2025-11-26 10:01:06.361 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:01:06 localhost nova_compute[281415]: 2025-11-26 10:01:06.362 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:01:06 localhost nova_compute[281415]: 2025-11-26 10:01:06.362 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 05:01:06 localhost nova_compute[281415]: 2025-11-26 10:01:06.363 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 05:01:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 05:01:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 05:01:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:06.641 262471 INFO neutron.agent.linux.ip_lib [None req-5ae1d170-c7f3-48fd-bdd4-3acb686c22ac - - - - - -] Device tapf2bee9b2-20 cannot be used as it has no MAC address#033[00m Nov 26 05:01:06 localhost podman[313884]: 2025-11-26 10:01:06.655020809 +0000 UTC m=+0.108240544 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, container_name=openstack_network_exporter, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.buildah.version=1.33.7, architecture=x86_64, release=1755695350) Nov 26 05:01:06 localhost nova_compute[281415]: 2025-11-26 10:01:06.671 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:06 localhost podman[313884]: 2025-11-26 10:01:06.677265465 +0000 UTC m=+0.130485150 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., distribution-scope=public, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-type=git, container_name=openstack_network_exporter, architecture=x86_64) Nov 26 05:01:06 localhost kernel: device tapf2bee9b2-20 entered promiscuous mode Nov 26 05:01:06 localhost nova_compute[281415]: 2025-11-26 10:01:06.680 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:06 localhost NetworkManager[5970]: [1764151266.6812] manager: (tapf2bee9b2-20): new Generic device (/org/freedesktop/NetworkManager/Devices/38) Nov 26 05:01:06 localhost ovn_controller[153664]: 2025-11-26T10:01:06Z|00221|binding|INFO|Claiming lport f2bee9b2-20ce-4529-8532-a72dc6f86276 for this chassis. Nov 26 05:01:06 localhost ovn_controller[153664]: 2025-11-26T10:01:06Z|00222|binding|INFO|f2bee9b2-20ce-4529-8532-a72dc6f86276: Claiming unknown Nov 26 05:01:06 localhost systemd-udevd[313920]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:01:06 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 05:01:06 localhost ovn_controller[153664]: 2025-11-26T10:01:06Z|00223|binding|INFO|Setting lport f2bee9b2-20ce-4529-8532-a72dc6f86276 ovn-installed in OVS Nov 26 05:01:06 localhost nova_compute[281415]: 2025-11-26 10:01:06.712 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:06 localhost ovn_controller[153664]: 2025-11-26T10:01:06Z|00224|binding|INFO|Setting lport f2bee9b2-20ce-4529-8532-a72dc6f86276 up in Southbound Nov 26 05:01:06 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:06.721 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-eb17c5e9-479e-4734-b91b-09d4f037002c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eb17c5e9-479e-4734-b91b-09d4f037002c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f6fde3de4aa44469b622b6f00e3baee9', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9502b79a-5d51-45fb-bffa-0cd60cd8583e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f2bee9b2-20ce-4529-8532-a72dc6f86276) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:01:06 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:06.723 159486 INFO neutron.agent.ovn.metadata.agent [-] Port f2bee9b2-20ce-4529-8532-a72dc6f86276 in datapath eb17c5e9-479e-4734-b91b-09d4f037002c bound to our chassis#033[00m Nov 26 05:01:06 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:06.724 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network eb17c5e9-479e-4734-b91b-09d4f037002c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:01:06 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:06.725 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[46bdaebc-c8a4-4db8-af03-c19db08d593c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:01:06 localhost podman[313883]: 2025-11-26 10:01:06.764175888 +0000 UTC m=+0.217406223 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:01:06 localhost nova_compute[281415]: 2025-11-26 10:01:06.774 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:06 localhost nova_compute[281415]: 2025-11-26 10:01:06.805 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:06 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 05:01:06 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2491075087' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 05:01:06 localhost podman[313883]: 2025-11-26 10:01:06.838531411 +0000 UTC m=+0.291761746 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:01:06 localhost nova_compute[281415]: 2025-11-26 10:01:06.845 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 05:01:06 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 05:01:06 localhost nova_compute[281415]: 2025-11-26 10:01:06.930 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 05:01:06 localhost nova_compute[281415]: 2025-11-26 10:01:06.931 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 05:01:07 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:07.132 262471 INFO neutron.agent.linux.ip_lib [None req-368dcbde-84d0-4486-b1c5-f5d13a4a0694 - - - - - -] Device tapc61a163e-58 cannot be used as it has no MAC address#033[00m Nov 26 05:01:07 localhost nova_compute[281415]: 2025-11-26 10:01:07.204 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:07 localhost nova_compute[281415]: 2025-11-26 10:01:07.211 281419 WARNING nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 05:01:07 localhost kernel: device tapc61a163e-58 entered promiscuous mode Nov 26 05:01:07 localhost nova_compute[281415]: 2025-11-26 10:01:07.213 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=11272MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 05:01:07 localhost nova_compute[281415]: 2025-11-26 10:01:07.213 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:01:07 localhost nova_compute[281415]: 2025-11-26 10:01:07.214 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:01:07 localhost NetworkManager[5970]: [1764151267.2149] manager: (tapc61a163e-58): new Generic device (/org/freedesktop/NetworkManager/Devices/39) Nov 26 05:01:07 localhost ovn_controller[153664]: 2025-11-26T10:01:07Z|00225|binding|INFO|Claiming lport c61a163e-5877-444d-94d4-34ab662947de for this chassis. Nov 26 05:01:07 localhost ovn_controller[153664]: 2025-11-26T10:01:07Z|00226|binding|INFO|c61a163e-5877-444d-94d4-34ab662947de: Claiming unknown Nov 26 05:01:07 localhost nova_compute[281415]: 2025-11-26 10:01:07.222 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:07 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:07.231 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fec6:1ea3/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=c61a163e-5877-444d-94d4-34ab662947de) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:01:07 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:07.233 159486 INFO neutron.agent.ovn.metadata.agent [-] Port c61a163e-5877-444d-94d4-34ab662947de in datapath cc3dc995-51cd-4d70-be2c-11c47524552d bound to our chassis#033[00m Nov 26 05:01:07 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:07.235 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Port f5f5fa7e-4c75-4f21-8f3c-e8733aee4d12 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 26 05:01:07 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:07.236 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cc3dc995-51cd-4d70-be2c-11c47524552d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:01:07 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:07.237 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[4fde9163-1be2-4cd2-aa86-71c6ba345e52]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:01:07 localhost ovn_controller[153664]: 2025-11-26T10:01:07Z|00227|binding|INFO|Setting lport c61a163e-5877-444d-94d4-34ab662947de ovn-installed in OVS Nov 26 05:01:07 localhost ovn_controller[153664]: 2025-11-26T10:01:07Z|00228|binding|INFO|Setting lport c61a163e-5877-444d-94d4-34ab662947de up in Southbound Nov 26 05:01:07 localhost nova_compute[281415]: 2025-11-26 10:01:07.251 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:07 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:07.268 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:5e:b4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '86:cf:7c:68:02:df'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:01:07 localhost nova_compute[281415]: 2025-11-26 10:01:07.268 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:07 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:07.270 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 26 05:01:07 localhost nova_compute[281415]: 2025-11-26 10:01:07.308 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 05:01:07 localhost nova_compute[281415]: 2025-11-26 10:01:07.309 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 05:01:07 localhost nova_compute[281415]: 2025-11-26 10:01:07.309 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 05:01:07 localhost nova_compute[281415]: 2025-11-26 10:01:07.318 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:07 localhost nova_compute[281415]: 2025-11-26 10:01:07.351 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:07 localhost nova_compute[281415]: 2025-11-26 10:01:07.373 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 05:01:07 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:07.752 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:74:03 10.100.0.2 2001:db8::f816:3eff:feaf:7403'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:feaf:7403/64', 'neutron:device_id': 'ovnmeta-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ba010266-c829-4775-9f81-9e5e8ac0a898) old=Port_Binding(mac=['fa:16:3e:af:74:03 2001:db8::f816:3eff:feaf:7403'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feaf:7403/64', 'neutron:device_id': 'ovnmeta-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:01:07 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:07.755 159486 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ba010266-c829-4775-9f81-9e5e8ac0a898 in datapath cc3dc995-51cd-4d70-be2c-11c47524552d updated#033[00m Nov 26 05:01:07 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:07.758 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Port f5f5fa7e-4c75-4f21-8f3c-e8733aee4d12 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 26 05:01:07 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:07.758 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cc3dc995-51cd-4d70-be2c-11c47524552d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:01:07 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:07.759 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[9557edb3-d3e3-4107-942c-485741091562]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:01:07 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 05:01:07 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/684212839' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 05:01:07 localhost nova_compute[281415]: 2025-11-26 10:01:07.846 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 05:01:07 localhost nova_compute[281415]: 2025-11-26 10:01:07.852 281419 DEBUG nova.compute.provider_tree [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 05:01:07 localhost podman[314042]: Nov 26 05:01:07 localhost podman[314042]: 2025-11-26 10:01:07.870496035 +0000 UTC m=+0.085836733 container create 44b5ba9f7e273ce559b55f594804a9389e8f2b55aa88b292d6d82b045d95f80f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eb17c5e9-479e-4734-b91b-09d4f037002c, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:01:07 localhost nova_compute[281415]: 2025-11-26 10:01:07.872 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 05:01:07 localhost nova_compute[281415]: 2025-11-26 10:01:07.873 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 05:01:07 localhost nova_compute[281415]: 2025-11-26 10:01:07.874 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.660s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:01:07 localhost systemd[1]: Started libpod-conmon-44b5ba9f7e273ce559b55f594804a9389e8f2b55aa88b292d6d82b045d95f80f.scope. Nov 26 05:01:07 localhost systemd[1]: Started libcrun container. Nov 26 05:01:07 localhost podman[314042]: 2025-11-26 10:01:07.828381943 +0000 UTC m=+0.043722711 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:01:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e91ae83446e8fe46e6270c82bffeef94dd3cc465ad2416e8263bf393c090f8cb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:01:07 localhost podman[314042]: 2025-11-26 10:01:07.944362983 +0000 UTC m=+0.159703701 container init 44b5ba9f7e273ce559b55f594804a9389e8f2b55aa88b292d6d82b045d95f80f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eb17c5e9-479e-4734-b91b-09d4f037002c, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 26 05:01:07 localhost podman[314042]: 2025-11-26 10:01:07.954014528 +0000 UTC m=+0.169355246 container start 44b5ba9f7e273ce559b55f594804a9389e8f2b55aa88b292d6d82b045d95f80f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eb17c5e9-479e-4734-b91b-09d4f037002c, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 26 05:01:07 localhost dnsmasq[314067]: started, version 2.85 cachesize 150 Nov 26 05:01:07 localhost dnsmasq[314067]: DNS service limited to local subnets Nov 26 05:01:07 localhost dnsmasq[314067]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:01:07 localhost dnsmasq[314067]: warning: no upstream servers configured Nov 26 05:01:07 localhost dnsmasq-dhcp[314067]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 26 05:01:07 localhost dnsmasq[314067]: read /var/lib/neutron/dhcp/eb17c5e9-479e-4734-b91b-09d4f037002c/addn_hosts - 0 addresses Nov 26 05:01:07 localhost dnsmasq-dhcp[314067]: read /var/lib/neutron/dhcp/eb17c5e9-479e-4734-b91b-09d4f037002c/host Nov 26 05:01:07 localhost dnsmasq-dhcp[314067]: read /var/lib/neutron/dhcp/eb17c5e9-479e-4734-b91b-09d4f037002c/opts Nov 26 05:01:08 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:08.238 262471 INFO neutron.agent.dhcp.agent [None req-073859e0-9dd4-4d53-950f-1f114f396340 - - - - - -] DHCP configuration for ports {'dd1f05b0-1b5a-49b9-8676-68ecc4a05ca7'} is completed#033[00m Nov 26 05:01:08 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:08.272 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fad182b-d1fd-4eb1-a4d3-436a76a6f49e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 05:01:08 localhost podman[314090]: Nov 26 05:01:08 localhost podman[314090]: 2025-11-26 10:01:08.315357305 +0000 UTC m=+0.105433581 container create 5f8d61c134f5bb8c9a55760e91aaf1ed407b5e9dfa80f18c95a9c4666ad8944d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:01:08 localhost systemd[1]: Started libpod-conmon-5f8d61c134f5bb8c9a55760e91aaf1ed407b5e9dfa80f18c95a9c4666ad8944d.scope. Nov 26 05:01:08 localhost podman[314090]: 2025-11-26 10:01:08.26804871 +0000 UTC m=+0.058125036 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:01:08 localhost systemd[1]: Started libcrun container. Nov 26 05:01:08 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68532f5748e15f359f37d2a8448a83f69ecb58209cd1ec49af6b085aeae7e3ec/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:01:08 localhost podman[314090]: 2025-11-26 10:01:08.386360039 +0000 UTC m=+0.176436315 container init 5f8d61c134f5bb8c9a55760e91aaf1ed407b5e9dfa80f18c95a9c4666ad8944d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:01:08 localhost podman[314090]: 2025-11-26 10:01:08.394685684 +0000 UTC m=+0.184761990 container start 5f8d61c134f5bb8c9a55760e91aaf1ed407b5e9dfa80f18c95a9c4666ad8944d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:01:08 localhost dnsmasq[314109]: started, version 2.85 cachesize 150 Nov 26 05:01:08 localhost dnsmasq[314109]: DNS service limited to local subnets Nov 26 05:01:08 localhost dnsmasq[314109]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:01:08 localhost dnsmasq[314109]: warning: no upstream servers configured Nov 26 05:01:08 localhost dnsmasq[314109]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:01:08 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:08.411 2 INFO neutron.agent.securitygroups_rpc [None req-e94d689b-9cb9-43a7-bb30-0b4d00f6bd14 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:01:08 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:08.602 262471 INFO neutron.agent.dhcp.agent [None req-6c2a6586-04d0-44f6-9188-c3b1fceff720 - - - - - -] DHCP configuration for ports {'ba010266-c829-4775-9f81-9e5e8ac0a898'} is completed#033[00m Nov 26 05:01:08 localhost dnsmasq[314109]: exiting on receipt of SIGTERM Nov 26 05:01:08 localhost podman[314128]: 2025-11-26 10:01:08.788090696 +0000 UTC m=+0.068739428 container kill 5f8d61c134f5bb8c9a55760e91aaf1ed407b5e9dfa80f18c95a9c4666ad8944d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:01:08 localhost systemd[1]: libpod-5f8d61c134f5bb8c9a55760e91aaf1ed407b5e9dfa80f18c95a9c4666ad8944d.scope: Deactivated successfully. Nov 26 05:01:08 localhost podman[314141]: 2025-11-26 10:01:08.871171907 +0000 UTC m=+0.067351698 container died 5f8d61c134f5bb8c9a55760e91aaf1ed407b5e9dfa80f18c95a9c4666ad8944d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:01:08 localhost systemd[1]: tmp-crun.B3YvvI.mount: Deactivated successfully. Nov 26 05:01:08 localhost nova_compute[281415]: 2025-11-26 10:01:08.874 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:01:08 localhost nova_compute[281415]: 2025-11-26 10:01:08.875 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 05:01:08 localhost nova_compute[281415]: 2025-11-26 10:01:08.875 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 05:01:08 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5f8d61c134f5bb8c9a55760e91aaf1ed407b5e9dfa80f18c95a9c4666ad8944d-userdata-shm.mount: Deactivated successfully. Nov 26 05:01:08 localhost podman[314141]: 2025-11-26 10:01:08.913622098 +0000 UTC m=+0.109801839 container cleanup 5f8d61c134f5bb8c9a55760e91aaf1ed407b5e9dfa80f18c95a9c4666ad8944d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:01:08 localhost systemd[1]: libpod-conmon-5f8d61c134f5bb8c9a55760e91aaf1ed407b5e9dfa80f18c95a9c4666ad8944d.scope: Deactivated successfully. Nov 26 05:01:08 localhost podman[314143]: 2025-11-26 10:01:08.950856367 +0000 UTC m=+0.138057793 container remove 5f8d61c134f5bb8c9a55760e91aaf1ed407b5e9dfa80f18c95a9c4666ad8944d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 26 05:01:08 localhost nova_compute[281415]: 2025-11-26 10:01:08.983 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 05:01:08 localhost nova_compute[281415]: 2025-11-26 10:01:08.983 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 05:01:08 localhost nova_compute[281415]: 2025-11-26 10:01:08.984 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 05:01:08 localhost nova_compute[281415]: 2025-11-26 10:01:08.984 281419 DEBUG nova.objects.instance [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 05:01:09 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e121 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:01:09 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:09.407 2 INFO neutron.agent.securitygroups_rpc [None req-f32ff883-8128-4f91-b20c-5654b7184b69 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:01:09 localhost nova_compute[281415]: 2025-11-26 10:01:09.452 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 05:01:09 localhost nova_compute[281415]: 2025-11-26 10:01:09.470 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 05:01:09 localhost nova_compute[281415]: 2025-11-26 10:01:09.471 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 05:01:09 localhost systemd[1]: var-lib-containers-storage-overlay-68532f5748e15f359f37d2a8448a83f69ecb58209cd1ec49af6b085aeae7e3ec-merged.mount: Deactivated successfully. Nov 26 05:01:10 localhost podman[314223]: Nov 26 05:01:10 localhost podman[314223]: 2025-11-26 10:01:10.55174692 +0000 UTC m=+0.092328614 container create 10a8642778ee234e74803ee4ed190f4f06b1b0f9583830e0a368aeb3845a5213 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:01:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 05:01:10 localhost systemd[1]: Started libpod-conmon-10a8642778ee234e74803ee4ed190f4f06b1b0f9583830e0a368aeb3845a5213.scope. Nov 26 05:01:10 localhost podman[314223]: 2025-11-26 10:01:10.504796705 +0000 UTC m=+0.045378389 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:01:10 localhost systemd[1]: Started libcrun container. Nov 26 05:01:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f14b0ab69be9a8732b0cbbb12449a783a9e586a7785df9f5c53279971006ad34/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:01:10 localhost podman[314223]: 2025-11-26 10:01:10.631643326 +0000 UTC m=+0.172225010 container init 10a8642778ee234e74803ee4ed190f4f06b1b0f9583830e0a368aeb3845a5213 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 26 05:01:10 localhost podman[314223]: 2025-11-26 10:01:10.641636981 +0000 UTC m=+0.182218665 container start 10a8642778ee234e74803ee4ed190f4f06b1b0f9583830e0a368aeb3845a5213 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:01:10 localhost dnsmasq[314253]: started, version 2.85 cachesize 150 Nov 26 05:01:10 localhost dnsmasq[314253]: DNS service limited to local subnets Nov 26 05:01:10 localhost dnsmasq[314253]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:01:10 localhost dnsmasq[314253]: warning: no upstream servers configured Nov 26 05:01:10 localhost dnsmasq-dhcp[314253]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 26 05:01:10 localhost dnsmasq[314253]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 2 addresses Nov 26 05:01:10 localhost dnsmasq-dhcp[314253]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:01:10 localhost dnsmasq-dhcp[314253]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:01:10 localhost podman[314237]: 2025-11-26 10:01:10.685285428 +0000 UTC m=+0.087212223 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 05:01:10 localhost podman[314237]: 2025-11-26 10:01:10.699317072 +0000 UTC m=+0.101243877 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 05:01:10 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 05:01:10 localhost nova_compute[281415]: 2025-11-26 10:01:10.961 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:11 localhost ceph-mon[297296]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0. Nov 26 05:01:11 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:01:11.089817) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 26 05:01:11 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37 Nov 26 05:01:11 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151271089872, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 1698, "num_deletes": 258, "total_data_size": 2260784, "memory_usage": 2293760, "flush_reason": "Manual Compaction"} Nov 26 05:01:11 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started Nov 26 05:01:11 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151271104072, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 1465480, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 21236, "largest_seqno": 22929, "table_properties": {"data_size": 1458789, "index_size": 3841, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 15009, "raw_average_key_size": 21, "raw_value_size": 1445112, "raw_average_value_size": 2052, "num_data_blocks": 163, "num_entries": 704, "num_filter_entries": 704, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764151168, "oldest_key_time": 1764151168, "file_creation_time": 1764151271, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79fcc812-ff89-4330-9c0d-148e92884770", "db_session_id": "NHY1MHQN9GMTALZ562AS", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}} Nov 26 05:01:11 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 14334 microseconds, and 8748 cpu microseconds. Nov 26 05:01:11 localhost ceph-mon[297296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 05:01:11 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:01:11.104145) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 1465480 bytes OK Nov 26 05:01:11 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:01:11.104178) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started Nov 26 05:01:11 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:01:11.105889) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done Nov 26 05:01:11 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:01:11.105912) EVENT_LOG_v1 {"time_micros": 1764151271105905, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 26 05:01:11 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:01:11.105963) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 26 05:01:11 localhost ceph-mon[297296]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 2252924, prev total WAL file size 2252924, number of live WAL files 2. Nov 26 05:01:11 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 05:01:11 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:01:11.106794) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131353436' seq:72057594037927935, type:22 .. '7061786F73003131373938' seq:0, type:0; will stop at (end) Nov 26 05:01:11 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 26 05:01:11 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(1431KB)], [36(15MB)] Nov 26 05:01:11 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151271106856, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 17451055, "oldest_snapshot_seqno": -1} Nov 26 05:01:11 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:11.108 262471 INFO neutron.agent.dhcp.agent [None req-3a63b765-e5b1-40cf-9bd1-72d4d4bf6d2b - - - - - -] DHCP configuration for ports {'48abd84e-ed76-4b5b-a680-be4718d615b3', 'ba010266-c829-4775-9f81-9e5e8ac0a898', 'c61a163e-5877-444d-94d4-34ab662947de'} is completed#033[00m Nov 26 05:01:11 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 11933 keys, 14682387 bytes, temperature: kUnknown Nov 26 05:01:11 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151271188265, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 14682387, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14615511, "index_size": 35927, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 29893, "raw_key_size": 320744, "raw_average_key_size": 26, "raw_value_size": 14413424, "raw_average_value_size": 1207, "num_data_blocks": 1358, "num_entries": 11933, "num_filter_entries": 11933, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764150724, "oldest_key_time": 0, "file_creation_time": 1764151271, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79fcc812-ff89-4330-9c0d-148e92884770", "db_session_id": "NHY1MHQN9GMTALZ562AS", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}} Nov 26 05:01:11 localhost ceph-mon[297296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 05:01:11 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:01:11.188613) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 14682387 bytes Nov 26 05:01:11 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:01:11.192034) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 214.1 rd, 180.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.4, 15.2 +0.0 blob) out(14.0 +0.0 blob), read-write-amplify(21.9) write-amplify(10.0) OK, records in: 12471, records dropped: 538 output_compression: NoCompression Nov 26 05:01:11 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:01:11.192072) EVENT_LOG_v1 {"time_micros": 1764151271192052, "job": 20, "event": "compaction_finished", "compaction_time_micros": 81510, "compaction_time_cpu_micros": 46115, "output_level": 6, "num_output_files": 1, "total_output_size": 14682387, "num_input_records": 12471, "num_output_records": 11933, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 26 05:01:11 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 05:01:11 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151271192427, "job": 20, "event": "table_file_deletion", "file_number": 38} Nov 26 05:01:11 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 05:01:11 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151271194869, "job": 20, "event": "table_file_deletion", "file_number": 36} Nov 26 05:01:11 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:01:11.106654) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:01:11 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:01:11.195011) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:01:11 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:01:11.195020) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:01:11 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:01:11.195023) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:01:11 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:01:11.195026) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:01:11 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:01:11.195029) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:01:11 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:11.261 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:01:10Z, description=, device_id=2f9b4a1c-ee75-4d4a-be09-d59d248cfe0e, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=523294ae-c3ef-4054-bb82-e941cbadf0ad, ip_allocation=immediate, mac_address=fa:16:3e:0a:71:29, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:00:58Z, description=, dns_domain=, id=a67680f7-b6ab-4378-9ba9-788d286c7bb6, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersNegativeIpV6Test-test-network-328279983, port_security_enabled=True, project_id=f6fde3de4aa44469b622b6f00e3baee9, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42475, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1257, status=ACTIVE, subnets=['859066e2-4e5b-4ddd-9b43-bc90661a631f'], tags=[], tenant_id=f6fde3de4aa44469b622b6f00e3baee9, updated_at=2025-11-26T10:01:00Z, vlan_transparent=None, network_id=a67680f7-b6ab-4378-9ba9-788d286c7bb6, port_security_enabled=False, project_id=f6fde3de4aa44469b622b6f00e3baee9, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1328, status=DOWN, tags=[], tenant_id=f6fde3de4aa44469b622b6f00e3baee9, updated_at=2025-11-26T10:01:11Z on network a67680f7-b6ab-4378-9ba9-788d286c7bb6#033[00m Nov 26 05:01:11 localhost systemd[1]: tmp-crun.C5drYR.mount: Deactivated successfully. Nov 26 05:01:11 localhost dnsmasq[314253]: exiting on receipt of SIGTERM Nov 26 05:01:11 localhost podman[314284]: 2025-11-26 10:01:11.321579733 +0000 UTC m=+0.065555605 container kill 10a8642778ee234e74803ee4ed190f4f06b1b0f9583830e0a368aeb3845a5213 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:01:11 localhost systemd[1]: libpod-10a8642778ee234e74803ee4ed190f4f06b1b0f9583830e0a368aeb3845a5213.scope: Deactivated successfully. Nov 26 05:01:11 localhost podman[314307]: 2025-11-26 10:01:11.414271157 +0000 UTC m=+0.073782937 container died 10a8642778ee234e74803ee4ed190f4f06b1b0f9583830e0a368aeb3845a5213 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:01:11 localhost podman[314307]: 2025-11-26 10:01:11.452231396 +0000 UTC m=+0.111743176 container cleanup 10a8642778ee234e74803ee4ed190f4f06b1b0f9583830e0a368aeb3845a5213 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 26 05:01:11 localhost systemd[1]: libpod-conmon-10a8642778ee234e74803ee4ed190f4f06b1b0f9583830e0a368aeb3845a5213.scope: Deactivated successfully. Nov 26 05:01:11 localhost dnsmasq[313860]: read /var/lib/neutron/dhcp/a67680f7-b6ab-4378-9ba9-788d286c7bb6/addn_hosts - 1 addresses Nov 26 05:01:11 localhost dnsmasq-dhcp[313860]: read /var/lib/neutron/dhcp/a67680f7-b6ab-4378-9ba9-788d286c7bb6/host Nov 26 05:01:11 localhost podman[314338]: 2025-11-26 10:01:11.492735911 +0000 UTC m=+0.064851503 container kill 9aacdd4777bf676e24da06a4ad9871ad88368bc3ba50964b36f1e7e08301fea4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a67680f7-b6ab-4378-9ba9-788d286c7bb6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 26 05:01:11 localhost dnsmasq-dhcp[313860]: read /var/lib/neutron/dhcp/a67680f7-b6ab-4378-9ba9-788d286c7bb6/opts Nov 26 05:01:11 localhost podman[314309]: 2025-11-26 10:01:11.550917557 +0000 UTC m=+0.200200466 container remove 10a8642778ee234e74803ee4ed190f4f06b1b0f9583830e0a368aeb3845a5213 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 26 05:01:11 localhost systemd[1]: var-lib-containers-storage-overlay-f14b0ab69be9a8732b0cbbb12449a783a9e586a7785df9f5c53279971006ad34-merged.mount: Deactivated successfully. Nov 26 05:01:11 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-10a8642778ee234e74803ee4ed190f4f06b1b0f9583830e0a368aeb3845a5213-userdata-shm.mount: Deactivated successfully. Nov 26 05:01:11 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:11.895 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:74:03 2001:db8::f816:3eff:feaf:7403'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feaf:7403/64', 'neutron:device_id': 'ovnmeta-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ba010266-c829-4775-9f81-9e5e8ac0a898) old=Port_Binding(mac=['fa:16:3e:af:74:03 10.100.0.2 2001:db8::f816:3eff:feaf:7403'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:feaf:7403/64', 'neutron:device_id': 'ovnmeta-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:01:11 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:11.897 262471 INFO neutron.agent.dhcp.agent [None req-a0a7954d-f3b8-4cd8-9e30-e51f8a18b680 - - - - - -] DHCP configuration for ports {'523294ae-c3ef-4054-bb82-e941cbadf0ad'} is completed#033[00m Nov 26 05:01:11 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:11.897 159486 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ba010266-c829-4775-9f81-9e5e8ac0a898 in datapath cc3dc995-51cd-4d70-be2c-11c47524552d updated#033[00m Nov 26 05:01:11 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:11.900 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Port f5f5fa7e-4c75-4f21-8f3c-e8733aee4d12 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 26 05:01:11 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:11.900 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cc3dc995-51cd-4d70-be2c-11c47524552d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:01:11 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:11.901 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[7051cb7f-311f-4291-bd9e-2fb70736c9b9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:01:12 localhost nova_compute[281415]: 2025-11-26 10:01:12.249 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:12 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:12.744 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:01:10Z, description=, device_id=2f9b4a1c-ee75-4d4a-be09-d59d248cfe0e, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=523294ae-c3ef-4054-bb82-e941cbadf0ad, ip_allocation=immediate, mac_address=fa:16:3e:0a:71:29, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:00:58Z, description=, dns_domain=, id=a67680f7-b6ab-4378-9ba9-788d286c7bb6, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersNegativeIpV6Test-test-network-328279983, port_security_enabled=True, project_id=f6fde3de4aa44469b622b6f00e3baee9, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42475, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1257, status=ACTIVE, subnets=['859066e2-4e5b-4ddd-9b43-bc90661a631f'], tags=[], tenant_id=f6fde3de4aa44469b622b6f00e3baee9, updated_at=2025-11-26T10:01:00Z, vlan_transparent=None, network_id=a67680f7-b6ab-4378-9ba9-788d286c7bb6, port_security_enabled=False, project_id=f6fde3de4aa44469b622b6f00e3baee9, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1328, status=DOWN, tags=[], tenant_id=f6fde3de4aa44469b622b6f00e3baee9, updated_at=2025-11-26T10:01:11Z on network a67680f7-b6ab-4378-9ba9-788d286c7bb6#033[00m Nov 26 05:01:12 localhost systemd[1]: tmp-crun.UiVOZK.mount: Deactivated successfully. Nov 26 05:01:12 localhost dnsmasq[313860]: read /var/lib/neutron/dhcp/a67680f7-b6ab-4378-9ba9-788d286c7bb6/addn_hosts - 1 addresses Nov 26 05:01:12 localhost dnsmasq-dhcp[313860]: read /var/lib/neutron/dhcp/a67680f7-b6ab-4378-9ba9-788d286c7bb6/host Nov 26 05:01:12 localhost dnsmasq-dhcp[313860]: read /var/lib/neutron/dhcp/a67680f7-b6ab-4378-9ba9-788d286c7bb6/opts Nov 26 05:01:12 localhost podman[314408]: 2025-11-26 10:01:12.968008899 +0000 UTC m=+0.073614013 container kill 9aacdd4777bf676e24da06a4ad9871ad88368bc3ba50964b36f1e7e08301fea4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a67680f7-b6ab-4378-9ba9-788d286c7bb6, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 26 05:01:13 localhost podman[314445]: Nov 26 05:01:13 localhost podman[314445]: 2025-11-26 10:01:13.137430245 +0000 UTC m=+0.094742815 container create 3ee159ae9c3b17d595e9d0a6b3fd5a7b82f3d60a0a005e8c49577e8b7869affa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:01:13 localhost systemd[1]: Started libpod-conmon-3ee159ae9c3b17d595e9d0a6b3fd5a7b82f3d60a0a005e8c49577e8b7869affa.scope. Nov 26 05:01:13 localhost podman[314445]: 2025-11-26 10:01:13.090579013 +0000 UTC m=+0.047891613 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:01:13 localhost systemd[1]: Started libcrun container. Nov 26 05:01:13 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8b3b8fdd0d974b86fb41e4f88dfc5463301cba91c03eef1b33e8e67698b47c4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:01:13 localhost podman[314445]: 2025-11-26 10:01:13.216156617 +0000 UTC m=+0.173469177 container init 3ee159ae9c3b17d595e9d0a6b3fd5a7b82f3d60a0a005e8c49577e8b7869affa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:01:13 localhost podman[314445]: 2025-11-26 10:01:13.225484112 +0000 UTC m=+0.182796682 container start 3ee159ae9c3b17d595e9d0a6b3fd5a7b82f3d60a0a005e8c49577e8b7869affa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 26 05:01:13 localhost dnsmasq[314468]: started, version 2.85 cachesize 150 Nov 26 05:01:13 localhost dnsmasq[314468]: DNS service limited to local subnets Nov 26 05:01:13 localhost dnsmasq[314468]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:01:13 localhost dnsmasq[314468]: warning: no upstream servers configured Nov 26 05:01:13 localhost dnsmasq-dhcp[314468]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 26 05:01:13 localhost dnsmasq[314468]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:01:13 localhost dnsmasq-dhcp[314468]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:01:13 localhost dnsmasq-dhcp[314468]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:01:13 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:13.251 262471 INFO neutron.agent.dhcp.agent [None req-52ba0e55-a7af-4283-beaf-8a72b611c7b4 - - - - - -] DHCP configuration for ports {'523294ae-c3ef-4054-bb82-e941cbadf0ad'} is completed#033[00m Nov 26 05:01:13 localhost dnsmasq[314468]: exiting on receipt of SIGTERM Nov 26 05:01:13 localhost podman[314486]: 2025-11-26 10:01:13.637990768 +0000 UTC m=+0.067893594 container kill 3ee159ae9c3b17d595e9d0a6b3fd5a7b82f3d60a0a005e8c49577e8b7869affa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS) Nov 26 05:01:13 localhost systemd[1]: libpod-3ee159ae9c3b17d595e9d0a6b3fd5a7b82f3d60a0a005e8c49577e8b7869affa.scope: Deactivated successfully. Nov 26 05:01:13 localhost podman[314498]: 2025-11-26 10:01:13.719236234 +0000 UTC m=+0.060831945 container died 3ee159ae9c3b17d595e9d0a6b3fd5a7b82f3d60a0a005e8c49577e8b7869affa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:01:13 localhost podman[314498]: 2025-11-26 10:01:13.753891916 +0000 UTC m=+0.095487557 container cleanup 3ee159ae9c3b17d595e9d0a6b3fd5a7b82f3d60a0a005e8c49577e8b7869affa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 26 05:01:13 localhost systemd[1]: libpod-conmon-3ee159ae9c3b17d595e9d0a6b3fd5a7b82f3d60a0a005e8c49577e8b7869affa.scope: Deactivated successfully. Nov 26 05:01:13 localhost podman[314500]: 2025-11-26 10:01:13.808853047 +0000 UTC m=+0.144438961 container remove 3ee159ae9c3b17d595e9d0a6b3fd5a7b82f3d60a0a005e8c49577e8b7869affa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 26 05:01:13 localhost dnsmasq[313860]: read /var/lib/neutron/dhcp/a67680f7-b6ab-4378-9ba9-788d286c7bb6/addn_hosts - 0 addresses Nov 26 05:01:13 localhost dnsmasq-dhcp[313860]: read /var/lib/neutron/dhcp/a67680f7-b6ab-4378-9ba9-788d286c7bb6/host Nov 26 05:01:13 localhost podman[314543]: 2025-11-26 10:01:13.919958303 +0000 UTC m=+0.065330647 container kill 9aacdd4777bf676e24da06a4ad9871ad88368bc3ba50964b36f1e7e08301fea4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a67680f7-b6ab-4378-9ba9-788d286c7bb6, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118) Nov 26 05:01:13 localhost dnsmasq-dhcp[313860]: read /var/lib/neutron/dhcp/a67680f7-b6ab-4378-9ba9-788d286c7bb6/opts Nov 26 05:01:13 localhost systemd[1]: var-lib-containers-storage-overlay-b8b3b8fdd0d974b86fb41e4f88dfc5463301cba91c03eef1b33e8e67698b47c4-merged.mount: Deactivated successfully. Nov 26 05:01:13 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3ee159ae9c3b17d595e9d0a6b3fd5a7b82f3d60a0a005e8c49577e8b7869affa-userdata-shm.mount: Deactivated successfully. Nov 26 05:01:14 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:14.044 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:74:03 10.100.0.2 2001:db8::f816:3eff:feaf:7403'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:feaf:7403/64', 'neutron:device_id': 'ovnmeta-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ba010266-c829-4775-9f81-9e5e8ac0a898) old=Port_Binding(mac=['fa:16:3e:af:74:03 2001:db8::f816:3eff:feaf:7403'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feaf:7403/64', 'neutron:device_id': 'ovnmeta-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:01:14 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:14.046 159486 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ba010266-c829-4775-9f81-9e5e8ac0a898 in datapath cc3dc995-51cd-4d70-be2c-11c47524552d updated#033[00m Nov 26 05:01:14 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:14.049 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Port f5f5fa7e-4c75-4f21-8f3c-e8733aee4d12 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 26 05:01:14 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:14.049 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cc3dc995-51cd-4d70-be2c-11c47524552d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:01:14 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:14.050 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[28a34c42-b9b9-4a92-83c2-ca094bd67e98]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:01:14 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:14.118 262471 INFO neutron.agent.dhcp.agent [None req-556aeaf8-374e-4104-86ad-c6a14f7a6f8a - - - - - -] DHCP configuration for ports {'ba010266-c829-4775-9f81-9e5e8ac0a898', 'c61a163e-5877-444d-94d4-34ab662947de'} is completed#033[00m Nov 26 05:01:14 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e121 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:01:14 localhost ovn_controller[153664]: 2025-11-26T10:01:14Z|00229|binding|INFO|Releasing lport d7937d5d-fd9a-4156-9554-2b8e5c61a861 from this chassis (sb_readonly=0) Nov 26 05:01:14 localhost ovn_controller[153664]: 2025-11-26T10:01:14Z|00230|binding|INFO|Setting lport d7937d5d-fd9a-4156-9554-2b8e5c61a861 down in Southbound Nov 26 05:01:14 localhost nova_compute[281415]: 2025-11-26 10:01:14.240 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:14 localhost kernel: device tapd7937d5d-fd left promiscuous mode Nov 26 05:01:14 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:14.254 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-a67680f7-b6ab-4378-9ba9-788d286c7bb6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a67680f7-b6ab-4378-9ba9-788d286c7bb6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f6fde3de4aa44469b622b6f00e3baee9', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4f4e28f9-bd6b-4809-ac7f-8831f217257b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d7937d5d-fd9a-4156-9554-2b8e5c61a861) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:01:14 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:14.256 159486 INFO neutron.agent.ovn.metadata.agent [-] Port d7937d5d-fd9a-4156-9554-2b8e5c61a861 in datapath a67680f7-b6ab-4378-9ba9-788d286c7bb6 unbound from our chassis#033[00m Nov 26 05:01:14 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:14.258 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network a67680f7-b6ab-4378-9ba9-788d286c7bb6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:01:14 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:14.259 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[ac91c23b-d57a-458e-a4e0-db146bbcf8f9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:01:14 localhost nova_compute[281415]: 2025-11-26 10:01:14.263 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:14 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:14.602 2 INFO neutron.agent.securitygroups_rpc [None req-fdfad57c-cfc8-4884-a2b3-ea098b149af6 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:01:14 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:14.638 2 INFO neutron.agent.securitygroups_rpc [None req-ddce0139-fa8d-4e10-aa9a-94b88f68076b e84da2ccc8624643b02b923452decadf 686e11ee04914d51ad8dbfc109d48ff7 - - default default] Security group member updated ['c0a14b41-776a-40df-8488-9607b72895ed']#033[00m Nov 26 05:01:15 localhost podman[314617]: Nov 26 05:01:15 localhost podman[314617]: 2025-11-26 10:01:15.394585853 +0000 UTC m=+0.104825253 container create 62eb2a31548df29034094703ba13f69e2b2a84dfe00ee8f80dbb6f11b48bcfd7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3) Nov 26 05:01:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 05:01:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 05:01:15 localhost systemd[1]: Started libpod-conmon-62eb2a31548df29034094703ba13f69e2b2a84dfe00ee8f80dbb6f11b48bcfd7.scope. Nov 26 05:01:15 localhost podman[314617]: 2025-11-26 10:01:15.345230597 +0000 UTC m=+0.055470037 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:01:15 localhost systemd[1]: tmp-crun.gUDAaj.mount: Deactivated successfully. Nov 26 05:01:15 localhost systemd[1]: Started libcrun container. Nov 26 05:01:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/121a923701764389a99e7d3581ca357c67120d2a88911cfe61c1b15d71b688dd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:01:15 localhost podman[314617]: 2025-11-26 10:01:15.485081632 +0000 UTC m=+0.195321042 container init 62eb2a31548df29034094703ba13f69e2b2a84dfe00ee8f80dbb6f11b48bcfd7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Nov 26 05:01:15 localhost podman[314617]: 2025-11-26 10:01:15.495402406 +0000 UTC m=+0.205641816 container start 62eb2a31548df29034094703ba13f69e2b2a84dfe00ee8f80dbb6f11b48bcfd7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS) Nov 26 05:01:15 localhost dnsmasq[314657]: started, version 2.85 cachesize 150 Nov 26 05:01:15 localhost dnsmasq[314657]: DNS service limited to local subnets Nov 26 05:01:15 localhost dnsmasq[314657]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:01:15 localhost dnsmasq[314657]: warning: no upstream servers configured Nov 26 05:01:15 localhost dnsmasq-dhcp[314657]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 26 05:01:15 localhost dnsmasq-dhcp[314657]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 26 05:01:15 localhost dnsmasq[314657]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:01:15 localhost dnsmasq-dhcp[314657]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:01:15 localhost dnsmasq-dhcp[314657]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:01:15 localhost podman[314630]: 2025-11-26 10:01:15.536630121 +0000 UTC m=+0.097927439 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Nov 26 05:01:15 localhost podman[314630]: 2025-11-26 10:01:15.572477279 +0000 UTC m=+0.133774607 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_managed=true) Nov 26 05:01:15 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 05:01:15 localhost podman[314632]: 2025-11-26 10:01:15.590250304 +0000 UTC m=+0.150265823 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 26 05:01:15 localhost podman[314632]: 2025-11-26 10:01:15.60305066 +0000 UTC m=+0.163066149 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Nov 26 05:01:15 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 05:01:15 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:15.699 2 INFO neutron.agent.securitygroups_rpc [None req-9a94cce0-bed9-4c09-ae14-3235d0d15883 e84da2ccc8624643b02b923452decadf 686e11ee04914d51ad8dbfc109d48ff7 - - default default] Security group member updated ['c0a14b41-776a-40df-8488-9607b72895ed']#033[00m Nov 26 05:01:15 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:15.745 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:01:15 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:15.771 262471 INFO neutron.agent.dhcp.agent [None req-0aa5e207-3c60-4d31-adc2-8de181a4d133 - - - - - -] DHCP configuration for ports {'ba010266-c829-4775-9f81-9e5e8ac0a898', 'c61a163e-5877-444d-94d4-34ab662947de'} is completed#033[00m Nov 26 05:01:15 localhost openstack_network_exporter[242153]: ERROR 10:01:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:01:15 localhost openstack_network_exporter[242153]: ERROR 10:01:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:01:15 localhost openstack_network_exporter[242153]: ERROR 10:01:15 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 05:01:15 localhost openstack_network_exporter[242153]: ERROR 10:01:15 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 05:01:15 localhost openstack_network_exporter[242153]: Nov 26 05:01:15 localhost openstack_network_exporter[242153]: ERROR 10:01:15 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 05:01:15 localhost openstack_network_exporter[242153]: Nov 26 05:01:15 localhost nova_compute[281415]: 2025-11-26 10:01:15.965 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:15 localhost dnsmasq[314657]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 2 addresses Nov 26 05:01:15 localhost dnsmasq-dhcp[314657]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:01:15 localhost podman[314693]: 2025-11-26 10:01:15.973314 +0000 UTC m=+0.066699138 container kill 62eb2a31548df29034094703ba13f69e2b2a84dfe00ee8f80dbb6f11b48bcfd7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 26 05:01:15 localhost dnsmasq-dhcp[314657]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:01:15 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:15.991 2 INFO neutron.agent.securitygroups_rpc [None req-cdb3f1ee-b0c8-4f5b-9064-6d3c21a20cb8 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:01:16 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:16.186 262471 INFO neutron.agent.dhcp.agent [None req-fc95db35-3596-4bd7-8cea-c2bcb1a6f321 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:01:08Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=48abd84e-ed76-4b5b-a680-be4718d615b3, ip_allocation=immediate, mac_address=fa:16:3e:f2:6a:65, name=tempest-NetworksTestDHCPv6-1024634978, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:00:20Z, description=, dns_domain=, id=cc3dc995-51cd-4d70-be2c-11c47524552d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-843096697, port_security_enabled=True, project_id=fbf16d8f1271436498d8d9cbfb24239d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42907, qos_policy_id=None, revision_number=23, router:external=False, shared=False, standard_attr_id=1055, status=ACTIVE, subnets=['61fa09e6-0364-4981-b3cd-1c5483c56758', '89281dc6-f813-45ba-8c78-9af190b62d48'], tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:01:03Z, vlan_transparent=None, network_id=cc3dc995-51cd-4d70-be2c-11c47524552d, port_security_enabled=True, project_id=fbf16d8f1271436498d8d9cbfb24239d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['513251a1-00ec-4f61-b1d4-b1337479c848'], standard_attr_id=1311, status=DOWN, tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:01:08Z on network cc3dc995-51cd-4d70-be2c-11c47524552d#033[00m Nov 26 05:01:16 localhost dnsmasq[314067]: exiting on receipt of SIGTERM Nov 26 05:01:16 localhost podman[314734]: 2025-11-26 10:01:16.246534788 +0000 UTC m=+0.065890544 container kill 44b5ba9f7e273ce559b55f594804a9389e8f2b55aa88b292d6d82b045d95f80f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eb17c5e9-479e-4734-b91b-09d4f037002c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:01:16 localhost systemd[1]: libpod-44b5ba9f7e273ce559b55f594804a9389e8f2b55aa88b292d6d82b045d95f80f.scope: Deactivated successfully. Nov 26 05:01:16 localhost podman[314749]: 2025-11-26 10:01:16.325405494 +0000 UTC m=+0.061546856 container died 44b5ba9f7e273ce559b55f594804a9389e8f2b55aa88b292d6d82b045d95f80f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eb17c5e9-479e-4734-b91b-09d4f037002c, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 26 05:01:16 localhost podman[314749]: 2025-11-26 10:01:16.362294452 +0000 UTC m=+0.098435744 container cleanup 44b5ba9f7e273ce559b55f594804a9389e8f2b55aa88b292d6d82b045d95f80f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eb17c5e9-479e-4734-b91b-09d4f037002c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Nov 26 05:01:16 localhost systemd[1]: libpod-conmon-44b5ba9f7e273ce559b55f594804a9389e8f2b55aa88b292d6d82b045d95f80f.scope: Deactivated successfully. Nov 26 05:01:16 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:16.377 262471 INFO neutron.agent.dhcp.agent [None req-b54f3be6-1df6-4dc9-83f6-97aca34e6096 - - - - - -] DHCP configuration for ports {'6491d301-d5cf-48d0-ad61-9457148dd1cb', 'ba010266-c829-4775-9f81-9e5e8ac0a898', 'c61a163e-5877-444d-94d4-34ab662947de'} is completed#033[00m Nov 26 05:01:16 localhost systemd[1]: var-lib-containers-storage-overlay-e91ae83446e8fe46e6270c82bffeef94dd3cc465ad2416e8263bf393c090f8cb-merged.mount: Deactivated successfully. Nov 26 05:01:16 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-44b5ba9f7e273ce559b55f594804a9389e8f2b55aa88b292d6d82b045d95f80f-userdata-shm.mount: Deactivated successfully. Nov 26 05:01:16 localhost podman[314752]: 2025-11-26 10:01:16.420396396 +0000 UTC m=+0.147480851 container remove 44b5ba9f7e273ce559b55f594804a9389e8f2b55aa88b292d6d82b045d95f80f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eb17c5e9-479e-4734-b91b-09d4f037002c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 26 05:01:16 localhost nova_compute[281415]: 2025-11-26 10:01:16.436 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:16 localhost kernel: device tapf2bee9b2-20 left promiscuous mode Nov 26 05:01:16 localhost ovn_controller[153664]: 2025-11-26T10:01:16Z|00231|binding|INFO|Releasing lport f2bee9b2-20ce-4529-8532-a72dc6f86276 from this chassis (sb_readonly=0) Nov 26 05:01:16 localhost ovn_controller[153664]: 2025-11-26T10:01:16Z|00232|binding|INFO|Setting lport f2bee9b2-20ce-4529-8532-a72dc6f86276 down in Southbound Nov 26 05:01:16 localhost nova_compute[281415]: 2025-11-26 10:01:16.459 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:16 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:16.473 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-eb17c5e9-479e-4734-b91b-09d4f037002c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eb17c5e9-479e-4734-b91b-09d4f037002c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f6fde3de4aa44469b622b6f00e3baee9', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9502b79a-5d51-45fb-bffa-0cd60cd8583e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f2bee9b2-20ce-4529-8532-a72dc6f86276) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:01:16 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:16.475 159486 INFO neutron.agent.ovn.metadata.agent [-] Port f2bee9b2-20ce-4529-8532-a72dc6f86276 in datapath eb17c5e9-479e-4734-b91b-09d4f037002c unbound from our chassis#033[00m Nov 26 05:01:16 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:16.477 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network eb17c5e9-479e-4734-b91b-09d4f037002c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:01:16 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:16.479 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[490f3dd0-ce37-44ea-8fc8-0955b3ef0d52]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:01:16 localhost dnsmasq[314657]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 4 addresses Nov 26 05:01:16 localhost dnsmasq-dhcp[314657]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:01:16 localhost podman[314796]: 2025-11-26 10:01:16.552076239 +0000 UTC m=+0.070573692 container kill 62eb2a31548df29034094703ba13f69e2b2a84dfe00ee8f80dbb6f11b48bcfd7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0) Nov 26 05:01:16 localhost dnsmasq-dhcp[314657]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:01:16 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:16.783 262471 INFO neutron.agent.dhcp.agent [None req-22ade980-d650-4266-9690-ae5b43ef4cb6 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:01:14Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=6491d301-d5cf-48d0-ad61-9457148dd1cb, ip_allocation=immediate, mac_address=fa:16:3e:66:71:5e, name=tempest-NetworksTestDHCPv6-2132289543, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:00:20Z, description=, dns_domain=, id=cc3dc995-51cd-4d70-be2c-11c47524552d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-843096697, port_security_enabled=True, project_id=fbf16d8f1271436498d8d9cbfb24239d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42907, qos_policy_id=None, revision_number=27, router:external=False, shared=False, standard_attr_id=1055, status=ACTIVE, subnets=['44f98135-e7fc-421a-8571-f2a1a86a49fe', 'b9704226-6597-4eb4-93a8-c6014e6170f8'], tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:01:12Z, vlan_transparent=None, network_id=cc3dc995-51cd-4d70-be2c-11c47524552d, port_security_enabled=True, project_id=fbf16d8f1271436498d8d9cbfb24239d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['513251a1-00ec-4f61-b1d4-b1337479c848'], standard_attr_id=1364, status=DOWN, tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:01:14Z on network cc3dc995-51cd-4d70-be2c-11c47524552d#033[00m Nov 26 05:01:16 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:16.862 262471 INFO neutron.agent.dhcp.agent [None req-239b49d0-5763-45f5-b1eb-cc92efbe465d - - - - - -] DHCP configuration for ports {'48abd84e-ed76-4b5b-a680-be4718d615b3'} is completed#033[00m Nov 26 05:01:17 localhost dnsmasq[314657]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 4 addresses Nov 26 05:01:17 localhost dnsmasq-dhcp[314657]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:01:17 localhost dnsmasq-dhcp[314657]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:01:17 localhost podman[314836]: 2025-11-26 10:01:17.036479015 +0000 UTC m=+0.069501991 container kill 62eb2a31548df29034094703ba13f69e2b2a84dfe00ee8f80dbb6f11b48bcfd7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true) Nov 26 05:01:17 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:17.078 262471 INFO neutron.agent.dhcp.agent [None req-e024a2a5-84ff-4adf-8750-4719eda700d8 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:01:17 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:17.109 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:01:17 localhost nova_compute[281415]: 2025-11-26 10:01:17.276 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:17 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:17.295 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:01:17 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:17.397 262471 INFO neutron.agent.dhcp.agent [None req-b5116778-88a2-479d-a891-23bff47f6db6 - - - - - -] DHCP configuration for ports {'6491d301-d5cf-48d0-ad61-9457148dd1cb'} is completed#033[00m Nov 26 05:01:17 localhost systemd[1]: run-netns-qdhcp\x2deb17c5e9\x2d479e\x2d4734\x2db91b\x2d09d4f037002c.mount: Deactivated successfully. Nov 26 05:01:17 localhost ovn_controller[153664]: 2025-11-26T10:01:17Z|00233|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:01:17 localhost nova_compute[281415]: 2025-11-26 10:01:17.540 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:17 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:17.543 2 INFO neutron.agent.securitygroups_rpc [None req-dc2febf1-71b1-4d20-a145-459b4f85411b a025d41076dc4bf8b5c42446a20db4a7 083b00bb83474f96865b0c5a38c5f88f - - default default] Security group member updated ['6d45445d-04cd-4f12-afe3-c4bc11ac69da']#033[00m Nov 26 05:01:17 localhost dnsmasq[314657]: exiting on receipt of SIGTERM Nov 26 05:01:17 localhost podman[314875]: 2025-11-26 10:01:17.618044986 +0000 UTC m=+0.068593674 container kill 62eb2a31548df29034094703ba13f69e2b2a84dfe00ee8f80dbb6f11b48bcfd7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 26 05:01:17 localhost systemd[1]: libpod-62eb2a31548df29034094703ba13f69e2b2a84dfe00ee8f80dbb6f11b48bcfd7.scope: Deactivated successfully. Nov 26 05:01:17 localhost podman[314889]: 2025-11-26 10:01:17.704136225 +0000 UTC m=+0.066122550 container died 62eb2a31548df29034094703ba13f69e2b2a84dfe00ee8f80dbb6f11b48bcfd7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:01:17 localhost podman[314889]: 2025-11-26 10:01:17.740380044 +0000 UTC m=+0.102366349 container cleanup 62eb2a31548df29034094703ba13f69e2b2a84dfe00ee8f80dbb6f11b48bcfd7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0) Nov 26 05:01:17 localhost systemd[1]: libpod-conmon-62eb2a31548df29034094703ba13f69e2b2a84dfe00ee8f80dbb6f11b48bcfd7.scope: Deactivated successfully. Nov 26 05:01:17 localhost podman[314891]: 2025-11-26 10:01:17.778252382 +0000 UTC m=+0.133195340 container remove 62eb2a31548df29034094703ba13f69e2b2a84dfe00ee8f80dbb6f11b48bcfd7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 26 05:01:18 localhost systemd[1]: var-lib-containers-storage-overlay-121a923701764389a99e7d3581ca357c67120d2a88911cfe61c1b15d71b688dd-merged.mount: Deactivated successfully. Nov 26 05:01:18 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-62eb2a31548df29034094703ba13f69e2b2a84dfe00ee8f80dbb6f11b48bcfd7-userdata-shm.mount: Deactivated successfully. Nov 26 05:01:18 localhost podman[314971]: Nov 26 05:01:18 localhost podman[314995]: 2025-11-26 10:01:18.956675735 +0000 UTC m=+0.066282295 container kill 9aacdd4777bf676e24da06a4ad9871ad88368bc3ba50964b36f1e7e08301fea4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a67680f7-b6ab-4378-9ba9-788d286c7bb6, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Nov 26 05:01:18 localhost dnsmasq[313860]: exiting on receipt of SIGTERM Nov 26 05:01:18 localhost systemd[1]: libpod-9aacdd4777bf676e24da06a4ad9871ad88368bc3ba50964b36f1e7e08301fea4.scope: Deactivated successfully. Nov 26 05:01:18 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:18.978 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:74:03 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ba010266-c829-4775-9f81-9e5e8ac0a898) old=Port_Binding(mac=['fa:16:3e:af:74:03 10.100.0.2 2001:db8::f816:3eff:feaf:7403'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:feaf:7403/64', 'neutron:device_id': 'ovnmeta-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:01:18 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:18.980 159486 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ba010266-c829-4775-9f81-9e5e8ac0a898 in datapath cc3dc995-51cd-4d70-be2c-11c47524552d updated#033[00m Nov 26 05:01:18 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:18.981 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Port f5f5fa7e-4c75-4f21-8f3c-e8733aee4d12 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 26 05:01:18 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:18.981 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cc3dc995-51cd-4d70-be2c-11c47524552d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:01:18 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:18.983 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[bc9a80e3-33a6-475e-93d3-576f91bc22f8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:01:18 localhost podman[314971]: 2025-11-26 10:01:18.996416176 +0000 UTC m=+0.191752985 container create 1b75638f71a2a7fb03d6af174e0b66b737b4f09aaec5c5a5cbd8021b16fd01fa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:01:19 localhost podman[314971]: 2025-11-26 10:01:18.90904593 +0000 UTC m=+0.104382779 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:01:19 localhost systemd[1]: Started libpod-conmon-1b75638f71a2a7fb03d6af174e0b66b737b4f09aaec5c5a5cbd8021b16fd01fa.scope. Nov 26 05:01:19 localhost podman[315006]: 2025-11-26 10:01:19.049853513 +0000 UTC m=+0.076493167 container died 9aacdd4777bf676e24da06a4ad9871ad88368bc3ba50964b36f1e7e08301fea4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a67680f7-b6ab-4378-9ba9-788d286c7bb6, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:01:19 localhost systemd[1]: Started libcrun container. Nov 26 05:01:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b277b214e9f08627ab87c3c8bc98142801df7eee6f448dfd5f78dcfc2a32aba0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:01:19 localhost podman[314971]: 2025-11-26 10:01:19.077782336 +0000 UTC m=+0.273119195 container init 1b75638f71a2a7fb03d6af174e0b66b737b4f09aaec5c5a5cbd8021b16fd01fa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:01:19 localhost podman[314971]: 2025-11-26 10:01:19.086399931 +0000 UTC m=+0.281736740 container start 1b75638f71a2a7fb03d6af174e0b66b737b4f09aaec5c5a5cbd8021b16fd01fa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 26 05:01:19 localhost dnsmasq[315032]: started, version 2.85 cachesize 150 Nov 26 05:01:19 localhost dnsmasq[315032]: DNS service limited to local subnets Nov 26 05:01:19 localhost dnsmasq[315032]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:01:19 localhost dnsmasq[315032]: warning: no upstream servers configured Nov 26 05:01:19 localhost dnsmasq-dhcp[315032]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 26 05:01:19 localhost dnsmasq[315032]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:01:19 localhost dnsmasq-dhcp[315032]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:01:19 localhost dnsmasq-dhcp[315032]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:01:19 localhost podman[315006]: 2025-11-26 10:01:19.13928351 +0000 UTC m=+0.165923144 container cleanup 9aacdd4777bf676e24da06a4ad9871ad88368bc3ba50964b36f1e7e08301fea4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a67680f7-b6ab-4378-9ba9-788d286c7bb6, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Nov 26 05:01:19 localhost systemd[1]: libpod-conmon-9aacdd4777bf676e24da06a4ad9871ad88368bc3ba50964b36f1e7e08301fea4.scope: Deactivated successfully. Nov 26 05:01:19 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e121 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:01:19 localhost podman[315008]: 2025-11-26 10:01:19.165084462 +0000 UTC m=+0.181052371 container remove 9aacdd4777bf676e24da06a4ad9871ad88368bc3ba50964b36f1e7e08301fea4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a67680f7-b6ab-4378-9ba9-788d286c7bb6, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:01:19 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:19.203 262471 INFO neutron.agent.dhcp.agent [None req-63c86135-dd7e-499e-adf2-edeba6b722ac - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:01:19 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:19.319 262471 INFO neutron.agent.dhcp.agent [None req-64d7b527-7a1c-48ce-92cb-e53939bdf047 - - - - - -] DHCP configuration for ports {'ba010266-c829-4775-9f81-9e5e8ac0a898', 'c61a163e-5877-444d-94d4-34ab662947de'} is completed#033[00m Nov 26 05:01:19 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:19.346 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:01:19 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:19.356 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:01:19 localhost systemd[1]: tmp-crun.Q53NCg.mount: Deactivated successfully. Nov 26 05:01:19 localhost systemd[1]: var-lib-containers-storage-overlay-e4f2887044c3a71aa35078f40444aa57ec739fb227f3d27ed3109d60112fdf9b-merged.mount: Deactivated successfully. Nov 26 05:01:19 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9aacdd4777bf676e24da06a4ad9871ad88368bc3ba50964b36f1e7e08301fea4-userdata-shm.mount: Deactivated successfully. Nov 26 05:01:19 localhost systemd[1]: run-netns-qdhcp\x2da67680f7\x2db6ab\x2d4378\x2d9ba9\x2d788d286c7bb6.mount: Deactivated successfully. Nov 26 05:01:19 localhost dnsmasq[315032]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:01:19 localhost dnsmasq-dhcp[315032]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:01:19 localhost podman[315058]: 2025-11-26 10:01:19.472059204 +0000 UTC m=+0.066056948 container kill 1b75638f71a2a7fb03d6af174e0b66b737b4f09aaec5c5a5cbd8021b16fd01fa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:01:19 localhost dnsmasq-dhcp[315032]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:01:19 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:19.498 2 INFO neutron.agent.securitygroups_rpc [None req-8998b8cb-ab5e-4f73-9777-47dbccdd7e2e a025d41076dc4bf8b5c42446a20db4a7 083b00bb83474f96865b0c5a38c5f88f - - default default] Security group member updated ['6d45445d-04cd-4f12-afe3-c4bc11ac69da']#033[00m Nov 26 05:01:19 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:19.624 2 INFO neutron.agent.securitygroups_rpc [None req-8998b8cb-ab5e-4f73-9777-47dbccdd7e2e a025d41076dc4bf8b5c42446a20db4a7 083b00bb83474f96865b0c5a38c5f88f - - default default] Security group member updated ['6d45445d-04cd-4f12-afe3-c4bc11ac69da']#033[00m Nov 26 05:01:19 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:19.986 262471 INFO neutron.agent.dhcp.agent [None req-d05fa3aa-c4ba-45d4-956f-e5fe51ec8533 - - - - - -] DHCP configuration for ports {'ba010266-c829-4775-9f81-9e5e8ac0a898', 'c61a163e-5877-444d-94d4-34ab662947de'} is completed#033[00m Nov 26 05:01:20 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:20.160 2 INFO neutron.agent.securitygroups_rpc [None req-39cece56-1816-4361-b0bc-13ab3d9560e1 a025d41076dc4bf8b5c42446a20db4a7 083b00bb83474f96865b0c5a38c5f88f - - default default] Security group member updated ['6d45445d-04cd-4f12-afe3-c4bc11ac69da']#033[00m Nov 26 05:01:20 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:20.185 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:01:20 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:20.200 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:01:20 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:20.526 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:74:03 10.100.0.2 2001:db8::f816:3eff:feaf:7403'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:feaf:7403/64', 'neutron:device_id': 'ovnmeta-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ba010266-c829-4775-9f81-9e5e8ac0a898) old=Port_Binding(mac=['fa:16:3e:af:74:03 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:01:20 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:20.528 159486 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ba010266-c829-4775-9f81-9e5e8ac0a898 in datapath cc3dc995-51cd-4d70-be2c-11c47524552d updated#033[00m Nov 26 05:01:20 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:20.531 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Port f5f5fa7e-4c75-4f21-8f3c-e8733aee4d12 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 26 05:01:20 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:20.531 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cc3dc995-51cd-4d70-be2c-11c47524552d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:01:20 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:20.532 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[e6f109cf-4e14-4bc0-9bab-f4f10b0cb8ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:01:20 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:20.818 2 INFO neutron.agent.securitygroups_rpc [None req-95d96546-1966-4430-9597-415e314751a5 a025d41076dc4bf8b5c42446a20db4a7 083b00bb83474f96865b0c5a38c5f88f - - default default] Security group member updated ['6d45445d-04cd-4f12-afe3-c4bc11ac69da']#033[00m Nov 26 05:01:21 localhost nova_compute[281415]: 2025-11-26 10:01:21.004 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:21 localhost dnsmasq[315032]: exiting on receipt of SIGTERM Nov 26 05:01:21 localhost podman[315098]: 2025-11-26 10:01:21.292015077 +0000 UTC m=+0.066378399 container kill 1b75638f71a2a7fb03d6af174e0b66b737b4f09aaec5c5a5cbd8021b16fd01fa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3) Nov 26 05:01:21 localhost systemd[1]: libpod-1b75638f71a2a7fb03d6af174e0b66b737b4f09aaec5c5a5cbd8021b16fd01fa.scope: Deactivated successfully. Nov 26 05:01:21 localhost podman[315111]: 2025-11-26 10:01:21.368903954 +0000 UTC m=+0.061981658 container died 1b75638f71a2a7fb03d6af174e0b66b737b4f09aaec5c5a5cbd8021b16fd01fa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:01:21 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1b75638f71a2a7fb03d6af174e0b66b737b4f09aaec5c5a5cbd8021b16fd01fa-userdata-shm.mount: Deactivated successfully. Nov 26 05:01:21 localhost podman[315111]: 2025-11-26 10:01:21.407572875 +0000 UTC m=+0.100650529 container cleanup 1b75638f71a2a7fb03d6af174e0b66b737b4f09aaec5c5a5cbd8021b16fd01fa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 26 05:01:21 localhost systemd[1]: libpod-conmon-1b75638f71a2a7fb03d6af174e0b66b737b4f09aaec5c5a5cbd8021b16fd01fa.scope: Deactivated successfully. Nov 26 05:01:21 localhost podman[315113]: 2025-11-26 10:01:21.450853002 +0000 UTC m=+0.131956974 container remove 1b75638f71a2a7fb03d6af174e0b66b737b4f09aaec5c5a5cbd8021b16fd01fa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:01:21 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:21.516 2 INFO neutron.agent.securitygroups_rpc [None req-0b255fc8-ec52-40cf-b14c-25f64322ce43 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:01:21 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:21.559 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:01:21 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:21.992 2 INFO neutron.agent.securitygroups_rpc [None req-34e64f9e-529a-4f81-a184-ed9695d31bfa 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:01:22 localhost systemd[1]: var-lib-containers-storage-overlay-b277b214e9f08627ab87c3c8bc98142801df7eee6f448dfd5f78dcfc2a32aba0-merged.mount: Deactivated successfully. Nov 26 05:01:22 localhost nova_compute[281415]: 2025-11-26 10:01:22.310 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:22 localhost podman[315191]: Nov 26 05:01:22 localhost podman[315191]: 2025-11-26 10:01:22.912120036 +0000 UTC m=+0.086977036 container create ec718efa58a9bb4ef2bf6b5eee9057105f4b328744568f74120b8915c1dd039f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3) Nov 26 05:01:22 localhost systemd[1]: Started libpod-conmon-ec718efa58a9bb4ef2bf6b5eee9057105f4b328744568f74120b8915c1dd039f.scope. Nov 26 05:01:22 localhost systemd[1]: tmp-crun.yN3d05.mount: Deactivated successfully. Nov 26 05:01:22 localhost systemd[1]: Started libcrun container. Nov 26 05:01:22 localhost podman[315191]: 2025-11-26 10:01:22.873626051 +0000 UTC m=+0.048483031 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:01:22 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a28ebedc65c9a704971808b3cd0ff9a36b9ddd12694681c19b02c9044fac3d21/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:01:22 localhost podman[315191]: 2025-11-26 10:01:22.985472069 +0000 UTC m=+0.160329069 container init ec718efa58a9bb4ef2bf6b5eee9057105f4b328744568f74120b8915c1dd039f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118) Nov 26 05:01:22 localhost podman[315191]: 2025-11-26 10:01:22.992659252 +0000 UTC m=+0.167516242 container start ec718efa58a9bb4ef2bf6b5eee9057105f4b328744568f74120b8915c1dd039f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Nov 26 05:01:22 localhost dnsmasq[315209]: started, version 2.85 cachesize 150 Nov 26 05:01:22 localhost dnsmasq[315209]: DNS service limited to local subnets Nov 26 05:01:22 localhost dnsmasq[315209]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:01:22 localhost dnsmasq[315209]: warning: no upstream servers configured Nov 26 05:01:22 localhost dnsmasq-dhcp[315209]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 26 05:01:22 localhost dnsmasq[315209]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:01:22 localhost dnsmasq-dhcp[315209]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:01:22 localhost dnsmasq-dhcp[315209]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:01:23 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:23.337 262471 INFO neutron.agent.dhcp.agent [None req-288f02ab-a9f0-401a-b357-b86c2810b77e - - - - - -] DHCP configuration for ports {'ba010266-c829-4775-9f81-9e5e8ac0a898', 'c61a163e-5877-444d-94d4-34ab662947de'} is completed#033[00m Nov 26 05:01:23 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:23.418 2 INFO neutron.agent.securitygroups_rpc [None req-0247ebce-674c-4505-b32a-8cc7a00b2010 a025d41076dc4bf8b5c42446a20db4a7 083b00bb83474f96865b0c5a38c5f88f - - default default] Security group member updated ['6d45445d-04cd-4f12-afe3-c4bc11ac69da']#033[00m Nov 26 05:01:23 localhost dnsmasq[315209]: exiting on receipt of SIGTERM Nov 26 05:01:23 localhost systemd[1]: libpod-ec718efa58a9bb4ef2bf6b5eee9057105f4b328744568f74120b8915c1dd039f.scope: Deactivated successfully. Nov 26 05:01:23 localhost podman[315228]: 2025-11-26 10:01:23.446697142 +0000 UTC m=+0.067585524 container kill ec718efa58a9bb4ef2bf6b5eee9057105f4b328744568f74120b8915c1dd039f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 26 05:01:23 localhost podman[315240]: 2025-11-26 10:01:23.52055601 +0000 UTC m=+0.064291967 container died ec718efa58a9bb4ef2bf6b5eee9057105f4b328744568f74120b8915c1dd039f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 26 05:01:23 localhost podman[315240]: 2025-11-26 10:01:23.652016527 +0000 UTC m=+0.195752424 container cleanup ec718efa58a9bb4ef2bf6b5eee9057105f4b328744568f74120b8915c1dd039f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 26 05:01:23 localhost systemd[1]: libpod-conmon-ec718efa58a9bb4ef2bf6b5eee9057105f4b328744568f74120b8915c1dd039f.scope: Deactivated successfully. Nov 26 05:01:23 localhost podman[315247]: 2025-11-26 10:01:23.682203448 +0000 UTC m=+0.205973725 container remove ec718efa58a9bb4ef2bf6b5eee9057105f4b328744568f74120b8915c1dd039f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0) Nov 26 05:01:23 localhost ovn_controller[153664]: 2025-11-26T10:01:23Z|00234|binding|INFO|Releasing lport c61a163e-5877-444d-94d4-34ab662947de from this chassis (sb_readonly=0) Nov 26 05:01:23 localhost kernel: device tapc61a163e-58 left promiscuous mode Nov 26 05:01:23 localhost nova_compute[281415]: 2025-11-26 10:01:23.698 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:23 localhost ovn_controller[153664]: 2025-11-26T10:01:23Z|00235|binding|INFO|Setting lport c61a163e-5877-444d-94d4-34ab662947de down in Southbound Nov 26 05:01:23 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:23.716 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8::f816:3eff:fec6:1ea3/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '12', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=c61a163e-5877-444d-94d4-34ab662947de) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:01:23 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:23.718 159486 INFO neutron.agent.ovn.metadata.agent [-] Port c61a163e-5877-444d-94d4-34ab662947de in datapath cc3dc995-51cd-4d70-be2c-11c47524552d unbound from our chassis#033[00m Nov 26 05:01:23 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:23.720 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cc3dc995-51cd-4d70-be2c-11c47524552d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:01:23 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:23.722 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[d1ece268-5b26-41c0-b289-771894f4c746]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:01:23 localhost nova_compute[281415]: 2025-11-26 10:01:23.727 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:23 localhost sshd[315271]: main: sshd: ssh-rsa algorithm is disabled Nov 26 05:01:23 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:23.895 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:74:03 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ba010266-c829-4775-9f81-9e5e8ac0a898) old=Port_Binding(mac=['fa:16:3e:af:74:03 10.100.0.2 2001:db8::f816:3eff:feaf:7403'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:feaf:7403/64', 'neutron:device_id': 'ovnmeta-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:01:23 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:23.898 159486 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ba010266-c829-4775-9f81-9e5e8ac0a898 in datapath cc3dc995-51cd-4d70-be2c-11c47524552d updated#033[00m Nov 26 05:01:23 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:23.900 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cc3dc995-51cd-4d70-be2c-11c47524552d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:01:23 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:23.901 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[cfe5e251-f7e0-419b-8ddc-c279c171f7bf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:01:24 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e121 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:01:24 localhost systemd[1]: var-lib-containers-storage-overlay-a28ebedc65c9a704971808b3cd0ff9a36b9ddd12694681c19b02c9044fac3d21-merged.mount: Deactivated successfully. Nov 26 05:01:24 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ec718efa58a9bb4ef2bf6b5eee9057105f4b328744568f74120b8915c1dd039f-userdata-shm.mount: Deactivated successfully. Nov 26 05:01:24 localhost systemd[1]: run-netns-qdhcp\x2dcc3dc995\x2d51cd\x2d4d70\x2dbe2c\x2d11c47524552d.mount: Deactivated successfully. Nov 26 05:01:24 localhost systemd[1]: virtsecretd.service: Deactivated successfully. Nov 26 05:01:24 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:24.559 2 INFO neutron.agent.securitygroups_rpc [None req-edf29fe8-378c-4ec8-9231-0a1c574f04f2 a025d41076dc4bf8b5c42446a20db4a7 083b00bb83474f96865b0c5a38c5f88f - - default default] Security group member updated ['6d45445d-04cd-4f12-afe3-c4bc11ac69da']#033[00m Nov 26 05:01:24 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:24.585 262471 INFO neutron.agent.linux.ip_lib [None req-8451eadb-6bcd-42c0-8306-9bfa93606859 - - - - - -] Device tap0dc44189-6e cannot be used as it has no MAC address#033[00m Nov 26 05:01:24 localhost nova_compute[281415]: 2025-11-26 10:01:24.651 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:24 localhost kernel: device tap0dc44189-6e entered promiscuous mode Nov 26 05:01:24 localhost NetworkManager[5970]: [1764151284.6635] manager: (tap0dc44189-6e): new Generic device (/org/freedesktop/NetworkManager/Devices/40) Nov 26 05:01:24 localhost ovn_controller[153664]: 2025-11-26T10:01:24Z|00236|binding|INFO|Claiming lport 0dc44189-6e84-4e22-b9cc-af00f287e651 for this chassis. Nov 26 05:01:24 localhost ovn_controller[153664]: 2025-11-26T10:01:24Z|00237|binding|INFO|0dc44189-6e84-4e22-b9cc-af00f287e651: Claiming unknown Nov 26 05:01:24 localhost nova_compute[281415]: 2025-11-26 10:01:24.664 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:24 localhost systemd-udevd[315283]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:01:24 localhost ovn_controller[153664]: 2025-11-26T10:01:24Z|00238|binding|INFO|Setting lport 0dc44189-6e84-4e22-b9cc-af00f287e651 ovn-installed in OVS Nov 26 05:01:24 localhost ovn_controller[153664]: 2025-11-26T10:01:24Z|00239|binding|INFO|Setting lport 0dc44189-6e84-4e22-b9cc-af00f287e651 up in Southbound Nov 26 05:01:24 localhost nova_compute[281415]: 2025-11-26 10:01:24.675 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:24 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:24.678 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=0dc44189-6e84-4e22-b9cc-af00f287e651) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:01:24 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:24.680 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 0dc44189-6e84-4e22-b9cc-af00f287e651 in datapath cc3dc995-51cd-4d70-be2c-11c47524552d bound to our chassis#033[00m Nov 26 05:01:24 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:24.682 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Port b7a191cf-e333-4d79-b455-25458f5e64f8 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 26 05:01:24 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:24.682 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cc3dc995-51cd-4d70-be2c-11c47524552d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:01:24 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:24.683 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[1faf0192-2b63-4dd9-a383-ef7c9146d772]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:01:24 localhost journal[229445]: ethtool ioctl error on tap0dc44189-6e: No such device Nov 26 05:01:24 localhost nova_compute[281415]: 2025-11-26 10:01:24.698 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:24 localhost journal[229445]: ethtool ioctl error on tap0dc44189-6e: No such device Nov 26 05:01:24 localhost journal[229445]: ethtool ioctl error on tap0dc44189-6e: No such device Nov 26 05:01:24 localhost journal[229445]: ethtool ioctl error on tap0dc44189-6e: No such device Nov 26 05:01:24 localhost journal[229445]: ethtool ioctl error on tap0dc44189-6e: No such device Nov 26 05:01:24 localhost journal[229445]: ethtool ioctl error on tap0dc44189-6e: No such device Nov 26 05:01:24 localhost journal[229445]: ethtool ioctl error on tap0dc44189-6e: No such device Nov 26 05:01:24 localhost journal[229445]: ethtool ioctl error on tap0dc44189-6e: No such device Nov 26 05:01:24 localhost nova_compute[281415]: 2025-11-26 10:01:24.753 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:24 localhost nova_compute[281415]: 2025-11-26 10:01:24.786 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:25 localhost sshd[315319]: main: sshd: ssh-rsa algorithm is disabled Nov 26 05:01:25 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:25.221 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:74:03 10.100.0.2 2001:db8::f816:3eff:feaf:7403'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:feaf:7403/64', 'neutron:device_id': 'ovnmeta-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ba010266-c829-4775-9f81-9e5e8ac0a898) old=Port_Binding(mac=['fa:16:3e:af:74:03 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:01:25 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:25.224 159486 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ba010266-c829-4775-9f81-9e5e8ac0a898 in datapath cc3dc995-51cd-4d70-be2c-11c47524552d updated#033[00m Nov 26 05:01:25 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:25.226 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Port b7a191cf-e333-4d79-b455-25458f5e64f8 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 26 05:01:25 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:25.226 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cc3dc995-51cd-4d70-be2c-11c47524552d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:01:25 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:25.227 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[9bf625af-a10c-4723-b533-b008b5bfb413]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:01:25 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:25.670 2 INFO neutron.agent.securitygroups_rpc [None req-fff4f275-f8a1-4da2-9c90-afb04472e648 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:01:25 localhost podman[315357]: Nov 26 05:01:25 localhost podman[315357]: 2025-11-26 10:01:25.765410495 +0000 UTC m=+0.101029460 container create 5c3a17b808be65dbeb3bd5432ecc176475efda76030349984c7b73284323b703 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 26 05:01:25 localhost podman[315357]: 2025-11-26 10:01:25.715410951 +0000 UTC m=+0.051029956 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:01:25 localhost systemd[1]: Started libpod-conmon-5c3a17b808be65dbeb3bd5432ecc176475efda76030349984c7b73284323b703.scope. Nov 26 05:01:25 localhost systemd[1]: Started libcrun container. Nov 26 05:01:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dc182fe9d792fa9e47e9bfed6e9d6f39da9a9a0358cf9d3385fd7400680482dd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:01:25 localhost podman[315357]: 2025-11-26 10:01:25.867583958 +0000 UTC m=+0.203202923 container init 5c3a17b808be65dbeb3bd5432ecc176475efda76030349984c7b73284323b703 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 26 05:01:25 localhost podman[315357]: 2025-11-26 10:01:25.877180082 +0000 UTC m=+0.212799047 container start 5c3a17b808be65dbeb3bd5432ecc176475efda76030349984c7b73284323b703 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:01:25 localhost dnsmasq[315375]: started, version 2.85 cachesize 150 Nov 26 05:01:25 localhost dnsmasq[315375]: DNS service limited to local subnets Nov 26 05:01:25 localhost dnsmasq[315375]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:01:25 localhost dnsmasq[315375]: warning: no upstream servers configured Nov 26 05:01:25 localhost dnsmasq-dhcp[315375]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 26 05:01:25 localhost dnsmasq[315375]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:01:25 localhost dnsmasq-dhcp[315375]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:01:25 localhost dnsmasq-dhcp[315375]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:01:26 localhost nova_compute[281415]: 2025-11-26 10:01:26.040 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:26 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:26.075 262471 INFO neutron.agent.dhcp.agent [None req-69b28175-efd6-4885-954b-619e931770b4 - - - - - -] DHCP configuration for ports {'ba010266-c829-4775-9f81-9e5e8ac0a898'} is completed#033[00m Nov 26 05:01:26 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:26.207 2 INFO neutron.agent.securitygroups_rpc [None req-7f76b873-3d6c-4b1f-9d35-02dbe6c32358 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:01:26 localhost dnsmasq[315375]: exiting on receipt of SIGTERM Nov 26 05:01:26 localhost podman[315393]: 2025-11-26 10:01:26.268820011 +0000 UTC m=+0.063000448 container kill 5c3a17b808be65dbeb3bd5432ecc176475efda76030349984c7b73284323b703 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 26 05:01:26 localhost systemd[1]: libpod-5c3a17b808be65dbeb3bd5432ecc176475efda76030349984c7b73284323b703.scope: Deactivated successfully. Nov 26 05:01:26 localhost podman[315407]: 2025-11-26 10:01:26.352822528 +0000 UTC m=+0.066192472 container died 5c3a17b808be65dbeb3bd5432ecc176475efda76030349984c7b73284323b703 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 26 05:01:26 localhost podman[315407]: 2025-11-26 10:01:26.390909332 +0000 UTC m=+0.104279246 container cleanup 5c3a17b808be65dbeb3bd5432ecc176475efda76030349984c7b73284323b703 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3) Nov 26 05:01:26 localhost systemd[1]: libpod-conmon-5c3a17b808be65dbeb3bd5432ecc176475efda76030349984c7b73284323b703.scope: Deactivated successfully. Nov 26 05:01:26 localhost podman[315409]: 2025-11-26 10:01:26.435915699 +0000 UTC m=+0.142598266 container remove 5c3a17b808be65dbeb3bd5432ecc176475efda76030349984c7b73284323b703 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118) Nov 26 05:01:26 localhost systemd[1]: var-lib-containers-storage-overlay-dc182fe9d792fa9e47e9bfed6e9d6f39da9a9a0358cf9d3385fd7400680482dd-merged.mount: Deactivated successfully. Nov 26 05:01:26 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5c3a17b808be65dbeb3bd5432ecc176475efda76030349984c7b73284323b703-userdata-shm.mount: Deactivated successfully. Nov 26 05:01:27 localhost nova_compute[281415]: 2025-11-26 10:01:27.330 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:27 localhost podman[240049]: time="2025-11-26T10:01:27Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 05:01:27 localhost podman[240049]: @ - - [26/Nov/2025:10:01:27 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153864 "" "Go-http-client/1.1" Nov 26 05:01:27 localhost podman[315487]: Nov 26 05:01:27 localhost podman[240049]: @ - - [26/Nov/2025:10:01:27 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18778 "" "Go-http-client/1.1" Nov 26 05:01:27 localhost podman[315487]: 2025-11-26 10:01:27.464979618 +0000 UTC m=+0.053899931 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:01:27 localhost podman[315487]: 2025-11-26 10:01:27.56676271 +0000 UTC m=+0.155683003 container create 1c36b1ff22c458f2bbd504e91e7b662d27eb26368d099c554083a1c9c655a873 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Nov 26 05:01:27 localhost systemd[1]: Started libpod-conmon-1c36b1ff22c458f2bbd504e91e7b662d27eb26368d099c554083a1c9c655a873.scope. Nov 26 05:01:27 localhost systemd[1]: tmp-crun.Pu3lU8.mount: Deactivated successfully. Nov 26 05:01:27 localhost systemd[1]: Started libcrun container. Nov 26 05:01:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/91227fe3df0756d0abc0d08dc289698a6de42604f0182b8f1628898df2cc9780/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:01:27 localhost podman[315487]: 2025-11-26 10:01:27.633960592 +0000 UTC m=+0.222880855 container init 1c36b1ff22c458f2bbd504e91e7b662d27eb26368d099c554083a1c9c655a873 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:01:27 localhost podman[315487]: 2025-11-26 10:01:27.639402233 +0000 UTC m=+0.228322496 container start 1c36b1ff22c458f2bbd504e91e7b662d27eb26368d099c554083a1c9c655a873 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 26 05:01:27 localhost dnsmasq[315505]: started, version 2.85 cachesize 150 Nov 26 05:01:27 localhost dnsmasq[315505]: DNS service limited to local subnets Nov 26 05:01:27 localhost dnsmasq[315505]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:01:27 localhost dnsmasq[315505]: warning: no upstream servers configured Nov 26 05:01:27 localhost dnsmasq-dhcp[315505]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 26 05:01:27 localhost dnsmasq-dhcp[315505]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 26 05:01:27 localhost dnsmasq[315505]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 2 addresses Nov 26 05:01:27 localhost dnsmasq-dhcp[315505]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:01:27 localhost dnsmasq-dhcp[315505]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:01:27 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:27.897 262471 INFO neutron.agent.dhcp.agent [None req-83924d9f-ba13-42d1-9477-ad6f0a301ab0 - - - - - -] DHCP configuration for ports {'dead0e52-ca94-457b-9dda-5bd21ce9ca00', '0dc44189-6e84-4e22-b9cc-af00f287e651', 'ba010266-c829-4775-9f81-9e5e8ac0a898'} is completed#033[00m Nov 26 05:01:28 localhost dnsmasq[315505]: exiting on receipt of SIGTERM Nov 26 05:01:28 localhost podman[315523]: 2025-11-26 10:01:28.033858426 +0000 UTC m=+0.070657126 container kill 1c36b1ff22c458f2bbd504e91e7b662d27eb26368d099c554083a1c9c655a873 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:01:28 localhost systemd[1]: libpod-1c36b1ff22c458f2bbd504e91e7b662d27eb26368d099c554083a1c9c655a873.scope: Deactivated successfully. Nov 26 05:01:28 localhost podman[315537]: 2025-11-26 10:01:28.114353679 +0000 UTC m=+0.066779230 container died 1c36b1ff22c458f2bbd504e91e7b662d27eb26368d099c554083a1c9c655a873 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true) Nov 26 05:01:28 localhost podman[315537]: 2025-11-26 10:01:28.201424337 +0000 UTC m=+0.153849838 container cleanup 1c36b1ff22c458f2bbd504e91e7b662d27eb26368d099c554083a1c9c655a873 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 26 05:01:28 localhost systemd[1]: libpod-conmon-1c36b1ff22c458f2bbd504e91e7b662d27eb26368d099c554083a1c9c655a873.scope: Deactivated successfully. Nov 26 05:01:28 localhost podman[315544]: 2025-11-26 10:01:28.230095103 +0000 UTC m=+0.166513642 container remove 1c36b1ff22c458f2bbd504e91e7b662d27eb26368d099c554083a1c9c655a873 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118) Nov 26 05:01:28 localhost nova_compute[281415]: 2025-11-26 10:01:28.284 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:28 localhost systemd[1]: var-lib-containers-storage-overlay-91227fe3df0756d0abc0d08dc289698a6de42604f0182b8f1628898df2cc9780-merged.mount: Deactivated successfully. Nov 26 05:01:28 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1c36b1ff22c458f2bbd504e91e7b662d27eb26368d099c554083a1c9c655a873-userdata-shm.mount: Deactivated successfully. Nov 26 05:01:28 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:28.798 262471 INFO neutron.agent.linux.ip_lib [None req-459a6f71-cb41-4c99-8ec4-79bd111d09f8 - - - - - -] Device tapbe3415dd-92 cannot be used as it has no MAC address#033[00m Nov 26 05:01:28 localhost nova_compute[281415]: 2025-11-26 10:01:28.833 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:28 localhost kernel: device tapbe3415dd-92 entered promiscuous mode Nov 26 05:01:28 localhost ovn_controller[153664]: 2025-11-26T10:01:28Z|00240|binding|INFO|Claiming lport be3415dd-920c-4817-841b-9e5ad30419d2 for this chassis. Nov 26 05:01:28 localhost nova_compute[281415]: 2025-11-26 10:01:28.842 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:28 localhost ovn_controller[153664]: 2025-11-26T10:01:28Z|00241|binding|INFO|be3415dd-920c-4817-841b-9e5ad30419d2: Claiming unknown Nov 26 05:01:28 localhost NetworkManager[5970]: [1764151288.8432] manager: (tapbe3415dd-92): new Generic device (/org/freedesktop/NetworkManager/Devices/41) Nov 26 05:01:28 localhost systemd-udevd[315604]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:01:28 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:28.859 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-537168b7-124e-4fa7-8d35-689fb2776839', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-537168b7-124e-4fa7-8d35-689fb2776839', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '083b00bb83474f96865b0c5a38c5f88f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=80867537-71e8-4eeb-a4c9-0594c30408ce, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=be3415dd-920c-4817-841b-9e5ad30419d2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:01:28 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:28.862 159486 INFO neutron.agent.ovn.metadata.agent [-] Port be3415dd-920c-4817-841b-9e5ad30419d2 in datapath 537168b7-124e-4fa7-8d35-689fb2776839 bound to our chassis#033[00m Nov 26 05:01:28 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:28.863 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 537168b7-124e-4fa7-8d35-689fb2776839 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:01:28 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:28.864 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[335acc5a-14cb-4379-92d0-bbe72dc80a6e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:01:28 localhost ovn_controller[153664]: 2025-11-26T10:01:28Z|00242|binding|INFO|Setting lport be3415dd-920c-4817-841b-9e5ad30419d2 ovn-installed in OVS Nov 26 05:01:28 localhost ovn_controller[153664]: 2025-11-26T10:01:28Z|00243|binding|INFO|Setting lport be3415dd-920c-4817-841b-9e5ad30419d2 up in Southbound Nov 26 05:01:28 localhost nova_compute[281415]: 2025-11-26 10:01:28.905 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:28 localhost nova_compute[281415]: 2025-11-26 10:01:28.908 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:28 localhost nova_compute[281415]: 2025-11-26 10:01:28.959 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:28 localhost nova_compute[281415]: 2025-11-26 10:01:28.998 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:29 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e121 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:01:29 localhost podman[315640]: Nov 26 05:01:29 localhost podman[315640]: 2025-11-26 10:01:29.22554938 +0000 UTC m=+0.111221901 container create a595ba13cdc21364f7f9a835eaac39105c93423f44e7bb99edc5d508693eda57 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3) Nov 26 05:01:29 localhost systemd[1]: Started libpod-conmon-a595ba13cdc21364f7f9a835eaac39105c93423f44e7bb99edc5d508693eda57.scope. Nov 26 05:01:29 localhost podman[315640]: 2025-11-26 10:01:29.176073171 +0000 UTC m=+0.061745712 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:01:29 localhost systemd[1]: Started libcrun container. Nov 26 05:01:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e1348d8aaf01f058f309cdc23a63a4345175cefda16b62ad26a01d35ebd90358/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:01:29 localhost podman[315640]: 2025-11-26 10:01:29.310252268 +0000 UTC m=+0.195924789 container init a595ba13cdc21364f7f9a835eaac39105c93423f44e7bb99edc5d508693eda57 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118) Nov 26 05:01:29 localhost podman[315640]: 2025-11-26 10:01:29.324148898 +0000 UTC m=+0.209821409 container start a595ba13cdc21364f7f9a835eaac39105c93423f44e7bb99edc5d508693eda57 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:01:29 localhost dnsmasq[315666]: started, version 2.85 cachesize 150 Nov 26 05:01:29 localhost dnsmasq[315666]: DNS service limited to local subnets Nov 26 05:01:29 localhost dnsmasq[315666]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:01:29 localhost dnsmasq[315666]: warning: no upstream servers configured Nov 26 05:01:29 localhost dnsmasq-dhcp[315666]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 26 05:01:29 localhost dnsmasq[315666]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:01:29 localhost dnsmasq-dhcp[315666]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:01:29 localhost dnsmasq-dhcp[315666]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:01:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:29.571 262471 INFO neutron.agent.dhcp.agent [None req-3474c9d5-88fe-4ec5-a19c-3e11fb707f68 - - - - - -] DHCP configuration for ports {'0dc44189-6e84-4e22-b9cc-af00f287e651', 'ba010266-c829-4775-9f81-9e5e8ac0a898'} is completed#033[00m Nov 26 05:01:29 localhost dnsmasq[315666]: exiting on receipt of SIGTERM Nov 26 05:01:29 localhost podman[315695]: 2025-11-26 10:01:29.720827737 +0000 UTC m=+0.065015679 container kill a595ba13cdc21364f7f9a835eaac39105c93423f44e7bb99edc5d508693eda57 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2) Nov 26 05:01:29 localhost systemd[1]: libpod-a595ba13cdc21364f7f9a835eaac39105c93423f44e7bb99edc5d508693eda57.scope: Deactivated successfully. Nov 26 05:01:29 localhost systemd[1]: tmp-crun.mIVgK2.mount: Deactivated successfully. Nov 26 05:01:29 localhost podman[315710]: 2025-11-26 10:01:29.807862684 +0000 UTC m=+0.065745581 container died a595ba13cdc21364f7f9a835eaac39105c93423f44e7bb99edc5d508693eda57 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:01:29 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a595ba13cdc21364f7f9a835eaac39105c93423f44e7bb99edc5d508693eda57-userdata-shm.mount: Deactivated successfully. Nov 26 05:01:29 localhost podman[315710]: 2025-11-26 10:01:29.891376277 +0000 UTC m=+0.149259104 container cleanup a595ba13cdc21364f7f9a835eaac39105c93423f44e7bb99edc5d508693eda57 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:01:29 localhost systemd[1]: libpod-conmon-a595ba13cdc21364f7f9a835eaac39105c93423f44e7bb99edc5d508693eda57.scope: Deactivated successfully. Nov 26 05:01:29 localhost podman[315711]: 2025-11-26 10:01:29.913023345 +0000 UTC m=+0.167152760 container remove a595ba13cdc21364f7f9a835eaac39105c93423f44e7bb99edc5d508693eda57 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 26 05:01:29 localhost ovn_controller[153664]: 2025-11-26T10:01:29Z|00244|binding|INFO|Releasing lport 0dc44189-6e84-4e22-b9cc-af00f287e651 from this chassis (sb_readonly=0) Nov 26 05:01:29 localhost ovn_controller[153664]: 2025-11-26T10:01:29Z|00245|binding|INFO|Setting lport 0dc44189-6e84-4e22-b9cc-af00f287e651 down in Southbound Nov 26 05:01:29 localhost kernel: device tap0dc44189-6e left promiscuous mode Nov 26 05:01:29 localhost nova_compute[281415]: 2025-11-26 10:01:29.928 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:29 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:29.935 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8::f816:3eff:fee6:66a4/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=0dc44189-6e84-4e22-b9cc-af00f287e651) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:01:29 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:29.938 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 0dc44189-6e84-4e22-b9cc-af00f287e651 in datapath cc3dc995-51cd-4d70-be2c-11c47524552d unbound from our chassis#033[00m Nov 26 05:01:29 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:29.940 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cc3dc995-51cd-4d70-be2c-11c47524552d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:01:29 localhost nova_compute[281415]: 2025-11-26 10:01:29.946 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:29 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:29.946 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[d9d02661-fa94-4492-9e0f-745f8ef12643]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:01:30 localhost podman[315784]: Nov 26 05:01:30 localhost podman[315784]: 2025-11-26 10:01:30.181889274 +0000 UTC m=+0.094610450 container create 956630711a9f3e8d20b87632d685b363ab25a443f5003b38419371cd236b73bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-537168b7-124e-4fa7-8d35-689fb2776839, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 26 05:01:30 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:30.189 262471 INFO neutron.agent.dhcp.agent [None req-5a0928df-240a-4c78-92f3-0c53f97d6365 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:01:30 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:30.190 262471 INFO neutron.agent.dhcp.agent [None req-5a0928df-240a-4c78-92f3-0c53f97d6365 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:01:30 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:30.191 262471 INFO neutron.agent.dhcp.agent [None req-5a0928df-240a-4c78-92f3-0c53f97d6365 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:01:30 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:30.191 262471 INFO neutron.agent.dhcp.agent [None req-5a0928df-240a-4c78-92f3-0c53f97d6365 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:01:30 localhost systemd[1]: Started libpod-conmon-956630711a9f3e8d20b87632d685b363ab25a443f5003b38419371cd236b73bf.scope. Nov 26 05:01:30 localhost systemd[1]: Started libcrun container. Nov 26 05:01:30 localhost podman[315784]: 2025-11-26 10:01:30.139816063 +0000 UTC m=+0.052537269 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:01:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2a4659f68a244c5ed1cbfdeea2a2987592cb0a5d0214a8aac5d3540004548b54/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:01:30 localhost podman[315784]: 2025-11-26 10:01:30.252833057 +0000 UTC m=+0.165554233 container init 956630711a9f3e8d20b87632d685b363ab25a443f5003b38419371cd236b73bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-537168b7-124e-4fa7-8d35-689fb2776839, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3) Nov 26 05:01:30 localhost podman[315784]: 2025-11-26 10:01:30.261839352 +0000 UTC m=+0.174560528 container start 956630711a9f3e8d20b87632d685b363ab25a443f5003b38419371cd236b73bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-537168b7-124e-4fa7-8d35-689fb2776839, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:01:30 localhost dnsmasq[315802]: started, version 2.85 cachesize 150 Nov 26 05:01:30 localhost dnsmasq[315802]: DNS service limited to local subnets Nov 26 05:01:30 localhost dnsmasq[315802]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:01:30 localhost dnsmasq[315802]: warning: no upstream servers configured Nov 26 05:01:30 localhost dnsmasq-dhcp[315802]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 26 05:01:30 localhost dnsmasq[315802]: read /var/lib/neutron/dhcp/537168b7-124e-4fa7-8d35-689fb2776839/addn_hosts - 0 addresses Nov 26 05:01:30 localhost dnsmasq-dhcp[315802]: read /var/lib/neutron/dhcp/537168b7-124e-4fa7-8d35-689fb2776839/host Nov 26 05:01:30 localhost dnsmasq-dhcp[315802]: read /var/lib/neutron/dhcp/537168b7-124e-4fa7-8d35-689fb2776839/opts Nov 26 05:01:30 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:30.327 262471 INFO neutron.agent.dhcp.agent [None req-459a6f71-cb41-4c99-8ec4-79bd111d09f8 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:01:28Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=0c14597e-8a5b-42d5-b6c7-3f1aca0246d7, ip_allocation=immediate, mac_address=fa:16:3e:bd:5d:44, name=tempest-PortsIpV6TestJSON-1744656999, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:01:26Z, description=, dns_domain=, id=537168b7-124e-4fa7-8d35-689fb2776839, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsIpV6TestJSON-1305696382, port_security_enabled=True, project_id=083b00bb83474f96865b0c5a38c5f88f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37500, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1425, status=ACTIVE, subnets=['104866cb-2163-48d1-b548-cea8169aad16'], tags=[], tenant_id=083b00bb83474f96865b0c5a38c5f88f, updated_at=2025-11-26T10:01:27Z, vlan_transparent=None, network_id=537168b7-124e-4fa7-8d35-689fb2776839, port_security_enabled=True, project_id=083b00bb83474f96865b0c5a38c5f88f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1462, status=DOWN, tags=[], tenant_id=083b00bb83474f96865b0c5a38c5f88f, updated_at=2025-11-26T10:01:28Z on network 537168b7-124e-4fa7-8d35-689fb2776839#033[00m Nov 26 05:01:30 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:30.414 262471 INFO neutron.agent.dhcp.agent [None req-4d29cef6-071f-4b46-8901-ddb341caa597 - - - - - -] DHCP configuration for ports {'fb7c4854-b43d-4322-96aa-e30c09c28b96'} is completed#033[00m Nov 26 05:01:30 localhost dnsmasq[315802]: read /var/lib/neutron/dhcp/537168b7-124e-4fa7-8d35-689fb2776839/addn_hosts - 1 addresses Nov 26 05:01:30 localhost dnsmasq-dhcp[315802]: read /var/lib/neutron/dhcp/537168b7-124e-4fa7-8d35-689fb2776839/host Nov 26 05:01:30 localhost dnsmasq-dhcp[315802]: read /var/lib/neutron/dhcp/537168b7-124e-4fa7-8d35-689fb2776839/opts Nov 26 05:01:30 localhost podman[315821]: 2025-11-26 10:01:30.53913046 +0000 UTC m=+0.065282347 container kill 956630711a9f3e8d20b87632d685b363ab25a443f5003b38419371cd236b73bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-537168b7-124e-4fa7-8d35-689fb2776839, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:01:30 localhost systemd[1]: var-lib-containers-storage-overlay-e1348d8aaf01f058f309cdc23a63a4345175cefda16b62ad26a01d35ebd90358-merged.mount: Deactivated successfully. Nov 26 05:01:30 localhost systemd[1]: run-netns-qdhcp\x2dcc3dc995\x2d51cd\x2d4d70\x2dbe2c\x2d11c47524552d.mount: Deactivated successfully. Nov 26 05:01:30 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:30.831 262471 INFO neutron.agent.dhcp.agent [None req-da4dc6f4-69df-40df-a5ec-808ca1eef047 - - - - - -] DHCP configuration for ports {'0c14597e-8a5b-42d5-b6c7-3f1aca0246d7'} is completed#033[00m Nov 26 05:01:30 localhost nova_compute[281415]: 2025-11-26 10:01:30.870 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:31 localhost podman[315860]: 2025-11-26 10:01:31.012166001 +0000 UTC m=+0.066451271 container kill 956630711a9f3e8d20b87632d685b363ab25a443f5003b38419371cd236b73bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-537168b7-124e-4fa7-8d35-689fb2776839, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:01:31 localhost dnsmasq[315802]: exiting on receipt of SIGTERM Nov 26 05:01:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 05:01:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 05:01:31 localhost nova_compute[281415]: 2025-11-26 10:01:31.042 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:31 localhost systemd[1]: libpod-956630711a9f3e8d20b87632d685b363ab25a443f5003b38419371cd236b73bf.scope: Deactivated successfully. Nov 26 05:01:31 localhost podman[315875]: 2025-11-26 10:01:31.105534864 +0000 UTC m=+0.072836859 container died 956630711a9f3e8d20b87632d685b363ab25a443f5003b38419371cd236b73bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-537168b7-124e-4fa7-8d35-689fb2776839, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 26 05:01:31 localhost systemd[1]: tmp-crun.KBBo48.mount: Deactivated successfully. Nov 26 05:01:31 localhost ovn_controller[153664]: 2025-11-26T10:01:31Z|00246|binding|INFO|Removing iface tapbe3415dd-92 ovn-installed in OVS Nov 26 05:01:31 localhost ovn_controller[153664]: 2025-11-26T10:01:31Z|00247|binding|INFO|Removing lport be3415dd-920c-4817-841b-9e5ad30419d2 ovn-installed in OVS Nov 26 05:01:31 localhost nova_compute[281415]: 2025-11-26 10:01:31.151 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:31 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:31.151 159486 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 2b8deb1c-eebe-4a00-892b-943756b8fe77 with type ""#033[00m Nov 26 05:01:31 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:31.152 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-537168b7-124e-4fa7-8d35-689fb2776839', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-537168b7-124e-4fa7-8d35-689fb2776839', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '083b00bb83474f96865b0c5a38c5f88f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=80867537-71e8-4eeb-a4c9-0594c30408ce, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=be3415dd-920c-4817-841b-9e5ad30419d2) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:01:31 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:31.155 159486 INFO neutron.agent.ovn.metadata.agent [-] Port be3415dd-920c-4817-841b-9e5ad30419d2 in datapath 537168b7-124e-4fa7-8d35-689fb2776839 unbound from our chassis#033[00m Nov 26 05:01:31 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:31.156 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 537168b7-124e-4fa7-8d35-689fb2776839 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:01:31 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:31.157 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[c2edd962-35ab-4fb9-8409-6aac515b051d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:01:31 localhost nova_compute[281415]: 2025-11-26 10:01:31.158 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:31 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:31.165 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:74:03 2001:db8::f816:3eff:feaf:7403'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feaf:7403/64', 'neutron:device_id': 'ovnmeta-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '18', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ba010266-c829-4775-9f81-9e5e8ac0a898) old=Port_Binding(mac=['fa:16:3e:af:74:03 10.100.0.2 2001:db8::f816:3eff:feaf:7403'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:feaf:7403/64', 'neutron:device_id': 'ovnmeta-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:01:31 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:31.167 159486 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ba010266-c829-4775-9f81-9e5e8ac0a898 in datapath cc3dc995-51cd-4d70-be2c-11c47524552d updated#033[00m Nov 26 05:01:31 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:31.169 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cc3dc995-51cd-4d70-be2c-11c47524552d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:01:31 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:31.170 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[1c88493e-020e-4bbb-be45-21f043343ce2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:01:31 localhost podman[315875]: 2025-11-26 10:01:31.198046213 +0000 UTC m=+0.165348198 container cleanup 956630711a9f3e8d20b87632d685b363ab25a443f5003b38419371cd236b73bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-537168b7-124e-4fa7-8d35-689fb2776839, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 26 05:01:31 localhost systemd[1]: libpod-conmon-956630711a9f3e8d20b87632d685b363ab25a443f5003b38419371cd236b73bf.scope: Deactivated successfully. Nov 26 05:01:31 localhost podman[315888]: 2025-11-26 10:01:31.214807407 +0000 UTC m=+0.161253177 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm) Nov 26 05:01:31 localhost podman[315876]: 2025-11-26 10:01:31.220274948 +0000 UTC m=+0.184025218 container remove 956630711a9f3e8d20b87632d685b363ab25a443f5003b38419371cd236b73bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-537168b7-124e-4fa7-8d35-689fb2776839, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:01:31 localhost kernel: device tapbe3415dd-92 left promiscuous mode Nov 26 05:01:31 localhost nova_compute[281415]: 2025-11-26 10:01:31.237 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:31 localhost podman[315888]: 2025-11-26 10:01:31.253742196 +0000 UTC m=+0.200187996 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_id=edpm) Nov 26 05:01:31 localhost nova_compute[281415]: 2025-11-26 10:01:31.255 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:31 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 05:01:31 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:31.282 262471 INFO neutron.agent.dhcp.agent [None req-f26b3e88-1ab2-4700-86aa-85ddd26bf50a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:01:31 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:31.282 262471 INFO neutron.agent.dhcp.agent [None req-f26b3e88-1ab2-4700-86aa-85ddd26bf50a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:01:31 localhost podman[315883]: 2025-11-26 10:01:31.344353547 +0000 UTC m=+0.292499417 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 26 05:01:31 localhost podman[315883]: 2025-11-26 10:01:31.359393692 +0000 UTC m=+0.307539562 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 26 05:01:31 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 05:01:31 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:31.688 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:01:31 localhost systemd[1]: var-lib-containers-storage-overlay-2a4659f68a244c5ed1cbfdeea2a2987592cb0a5d0214a8aac5d3540004548b54-merged.mount: Deactivated successfully. Nov 26 05:01:31 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-956630711a9f3e8d20b87632d685b363ab25a443f5003b38419371cd236b73bf-userdata-shm.mount: Deactivated successfully. Nov 26 05:01:31 localhost systemd[1]: run-netns-qdhcp\x2d537168b7\x2d124e\x2d4fa7\x2d8d35\x2d689fb2776839.mount: Deactivated successfully. Nov 26 05:01:31 localhost ovn_controller[153664]: 2025-11-26T10:01:31Z|00248|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:01:32 localhost nova_compute[281415]: 2025-11-26 10:01:32.071 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:32 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:32.071 2 INFO neutron.agent.securitygroups_rpc [None req-e70c9a43-93e1-4145-aa83-3224e2bc9784 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:01:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:32.100 262471 INFO neutron.agent.linux.ip_lib [None req-11c8596a-142f-48fa-93fa-9a263cdb972f - - - - - -] Device tapd3630127-2a cannot be used as it has no MAC address#033[00m Nov 26 05:01:32 localhost nova_compute[281415]: 2025-11-26 10:01:32.127 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:32 localhost kernel: device tapd3630127-2a entered promiscuous mode Nov 26 05:01:32 localhost ovn_controller[153664]: 2025-11-26T10:01:32Z|00249|binding|INFO|Claiming lport d3630127-2a81-4c86-8082-728c13f4770d for this chassis. Nov 26 05:01:32 localhost ovn_controller[153664]: 2025-11-26T10:01:32Z|00250|binding|INFO|d3630127-2a81-4c86-8082-728c13f4770d: Claiming unknown Nov 26 05:01:32 localhost nova_compute[281415]: 2025-11-26 10:01:32.134 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:32 localhost NetworkManager[5970]: [1764151292.1398] manager: (tapd3630127-2a): new Generic device (/org/freedesktop/NetworkManager/Devices/42) Nov 26 05:01:32 localhost systemd-udevd[315955]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:01:32 localhost ovn_controller[153664]: 2025-11-26T10:01:32Z|00251|binding|INFO|Setting lport d3630127-2a81-4c86-8082-728c13f4770d ovn-installed in OVS Nov 26 05:01:32 localhost ovn_controller[153664]: 2025-11-26T10:01:32Z|00252|binding|INFO|Setting lport d3630127-2a81-4c86-8082-728c13f4770d up in Southbound Nov 26 05:01:32 localhost nova_compute[281415]: 2025-11-26 10:01:32.149 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:32 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:32.150 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe9a:c9a8/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d3630127-2a81-4c86-8082-728c13f4770d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:01:32 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:32.152 159486 INFO neutron.agent.ovn.metadata.agent [-] Port d3630127-2a81-4c86-8082-728c13f4770d in datapath cc3dc995-51cd-4d70-be2c-11c47524552d bound to our chassis#033[00m Nov 26 05:01:32 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:32.154 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Port 2ba2edd7-f453-4b55-b239-fa2378a9de19 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 26 05:01:32 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:32.155 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cc3dc995-51cd-4d70-be2c-11c47524552d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:01:32 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:32.156 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[2bb7c497-613d-4247-950d-99f0e9b93873]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:01:32 localhost nova_compute[281415]: 2025-11-26 10:01:32.165 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:32 localhost journal[229445]: ethtool ioctl error on tapd3630127-2a: No such device Nov 26 05:01:32 localhost journal[229445]: ethtool ioctl error on tapd3630127-2a: No such device Nov 26 05:01:32 localhost journal[229445]: ethtool ioctl error on tapd3630127-2a: No such device Nov 26 05:01:32 localhost journal[229445]: ethtool ioctl error on tapd3630127-2a: No such device Nov 26 05:01:32 localhost journal[229445]: ethtool ioctl error on tapd3630127-2a: No such device Nov 26 05:01:32 localhost journal[229445]: ethtool ioctl error on tapd3630127-2a: No such device Nov 26 05:01:32 localhost journal[229445]: ethtool ioctl error on tapd3630127-2a: No such device Nov 26 05:01:32 localhost journal[229445]: ethtool ioctl error on tapd3630127-2a: No such device Nov 26 05:01:32 localhost nova_compute[281415]: 2025-11-26 10:01:32.212 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:32 localhost nova_compute[281415]: 2025-11-26 10:01:32.245 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:32 localhost nova_compute[281415]: 2025-11-26 10:01:32.332 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:32 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:32.411 2 INFO neutron.agent.securitygroups_rpc [None req-1bb4f526-20f7-4384-9aec-a9899459bcfc a025d41076dc4bf8b5c42446a20db4a7 083b00bb83474f96865b0c5a38c5f88f - - default default] Security group member updated ['6d45445d-04cd-4f12-afe3-c4bc11ac69da']#033[00m Nov 26 05:01:32 localhost ovn_controller[153664]: 2025-11-26T10:01:32Z|00253|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:01:32 localhost nova_compute[281415]: 2025-11-26 10:01:32.649 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:32 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:32.752 2 INFO neutron.agent.securitygroups_rpc [None req-78117b72-6e95-4af9-9734-e0a0c5f4011b a025d41076dc4bf8b5c42446a20db4a7 083b00bb83474f96865b0c5a38c5f88f - - default default] Security group member updated ['6d45445d-04cd-4f12-afe3-c4bc11ac69da']#033[00m Nov 26 05:01:32 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:32.997 2 INFO neutron.agent.securitygroups_rpc [None req-2e5bbb00-336a-45ac-9bcf-992491491c71 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:01:33 localhost podman[316026]: Nov 26 05:01:33 localhost podman[316026]: 2025-11-26 10:01:33.069961409 +0000 UTC m=+0.105632796 container create a8ac93d6f00c5da272a7de59b723fdc84759520b63612191fa51681761c86761 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 26 05:01:33 localhost systemd[1]: Started libpod-conmon-a8ac93d6f00c5da272a7de59b723fdc84759520b63612191fa51681761c86761.scope. Nov 26 05:01:33 localhost podman[316026]: 2025-11-26 10:01:33.019194992 +0000 UTC m=+0.054866389 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:01:33 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:33.133 2 INFO neutron.agent.securitygroups_rpc [None req-58d96308-4a4a-4bb7-959c-2bbcb5ddf5ab a025d41076dc4bf8b5c42446a20db4a7 083b00bb83474f96865b0c5a38c5f88f - - default default] Security group member updated ['6d45445d-04cd-4f12-afe3-c4bc11ac69da']#033[00m Nov 26 05:01:33 localhost systemd[1]: Started libcrun container. Nov 26 05:01:33 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34ae71eb6c7125e0c2a516d4c14f8dba92ffb0d732ad7f4d3198f1c8bfbf19ec/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:01:33 localhost podman[316026]: 2025-11-26 10:01:33.160333314 +0000 UTC m=+0.196004661 container init a8ac93d6f00c5da272a7de59b723fdc84759520b63612191fa51681761c86761 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 26 05:01:33 localhost podman[316026]: 2025-11-26 10:01:33.172164583 +0000 UTC m=+0.207835930 container start a8ac93d6f00c5da272a7de59b723fdc84759520b63612191fa51681761c86761 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:01:33 localhost dnsmasq[316043]: started, version 2.85 cachesize 150 Nov 26 05:01:33 localhost dnsmasq[316043]: DNS service limited to local subnets Nov 26 05:01:33 localhost dnsmasq[316043]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:01:33 localhost dnsmasq[316043]: warning: no upstream servers configured Nov 26 05:01:33 localhost dnsmasq[316043]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:01:33 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:33.244 262471 INFO neutron.agent.dhcp.agent [None req-11c8596a-142f-48fa-93fa-9a263cdb972f - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:01:31Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=26ed47d3-f2d0-4305-99a4-8e372a37bfe8, ip_allocation=immediate, mac_address=fa:16:3e:7a:88:37, name=tempest-NetworksTestDHCPv6-984765453, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:00:20Z, description=, dns_domain=, id=cc3dc995-51cd-4d70-be2c-11c47524552d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-843096697, port_security_enabled=True, project_id=fbf16d8f1271436498d8d9cbfb24239d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42907, qos_policy_id=None, revision_number=38, router:external=False, shared=False, standard_attr_id=1055, status=ACTIVE, subnets=['ae33587a-632d-4f94-9494-947e956294f0'], tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:01:30Z, vlan_transparent=None, network_id=cc3dc995-51cd-4d70-be2c-11c47524552d, port_security_enabled=True, project_id=fbf16d8f1271436498d8d9cbfb24239d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['513251a1-00ec-4f61-b1d4-b1337479c848'], standard_attr_id=1482, status=DOWN, tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:01:31Z on network cc3dc995-51cd-4d70-be2c-11c47524552d#033[00m Nov 26 05:01:33 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:33.295 262471 INFO neutron.agent.dhcp.agent [None req-2d326422-809a-494d-a16c-e76479491205 - - - - - -] DHCP configuration for ports {'ba010266-c829-4775-9f81-9e5e8ac0a898'} is completed#033[00m Nov 26 05:01:33 localhost dnsmasq[316043]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 1 addresses Nov 26 05:01:33 localhost podman[316062]: 2025-11-26 10:01:33.46501979 +0000 UTC m=+0.066127171 container kill a8ac93d6f00c5da272a7de59b723fdc84759520b63612191fa51681761c86761 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:01:33 localhost ovn_controller[153664]: 2025-11-26T10:01:33Z|00254|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:01:33 localhost nova_compute[281415]: 2025-11-26 10:01:33.620 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:33 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:33.664 2 INFO neutron.agent.securitygroups_rpc [None req-7b77a51d-a29d-49d4-b249-8ca9d64c58c1 a025d41076dc4bf8b5c42446a20db4a7 083b00bb83474f96865b0c5a38c5f88f - - default default] Security group member updated ['6d45445d-04cd-4f12-afe3-c4bc11ac69da']#033[00m Nov 26 05:01:33 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:33.735 262471 INFO neutron.agent.dhcp.agent [None req-6def2d6d-610c-4ddc-96cc-fc989816a77f - - - - - -] DHCP configuration for ports {'26ed47d3-f2d0-4305-99a4-8e372a37bfe8'} is completed#033[00m Nov 26 05:01:33 localhost dnsmasq[316043]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:01:33 localhost podman[316097]: 2025-11-26 10:01:33.954953689 +0000 UTC m=+0.066810202 container kill a8ac93d6f00c5da272a7de59b723fdc84759520b63612191fa51681761c86761 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118) Nov 26 05:01:34 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e121 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:01:34 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:34.484 2 INFO neutron.agent.securitygroups_rpc [None req-f1e67f54-cbe1-4f80-90e1-7580592ab278 a025d41076dc4bf8b5c42446a20db4a7 083b00bb83474f96865b0c5a38c5f88f - - default default] Security group member updated ['6d45445d-04cd-4f12-afe3-c4bc11ac69da']#033[00m Nov 26 05:01:34 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:34.531 262471 INFO neutron.agent.dhcp.agent [None req-4a616263-a68a-4881-abbf-0747597a945f - - - - - -] DHCP configuration for ports {'ba010266-c829-4775-9f81-9e5e8ac0a898', 'd3630127-2a81-4c86-8082-728c13f4770d'} is completed#033[00m Nov 26 05:01:34 localhost dnsmasq[316043]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:01:34 localhost podman[316136]: 2025-11-26 10:01:34.988596092 +0000 UTC m=+0.067567853 container kill a8ac93d6f00c5da272a7de59b723fdc84759520b63612191fa51681761c86761 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 26 05:01:35 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:35.063 2 INFO neutron.agent.securitygroups_rpc [None req-7ee17f6b-3822-493d-8596-30a67cae6792 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:01:35 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:35.164 262471 INFO neutron.agent.dhcp.agent [None req-2a42c7fb-9492-41f2-9478-694f876978f0 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:01:34Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=2e3f53aa-c9d9-40a2-99fa-d821cf823dda, ip_allocation=immediate, mac_address=fa:16:3e:19:c5:16, name=tempest-NetworksTestDHCPv6-2066533184, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:00:20Z, description=, dns_domain=, id=cc3dc995-51cd-4d70-be2c-11c47524552d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-843096697, port_security_enabled=True, project_id=fbf16d8f1271436498d8d9cbfb24239d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42907, qos_policy_id=None, revision_number=40, router:external=False, shared=False, standard_attr_id=1055, status=ACTIVE, subnets=['7ce31d17-398c-4f04-967f-8c8cbf83327b'], tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:01:33Z, vlan_transparent=None, network_id=cc3dc995-51cd-4d70-be2c-11c47524552d, port_security_enabled=True, project_id=fbf16d8f1271436498d8d9cbfb24239d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['513251a1-00ec-4f61-b1d4-b1337479c848'], standard_attr_id=1502, status=DOWN, tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:01:34Z on network cc3dc995-51cd-4d70-be2c-11c47524552d#033[00m Nov 26 05:01:35 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:35.355 262471 INFO neutron.agent.dhcp.agent [None req-abe984bd-42fc-47f9-aad6-fb821cfd0687 - - - - - -] DHCP configuration for ports {'d3630127-2a81-4c86-8082-728c13f4770d', 'ba010266-c829-4775-9f81-9e5e8ac0a898'} is completed#033[00m Nov 26 05:01:35 localhost dnsmasq[316043]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 1 addresses Nov 26 05:01:35 localhost podman[316176]: 2025-11-26 10:01:35.36081321 +0000 UTC m=+0.053728246 container kill a8ac93d6f00c5da272a7de59b723fdc84759520b63612191fa51681761c86761 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 26 05:01:35 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:35.493 2 INFO neutron.agent.securitygroups_rpc [None req-3ab49a00-145f-4220-a555-02e2d1646950 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:01:35 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:35.560 2 INFO neutron.agent.securitygroups_rpc [None req-86736c23-afe1-4e89-9e3f-cfe64736ee9d a025d41076dc4bf8b5c42446a20db4a7 083b00bb83474f96865b0c5a38c5f88f - - default default] Security group member updated ['6d45445d-04cd-4f12-afe3-c4bc11ac69da']#033[00m Nov 26 05:01:35 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:35.578 262471 INFO neutron.agent.dhcp.agent [None req-73601d18-fb47-421a-b210-a0580043bc8d - - - - - -] DHCP configuration for ports {'2e3f53aa-c9d9-40a2-99fa-d821cf823dda'} is completed#033[00m Nov 26 05:01:35 localhost systemd[1]: tmp-crun.t82HX4.mount: Deactivated successfully. Nov 26 05:01:35 localhost podman[316215]: 2025-11-26 10:01:35.755408877 +0000 UTC m=+0.073110957 container kill a8ac93d6f00c5da272a7de59b723fdc84759520b63612191fa51681761c86761 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 26 05:01:35 localhost dnsmasq[316043]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:01:36 localhost nova_compute[281415]: 2025-11-26 10:01:36.090 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:36 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e122 e122: 6 total, 6 up, 6 in Nov 26 05:01:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 05:01:36 localhost podman[316251]: 2025-11-26 10:01:36.76901047 +0000 UTC m=+0.062730992 container kill a8ac93d6f00c5da272a7de59b723fdc84759520b63612191fa51681761c86761 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 26 05:01:36 localhost dnsmasq[316043]: exiting on receipt of SIGTERM Nov 26 05:01:36 localhost systemd[1]: libpod-a8ac93d6f00c5da272a7de59b723fdc84759520b63612191fa51681761c86761.scope: Deactivated successfully. Nov 26 05:01:36 localhost podman[316261]: 2025-11-26 10:01:36.839491838 +0000 UTC m=+0.093772617 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64) Nov 26 05:01:36 localhost podman[316279]: 2025-11-26 10:01:36.859147448 +0000 UTC m=+0.064265066 container died a8ac93d6f00c5da272a7de59b723fdc84759520b63612191fa51681761c86761 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118) Nov 26 05:01:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 05:01:36 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a8ac93d6f00c5da272a7de59b723fdc84759520b63612191fa51681761c86761-userdata-shm.mount: Deactivated successfully. Nov 26 05:01:36 localhost systemd[1]: var-lib-containers-storage-overlay-34ae71eb6c7125e0c2a516d4c14f8dba92ffb0d732ad7f4d3198f1c8bfbf19ec-merged.mount: Deactivated successfully. Nov 26 05:01:36 localhost podman[316309]: 2025-11-26 10:01:36.980213918 +0000 UTC m=+0.082008099 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 26 05:01:37 localhost podman[316279]: 2025-11-26 10:01:37.030680247 +0000 UTC m=+0.235797815 container remove a8ac93d6f00c5da272a7de59b723fdc84759520b63612191fa51681761c86761 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Nov 26 05:01:37 localhost systemd[1]: libpod-conmon-a8ac93d6f00c5da272a7de59b723fdc84759520b63612191fa51681761c86761.scope: Deactivated successfully. Nov 26 05:01:37 localhost podman[316309]: 2025-11-26 10:01:37.050966955 +0000 UTC m=+0.152761146 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20251118) Nov 26 05:01:37 localhost ovn_controller[153664]: 2025-11-26T10:01:37Z|00255|binding|INFO|Releasing lport d3630127-2a81-4c86-8082-728c13f4770d from this chassis (sb_readonly=0) Nov 26 05:01:37 localhost kernel: device tapd3630127-2a left promiscuous mode Nov 26 05:01:37 localhost ovn_controller[153664]: 2025-11-26T10:01:37Z|00256|binding|INFO|Setting lport d3630127-2a81-4c86-8082-728c13f4770d down in Southbound Nov 26 05:01:37 localhost nova_compute[281415]: 2025-11-26 10:01:37.051 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:37 localhost podman[316261]: 2025-11-26 10:01:37.061454434 +0000 UTC m=+0.315735263 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, build-date=2025-08-20T13:12:41, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Nov 26 05:01:37 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:37.063 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe9a:c9a8/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d3630127-2a81-4c86-8082-728c13f4770d) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:01:37 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:37.064 159486 INFO neutron.agent.ovn.metadata.agent [-] Port d3630127-2a81-4c86-8082-728c13f4770d in datapath cc3dc995-51cd-4d70-be2c-11c47524552d unbound from our chassis#033[00m Nov 26 05:01:37 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:37.066 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cc3dc995-51cd-4d70-be2c-11c47524552d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:01:37 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:37.067 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[2e2c7e59-a821-4862-ae09-30f1690fd238]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:01:37 localhost nova_compute[281415]: 2025-11-26 10:01:37.073 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:37 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 05:01:37 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 05:01:37 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:37.098 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:01:37 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:37.274 262471 INFO neutron.agent.dhcp.agent [None req-444b6059-4bb1-44de-9652-04b0d7be051f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:01:37 localhost nova_compute[281415]: 2025-11-26 10:01:37.368 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:37 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e123 e123: 6 total, 6 up, 6 in Nov 26 05:01:37 localhost systemd[1]: run-netns-qdhcp\x2dcc3dc995\x2d51cd\x2d4d70\x2dbe2c\x2d11c47524552d.mount: Deactivated successfully. Nov 26 05:01:39 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:39.121 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:01:39 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:01:39 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:39.369 262471 INFO neutron.agent.linux.ip_lib [None req-3c1934d0-f080-4aa2-9d06-097305c089aa - - - - - -] Device tapa7c92b27-30 cannot be used as it has no MAC address#033[00m Nov 26 05:01:39 localhost nova_compute[281415]: 2025-11-26 10:01:39.394 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:39 localhost kernel: device tapa7c92b27-30 entered promiscuous mode Nov 26 05:01:39 localhost NetworkManager[5970]: [1764151299.4034] manager: (tapa7c92b27-30): new Generic device (/org/freedesktop/NetworkManager/Devices/43) Nov 26 05:01:39 localhost nova_compute[281415]: 2025-11-26 10:01:39.403 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:39 localhost ovn_controller[153664]: 2025-11-26T10:01:39Z|00257|binding|INFO|Claiming lport a7c92b27-30e3-4635-bf82-6ee8df0ccee2 for this chassis. Nov 26 05:01:39 localhost ovn_controller[153664]: 2025-11-26T10:01:39Z|00258|binding|INFO|a7c92b27-30e3-4635-bf82-6ee8df0ccee2: Claiming unknown Nov 26 05:01:39 localhost systemd-udevd[316349]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:01:39 localhost ovn_controller[153664]: 2025-11-26T10:01:39Z|00259|binding|INFO|Setting lport a7c92b27-30e3-4635-bf82-6ee8df0ccee2 ovn-installed in OVS Nov 26 05:01:39 localhost ovn_controller[153664]: 2025-11-26T10:01:39Z|00260|binding|INFO|Setting lport a7c92b27-30e3-4635-bf82-6ee8df0ccee2 up in Southbound Nov 26 05:01:39 localhost nova_compute[281415]: 2025-11-26 10:01:39.413 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:39 localhost nova_compute[281415]: 2025-11-26 10:01:39.416 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:39 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:39.417 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe48:67ce/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a7c92b27-30e3-4635-bf82-6ee8df0ccee2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:01:39 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:39.419 159486 INFO neutron.agent.ovn.metadata.agent [-] Port a7c92b27-30e3-4635-bf82-6ee8df0ccee2 in datapath cc3dc995-51cd-4d70-be2c-11c47524552d bound to our chassis#033[00m Nov 26 05:01:39 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:39.423 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Port f9b2ac88-1630-4797-9c4c-6516b3e363a7 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 26 05:01:39 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:39.423 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cc3dc995-51cd-4d70-be2c-11c47524552d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:01:39 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:39.424 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[ff4cb920-f504-4444-8107-8e35fef9f221]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:01:39 localhost nova_compute[281415]: 2025-11-26 10:01:39.437 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:39 localhost journal[229445]: ethtool ioctl error on tapa7c92b27-30: No such device Nov 26 05:01:39 localhost journal[229445]: ethtool ioctl error on tapa7c92b27-30: No such device Nov 26 05:01:39 localhost journal[229445]: ethtool ioctl error on tapa7c92b27-30: No such device Nov 26 05:01:39 localhost journal[229445]: ethtool ioctl error on tapa7c92b27-30: No such device Nov 26 05:01:39 localhost journal[229445]: ethtool ioctl error on tapa7c92b27-30: No such device Nov 26 05:01:39 localhost journal[229445]: ethtool ioctl error on tapa7c92b27-30: No such device Nov 26 05:01:39 localhost journal[229445]: ethtool ioctl error on tapa7c92b27-30: No such device Nov 26 05:01:39 localhost journal[229445]: ethtool ioctl error on tapa7c92b27-30: No such device Nov 26 05:01:39 localhost nova_compute[281415]: 2025-11-26 10:01:39.486 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:39 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:39.503 2 INFO neutron.agent.securitygroups_rpc [None req-89034895-0105-4cda-bb17-64ef25da86f6 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:01:39 localhost nova_compute[281415]: 2025-11-26 10:01:39.515 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:39 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e124 e124: 6 total, 6 up, 6 in Nov 26 05:01:40 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:40.145 2 INFO neutron.agent.securitygroups_rpc [None req-27055e6c-f788-4c02-a4c2-b1a2c4cd80dc a025d41076dc4bf8b5c42446a20db4a7 083b00bb83474f96865b0c5a38c5f88f - - default default] Security group member updated ['6d45445d-04cd-4f12-afe3-c4bc11ac69da']#033[00m Nov 26 05:01:40 localhost podman[316420]: Nov 26 05:01:40 localhost podman[316420]: 2025-11-26 10:01:40.429231416 +0000 UTC m=+0.101273098 container create 8b439233def2334393d1d52fd1cf22290501b31a0eccfb05a4ad5a33f23efb1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 26 05:01:40 localhost podman[316420]: 2025-11-26 10:01:40.381474277 +0000 UTC m=+0.053515969 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:01:40 localhost systemd[1]: Started libpod-conmon-8b439233def2334393d1d52fd1cf22290501b31a0eccfb05a4ad5a33f23efb1f.scope. Nov 26 05:01:40 localhost systemd[1]: tmp-crun.uXuOmp.mount: Deactivated successfully. Nov 26 05:01:40 localhost systemd[1]: Started libcrun container. Nov 26 05:01:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14a2e722d4d10dd4a3d1e027cf16acf9a3c2ba5bc4d5531dc8ca982a06bcc235/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:01:40 localhost podman[316420]: 2025-11-26 10:01:40.543220278 +0000 UTC m=+0.215261960 container init 8b439233def2334393d1d52fd1cf22290501b31a0eccfb05a4ad5a33f23efb1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 26 05:01:40 localhost podman[316420]: 2025-11-26 10:01:40.553319536 +0000 UTC m=+0.225361188 container start 8b439233def2334393d1d52fd1cf22290501b31a0eccfb05a4ad5a33f23efb1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Nov 26 05:01:40 localhost dnsmasq[316438]: started, version 2.85 cachesize 150 Nov 26 05:01:40 localhost dnsmasq[316438]: DNS service limited to local subnets Nov 26 05:01:40 localhost dnsmasq[316438]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:01:40 localhost dnsmasq[316438]: warning: no upstream servers configured Nov 26 05:01:40 localhost dnsmasq[316438]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:01:40 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:40.626 262471 INFO neutron.agent.dhcp.agent [None req-3c1934d0-f080-4aa2-9d06-097305c089aa - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:01:38Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=696d3a15-04eb-4cdc-a8c4-979ed0121e3b, ip_allocation=immediate, mac_address=fa:16:3e:83:d6:53, name=tempest-NetworksTestDHCPv6-1773071049, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:00:20Z, description=, dns_domain=, id=cc3dc995-51cd-4d70-be2c-11c47524552d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-843096697, port_security_enabled=True, project_id=fbf16d8f1271436498d8d9cbfb24239d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42907, qos_policy_id=None, revision_number=42, router:external=False, shared=False, standard_attr_id=1055, status=ACTIVE, subnets=['51c769d7-f4ff-4368-b422-620ec2166029'], tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:01:36Z, vlan_transparent=None, network_id=cc3dc995-51cd-4d70-be2c-11c47524552d, port_security_enabled=True, project_id=fbf16d8f1271436498d8d9cbfb24239d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['513251a1-00ec-4f61-b1d4-b1337479c848'], standard_attr_id=1532, status=DOWN, tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:01:39Z on network cc3dc995-51cd-4d70-be2c-11c47524552d#033[00m Nov 26 05:01:40 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:40.639 2 INFO neutron.agent.securitygroups_rpc [None req-646d66ce-2b81-47ce-b7a5-82f37fcbec39 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:01:40 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e125 e125: 6 total, 6 up, 6 in Nov 26 05:01:40 localhost dnsmasq[316438]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 1 addresses Nov 26 05:01:40 localhost podman[316457]: 2025-11-26 10:01:40.860607888 +0000 UTC m=+0.062435662 container kill 8b439233def2334393d1d52fd1cf22290501b31a0eccfb05a4ad5a33f23efb1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 26 05:01:40 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:40.874 262471 INFO neutron.agent.dhcp.agent [None req-6fd324a3-9092-48d0-ba8c-b515f0f47a8f - - - - - -] DHCP configuration for ports {'ba010266-c829-4775-9f81-9e5e8ac0a898'} is completed#033[00m Nov 26 05:01:41 localhost nova_compute[281415]: 2025-11-26 10:01:41.130 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 05:01:41 localhost dnsmasq[316438]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:01:41 localhost podman[316494]: 2025-11-26 10:01:41.243050097 +0000 UTC m=+0.063644878 container kill 8b439233def2334393d1d52fd1cf22290501b31a0eccfb05a4ad5a33f23efb1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 26 05:01:41 localhost podman[316506]: 2025-11-26 10:01:41.340523022 +0000 UTC m=+0.092711686 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 26 05:01:41 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:41.351 262471 INFO neutron.agent.dhcp.agent [None req-1632ff94-148c-4430-bef2-92f20598b22d - - - - - -] DHCP configuration for ports {'696d3a15-04eb-4cdc-a8c4-979ed0121e3b'} is completed#033[00m Nov 26 05:01:41 localhost podman[316506]: 2025-11-26 10:01:41.355456302 +0000 UTC m=+0.107644926 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 26 05:01:41 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 05:01:41 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e126 e126: 6 total, 6 up, 6 in Nov 26 05:01:41 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:41.769 2 INFO neutron.agent.securitygroups_rpc [None req-7b05982b-7ae8-43fd-8710-5e2ee3aaf193 a025d41076dc4bf8b5c42446a20db4a7 083b00bb83474f96865b0c5a38c5f88f - - default default] Security group member updated ['6d45445d-04cd-4f12-afe3-c4bc11ac69da']#033[00m Nov 26 05:01:42 localhost nova_compute[281415]: 2025-11-26 10:01:42.393 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:42 localhost dnsmasq[316438]: exiting on receipt of SIGTERM Nov 26 05:01:42 localhost podman[316554]: 2025-11-26 10:01:42.480604584 +0000 UTC m=+0.067061427 container kill 8b439233def2334393d1d52fd1cf22290501b31a0eccfb05a4ad5a33f23efb1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:01:42 localhost systemd[1]: libpod-8b439233def2334393d1d52fd1cf22290501b31a0eccfb05a4ad5a33f23efb1f.scope: Deactivated successfully. Nov 26 05:01:42 localhost podman[316567]: 2025-11-26 10:01:42.562629814 +0000 UTC m=+0.061519515 container died 8b439233def2334393d1d52fd1cf22290501b31a0eccfb05a4ad5a33f23efb1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118) Nov 26 05:01:42 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8b439233def2334393d1d52fd1cf22290501b31a0eccfb05a4ad5a33f23efb1f-userdata-shm.mount: Deactivated successfully. Nov 26 05:01:42 localhost systemd[1]: var-lib-containers-storage-overlay-14a2e722d4d10dd4a3d1e027cf16acf9a3c2ba5bc4d5531dc8ca982a06bcc235-merged.mount: Deactivated successfully. Nov 26 05:01:42 localhost podman[316567]: 2025-11-26 10:01:42.591975139 +0000 UTC m=+0.090864790 container cleanup 8b439233def2334393d1d52fd1cf22290501b31a0eccfb05a4ad5a33f23efb1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118) Nov 26 05:01:42 localhost systemd[1]: libpod-conmon-8b439233def2334393d1d52fd1cf22290501b31a0eccfb05a4ad5a33f23efb1f.scope: Deactivated successfully. Nov 26 05:01:42 localhost podman[316569]: 2025-11-26 10:01:42.649605359 +0000 UTC m=+0.137361272 container remove 8b439233def2334393d1d52fd1cf22290501b31a0eccfb05a4ad5a33f23efb1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3) Nov 26 05:01:42 localhost ovn_controller[153664]: 2025-11-26T10:01:42Z|00261|binding|INFO|Releasing lport a7c92b27-30e3-4635-bf82-6ee8df0ccee2 from this chassis (sb_readonly=0) Nov 26 05:01:42 localhost ovn_controller[153664]: 2025-11-26T10:01:42Z|00262|binding|INFO|Setting lport a7c92b27-30e3-4635-bf82-6ee8df0ccee2 down in Southbound Nov 26 05:01:42 localhost kernel: device tapa7c92b27-30 left promiscuous mode Nov 26 05:01:42 localhost nova_compute[281415]: 2025-11-26 10:01:42.665 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:42 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:42.693 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe48:67ce/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a7c92b27-30e3-4635-bf82-6ee8df0ccee2) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:01:42 localhost nova_compute[281415]: 2025-11-26 10:01:42.694 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:42 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:42.696 159486 INFO neutron.agent.ovn.metadata.agent [-] Port a7c92b27-30e3-4635-bf82-6ee8df0ccee2 in datapath cc3dc995-51cd-4d70-be2c-11c47524552d unbound from our chassis#033[00m Nov 26 05:01:42 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:42.698 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cc3dc995-51cd-4d70-be2c-11c47524552d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:01:42 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:42.700 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[3756d8ac-f1f6-4537-83d7-12d2f7144452]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:01:43 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:43.204 2 INFO neutron.agent.securitygroups_rpc [None req-cb1e9a3c-1b4c-4620-bb7a-b8f8a9f14293 a025d41076dc4bf8b5c42446a20db4a7 083b00bb83474f96865b0c5a38c5f88f - - default default] Security group member updated ['6d45445d-04cd-4f12-afe3-c4bc11ac69da']#033[00m Nov 26 05:01:43 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:43.252 262471 INFO neutron.agent.dhcp.agent [None req-1cedeca6-777a-4a7a-8087-bd7106f71f6f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:01:43 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e127 e127: 6 total, 6 up, 6 in Nov 26 05:01:43 localhost systemd[1]: run-netns-qdhcp\x2dcc3dc995\x2d51cd\x2d4d70\x2dbe2c\x2d11c47524552d.mount: Deactivated successfully. Nov 26 05:01:43 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:43.913 2 INFO neutron.agent.securitygroups_rpc [None req-95cf71e3-1bd5-4fef-8191-4325ba087ead a025d41076dc4bf8b5c42446a20db4a7 083b00bb83474f96865b0c5a38c5f88f - - default default] Security group member updated ['6d45445d-04cd-4f12-afe3-c4bc11ac69da']#033[00m Nov 26 05:01:43 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 26 05:01:43 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/101072340' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 26 05:01:43 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 26 05:01:43 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/101072340' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 26 05:01:44 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:01:45 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:45.420 262471 INFO neutron.agent.linux.ip_lib [None req-007a81fb-bc4e-49e0-9073-3de58c822265 - - - - - -] Device tap2093a468-29 cannot be used as it has no MAC address#033[00m Nov 26 05:01:45 localhost nova_compute[281415]: 2025-11-26 10:01:45.449 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:45 localhost kernel: device tap2093a468-29 entered promiscuous mode Nov 26 05:01:45 localhost ovn_controller[153664]: 2025-11-26T10:01:45Z|00263|binding|INFO|Claiming lport 2093a468-29e7-43ca-9d5a-392fda005f4f for this chassis. Nov 26 05:01:45 localhost ovn_controller[153664]: 2025-11-26T10:01:45Z|00264|binding|INFO|2093a468-29e7-43ca-9d5a-392fda005f4f: Claiming unknown Nov 26 05:01:45 localhost nova_compute[281415]: 2025-11-26 10:01:45.458 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:45 localhost NetworkManager[5970]: [1764151305.4614] manager: (tap2093a468-29): new Generic device (/org/freedesktop/NetworkManager/Devices/44) Nov 26 05:01:45 localhost systemd-udevd[316607]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:01:45 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:45.464 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feed:c748/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=2093a468-29e7-43ca-9d5a-392fda005f4f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:01:45 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:45.466 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 2093a468-29e7-43ca-9d5a-392fda005f4f in datapath cc3dc995-51cd-4d70-be2c-11c47524552d bound to our chassis#033[00m Nov 26 05:01:45 localhost ovn_controller[153664]: 2025-11-26T10:01:45Z|00265|binding|INFO|Setting lport 2093a468-29e7-43ca-9d5a-392fda005f4f ovn-installed in OVS Nov 26 05:01:45 localhost ovn_controller[153664]: 2025-11-26T10:01:45Z|00266|binding|INFO|Setting lport 2093a468-29e7-43ca-9d5a-392fda005f4f up in Southbound Nov 26 05:01:45 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:45.468 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Port 14041d5b-7fd4-4b39-84b3-a182594927ac IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 26 05:01:45 localhost nova_compute[281415]: 2025-11-26 10:01:45.468 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:45 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:45.468 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cc3dc995-51cd-4d70-be2c-11c47524552d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:01:45 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:45.469 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[2e2d2054-09a8-4931-9fc0-7b5f255f0c48]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:01:45 localhost nova_compute[281415]: 2025-11-26 10:01:45.470 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:45 localhost journal[229445]: ethtool ioctl error on tap2093a468-29: No such device Nov 26 05:01:45 localhost journal[229445]: ethtool ioctl error on tap2093a468-29: No such device Nov 26 05:01:45 localhost nova_compute[281415]: 2025-11-26 10:01:45.493 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:45 localhost journal[229445]: ethtool ioctl error on tap2093a468-29: No such device Nov 26 05:01:45 localhost journal[229445]: ethtool ioctl error on tap2093a468-29: No such device Nov 26 05:01:45 localhost journal[229445]: ethtool ioctl error on tap2093a468-29: No such device Nov 26 05:01:45 localhost journal[229445]: ethtool ioctl error on tap2093a468-29: No such device Nov 26 05:01:45 localhost journal[229445]: ethtool ioctl error on tap2093a468-29: No such device Nov 26 05:01:45 localhost journal[229445]: ethtool ioctl error on tap2093a468-29: No such device Nov 26 05:01:45 localhost nova_compute[281415]: 2025-11-26 10:01:45.536 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:45 localhost nova_compute[281415]: 2025-11-26 10:01:45.572 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:45 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:45.625 2 INFO neutron.agent.securitygroups_rpc [None req-7ad413bc-1979-47b4-93b4-bba18ac738fb 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:01:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 05:01:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 05:01:45 localhost openstack_network_exporter[242153]: ERROR 10:01:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:01:45 localhost openstack_network_exporter[242153]: ERROR 10:01:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:01:45 localhost openstack_network_exporter[242153]: ERROR 10:01:45 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 05:01:45 localhost openstack_network_exporter[242153]: ERROR 10:01:45 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 05:01:45 localhost openstack_network_exporter[242153]: Nov 26 05:01:45 localhost openstack_network_exporter[242153]: ERROR 10:01:45 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 05:01:45 localhost openstack_network_exporter[242153]: Nov 26 05:01:45 localhost podman[316640]: 2025-11-26 10:01:45.85120325 +0000 UTC m=+0.104379620 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=multipathd) Nov 26 05:01:45 localhost podman[316640]: 2025-11-26 10:01:45.864491361 +0000 UTC m=+0.117667731 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 26 05:01:45 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 05:01:45 localhost podman[316639]: 2025-11-26 10:01:45.923215653 +0000 UTC m=+0.181582175 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:01:45 localhost podman[316639]: 2025-11-26 10:01:45.957419112 +0000 UTC m=+0.215785624 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible) Nov 26 05:01:45 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 05:01:46 localhost nova_compute[281415]: 2025-11-26 10:01:46.170 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:46 localhost podman[316715]: Nov 26 05:01:46 localhost podman[316715]: 2025-11-26 10:01:46.584106934 +0000 UTC m=+0.091308973 container create 5c1d7ddc29702a1f7997d19a2d320636c98e88b6fa1b894782fcb2ee9ddb7ddd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Nov 26 05:01:46 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:46.633 2 INFO neutron.agent.securitygroups_rpc [None req-5e32cbcf-28aa-48d6-934a-92bc1a970f80 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:01:46 localhost systemd[1]: Started libpod-conmon-5c1d7ddc29702a1f7997d19a2d320636c98e88b6fa1b894782fcb2ee9ddb7ddd.scope. Nov 26 05:01:46 localhost podman[316715]: 2025-11-26 10:01:46.541847208 +0000 UTC m=+0.049049287 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:01:46 localhost systemd[1]: Started libcrun container. Nov 26 05:01:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b570cc275f422f53e906e01fcc5a7f395c30391052c13dc9572d5f87eef656f3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:01:46 localhost podman[316715]: 2025-11-26 10:01:46.668971237 +0000 UTC m=+0.176173266 container init 5c1d7ddc29702a1f7997d19a2d320636c98e88b6fa1b894782fcb2ee9ddb7ddd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:01:46 localhost podman[316715]: 2025-11-26 10:01:46.678718504 +0000 UTC m=+0.185920533 container start 5c1d7ddc29702a1f7997d19a2d320636c98e88b6fa1b894782fcb2ee9ddb7ddd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:01:46 localhost dnsmasq[316734]: started, version 2.85 cachesize 150 Nov 26 05:01:46 localhost dnsmasq[316734]: DNS service limited to local subnets Nov 26 05:01:46 localhost dnsmasq[316734]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:01:46 localhost dnsmasq[316734]: warning: no upstream servers configured Nov 26 05:01:46 localhost dnsmasq-dhcp[316734]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 26 05:01:46 localhost dnsmasq[316734]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:01:46 localhost dnsmasq-dhcp[316734]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:01:46 localhost dnsmasq-dhcp[316734]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:01:46 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:46.747 262471 INFO neutron.agent.dhcp.agent [None req-007a81fb-bc4e-49e0-9073-3de58c822265 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:01:45Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=1d99a163-89c0-4f7e-86a3-4dd77f359af9, ip_allocation=immediate, mac_address=fa:16:3e:ea:21:4a, name=tempest-NetworksTestDHCPv6-1560401890, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:00:20Z, description=, dns_domain=, id=cc3dc995-51cd-4d70-be2c-11c47524552d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-843096697, port_security_enabled=True, project_id=fbf16d8f1271436498d8d9cbfb24239d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42907, qos_policy_id=None, revision_number=44, router:external=False, shared=False, standard_attr_id=1055, status=ACTIVE, subnets=['2aaf1150-46d1-4605-9e40-09474296917f'], tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:01:42Z, vlan_transparent=None, network_id=cc3dc995-51cd-4d70-be2c-11c47524552d, port_security_enabled=True, project_id=fbf16d8f1271436498d8d9cbfb24239d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['513251a1-00ec-4f61-b1d4-b1337479c848'], standard_attr_id=1548, status=DOWN, tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:01:45Z on network cc3dc995-51cd-4d70-be2c-11c47524552d#033[00m Nov 26 05:01:46 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:46.820 262471 INFO neutron.agent.dhcp.agent [None req-53aa2319-28c7-4a55-8a07-ba04310480a1 - - - - - -] DHCP configuration for ports {'ba010266-c829-4775-9f81-9e5e8ac0a898'} is completed#033[00m Nov 26 05:01:46 localhost dnsmasq[316734]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 1 addresses Nov 26 05:01:46 localhost podman[316753]: 2025-11-26 10:01:46.955076155 +0000 UTC m=+0.067386458 container kill 5c1d7ddc29702a1f7997d19a2d320636c98e88b6fa1b894782fcb2ee9ddb7ddd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 26 05:01:46 localhost dnsmasq-dhcp[316734]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:01:46 localhost dnsmasq-dhcp[316734]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:01:47 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:47.234 262471 INFO neutron.agent.dhcp.agent [None req-736d1cfa-6588-4383-a82b-b4d1f82e9a65 - - - - - -] DHCP configuration for ports {'1d99a163-89c0-4f7e-86a3-4dd77f359af9'} is completed#033[00m Nov 26 05:01:47 localhost dnsmasq[316734]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:01:47 localhost dnsmasq-dhcp[316734]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:01:47 localhost dnsmasq-dhcp[316734]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:01:47 localhost podman[316791]: 2025-11-26 10:01:47.36652536 +0000 UTC m=+0.063844645 container kill 5c1d7ddc29702a1f7997d19a2d320636c98e88b6fa1b894782fcb2ee9ddb7ddd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:01:47 localhost nova_compute[281415]: 2025-11-26 10:01:47.423 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:47 localhost systemd[1]: tmp-crun.cH2g8U.mount: Deactivated successfully. Nov 26 05:01:47 localhost dnsmasq[316734]: exiting on receipt of SIGTERM Nov 26 05:01:47 localhost podman[316829]: 2025-11-26 10:01:47.858692574 +0000 UTC m=+0.077598349 container kill 5c1d7ddc29702a1f7997d19a2d320636c98e88b6fa1b894782fcb2ee9ddb7ddd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:01:47 localhost systemd[1]: libpod-5c1d7ddc29702a1f7997d19a2d320636c98e88b6fa1b894782fcb2ee9ddb7ddd.scope: Deactivated successfully. Nov 26 05:01:47 localhost podman[316842]: 2025-11-26 10:01:47.933754598 +0000 UTC m=+0.060650350 container died 5c1d7ddc29702a1f7997d19a2d320636c98e88b6fa1b894782fcb2ee9ddb7ddd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:01:47 localhost podman[316842]: 2025-11-26 10:01:47.969876823 +0000 UTC m=+0.096772525 container cleanup 5c1d7ddc29702a1f7997d19a2d320636c98e88b6fa1b894782fcb2ee9ddb7ddd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:01:47 localhost systemd[1]: libpod-conmon-5c1d7ddc29702a1f7997d19a2d320636c98e88b6fa1b894782fcb2ee9ddb7ddd.scope: Deactivated successfully. Nov 26 05:01:48 localhost podman[316849]: 2025-11-26 10:01:48.028991346 +0000 UTC m=+0.134827977 container remove 5c1d7ddc29702a1f7997d19a2d320636c98e88b6fa1b894782fcb2ee9ddb7ddd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:01:48 localhost ovn_controller[153664]: 2025-11-26T10:01:48Z|00267|binding|INFO|Releasing lport 2093a468-29e7-43ca-9d5a-392fda005f4f from this chassis (sb_readonly=0) Nov 26 05:01:48 localhost kernel: device tap2093a468-29 left promiscuous mode Nov 26 05:01:48 localhost ovn_controller[153664]: 2025-11-26T10:01:48Z|00268|binding|INFO|Setting lport 2093a468-29e7-43ca-9d5a-392fda005f4f down in Southbound Nov 26 05:01:48 localhost nova_compute[281415]: 2025-11-26 10:01:48.047 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:48 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:48.067 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feed:c748/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=2093a468-29e7-43ca-9d5a-392fda005f4f) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:01:48 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:48.071 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 2093a468-29e7-43ca-9d5a-392fda005f4f in datapath cc3dc995-51cd-4d70-be2c-11c47524552d unbound from our chassis#033[00m Nov 26 05:01:48 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:48.073 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cc3dc995-51cd-4d70-be2c-11c47524552d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:01:48 localhost nova_compute[281415]: 2025-11-26 10:01:48.077 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:48 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:48.077 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[c6ee7a8d-c6cd-4dfd-99f7-821cfd6ec206]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:01:48 localhost nova_compute[281415]: 2025-11-26 10:01:48.078 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:48 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e128 e128: 6 total, 6 up, 6 in Nov 26 05:01:48 localhost systemd[1]: var-lib-containers-storage-overlay-b570cc275f422f53e906e01fcc5a7f395c30391052c13dc9572d5f87eef656f3-merged.mount: Deactivated successfully. Nov 26 05:01:48 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5c1d7ddc29702a1f7997d19a2d320636c98e88b6fa1b894782fcb2ee9ddb7ddd-userdata-shm.mount: Deactivated successfully. Nov 26 05:01:48 localhost systemd[1]: run-netns-qdhcp\x2dcc3dc995\x2d51cd\x2d4d70\x2dbe2c\x2d11c47524552d.mount: Deactivated successfully. Nov 26 05:01:49 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:49.133 262471 INFO neutron.agent.linux.ip_lib [None req-6fa1c002-d573-4f0a-8f18-33027a5be452 - - - - - -] Device tap2c24640d-80 cannot be used as it has no MAC address#033[00m Nov 26 05:01:49 localhost nova_compute[281415]: 2025-11-26 10:01:49.164 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:49 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:01:49 localhost kernel: device tap2c24640d-80 entered promiscuous mode Nov 26 05:01:49 localhost NetworkManager[5970]: [1764151309.1724] manager: (tap2c24640d-80): new Generic device (/org/freedesktop/NetworkManager/Devices/45) Nov 26 05:01:49 localhost ovn_controller[153664]: 2025-11-26T10:01:49Z|00269|binding|INFO|Claiming lport 2c24640d-8002-4f59-be93-e944ea6d0a30 for this chassis. Nov 26 05:01:49 localhost nova_compute[281415]: 2025-11-26 10:01:49.172 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:49 localhost ovn_controller[153664]: 2025-11-26T10:01:49Z|00270|binding|INFO|2c24640d-8002-4f59-be93-e944ea6d0a30: Claiming unknown Nov 26 05:01:49 localhost systemd-udevd[316879]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:01:49 localhost ovn_controller[153664]: 2025-11-26T10:01:49Z|00271|binding|INFO|Setting lport 2c24640d-8002-4f59-be93-e944ea6d0a30 ovn-installed in OVS Nov 26 05:01:49 localhost nova_compute[281415]: 2025-11-26 10:01:49.183 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:49 localhost nova_compute[281415]: 2025-11-26 10:01:49.186 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:49 localhost ovn_controller[153664]: 2025-11-26T10:01:49Z|00272|binding|INFO|Setting lport 2c24640d-8002-4f59-be93-e944ea6d0a30 up in Southbound Nov 26 05:01:49 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:49.189 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=2c24640d-8002-4f59-be93-e944ea6d0a30) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:01:49 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:49.191 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 2c24640d-8002-4f59-be93-e944ea6d0a30 in datapath cc3dc995-51cd-4d70-be2c-11c47524552d bound to our chassis#033[00m Nov 26 05:01:49 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:49.193 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Port 2ad04435-39db-4fa4-be3b-80b6eccc4923 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 26 05:01:49 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:49.193 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cc3dc995-51cd-4d70-be2c-11c47524552d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:01:49 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:49.195 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[dc47e72b-9b37-41cc-a6db-cfcbbecb998c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:01:49 localhost nova_compute[281415]: 2025-11-26 10:01:49.215 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:49 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:49.225 2 INFO neutron.agent.securitygroups_rpc [None req-e1869391-86cf-4a14-a629-5212059b9a21 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:01:49 localhost nova_compute[281415]: 2025-11-26 10:01:49.258 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:49 localhost nova_compute[281415]: 2025-11-26 10:01:49.294 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:49 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:49.918 2 INFO neutron.agent.securitygroups_rpc [None req-925d5162-a504-4bc8-8bdc-e2a35b5d9f80 a025d41076dc4bf8b5c42446a20db4a7 083b00bb83474f96865b0c5a38c5f88f - - default default] Security group member updated ['6d45445d-04cd-4f12-afe3-c4bc11ac69da']#033[00m Nov 26 05:01:50 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:50.083 2 INFO neutron.agent.securitygroups_rpc [None req-98e713a0-1c42-4d3d-bc60-c993d4403715 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:01:50 localhost podman[316934]: Nov 26 05:01:50 localhost podman[316934]: 2025-11-26 10:01:50.225404702 +0000 UTC m=+0.093213339 container create 3152c1dbb7229fe0bd386214fcca9238bbd0e063c7952a1c59df8707bed5cecc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 26 05:01:50 localhost systemd[1]: Started libpod-conmon-3152c1dbb7229fe0bd386214fcca9238bbd0e063c7952a1c59df8707bed5cecc.scope. Nov 26 05:01:50 localhost systemd[1]: tmp-crun.S0NUvc.mount: Deactivated successfully. Nov 26 05:01:50 localhost podman[316934]: 2025-11-26 10:01:50.181507377 +0000 UTC m=+0.049316034 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:01:50 localhost systemd[1]: Started libcrun container. Nov 26 05:01:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/556a21e2f6f80bea0b3776d5fdd7e25bae8c2779d132ef385a597cdc042dfa68/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:01:50 localhost podman[316934]: 2025-11-26 10:01:50.314743217 +0000 UTC m=+0.182551854 container init 3152c1dbb7229fe0bd386214fcca9238bbd0e063c7952a1c59df8707bed5cecc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:01:50 localhost podman[316934]: 2025-11-26 10:01:50.324251777 +0000 UTC m=+0.192060404 container start 3152c1dbb7229fe0bd386214fcca9238bbd0e063c7952a1c59df8707bed5cecc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 26 05:01:50 localhost dnsmasq[316953]: started, version 2.85 cachesize 150 Nov 26 05:01:50 localhost dnsmasq[316953]: DNS service limited to local subnets Nov 26 05:01:50 localhost dnsmasq[316953]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:01:50 localhost dnsmasq[316953]: warning: no upstream servers configured Nov 26 05:01:50 localhost dnsmasq-dhcp[316953]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 26 05:01:50 localhost dnsmasq[316953]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:01:50 localhost dnsmasq-dhcp[316953]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:01:50 localhost dnsmasq-dhcp[316953]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:01:50 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:50.392 262471 INFO neutron.agent.dhcp.agent [None req-6fa1c002-d573-4f0a-8f18-33027a5be452 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:01:48Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=14c04c1d-7aa5-405d-9bbd-fd7bbee70faf, ip_allocation=immediate, mac_address=fa:16:3e:52:1c:d5, name=tempest-NetworksTestDHCPv6-828538829, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:00:20Z, description=, dns_domain=, id=cc3dc995-51cd-4d70-be2c-11c47524552d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-843096697, port_security_enabled=True, project_id=fbf16d8f1271436498d8d9cbfb24239d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42907, qos_policy_id=None, revision_number=46, router:external=False, shared=False, standard_attr_id=1055, status=ACTIVE, subnets=['ac73b329-815f-4627-9a3d-368a4e05734b'], tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:01:47Z, vlan_transparent=None, network_id=cc3dc995-51cd-4d70-be2c-11c47524552d, port_security_enabled=True, project_id=fbf16d8f1271436498d8d9cbfb24239d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['513251a1-00ec-4f61-b1d4-b1337479c848'], standard_attr_id=1589, status=DOWN, tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:01:48Z on network cc3dc995-51cd-4d70-be2c-11c47524552d#033[00m Nov 26 05:01:50 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:50.472 2 INFO neutron.agent.securitygroups_rpc [None req-22b5bc13-c480-45e9-91d5-b33f407b58e0 a244375c66834570bc0317c372dd1a3e cb3e8514b34843fd8d83b236fed9b365 - - default default] Security group member updated ['b21e670b-5e2f-49a1-b70b-91a5efe6229b']#033[00m Nov 26 05:01:50 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:50.529 262471 INFO neutron.agent.dhcp.agent [None req-9090139b-edaa-4c60-ae88-78eb83910011 - - - - - -] DHCP configuration for ports {'ba010266-c829-4775-9f81-9e5e8ac0a898'} is completed#033[00m Nov 26 05:01:50 localhost podman[316970]: 2025-11-26 10:01:50.602309168 +0000 UTC m=+0.067190302 container kill 3152c1dbb7229fe0bd386214fcca9238bbd0e063c7952a1c59df8707bed5cecc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118) Nov 26 05:01:50 localhost dnsmasq[316953]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 1 addresses Nov 26 05:01:50 localhost dnsmasq-dhcp[316953]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:01:50 localhost dnsmasq-dhcp[316953]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:01:50 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:50.801 2 INFO neutron.agent.securitygroups_rpc [None req-fbd77e20-2511-406e-b7bf-7c1bec4edf10 a025d41076dc4bf8b5c42446a20db4a7 083b00bb83474f96865b0c5a38c5f88f - - default default] Security group member updated ['6d45445d-04cd-4f12-afe3-c4bc11ac69da']#033[00m Nov 26 05:01:50 localhost ovn_controller[153664]: 2025-11-26T10:01:50Z|00273|binding|INFO|Releasing lport 2c24640d-8002-4f59-be93-e944ea6d0a30 from this chassis (sb_readonly=0) Nov 26 05:01:50 localhost kernel: device tap2c24640d-80 left promiscuous mode Nov 26 05:01:50 localhost ovn_controller[153664]: 2025-11-26T10:01:50Z|00274|binding|INFO|Setting lport 2c24640d-8002-4f59-be93-e944ea6d0a30 down in Southbound Nov 26 05:01:50 localhost nova_compute[281415]: 2025-11-26 10:01:50.820 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:50 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:50.831 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=2c24640d-8002-4f59-be93-e944ea6d0a30) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:01:50 localhost nova_compute[281415]: 2025-11-26 10:01:50.839 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:50 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:50.842 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 2c24640d-8002-4f59-be93-e944ea6d0a30 in datapath cc3dc995-51cd-4d70-be2c-11c47524552d unbound from our chassis#033[00m Nov 26 05:01:50 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:50.844 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cc3dc995-51cd-4d70-be2c-11c47524552d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:01:50 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:50.845 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[a2d0f660-2c3c-4b04-9bb6-71d288a52858]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:01:50 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:50.849 262471 INFO neutron.agent.dhcp.agent [None req-79d3a48f-f8a1-444a-9af1-1e5a1bb7afd5 - - - - - -] DHCP configuration for ports {'14c04c1d-7aa5-405d-9bbd-fd7bbee70faf'} is completed#033[00m Nov 26 05:01:51 localhost dnsmasq[316953]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:01:51 localhost dnsmasq-dhcp[316953]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:01:51 localhost podman[317011]: 2025-11-26 10:01:51.044439347 +0000 UTC m=+0.064835873 container kill 3152c1dbb7229fe0bd386214fcca9238bbd0e063c7952a1c59df8707bed5cecc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 26 05:01:51 localhost dnsmasq-dhcp[316953]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.071 262471 ERROR neutron.agent.dhcp.agent [None req-6fa1c002-d573-4f0a-8f18-33027a5be452 - - - - - -] Unable to reload_allocations dhcp for cc3dc995-51cd-4d70-be2c-11c47524552d.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap2c24640d-80 not found in namespace qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d. Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.071 262471 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.071 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.071 262471 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.071 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.071 262471 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.071 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.071 262471 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.071 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.071 262471 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.071 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.071 262471 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.071 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.071 262471 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.071 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.071 262471 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.071 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.071 262471 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.071 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.071 262471 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.071 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.071 262471 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.071 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.071 262471 ERROR neutron.agent.dhcp.agent return fut.result() Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.071 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.071 262471 ERROR neutron.agent.dhcp.agent return self.__get_result() Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.071 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.071 262471 ERROR neutron.agent.dhcp.agent raise self._exception Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.071 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.071 262471 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.071 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.071 262471 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.071 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.071 262471 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.071 262471 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap2c24640d-80 not found in namespace qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d. Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.071 262471 ERROR neutron.agent.dhcp.agent #033[00m Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.075 262471 INFO neutron.agent.dhcp.agent [None req-691a5d6e-e46c-4c84-8d49-e89d3d4a0607 - - - - - -] Synchronizing state#033[00m Nov 26 05:01:51 localhost nova_compute[281415]: 2025-11-26 10:01:51.207 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:51 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:51.207 2 INFO neutron.agent.securitygroups_rpc [None req-454f31e4-0326-455d-a70c-bdda94017117 a244375c66834570bc0317c372dd1a3e cb3e8514b34843fd8d83b236fed9b365 - - default default] Security group member updated ['b21e670b-5e2f-49a1-b70b-91a5efe6229b']#033[00m Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.234 262471 INFO neutron.agent.dhcp.agent [None req-c5a2b9df-a10c-4fbe-a719-a325cd188f8b - - - - - -] All active networks have been fetched through RPC.#033[00m Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.236 262471 INFO neutron.agent.dhcp.agent [-] Starting network cc3dc995-51cd-4d70-be2c-11c47524552d dhcp configuration#033[00m Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.236 262471 INFO neutron.agent.dhcp.agent [-] Finished network cc3dc995-51cd-4d70-be2c-11c47524552d dhcp configuration#033[00m Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.237 262471 INFO neutron.agent.dhcp.agent [None req-c5a2b9df-a10c-4fbe-a719-a325cd188f8b - - - - - -] Synchronizing state complete#033[00m Nov 26 05:01:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:51.301 262471 INFO neutron.agent.dhcp.agent [None req-4cfd357b-cb40-4f05-912e-49e5c9e9a5b2 - - - - - -] DHCP configuration for ports {'ba010266-c829-4775-9f81-9e5e8ac0a898'} is completed#033[00m Nov 26 05:01:51 localhost dnsmasq[316953]: exiting on receipt of SIGTERM Nov 26 05:01:51 localhost podman[317042]: 2025-11-26 10:01:51.486396061 +0000 UTC m=+0.058668681 container kill 3152c1dbb7229fe0bd386214fcca9238bbd0e063c7952a1c59df8707bed5cecc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 26 05:01:51 localhost systemd[1]: libpod-3152c1dbb7229fe0bd386214fcca9238bbd0e063c7952a1c59df8707bed5cecc.scope: Deactivated successfully. Nov 26 05:01:51 localhost podman[317055]: 2025-11-26 10:01:51.562144385 +0000 UTC m=+0.060632978 container died 3152c1dbb7229fe0bd386214fcca9238bbd0e063c7952a1c59df8707bed5cecc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3) Nov 26 05:01:51 localhost podman[317055]: 2025-11-26 10:01:51.600989441 +0000 UTC m=+0.099477984 container cleanup 3152c1dbb7229fe0bd386214fcca9238bbd0e063c7952a1c59df8707bed5cecc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2) Nov 26 05:01:51 localhost systemd[1]: libpod-conmon-3152c1dbb7229fe0bd386214fcca9238bbd0e063c7952a1c59df8707bed5cecc.scope: Deactivated successfully. Nov 26 05:01:51 localhost podman[317062]: 2025-11-26 10:01:51.653086158 +0000 UTC m=+0.135892119 container remove 3152c1dbb7229fe0bd386214fcca9238bbd0e063c7952a1c59df8707bed5cecc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Nov 26 05:01:51 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:51.921 2 INFO neutron.agent.securitygroups_rpc [None req-6ad0552b-9eff-441f-b6f5-35717f0457a1 a025d41076dc4bf8b5c42446a20db4a7 083b00bb83474f96865b0c5a38c5f88f - - default default] Security group member updated ['6d45445d-04cd-4f12-afe3-c4bc11ac69da']#033[00m Nov 26 05:01:52 localhost systemd[1]: var-lib-containers-storage-overlay-556a21e2f6f80bea0b3776d5fdd7e25bae8c2779d132ef385a597cdc042dfa68-merged.mount: Deactivated successfully. Nov 26 05:01:52 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3152c1dbb7229fe0bd386214fcca9238bbd0e063c7952a1c59df8707bed5cecc-userdata-shm.mount: Deactivated successfully. Nov 26 05:01:52 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:52.263 2 INFO neutron.agent.securitygroups_rpc [None req-d0a7ee9a-e27f-441c-bae9-fd0be4f8c3b5 a244375c66834570bc0317c372dd1a3e cb3e8514b34843fd8d83b236fed9b365 - - default default] Security group member updated ['b21e670b-5e2f-49a1-b70b-91a5efe6229b']#033[00m Nov 26 05:01:52 localhost systemd[1]: run-netns-qdhcp\x2dcc3dc995\x2d51cd\x2d4d70\x2dbe2c\x2d11c47524552d.mount: Deactivated successfully. Nov 26 05:01:52 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:52.338 262471 INFO neutron.agent.dhcp.agent [None req-90e60123-8a6a-45af-9cda-104b0acbf649 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:01:52 localhost nova_compute[281415]: 2025-11-26 10:01:52.459 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:52 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:52.785 2 INFO neutron.agent.securitygroups_rpc [None req-e5bb332b-7240-41f3-8b1d-d5b172751a97 a025d41076dc4bf8b5c42446a20db4a7 083b00bb83474f96865b0c5a38c5f88f - - default default] Security group member updated ['6d45445d-04cd-4f12-afe3-c4bc11ac69da']#033[00m Nov 26 05:01:53 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:53.135 2 INFO neutron.agent.securitygroups_rpc [None req-39a1c2be-e411-4022-a4d0-31a2eaae326b a244375c66834570bc0317c372dd1a3e cb3e8514b34843fd8d83b236fed9b365 - - default default] Security group member updated ['b21e670b-5e2f-49a1-b70b-91a5efe6229b']#033[00m Nov 26 05:01:54 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:01:56 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:56.093 2 INFO neutron.agent.securitygroups_rpc [None req-0ca5e6a3-0705-4853-b9b6-ff9205480f1c a244375c66834570bc0317c372dd1a3e cb3e8514b34843fd8d83b236fed9b365 - - default default] Security group member updated ['b21e670b-5e2f-49a1-b70b-91a5efe6229b']#033[00m Nov 26 05:01:56 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:56.095 2 INFO neutron.agent.securitygroups_rpc [None req-de52665f-87f6-48b6-813c-8585a8265b05 a244375c66834570bc0317c372dd1a3e cb3e8514b34843fd8d83b236fed9b365 - - default default] Security group member updated ['b21e670b-5e2f-49a1-b70b-91a5efe6229b']#033[00m Nov 26 05:01:56 localhost nova_compute[281415]: 2025-11-26 10:01:56.121 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:56 localhost ovn_controller[153664]: 2025-11-26T10:01:56Z|00275|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:01:56 localhost ovn_controller[153664]: 2025-11-26T10:01:56Z|00276|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:01:56 localhost nova_compute[281415]: 2025-11-26 10:01:56.146 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:56.200 262471 INFO neutron.agent.linux.ip_lib [None req-ac477aa7-017e-4db2-b8a1-485acbeb3142 - - - - - -] Device tapa77a60af-05 cannot be used as it has no MAC address#033[00m Nov 26 05:01:56 localhost nova_compute[281415]: 2025-11-26 10:01:56.208 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:56 localhost nova_compute[281415]: 2025-11-26 10:01:56.221 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:56 localhost kernel: device tapa77a60af-05 entered promiscuous mode Nov 26 05:01:56 localhost NetworkManager[5970]: [1764151316.2291] manager: (tapa77a60af-05): new Generic device (/org/freedesktop/NetworkManager/Devices/46) Nov 26 05:01:56 localhost systemd-udevd[317095]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:01:56 localhost ovn_controller[153664]: 2025-11-26T10:01:56Z|00277|binding|INFO|Claiming lport a77a60af-05fc-49aa-94e8-bc68d9db7bd5 for this chassis. Nov 26 05:01:56 localhost ovn_controller[153664]: 2025-11-26T10:01:56Z|00278|binding|INFO|a77a60af-05fc-49aa-94e8-bc68d9db7bd5: Claiming unknown Nov 26 05:01:56 localhost nova_compute[281415]: 2025-11-26 10:01:56.233 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:56 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:56.247 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fedb:5f99/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a77a60af-05fc-49aa-94e8-bc68d9db7bd5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:01:56 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:56.249 159486 INFO neutron.agent.ovn.metadata.agent [-] Port a77a60af-05fc-49aa-94e8-bc68d9db7bd5 in datapath cc3dc995-51cd-4d70-be2c-11c47524552d bound to our chassis#033[00m Nov 26 05:01:56 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:56.252 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Port dd62f742-d769-4622-b9f4-2359430d799a IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 26 05:01:56 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:56.252 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cc3dc995-51cd-4d70-be2c-11c47524552d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:01:56 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:56.253 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[1e1f6e54-4652-4f98-acfa-6e4100e71a82]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:01:56 localhost ovn_controller[153664]: 2025-11-26T10:01:56Z|00279|binding|INFO|Setting lport a77a60af-05fc-49aa-94e8-bc68d9db7bd5 ovn-installed in OVS Nov 26 05:01:56 localhost ovn_controller[153664]: 2025-11-26T10:01:56Z|00280|binding|INFO|Setting lport a77a60af-05fc-49aa-94e8-bc68d9db7bd5 up in Southbound Nov 26 05:01:56 localhost nova_compute[281415]: 2025-11-26 10:01:56.275 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:56 localhost nova_compute[281415]: 2025-11-26 10:01:56.313 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:56 localhost nova_compute[281415]: 2025-11-26 10:01:56.339 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:56 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:56.668 2 INFO neutron.agent.securitygroups_rpc [None req-2f8b5a04-5a00-4262-89d9-3e1cf71d8452 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:01:57 localhost podman[317150]: Nov 26 05:01:57 localhost podman[317150]: 2025-11-26 10:01:57.209551787 +0000 UTC m=+0.101147734 container create c7a11115b3d75fa772ead77c90a222b9daaf677e812c321b54247dd22395ec85 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 26 05:01:57 localhost systemd[1]: Started libpod-conmon-c7a11115b3d75fa772ead77c90a222b9daaf677e812c321b54247dd22395ec85.scope. Nov 26 05:01:57 localhost podman[317150]: 2025-11-26 10:01:57.164712365 +0000 UTC m=+0.056308382 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:01:57 localhost systemd[1]: Started libcrun container. Nov 26 05:01:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b5b270cbadc5e9d7cbef0a2b012c6d3c0a99b3739588baf78be45094719fed2a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:01:57 localhost podman[317150]: 2025-11-26 10:01:57.28388599 +0000 UTC m=+0.175481937 container init c7a11115b3d75fa772ead77c90a222b9daaf677e812c321b54247dd22395ec85 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Nov 26 05:01:57 localhost nova_compute[281415]: 2025-11-26 10:01:57.329 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:57 localhost ovn_controller[153664]: 2025-11-26T10:01:57Z|00281|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:01:57 localhost podman[317150]: 2025-11-26 10:01:57.336175632 +0000 UTC m=+0.227771589 container start c7a11115b3d75fa772ead77c90a222b9daaf677e812c321b54247dd22395ec85 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0) Nov 26 05:01:57 localhost dnsmasq[317169]: started, version 2.85 cachesize 150 Nov 26 05:01:57 localhost dnsmasq[317169]: DNS service limited to local subnets Nov 26 05:01:57 localhost dnsmasq[317169]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:01:57 localhost dnsmasq[317169]: warning: no upstream servers configured Nov 26 05:01:57 localhost dnsmasq[317169]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:01:57 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:57.358 2 INFO neutron.agent.securitygroups_rpc [None req-8dee1088-e3a1-40fa-bc39-f759ffed6920 a244375c66834570bc0317c372dd1a3e cb3e8514b34843fd8d83b236fed9b365 - - default default] Security group member updated ['b21e670b-5e2f-49a1-b70b-91a5efe6229b']#033[00m Nov 26 05:01:57 localhost nova_compute[281415]: 2025-11-26 10:01:57.462 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:57 localhost podman[240049]: time="2025-11-26T10:01:57Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 05:01:57 localhost podman[240049]: @ - - [26/Nov/2025:10:01:57 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155581 "" "Go-http-client/1.1" Nov 26 05:01:57 localhost podman[240049]: @ - - [26/Nov/2025:10:01:57 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19244 "" "Go-http-client/1.1" Nov 26 05:01:57 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:57.611 2 INFO neutron.agent.securitygroups_rpc [None req-f3fa61ed-469a-4d15-8181-da9b96a9a62c 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:01:57 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:57.625 262471 INFO neutron.agent.dhcp.agent [None req-117a4bc3-f716-4c1c-a4c8-b2cf4d2b9e2e - - - - - -] DHCP configuration for ports {'ba010266-c829-4775-9f81-9e5e8ac0a898'} is completed#033[00m Nov 26 05:01:57 localhost dnsmasq[317169]: exiting on receipt of SIGTERM Nov 26 05:01:57 localhost systemd[1]: libpod-c7a11115b3d75fa772ead77c90a222b9daaf677e812c321b54247dd22395ec85.scope: Deactivated successfully. Nov 26 05:01:57 localhost podman[317204]: 2025-11-26 10:01:57.854136577 +0000 UTC m=+0.069071377 container kill c7a11115b3d75fa772ead77c90a222b9daaf677e812c321b54247dd22395ec85 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:01:57 localhost podman[317234]: 2025-11-26 10:01:57.923571095 +0000 UTC m=+0.054960072 container died c7a11115b3d75fa772ead77c90a222b9daaf677e812c321b54247dd22395ec85 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 26 05:01:57 localhost podman[317234]: 2025-11-26 10:01:57.959799483 +0000 UTC m=+0.091188420 container cleanup c7a11115b3d75fa772ead77c90a222b9daaf677e812c321b54247dd22395ec85 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:01:57 localhost systemd[1]: libpod-conmon-c7a11115b3d75fa772ead77c90a222b9daaf677e812c321b54247dd22395ec85.scope: Deactivated successfully. Nov 26 05:01:57 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:57.977 262471 INFO neutron.agent.linux.ip_lib [None req-250a16f9-a8ee-404b-bc03-9dbaa6a8bbb2 - - - - - -] Device tap90579928-84 cannot be used as it has no MAC address#033[00m Nov 26 05:01:57 localhost podman[317245]: 2025-11-26 10:01:57.996913098 +0000 UTC m=+0.095135177 container remove c7a11115b3d75fa772ead77c90a222b9daaf677e812c321b54247dd22395ec85 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 26 05:01:58 localhost nova_compute[281415]: 2025-11-26 10:01:58.012 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:58 localhost kernel: device tap90579928-84 entered promiscuous mode Nov 26 05:01:58 localhost NetworkManager[5970]: [1764151318.0209] manager: (tap90579928-84): new Generic device (/org/freedesktop/NetworkManager/Devices/47) Nov 26 05:01:58 localhost systemd-udevd[317097]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:01:58 localhost ovn_controller[153664]: 2025-11-26T10:01:58Z|00282|binding|INFO|Claiming lport 90579928-8403-4db1-af69-5bb45b1706cf for this chassis. Nov 26 05:01:58 localhost ovn_controller[153664]: 2025-11-26T10:01:58Z|00283|binding|INFO|90579928-8403-4db1-af69-5bb45b1706cf: Claiming unknown Nov 26 05:01:58 localhost nova_compute[281415]: 2025-11-26 10:01:58.021 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:58 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:58.032 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-0ab50863-10aa-4a2b-b78b-7be59e696016', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0ab50863-10aa-4a2b-b78b-7be59e696016', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '083b00bb83474f96865b0c5a38c5f88f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2cdb8bec-7c38-4b23-b22c-dbad3237715c, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=90579928-8403-4db1-af69-5bb45b1706cf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:01:58 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:58.034 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 90579928-8403-4db1-af69-5bb45b1706cf in datapath 0ab50863-10aa-4a2b-b78b-7be59e696016 bound to our chassis#033[00m Nov 26 05:01:58 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:58.039 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 0ab50863-10aa-4a2b-b78b-7be59e696016 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:01:58 localhost ovn_metadata_agent[159481]: 2025-11-26 10:01:58.040 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[87cbd9a3-2fca-4e4c-8a52-9342699fc2a8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:01:58 localhost ovn_controller[153664]: 2025-11-26T10:01:58Z|00284|binding|INFO|Setting lport 90579928-8403-4db1-af69-5bb45b1706cf ovn-installed in OVS Nov 26 05:01:58 localhost ovn_controller[153664]: 2025-11-26T10:01:58Z|00285|binding|INFO|Setting lport 90579928-8403-4db1-af69-5bb45b1706cf up in Southbound Nov 26 05:01:58 localhost nova_compute[281415]: 2025-11-26 10:01:58.065 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:58 localhost ovn_controller[153664]: 2025-11-26T10:01:58Z|00286|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:01:58 localhost ovn_controller[153664]: 2025-11-26T10:01:58Z|00287|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:01:58 localhost nova_compute[281415]: 2025-11-26 10:01:58.088 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:58 localhost nova_compute[281415]: 2025-11-26 10:01:58.111 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:58 localhost nova_compute[281415]: 2025-11-26 10:01:58.149 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:01:58 localhost systemd[1]: var-lib-containers-storage-overlay-b5b270cbadc5e9d7cbef0a2b012c6d3c0a99b3739588baf78be45094719fed2a-merged.mount: Deactivated successfully. Nov 26 05:01:58 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c7a11115b3d75fa772ead77c90a222b9daaf677e812c321b54247dd22395ec85-userdata-shm.mount: Deactivated successfully. Nov 26 05:01:59 localhost podman[317374]: Nov 26 05:01:59 localhost podman[317374]: 2025-11-26 10:01:59.034342614 +0000 UTC m=+0.094501078 container create cac6ab04c7d737e695819818855052ce0e52e13aaabac91b919c33911ea455dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0ab50863-10aa-4a2b-b78b-7be59e696016, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Nov 26 05:01:59 localhost systemd[1]: Started libpod-conmon-cac6ab04c7d737e695819818855052ce0e52e13aaabac91b919c33911ea455dd.scope. Nov 26 05:01:59 localhost podman[317374]: 2025-11-26 10:01:58.98810793 +0000 UTC m=+0.048266404 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:01:59 localhost systemd[1]: Started libcrun container. Nov 26 05:01:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff121ff0638570a01e9f9e63dad26aa114fa0f7b01074f18d749550add2cde8c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:01:59 localhost podman[317374]: 2025-11-26 10:01:59.135451926 +0000 UTC m=+0.195610380 container init cac6ab04c7d737e695819818855052ce0e52e13aaabac91b919c33911ea455dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0ab50863-10aa-4a2b-b78b-7be59e696016, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 26 05:01:59 localhost podman[317374]: 2025-11-26 10:01:59.145763759 +0000 UTC m=+0.205922213 container start cac6ab04c7d737e695819818855052ce0e52e13aaabac91b919c33911ea455dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0ab50863-10aa-4a2b-b78b-7be59e696016, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118) Nov 26 05:01:59 localhost dnsmasq[317392]: started, version 2.85 cachesize 150 Nov 26 05:01:59 localhost dnsmasq[317392]: DNS service limited to local subnets Nov 26 05:01:59 localhost dnsmasq[317392]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:01:59 localhost dnsmasq[317392]: warning: no upstream servers configured Nov 26 05:01:59 localhost dnsmasq-dhcp[317392]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 26 05:01:59 localhost dnsmasq[317392]: read /var/lib/neutron/dhcp/0ab50863-10aa-4a2b-b78b-7be59e696016/addn_hosts - 0 addresses Nov 26 05:01:59 localhost dnsmasq-dhcp[317392]: read /var/lib/neutron/dhcp/0ab50863-10aa-4a2b-b78b-7be59e696016/host Nov 26 05:01:59 localhost dnsmasq-dhcp[317392]: read /var/lib/neutron/dhcp/0ab50863-10aa-4a2b-b78b-7be59e696016/opts Nov 26 05:01:59 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:01:59 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:59.363 2 INFO neutron.agent.securitygroups_rpc [None req-2f88910d-73c9-4396-a4eb-a11e6d110545 a025d41076dc4bf8b5c42446a20db4a7 083b00bb83474f96865b0c5a38c5f88f - - default default] Security group member updated ['6d45445d-04cd-4f12-afe3-c4bc11ac69da']#033[00m Nov 26 05:01:59 localhost neutron_sriov_agent[255515]: 2025-11-26 10:01:59.374 2 INFO neutron.agent.securitygroups_rpc [None req-526138fe-f2f0-4d8f-a493-4d983afa5008 a244375c66834570bc0317c372dd1a3e cb3e8514b34843fd8d83b236fed9b365 - - default default] Security group member updated ['b21e670b-5e2f-49a1-b70b-91a5efe6229b']#033[00m Nov 26 05:01:59 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 05:01:59 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:01:59 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:59.408 262471 INFO neutron.agent.dhcp.agent [None req-93682c1d-af5c-4056-93e8-a0b7f322b299 - - - - - -] DHCP configuration for ports {'71986a4a-12c3-4bd3-8895-ba5caf6849b7'} is completed#033[00m Nov 26 05:01:59 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:59.458 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:01:57Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=cefa2e33-4b7b-49d4-b35d-16608daaec40, ip_allocation=immediate, mac_address=fa:16:3e:40:8b:ec, name=tempest-PortsIpV6TestJSON-1084729547, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:01:54Z, description=, dns_domain=, id=0ab50863-10aa-4a2b-b78b-7be59e696016, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsIpV6TestJSON-1021569215, port_security_enabled=True, project_id=083b00bb83474f96865b0c5a38c5f88f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=1907, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1639, status=ACTIVE, subnets=['180b27cc-507e-453f-afe3-d731aecce5e0'], tags=[], tenant_id=083b00bb83474f96865b0c5a38c5f88f, updated_at=2025-11-26T10:01:56Z, vlan_transparent=None, network_id=0ab50863-10aa-4a2b-b78b-7be59e696016, port_security_enabled=True, project_id=083b00bb83474f96865b0c5a38c5f88f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['6d45445d-04cd-4f12-afe3-c4bc11ac69da'], standard_attr_id=1662, status=DOWN, tags=[], tenant_id=083b00bb83474f96865b0c5a38c5f88f, updated_at=2025-11-26T10:01:58Z on network 0ab50863-10aa-4a2b-b78b-7be59e696016#033[00m Nov 26 05:01:59 localhost dnsmasq[317392]: read /var/lib/neutron/dhcp/0ab50863-10aa-4a2b-b78b-7be59e696016/addn_hosts - 1 addresses Nov 26 05:01:59 localhost podman[317415]: 2025-11-26 10:01:59.673484943 +0000 UTC m=+0.063854854 container kill cac6ab04c7d737e695819818855052ce0e52e13aaabac91b919c33911ea455dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0ab50863-10aa-4a2b-b78b-7be59e696016, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:01:59 localhost dnsmasq-dhcp[317392]: read /var/lib/neutron/dhcp/0ab50863-10aa-4a2b-b78b-7be59e696016/host Nov 26 05:01:59 localhost dnsmasq-dhcp[317392]: read /var/lib/neutron/dhcp/0ab50863-10aa-4a2b-b78b-7be59e696016/opts Nov 26 05:01:59 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:01:59.989 262471 INFO neutron.agent.dhcp.agent [None req-bee41123-993e-495f-9fc4-15a44cea4251 - - - - - -] DHCP configuration for ports {'cefa2e33-4b7b-49d4-b35d-16608daaec40'} is completed#033[00m Nov 26 05:02:00 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:00.176 2 INFO neutron.agent.securitygroups_rpc [None req-03c494c5-54a6-4075-89fc-855342539c19 a244375c66834570bc0317c372dd1a3e cb3e8514b34843fd8d83b236fed9b365 - - default default] Security group member updated ['b21e670b-5e2f-49a1-b70b-91a5efe6229b']#033[00m Nov 26 05:02:00 localhost podman[317479]: Nov 26 05:02:00 localhost podman[317479]: 2025-11-26 10:02:00.397836855 +0000 UTC m=+0.085487422 container create af7c1525df162b18019ee621a718d83e95e8fa5d7ef1536554ea3fb2086a917e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:02:00 localhost systemd[1]: Started libpod-conmon-af7c1525df162b18019ee621a718d83e95e8fa5d7ef1536554ea3fb2086a917e.scope. Nov 26 05:02:00 localhost nova_compute[281415]: 2025-11-26 10:02:00.439 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:02:00 localhost systemd[1]: tmp-crun.RdRJQr.mount: Deactivated successfully. Nov 26 05:02:00 localhost podman[317479]: 2025-11-26 10:02:00.357090394 +0000 UTC m=+0.044740961 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:02:00 localhost systemd[1]: Started libcrun container. Nov 26 05:02:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/848cef7f1234576da8d187493e6fc5202bfa44871f7e3a86813c69438b7ade87/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:02:00 localhost podman[317479]: 2025-11-26 10:02:00.47937739 +0000 UTC m=+0.167027967 container init af7c1525df162b18019ee621a718d83e95e8fa5d7ef1536554ea3fb2086a917e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:02:00 localhost podman[317479]: 2025-11-26 10:02:00.488764577 +0000 UTC m=+0.176415144 container start af7c1525df162b18019ee621a718d83e95e8fa5d7ef1536554ea3fb2086a917e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Nov 26 05:02:00 localhost dnsmasq[317497]: started, version 2.85 cachesize 150 Nov 26 05:02:00 localhost dnsmasq[317497]: DNS service limited to local subnets Nov 26 05:02:00 localhost dnsmasq[317497]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:02:00 localhost dnsmasq[317497]: warning: no upstream servers configured Nov 26 05:02:00 localhost dnsmasq-dhcp[317497]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Nov 26 05:02:00 localhost dnsmasq[317497]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:02:00 localhost dnsmasq-dhcp[317497]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:02:00 localhost dnsmasq-dhcp[317497]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:02:00 localhost nova_compute[281415]: 2025-11-26 10:02:00.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:02:00 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:00.864 262471 INFO neutron.agent.dhcp.agent [None req-29ed7f88-316d-4c26-8beb-0d69c179df8e - - - - - -] DHCP configuration for ports {'a77a60af-05fc-49aa-94e8-bc68d9db7bd5', 'ba010266-c829-4775-9f81-9e5e8ac0a898'} is completed#033[00m Nov 26 05:02:01 localhost dnsmasq[317497]: exiting on receipt of SIGTERM Nov 26 05:02:01 localhost podman[317515]: 2025-11-26 10:02:01.008296378 +0000 UTC m=+0.063938377 container kill af7c1525df162b18019ee621a718d83e95e8fa5d7ef1536554ea3fb2086a917e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:02:01 localhost systemd[1]: libpod-af7c1525df162b18019ee621a718d83e95e8fa5d7ef1536554ea3fb2086a917e.scope: Deactivated successfully. Nov 26 05:02:01 localhost podman[317528]: 2025-11-26 10:02:01.086121584 +0000 UTC m=+0.059864296 container died af7c1525df162b18019ee621a718d83e95e8fa5d7ef1536554ea3fb2086a917e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:02:01 localhost podman[317528]: 2025-11-26 10:02:01.118626463 +0000 UTC m=+0.092369145 container cleanup af7c1525df162b18019ee621a718d83e95e8fa5d7ef1536554ea3fb2086a917e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:02:01 localhost systemd[1]: libpod-conmon-af7c1525df162b18019ee621a718d83e95e8fa5d7ef1536554ea3fb2086a917e.scope: Deactivated successfully. Nov 26 05:02:01 localhost podman[317529]: 2025-11-26 10:02:01.16975823 +0000 UTC m=+0.134757335 container remove af7c1525df162b18019ee621a718d83e95e8fa5d7ef1536554ea3fb2086a917e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118) Nov 26 05:02:01 localhost ovn_controller[153664]: 2025-11-26T10:02:01Z|00288|binding|INFO|Releasing lport a77a60af-05fc-49aa-94e8-bc68d9db7bd5 from this chassis (sb_readonly=0) Nov 26 05:02:01 localhost kernel: device tapa77a60af-05 left promiscuous mode Nov 26 05:02:01 localhost nova_compute[281415]: 2025-11-26 10:02:01.244 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:01 localhost ovn_controller[153664]: 2025-11-26T10:02:01Z|00289|binding|INFO|Setting lport a77a60af-05fc-49aa-94e8-bc68d9db7bd5 down in Southbound Nov 26 05:02:01 localhost nova_compute[281415]: 2025-11-26 10:02:01.246 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 26 05:02:01 localhost nova_compute[281415]: 2025-11-26 10:02:01.261 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:01 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:01.390 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1::2/64 2001:db8::f816:3eff:fedb:5f99/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a77a60af-05fc-49aa-94e8-bc68d9db7bd5) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:02:01 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:01.393 159486 INFO neutron.agent.ovn.metadata.agent [-] Port a77a60af-05fc-49aa-94e8-bc68d9db7bd5 in datapath cc3dc995-51cd-4d70-be2c-11c47524552d unbound from our chassis#033[00m Nov 26 05:02:01 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:01.396 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cc3dc995-51cd-4d70-be2c-11c47524552d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:02:01 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:01.397 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[3ba1e391-5784-400c-97ad-ba15510557db]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:02:01 localhost systemd[1]: var-lib-containers-storage-overlay-848cef7f1234576da8d187493e6fc5202bfa44871f7e3a86813c69438b7ade87-merged.mount: Deactivated successfully. Nov 26 05:02:01 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-af7c1525df162b18019ee621a718d83e95e8fa5d7ef1536554ea3fb2086a917e-userdata-shm.mount: Deactivated successfully. Nov 26 05:02:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 05:02:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 05:02:01 localhost podman[317560]: 2025-11-26 10:02:01.529457208 +0000 UTC m=+0.094599470 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 26 05:02:01 localhost podman[317560]: 2025-11-26 10:02:01.541387061 +0000 UTC m=+0.106529323 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 05:02:01 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 05:02:01 localhost podman[317561]: 2025-11-26 10:02:01.636087413 +0000 UTC m=+0.198332520 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:02:01 localhost podman[317561]: 2025-11-26 10:02:01.652401064 +0000 UTC m=+0.214646191 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:02:01 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 05:02:02 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:02.105 262471 INFO neutron.agent.dhcp.agent [None req-ea0c5d2c-c766-47d3-98a1-d6719dd028c3 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:02:02 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:02.106 262471 INFO neutron.agent.dhcp.agent [None req-ea0c5d2c-c766-47d3-98a1-d6719dd028c3 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:02:02 localhost systemd[1]: run-netns-qdhcp\x2dcc3dc995\x2d51cd\x2d4d70\x2dbe2c\x2d11c47524552d.mount: Deactivated successfully. Nov 26 05:02:02 localhost sshd[317604]: main: sshd: ssh-rsa algorithm is disabled Nov 26 05:02:02 localhost nova_compute[281415]: 2025-11-26 10:02:02.489 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:02 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:02.533 2 INFO neutron.agent.securitygroups_rpc [None req-9aade140-ee1b-4638-880f-a7a99b5919e8 a244375c66834570bc0317c372dd1a3e cb3e8514b34843fd8d83b236fed9b365 - - default default] Security group member updated ['b21e670b-5e2f-49a1-b70b-91a5efe6229b']#033[00m Nov 26 05:02:02 localhost nova_compute[281415]: 2025-11-26 10:02:02.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:02:02 localhost nova_compute[281415]: 2025-11-26 10:02:02.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:02:02 localhost nova_compute[281415]: 2025-11-26 10:02:02.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:02:02 localhost nova_compute[281415]: 2025-11-26 10:02:02.848 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 05:02:03 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:03.125 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:01:57Z, description=, device_id=ee3592c2-cb90-4afb-9993-49908aa5e43a, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=cefa2e33-4b7b-49d4-b35d-16608daaec40, ip_allocation=immediate, mac_address=fa:16:3e:40:8b:ec, name=tempest-PortsIpV6TestJSON-1084729547, network_id=0ab50863-10aa-4a2b-b78b-7be59e696016, port_security_enabled=True, project_id=083b00bb83474f96865b0c5a38c5f88f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=3, security_groups=['6d45445d-04cd-4f12-afe3-c4bc11ac69da'], standard_attr_id=1662, status=ACTIVE, tags=[], tenant_id=083b00bb83474f96865b0c5a38c5f88f, updated_at=2025-11-26T10:02:01Z on network 0ab50863-10aa-4a2b-b78b-7be59e696016#033[00m Nov 26 05:02:03 localhost podman[317625]: 2025-11-26 10:02:03.35678419 +0000 UTC m=+0.069291305 container kill cac6ab04c7d737e695819818855052ce0e52e13aaabac91b919c33911ea455dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0ab50863-10aa-4a2b-b78b-7be59e696016, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS) Nov 26 05:02:03 localhost dnsmasq[317392]: read /var/lib/neutron/dhcp/0ab50863-10aa-4a2b-b78b-7be59e696016/addn_hosts - 1 addresses Nov 26 05:02:03 localhost dnsmasq-dhcp[317392]: read /var/lib/neutron/dhcp/0ab50863-10aa-4a2b-b78b-7be59e696016/host Nov 26 05:02:03 localhost dnsmasq-dhcp[317392]: read /var/lib/neutron/dhcp/0ab50863-10aa-4a2b-b78b-7be59e696016/opts Nov 26 05:02:03 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:03.511 2 INFO neutron.agent.securitygroups_rpc [None req-7322fd6a-7dcc-499b-af16-67a1e46c8390 a244375c66834570bc0317c372dd1a3e cb3e8514b34843fd8d83b236fed9b365 - - default default] Security group member updated ['b21e670b-5e2f-49a1-b70b-91a5efe6229b']#033[00m Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.592 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'name': 'test', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005536118.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'hostId': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.593 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.599 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b7ac9695-e59a-4788-9afc-13522385721b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:02:03.593764', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'f53cb9ee-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11798.836026811, 'message_signature': '14e4a8f1bb00278ce7be3bfdf214f51ede7abdfa865a121df84c6642d1d955dd'}]}, 'timestamp': '2025-11-26 10:02:03.600657', '_unique_id': 'b3e9858e8be5418a8c6adb972dbeb1e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.602 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.603 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.633 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.633 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dbe50d16-4aa7-4ce9-b3ec-a78ff241e7c6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:02:03.603997', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f541c34e-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11798.846253033, 'message_signature': '9562479630b8a563d208a103c3c15be0f41056b686015235405add166207f0e1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:02:03.603997', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f541d8f2-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11798.846253033, 'message_signature': 'fbd18d71aace5c89888c86572782f54129e76eb21d7099e326bd1623243aef38'}]}, 'timestamp': '2025-11-26 10:02:03.634228', '_unique_id': 'a06c7e89e8764b04ae853994d9c784c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.635 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.636 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.637 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes volume: 7557 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '03e54647-66e4-4b84-b18c-6ee29d201581', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7557, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:02:03.636959', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'f5425b92-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11798.836026811, 'message_signature': 'd722c3056375189fea03004ddb390e43d4c9bbb9a661941f492262641d7eea6f'}]}, 'timestamp': '2025-11-26 10:02:03.637477', '_unique_id': '19a5e730611541cfbfe8c1c663a976a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.638 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.639 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.639 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f823367c-86fb-4b46-b3fe-8822721b8ba3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:02:03.639643', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'f542c276-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11798.836026811, 'message_signature': '35ffcc43dfdae5e745275d768cf0ff6854ca9550332ddf827fac9d924309eb57'}]}, 'timestamp': '2025-11-26 10:02:03.640142', '_unique_id': 'ec80153acd8647179e2e57a650c746d1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.641 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.642 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.642 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 1143371229 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.642 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 23326743 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8bdb28b3-6f3e-4c90-bfe6-c269b1928fd7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1143371229, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:02:03.642344', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f5432bb2-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11798.846253033, 'message_signature': '1c4f11bcc8a77bc5046c80e87389d19fb0ed3c4eacc8e6d353b03b60331a6b07'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 23326743, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:02:03.642344', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f5433d0a-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11798.846253033, 'message_signature': 'e0a81b083afb7c1638b766fe55ec0223bad527792ec212ffac149e1fb08b6567'}]}, 'timestamp': '2025-11-26 10:02:03.643214', '_unique_id': 'f87d83480f8f4a6fbc886b15aca474f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.644 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.645 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.645 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.645 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fa373a9d-53d0-42a1-b7ee-fac394a60765', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:02:03.645567', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'f543a98e-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11798.836026811, 'message_signature': '5a5402d4af3ad67e1c653efb4a6e4da3dae662202c9bd290bfaae1176be549c3'}]}, 'timestamp': '2025-11-26 10:02:03.646052', '_unique_id': '70f267e882814ca9a1d572cdf7a6c3e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.646 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.648 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.648 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.648 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8879e756-359c-4c19-b690-7572f0e439cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:02:03.648157', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f5440e92-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11798.846253033, 'message_signature': '5fc9cd5f6de3121532de0ee69bb3defe305a6d392af9e6d841fdd05ed3664807'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:02:03.648157', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f5441e78-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11798.846253033, 'message_signature': 'ed67fa1c8f1f34d783ffeb035bdd37b21702850766e2b3e96c25d5d50c3f2810'}]}, 'timestamp': '2025-11-26 10:02:03.649013', '_unique_id': 'd1f17bc59a2d44c98dd1939a53a48fe0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.649 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.651 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.651 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b325243-d754-48b5-bcca-8d4cdd137afc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:02:03.651136', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'f544832c-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11798.836026811, 'message_signature': '9b45db927946f1f7bee853c5d8d83bd76ea3083db98159be69612e5512458e06'}]}, 'timestamp': '2025-11-26 10:02:03.651614', '_unique_id': 'c8343771dce14832a45579170066768f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.652 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.653 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.653 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3f54900b-4156-4dd9-b097-c58613a5debb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:02:03.653719', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'f544e916-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11798.836026811, 'message_signature': 'ccc7ba2950a7bd703842f5de9fdbb7d229bf62aac002e314b70aa516c7b89af6'}]}, 'timestamp': '2025-11-26 10:02:03.654199', '_unique_id': '3d117bcb82f34faf8188c67356042022'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.655 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.656 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.666 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.667 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:02:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:03.668 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:02:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:03.668 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '538479ab-7178-4838-851d-37439943bf35', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:02:03.656499', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f546e234-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11798.898753531, 'message_signature': '7231771d1b6aa207865bcd56e2291ffe9e17476071613e04a484e03c51c7017c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:02:03.656499', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f546f42c-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11798.898753531, 'message_signature': 'ca3e07cc0b760b74b53d54ec4a926eabdc1fa7275eedcff1bd5514da57970f45'}]}, 'timestamp': '2025-11-26 10:02:03.667563', '_unique_id': '0a6f640969c74089a347b0537d5a86d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:03.669 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.668 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.669 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.669 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.670 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce7d6630-12b4-466f-998a-f6d4d47db646', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:02:03.669746', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f5475b06-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11798.846253033, 'message_signature': '267a0641804d35c083e4fc7e028f0b95a900d2c01fcbaa2869f60fdd6e4f54d1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:02:03.669746', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f5476b32-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11798.846253033, 'message_signature': '8ec77c39a312733d0b6b5324bd45315c3cb94d6da9c2cd0824a8832af20dd43d'}]}, 'timestamp': '2025-11-26 10:02:03.670604', '_unique_id': '4decc77f55dc4183bdc184e3c497d0e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.671 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.672 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.672 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.672 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.673 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d27afda-406f-4789-a67d-4912b335d091', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:02:03.672855', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f547d4e6-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11798.898753531, 'message_signature': '7090eddd8261f630d3364f083194ee9db23a4f322a42cf496d9f543822406412'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:02:03.672855', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f547e4b8-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11798.898753531, 'message_signature': 'c4ab5dc247fdac0c0c36ea7dc3f8db4de64e7627edee228c3e781bdb080373f2'}]}, 'timestamp': '2025-11-26 10:02:03.673716', '_unique_id': 'b6d39b6e953042b5a18d58e290614b25'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.674 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.675 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.692 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/cpu volume: 16630000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '440027a5-1673-4022-be35-5c82d38a02a2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 16630000000, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T10:02:03.675850', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'f54ac3cc-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11798.934231947, 'message_signature': 'c6ac34884b44bd6c21b8c22afb8d577e2bddea4bb800278560222ac467d71a0d'}]}, 'timestamp': '2025-11-26 10:02:03.692553', '_unique_id': '9221ade409884c7a9c356f4ef4e47dbe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.693 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.694 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.694 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.695 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '750aff2b-916f-403c-810f-488975ab37c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:02:03.694645', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f54b2678-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11798.846253033, 'message_signature': '7bf0c66b866e37811d5ddbcf7654c9c8500938ffa7454873555f8e39b49a9564'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:02:03.694645', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f54b37d0-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11798.846253033, 'message_signature': '844d71efffaed2241ccaaee4d25af753d20c64e74e0b7b7d9737bc6bffe8addf'}]}, 'timestamp': '2025-11-26 10:02:03.695506', '_unique_id': '602e5d7b9dce479abd91f3aef859d9dd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.696 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.697 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.697 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.698 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c49a405f-ff59-4fa6-baad-09f91fb21e12', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:02:03.697714', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f54b9f9a-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11798.898753531, 'message_signature': 'd9c1aed052eb9fdce24638b33b018c5c3db6972c56fefd0e4790152aa9a8ace5'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:02:03.697714', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f54bafbc-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11798.898753531, 'message_signature': '69596eba773d81c4543d0ebfe84659ff0fc2c82b5656bd898ac76910c3ebbc21'}]}, 'timestamp': '2025-11-26 10:02:03.698572', '_unique_id': '203834e85e21483e808a70c444fff35c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.699 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.700 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.700 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/memory.usage volume: 51.79296875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '35cf3ff0-5b75-4bf5-a646-465c3063b808', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.79296875, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T10:02:03.700681', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'f54c1240-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11798.934231947, 'message_signature': '0856a185769afbcff751da9652832c0b874d606151ece54b61da1645919a4fa9'}]}, 'timestamp': '2025-11-26 10:02:03.701140', '_unique_id': '47a0b9fb90274558aad5243e97111e5a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.702 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.703 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.703 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '428a6e4f-be26-4e3e-a3ab-7f55372dc3c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:02:03.703189', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'f54c746a-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11798.836026811, 'message_signature': 'b32693695a590b8546519ee19fd490baea66938b268dd88f588f71f1c52a2b35'}]}, 'timestamp': '2025-11-26 10:02:03.703660', '_unique_id': 'da0942ecedc24c6a99e2f1b5816389a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.704 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.705 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.705 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b90cabce-740e-4640-b44d-74db6f689acf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:02:03.705707', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'f54cd6b2-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11798.836026811, 'message_signature': 'da6bf0f6477d79ee83e555f28a8318a3b21db178ddf90e65bc09be573c18162e'}]}, 'timestamp': '2025-11-26 10:02:03.706186', '_unique_id': '2eef9ccdd4054b4e9d966c06dee1fbae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.707 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.708 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.708 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.708 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets volume: 68 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f2cd9997-85f0-4b72-8895-f356633a3fd0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 68, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:02:03.708634', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'f54d4958-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11798.836026811, 'message_signature': '579247931ca9a638ebff8ef50b28dca85a20ef490f1d237b1abd478e71172893'}]}, 'timestamp': '2025-11-26 10:02:03.709122', '_unique_id': 'a02e6324ac1842bfb8fde92b4321cc27'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.709 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.711 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.711 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.711 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '589cc991-20bf-4b0b-bd20-de3afe52a89a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:02:03.711301', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'f54db1a4-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11798.836026811, 'message_signature': '62546e2ea080213df0a8a3170384645fdfed60cc75ec12e303c953f80ef1e02b'}]}, 'timestamp': '2025-11-26 10:02:03.711782', '_unique_id': 'fd389a3a4a0246939fb6949427693a70'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.712 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.713 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.713 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 1723586642 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.714 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 89399569 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd599dbd5-7486-48f2-8fc2-ba9e5d921fd9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1723586642, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:02:03.713819', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f54e148c-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11798.846253033, 'message_signature': '4e7035ab350decb14befef4aff91f2f054d8893ea169be440125d771e7f98d9c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89399569, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:02:03.713819', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f54e249a-caae-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11798.846253033, 'message_signature': 'dca914334701c6328d6482e0f363c6d3183b08b489160c03323da96b624bd280'}]}, 'timestamp': '2025-11-26 10:02:03.714674', '_unique_id': '0306d9072f35400cbc29743e2e4c521a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:02:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:02:03.715 12 ERROR oslo_messaging.notify.messaging Nov 26 05:02:03 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:03.722 262471 INFO neutron.agent.dhcp.agent [None req-dee6b319-f644-422d-8061-6129c3ffac06 - - - - - -] DHCP configuration for ports {'cefa2e33-4b7b-49d4-b35d-16608daaec40'} is completed#033[00m Nov 26 05:02:03 localhost nova_compute[281415]: 2025-11-26 10:02:03.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:02:03 localhost nova_compute[281415]: 2025-11-26 10:02:03.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:02:03 localhost nova_compute[281415]: 2025-11-26 10:02:03.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:02:03 localhost nova_compute[281415]: 2025-11-26 10:02:03.889 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:02:03 localhost nova_compute[281415]: 2025-11-26 10:02:03.890 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:02:03 localhost nova_compute[281415]: 2025-11-26 10:02:03.890 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:02:03 localhost nova_compute[281415]: 2025-11-26 10:02:03.890 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 05:02:03 localhost nova_compute[281415]: 2025-11-26 10:02:03.891 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 05:02:04 localhost ceph-mon[297296]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 26 05:02:04 localhost ceph-mon[297296]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.0 total, 600.0 interval#012Cumulative writes: 2344 writes, 23K keys, 2344 commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.07 MB/s#012Cumulative WAL: 2344 writes, 2344 syncs, 1.00 writes per sync, written: 0.04 GB, 0.07 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2344 writes, 23K keys, 2344 commit groups, 1.0 writes per commit group, ingest: 39.51 MB, 0.07 MB/s#012Interval WAL: 2344 writes, 2344 syncs, 1.00 writes per sync, written: 0.04 GB, 0.07 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 164.3 0.17 0.08 10 0.017 0 0 0.0 0.0#012 L6 1/0 14.00 MB 0.0 0.1 0.0 0.1 0.1 0.0 0.0 5.0 210.0 191.0 0.72 0.43 9 0.080 106K 4533 0.0 0.0#012 Sum 1/0 14.00 MB 0.0 0.1 0.0 0.1 0.2 0.0 0.0 6.0 170.2 186.0 0.89 0.51 19 0.047 106K 4533 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.1 0.0 0.1 0.2 0.0 0.0 6.0 170.8 186.7 0.88 0.51 18 0.049 106K 4533 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low 0/0 0.00 KB 0.0 0.1 0.0 0.1 0.1 0.0 0.0 0.0 210.0 191.0 0.72 0.43 9 0.080 106K 4533 0.0 0.0#012High 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 167.6 0.16 0.08 9 0.018 0 0 0.0 0.0#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.5 0.00 0.00 1 0.003 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.027, interval 0.027#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.16 GB write, 0.28 MB/s write, 0.15 GB read, 0.25 MB/s read, 0.9 seconds#012Interval compaction: 0.16 GB write, 0.28 MB/s write, 0.15 GB read, 0.25 MB/s read, 0.9 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5572d8ebd350#2 capacity: 308.00 MB usage: 28.67 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 0.000397 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1912,27.92 MB,9.06474%) FilterBlock(19,335.42 KB,0.106351%) IndexBlock(19,428.89 KB,0.135987%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] ** Nov 26 05:02:04 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:02:04 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:02:04 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 05:02:04 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2501200131' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 05:02:04 localhost nova_compute[281415]: 2025-11-26 10:02:04.364 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 05:02:04 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e129 e129: 6 total, 6 up, 6 in Nov 26 05:02:04 localhost nova_compute[281415]: 2025-11-26 10:02:04.517 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 05:02:04 localhost nova_compute[281415]: 2025-11-26 10:02:04.517 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 05:02:04 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:04.618 2 INFO neutron.agent.securitygroups_rpc [None req-67297cae-0045-4a2b-8f24-71b528ce2df4 a244375c66834570bc0317c372dd1a3e cb3e8514b34843fd8d83b236fed9b365 - - default default] Security group member updated ['b21e670b-5e2f-49a1-b70b-91a5efe6229b']#033[00m Nov 26 05:02:04 localhost nova_compute[281415]: 2025-11-26 10:02:04.769 281419 WARNING nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 05:02:04 localhost nova_compute[281415]: 2025-11-26 10:02:04.771 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=11285MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 05:02:04 localhost nova_compute[281415]: 2025-11-26 10:02:04.772 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:02:04 localhost nova_compute[281415]: 2025-11-26 10:02:04.772 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:02:04 localhost nova_compute[281415]: 2025-11-26 10:02:04.951 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 05:02:04 localhost nova_compute[281415]: 2025-11-26 10:02:04.951 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 05:02:04 localhost nova_compute[281415]: 2025-11-26 10:02:04.952 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 05:02:04 localhost nova_compute[281415]: 2025-11-26 10:02:04.995 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 05:02:05 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:05.092 2 INFO neutron.agent.securitygroups_rpc [None req-2445300d-ce7e-4426-889c-48c3febba7c2 a025d41076dc4bf8b5c42446a20db4a7 083b00bb83474f96865b0c5a38c5f88f - - default default] Security group member updated ['6d45445d-04cd-4f12-afe3-c4bc11ac69da']#033[00m Nov 26 05:02:05 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:05.233 262471 INFO neutron.agent.linux.ip_lib [None req-f89c0846-18e1-4354-ab24-0e1fa0928b9c - - - - - -] Device tapd40c88ee-08 cannot be used as it has no MAC address#033[00m Nov 26 05:02:05 localhost nova_compute[281415]: 2025-11-26 10:02:05.258 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:05 localhost kernel: device tapd40c88ee-08 entered promiscuous mode Nov 26 05:02:05 localhost NetworkManager[5970]: [1764151325.2688] manager: (tapd40c88ee-08): new Generic device (/org/freedesktop/NetworkManager/Devices/48) Nov 26 05:02:05 localhost nova_compute[281415]: 2025-11-26 10:02:05.268 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:05 localhost ovn_controller[153664]: 2025-11-26T10:02:05Z|00290|binding|INFO|Claiming lport d40c88ee-08b8-4e01-8b40-df66e45034fe for this chassis. Nov 26 05:02:05 localhost ovn_controller[153664]: 2025-11-26T10:02:05Z|00291|binding|INFO|d40c88ee-08b8-4e01-8b40-df66e45034fe: Claiming unknown Nov 26 05:02:05 localhost systemd-udevd[317713]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:02:05 localhost ovn_controller[153664]: 2025-11-26T10:02:05Z|00292|binding|INFO|Setting lport d40c88ee-08b8-4e01-8b40-df66e45034fe ovn-installed in OVS Nov 26 05:02:05 localhost nova_compute[281415]: 2025-11-26 10:02:05.295 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:05 localhost nova_compute[281415]: 2025-11-26 10:02:05.307 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:05 localhost journal[229445]: ethtool ioctl error on tapd40c88ee-08: No such device Nov 26 05:02:05 localhost journal[229445]: ethtool ioctl error on tapd40c88ee-08: No such device Nov 26 05:02:05 localhost journal[229445]: ethtool ioctl error on tapd40c88ee-08: No such device Nov 26 05:02:05 localhost journal[229445]: ethtool ioctl error on tapd40c88ee-08: No such device Nov 26 05:02:05 localhost journal[229445]: ethtool ioctl error on tapd40c88ee-08: No such device Nov 26 05:02:05 localhost journal[229445]: ethtool ioctl error on tapd40c88ee-08: No such device Nov 26 05:02:05 localhost ovn_controller[153664]: 2025-11-26T10:02:05Z|00293|binding|INFO|Setting lport d40c88ee-08b8-4e01-8b40-df66e45034fe up in Southbound Nov 26 05:02:05 localhost journal[229445]: ethtool ioctl error on tapd40c88ee-08: No such device Nov 26 05:02:05 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:05.342 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe50:ae77/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d40c88ee-08b8-4e01-8b40-df66e45034fe) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:02:05 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:05.344 159486 INFO neutron.agent.ovn.metadata.agent [-] Port d40c88ee-08b8-4e01-8b40-df66e45034fe in datapath cc3dc995-51cd-4d70-be2c-11c47524552d bound to our chassis#033[00m Nov 26 05:02:05 localhost journal[229445]: ethtool ioctl error on tapd40c88ee-08: No such device Nov 26 05:02:05 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:05.346 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Port c9632e6d-bbe6-4f0d-b17b-4301a6ab179c IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 26 05:02:05 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:05.347 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cc3dc995-51cd-4d70-be2c-11c47524552d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:02:05 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:05.348 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[4c206b5a-2e23-472f-8497-9e7c33c30141]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:02:05 localhost nova_compute[281415]: 2025-11-26 10:02:05.364 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:05 localhost nova_compute[281415]: 2025-11-26 10:02:05.396 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:05 localhost podman[317727]: 2025-11-26 10:02:05.412198297 +0000 UTC m=+0.075511498 container kill cac6ab04c7d737e695819818855052ce0e52e13aaabac91b919c33911ea455dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0ab50863-10aa-4a2b-b78b-7be59e696016, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118) Nov 26 05:02:05 localhost dnsmasq[317392]: read /var/lib/neutron/dhcp/0ab50863-10aa-4a2b-b78b-7be59e696016/addn_hosts - 0 addresses Nov 26 05:02:05 localhost dnsmasq-dhcp[317392]: read /var/lib/neutron/dhcp/0ab50863-10aa-4a2b-b78b-7be59e696016/host Nov 26 05:02:05 localhost dnsmasq-dhcp[317392]: read /var/lib/neutron/dhcp/0ab50863-10aa-4a2b-b78b-7be59e696016/opts Nov 26 05:02:05 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 05:02:05 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/4136156163' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 05:02:05 localhost nova_compute[281415]: 2025-11-26 10:02:05.466 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 05:02:05 localhost nova_compute[281415]: 2025-11-26 10:02:05.473 281419 DEBUG nova.compute.provider_tree [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 05:02:05 localhost nova_compute[281415]: 2025-11-26 10:02:05.517 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 05:02:05 localhost nova_compute[281415]: 2025-11-26 10:02:05.519 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 05:02:05 localhost nova_compute[281415]: 2025-11-26 10:02:05.520 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.748s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:02:06 localhost nova_compute[281415]: 2025-11-26 10:02:06.207 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:06 localhost ovn_controller[153664]: 2025-11-26T10:02:06Z|00294|binding|INFO|Releasing lport 90579928-8403-4db1-af69-5bb45b1706cf from this chassis (sb_readonly=0) Nov 26 05:02:06 localhost kernel: device tap90579928-84 left promiscuous mode Nov 26 05:02:06 localhost ovn_controller[153664]: 2025-11-26T10:02:06Z|00295|binding|INFO|Setting lport 90579928-8403-4db1-af69-5bb45b1706cf down in Southbound Nov 26 05:02:06 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:06.218 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-0ab50863-10aa-4a2b-b78b-7be59e696016', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0ab50863-10aa-4a2b-b78b-7be59e696016', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '083b00bb83474f96865b0c5a38c5f88f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2cdb8bec-7c38-4b23-b22c-dbad3237715c, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=90579928-8403-4db1-af69-5bb45b1706cf) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:02:06 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:06.220 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 90579928-8403-4db1-af69-5bb45b1706cf in datapath 0ab50863-10aa-4a2b-b78b-7be59e696016 unbound from our chassis#033[00m Nov 26 05:02:06 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:06.222 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 0ab50863-10aa-4a2b-b78b-7be59e696016 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:02:06 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:06.223 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[a48609e2-354c-44e2-b8e6-a259fd11bbbe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:02:06 localhost nova_compute[281415]: 2025-11-26 10:02:06.225 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:06 localhost nova_compute[281415]: 2025-11-26 10:02:06.246 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:06 localhost podman[317811]: Nov 26 05:02:06 localhost podman[317811]: 2025-11-26 10:02:06.404046508 +0000 UTC m=+0.098364501 container create 69afef6c812098ad2271e04125f5b9145e8d9cc437d77c5a1c770988a54bfacf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 26 05:02:06 localhost systemd[1]: Started libpod-conmon-69afef6c812098ad2271e04125f5b9145e8d9cc437d77c5a1c770988a54bfacf.scope. Nov 26 05:02:06 localhost podman[317811]: 2025-11-26 10:02:06.358233458 +0000 UTC m=+0.052551501 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:02:06 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e130 e130: 6 total, 6 up, 6 in Nov 26 05:02:06 localhost systemd[1]: Started libcrun container. Nov 26 05:02:06 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d6f34bf4ee47eacbe351b19bed938044c1b589405cb0ee63b403a0d2e157450/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:02:06 localhost systemd-journald[47778]: Data hash table of /run/log/journal/ea6370aa35b896eb1e7cdbd81aa316d7/system.journal has a fill level at 75.0 (53723 of 71630 items, 25165824 file size, 468 bytes per hash table item), suggesting rotation. Nov 26 05:02:06 localhost systemd-journald[47778]: /run/log/journal/ea6370aa35b896eb1e7cdbd81aa316d7/system.journal: Journal header limits reached or header out-of-date, rotating. Nov 26 05:02:06 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 26 05:02:06 localhost podman[317811]: 2025-11-26 10:02:06.490948131 +0000 UTC m=+0.185266114 container init 69afef6c812098ad2271e04125f5b9145e8d9cc437d77c5a1c770988a54bfacf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118) Nov 26 05:02:06 localhost podman[317811]: 2025-11-26 10:02:06.503063409 +0000 UTC m=+0.197381392 container start 69afef6c812098ad2271e04125f5b9145e8d9cc437d77c5a1c770988a54bfacf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 26 05:02:06 localhost dnsmasq[317831]: started, version 2.85 cachesize 150 Nov 26 05:02:06 localhost dnsmasq[317831]: DNS service limited to local subnets Nov 26 05:02:06 localhost dnsmasq[317831]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:02:06 localhost dnsmasq[317831]: warning: no upstream servers configured Nov 26 05:02:06 localhost dnsmasq-dhcp[317831]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 26 05:02:06 localhost dnsmasq[317831]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:02:06 localhost dnsmasq-dhcp[317831]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:02:06 localhost dnsmasq-dhcp[317831]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:02:06 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 26 05:02:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:06.811 262471 INFO neutron.agent.dhcp.agent [None req-e548a450-4dba-487c-b5b5-ad2908197e1a - - - - - -] DHCP configuration for ports {'ba010266-c829-4775-9f81-9e5e8ac0a898'} is completed#033[00m Nov 26 05:02:06 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:06.859 2 INFO neutron.agent.securitygroups_rpc [None req-113a8e39-3aa7-4dc9-86f1-dd9d0dcadaf6 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:02:07 localhost dnsmasq[317831]: exiting on receipt of SIGTERM Nov 26 05:02:07 localhost podman[317849]: 2025-11-26 10:02:07.038287804 +0000 UTC m=+0.067543623 container kill 69afef6c812098ad2271e04125f5b9145e8d9cc437d77c5a1c770988a54bfacf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2) Nov 26 05:02:07 localhost systemd[1]: libpod-69afef6c812098ad2271e04125f5b9145e8d9cc437d77c5a1c770988a54bfacf.scope: Deactivated successfully. Nov 26 05:02:07 localhost podman[317862]: 2025-11-26 10:02:07.110608567 +0000 UTC m=+0.058631580 container died 69afef6c812098ad2271e04125f5b9145e8d9cc437d77c5a1c770988a54bfacf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 26 05:02:07 localhost podman[317862]: 2025-11-26 10:02:07.145797195 +0000 UTC m=+0.093820168 container cleanup 69afef6c812098ad2271e04125f5b9145e8d9cc437d77c5a1c770988a54bfacf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 26 05:02:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 05:02:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 05:02:07 localhost systemd[1]: libpod-conmon-69afef6c812098ad2271e04125f5b9145e8d9cc437d77c5a1c770988a54bfacf.scope: Deactivated successfully. Nov 26 05:02:07 localhost podman[317868]: 2025-11-26 10:02:07.213002156 +0000 UTC m=+0.146264074 container remove 69afef6c812098ad2271e04125f5b9145e8d9cc437d77c5a1c770988a54bfacf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0) Nov 26 05:02:07 localhost podman[317889]: 2025-11-26 10:02:07.263197117 +0000 UTC m=+0.094900820 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.build-date=20251118) Nov 26 05:02:07 localhost podman[317890]: 2025-11-26 10:02:07.338982822 +0000 UTC m=+0.166102699 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter) Nov 26 05:02:07 localhost podman[317889]: 2025-11-26 10:02:07.344989169 +0000 UTC m=+0.176692922 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, org.label-schema.schema-version=1.0) Nov 26 05:02:07 localhost podman[317890]: 2025-11-26 10:02:07.357495208 +0000 UTC m=+0.184615075 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6, distribution-scope=public, release=1755695350, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.tags=minimal rhel9, name=ubi9-minimal, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible) Nov 26 05:02:07 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 05:02:07 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 05:02:07 localhost systemd[1]: tmp-crun.P49KOG.mount: Deactivated successfully. Nov 26 05:02:07 localhost systemd[1]: var-lib-containers-storage-overlay-9d6f34bf4ee47eacbe351b19bed938044c1b589405cb0ee63b403a0d2e157450-merged.mount: Deactivated successfully. Nov 26 05:02:07 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-69afef6c812098ad2271e04125f5b9145e8d9cc437d77c5a1c770988a54bfacf-userdata-shm.mount: Deactivated successfully. Nov 26 05:02:07 localhost nova_compute[281415]: 2025-11-26 10:02:07.492 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:07 localhost nova_compute[281415]: 2025-11-26 10:02:07.516 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:02:07 localhost nova_compute[281415]: 2025-11-26 10:02:07.544 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:02:07 localhost nova_compute[281415]: 2025-11-26 10:02:07.545 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 05:02:07 localhost nova_compute[281415]: 2025-11-26 10:02:07.545 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 05:02:07 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:07.655 2 INFO neutron.agent.securitygroups_rpc [None req-403e895a-f73c-46e6-b8bd-fb44e75b7a19 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:02:07 localhost nova_compute[281415]: 2025-11-26 10:02:07.703 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 05:02:07 localhost nova_compute[281415]: 2025-11-26 10:02:07.704 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 05:02:07 localhost nova_compute[281415]: 2025-11-26 10:02:07.704 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 05:02:07 localhost nova_compute[281415]: 2025-11-26 10:02:07.705 281419 DEBUG nova.objects.instance [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 05:02:07 localhost podman[317953]: 2025-11-26 10:02:07.81905783 +0000 UTC m=+0.065599135 container kill cac6ab04c7d737e695819818855052ce0e52e13aaabac91b919c33911ea455dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0ab50863-10aa-4a2b-b78b-7be59e696016, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:02:07 localhost dnsmasq[317392]: exiting on receipt of SIGTERM Nov 26 05:02:07 localhost systemd[1]: libpod-cac6ab04c7d737e695819818855052ce0e52e13aaabac91b919c33911ea455dd.scope: Deactivated successfully. Nov 26 05:02:07 localhost podman[317966]: 2025-11-26 10:02:07.89669957 +0000 UTC m=+0.060340590 container died cac6ab04c7d737e695819818855052ce0e52e13aaabac91b919c33911ea455dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0ab50863-10aa-4a2b-b78b-7be59e696016, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 26 05:02:07 localhost systemd[1]: tmp-crun.CHa2kL.mount: Deactivated successfully. Nov 26 05:02:07 localhost podman[317966]: 2025-11-26 10:02:07.936796852 +0000 UTC m=+0.100437832 container cleanup cac6ab04c7d737e695819818855052ce0e52e13aaabac91b919c33911ea455dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0ab50863-10aa-4a2b-b78b-7be59e696016, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:02:07 localhost systemd[1]: libpod-conmon-cac6ab04c7d737e695819818855052ce0e52e13aaabac91b919c33911ea455dd.scope: Deactivated successfully. Nov 26 05:02:07 localhost podman[317967]: 2025-11-26 10:02:07.978641376 +0000 UTC m=+0.137551237 container remove cac6ab04c7d737e695819818855052ce0e52e13aaabac91b919c33911ea455dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0ab50863-10aa-4a2b-b78b-7be59e696016, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:02:08 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:08.020 262471 INFO neutron.agent.dhcp.agent [None req-0b2f4317-5e9a-4793-8029-34bf9812a01f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:02:08 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:08.047 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:5e:b4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '86:cf:7c:68:02:df'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:02:08 localhost nova_compute[281415]: 2025-11-26 10:02:08.048 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:08 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:08.049 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 26 05:02:08 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:08.254 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:02:08 localhost nova_compute[281415]: 2025-11-26 10:02:08.356 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 05:02:08 localhost nova_compute[281415]: 2025-11-26 10:02:08.401 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 05:02:08 localhost nova_compute[281415]: 2025-11-26 10:02:08.401 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 05:02:08 localhost systemd[1]: var-lib-containers-storage-overlay-ff121ff0638570a01e9f9e63dad26aa114fa0f7b01074f18d749550add2cde8c-merged.mount: Deactivated successfully. Nov 26 05:02:08 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cac6ab04c7d737e695819818855052ce0e52e13aaabac91b919c33911ea455dd-userdata-shm.mount: Deactivated successfully. Nov 26 05:02:08 localhost systemd[1]: run-netns-qdhcp\x2d0ab50863\x2d10aa\x2d4a2b\x2db78b\x2d7be59e696016.mount: Deactivated successfully. Nov 26 05:02:08 localhost ovn_controller[153664]: 2025-11-26T10:02:08Z|00296|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:02:08 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e131 e131: 6 total, 6 up, 6 in Nov 26 05:02:08 localhost nova_compute[281415]: 2025-11-26 10:02:08.528 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:08 localhost podman[318046]: Nov 26 05:02:08 localhost podman[318046]: 2025-11-26 10:02:08.946571473 +0000 UTC m=+0.096282420 container create 1116953a4affd035dbf179681fd2c6a1f67b5efee34f5af11dbed5a7151c8319 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:02:08 localhost systemd[1]: Started libpod-conmon-1116953a4affd035dbf179681fd2c6a1f67b5efee34f5af11dbed5a7151c8319.scope. Nov 26 05:02:09 localhost podman[318046]: 2025-11-26 10:02:08.902493853 +0000 UTC m=+0.052204900 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:02:09 localhost systemd[1]: tmp-crun.Zphcd4.mount: Deactivated successfully. Nov 26 05:02:09 localhost systemd[1]: Started libcrun container. Nov 26 05:02:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c1a4f69de5399b720c2e2343d89465ebb3182711ec1efc3128467b2bc43bf962/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:02:09 localhost podman[318046]: 2025-11-26 10:02:09.031292331 +0000 UTC m=+0.181003278 container init 1116953a4affd035dbf179681fd2c6a1f67b5efee34f5af11dbed5a7151c8319 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118) Nov 26 05:02:09 localhost podman[318046]: 2025-11-26 10:02:09.040176343 +0000 UTC m=+0.189887290 container start 1116953a4affd035dbf179681fd2c6a1f67b5efee34f5af11dbed5a7151c8319 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Nov 26 05:02:09 localhost dnsmasq[318064]: started, version 2.85 cachesize 150 Nov 26 05:02:09 localhost dnsmasq[318064]: DNS service limited to local subnets Nov 26 05:02:09 localhost dnsmasq[318064]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:02:09 localhost dnsmasq[318064]: warning: no upstream servers configured Nov 26 05:02:09 localhost dnsmasq-dhcp[318064]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Nov 26 05:02:09 localhost dnsmasq-dhcp[318064]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 26 05:02:09 localhost dnsmasq[318064]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:02:09 localhost dnsmasq-dhcp[318064]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:02:09 localhost dnsmasq-dhcp[318064]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:02:09 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e131 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:02:09 localhost podman[318081]: 2025-11-26 10:02:09.439313725 +0000 UTC m=+0.072280423 container kill 1116953a4affd035dbf179681fd2c6a1f67b5efee34f5af11dbed5a7151c8319 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 26 05:02:09 localhost dnsmasq[318064]: exiting on receipt of SIGTERM Nov 26 05:02:09 localhost systemd[1]: libpod-1116953a4affd035dbf179681fd2c6a1f67b5efee34f5af11dbed5a7151c8319.scope: Deactivated successfully. Nov 26 05:02:09 localhost podman[318095]: 2025-11-26 10:02:09.513407659 +0000 UTC m=+0.058372962 container died 1116953a4affd035dbf179681fd2c6a1f67b5efee34f5af11dbed5a7151c8319 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:02:09 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1116953a4affd035dbf179681fd2c6a1f67b5efee34f5af11dbed5a7151c8319-userdata-shm.mount: Deactivated successfully. Nov 26 05:02:09 localhost podman[318095]: 2025-11-26 10:02:09.596338515 +0000 UTC m=+0.141303788 container cleanup 1116953a4affd035dbf179681fd2c6a1f67b5efee34f5af11dbed5a7151c8319 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118) Nov 26 05:02:09 localhost systemd[1]: libpod-conmon-1116953a4affd035dbf179681fd2c6a1f67b5efee34f5af11dbed5a7151c8319.scope: Deactivated successfully. Nov 26 05:02:09 localhost podman[318097]: 2025-11-26 10:02:09.621093846 +0000 UTC m=+0.156418284 container remove 1116953a4affd035dbf179681fd2c6a1f67b5efee34f5af11dbed5a7151c8319 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 26 05:02:09 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:09.630 262471 INFO neutron.agent.dhcp.agent [None req-2ad8c0b7-7e3d-49e9-bbb5-ed072926aeff - - - - - -] DHCP configuration for ports {'ba010266-c829-4775-9f81-9e5e8ac0a898', 'd40c88ee-08b8-4e01-8b40-df66e45034fe'} is completed#033[00m Nov 26 05:02:10 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:10.271 2 INFO neutron.agent.securitygroups_rpc [None req-e63c8caa-44e0-4b83-b701-a0b44919314b a025d41076dc4bf8b5c42446a20db4a7 083b00bb83474f96865b0c5a38c5f88f - - default default] Security group member updated ['8ef43f4c-6294-4e42-9cc5-27ecd40a0c93']#033[00m Nov 26 05:02:10 localhost systemd[1]: var-lib-containers-storage-overlay-c1a4f69de5399b720c2e2343d89465ebb3182711ec1efc3128467b2bc43bf962-merged.mount: Deactivated successfully. Nov 26 05:02:10 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e132 e132: 6 total, 6 up, 6 in Nov 26 05:02:10 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:10.951 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:74:03 2001:db8:0:1:f816:3eff:feaf:7403'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:feaf:7403/64', 'neutron:device_id': 'ovnmeta-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '30', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ba010266-c829-4775-9f81-9e5e8ac0a898) old=Port_Binding(mac=['fa:16:3e:af:74:03 2001:db8::f816:3eff:feaf:7403'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feaf:7403/64', 'neutron:device_id': 'ovnmeta-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '28', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:02:10 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:10.955 159486 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ba010266-c829-4775-9f81-9e5e8ac0a898 in datapath cc3dc995-51cd-4d70-be2c-11c47524552d updated#033[00m Nov 26 05:02:10 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:10.957 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Port c9632e6d-bbe6-4f0d-b17b-4301a6ab179c IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 26 05:02:10 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:10.958 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cc3dc995-51cd-4d70-be2c-11c47524552d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:02:10 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:10.959 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[0ccaaf18-fdfd-43b7-ae18-e3dfbb100fc5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:02:11 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:11.051 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fad182b-d1fd-4eb1-a4d3-436a76a6f49e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 05:02:11 localhost nova_compute[281415]: 2025-11-26 10:02:11.247 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:11 localhost podman[318175]: Nov 26 05:02:11 localhost podman[318175]: 2025-11-26 10:02:11.32951504 +0000 UTC m=+0.063936756 container create 35fa164138fe8feb6ffe07a57150d5b8b9d228f68356e8b876f12a708c29a573 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118) Nov 26 05:02:11 localhost systemd[1]: Started libpod-conmon-35fa164138fe8feb6ffe07a57150d5b8b9d228f68356e8b876f12a708c29a573.scope. Nov 26 05:02:11 localhost systemd[1]: Started libcrun container. Nov 26 05:02:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 05:02:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8675092d5d2f0ddace739b4288a3fa1364e9b2d8a4042fc8d0bfd65377ab0975/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:02:11 localhost podman[318175]: 2025-11-26 10:02:11.392445426 +0000 UTC m=+0.126867142 container init 35fa164138fe8feb6ffe07a57150d5b8b9d228f68356e8b876f12a708c29a573 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Nov 26 05:02:11 localhost podman[318175]: 2025-11-26 10:02:11.298869317 +0000 UTC m=+0.033291103 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:02:11 localhost podman[318175]: 2025-11-26 10:02:11.403054209 +0000 UTC m=+0.137475925 container start 35fa164138fe8feb6ffe07a57150d5b8b9d228f68356e8b876f12a708c29a573 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Nov 26 05:02:11 localhost dnsmasq[318200]: started, version 2.85 cachesize 150 Nov 26 05:02:11 localhost dnsmasq[318200]: DNS service limited to local subnets Nov 26 05:02:11 localhost dnsmasq[318200]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:02:11 localhost dnsmasq[318200]: warning: no upstream servers configured Nov 26 05:02:11 localhost dnsmasq-dhcp[318200]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 26 05:02:11 localhost dnsmasq[318200]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:02:11 localhost dnsmasq-dhcp[318200]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:02:11 localhost dnsmasq-dhcp[318200]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:02:11 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:11.467 262471 INFO neutron.agent.dhcp.agent [None req-f355a96b-9d06-4666-95b4-240f89fc6f89 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:02:06Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=0e19300a-8fc0-457b-8a6a-d0ceb3b7dd41, ip_allocation=immediate, mac_address=fa:16:3e:db:f6:59, name=tempest-NetworksTestDHCPv6-2084550545, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:00:20Z, description=, dns_domain=, id=cc3dc995-51cd-4d70-be2c-11c47524552d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-843096697, port_security_enabled=True, project_id=fbf16d8f1271436498d8d9cbfb24239d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42907, qos_policy_id=None, revision_number=53, router:external=False, shared=False, standard_attr_id=1055, status=ACTIVE, subnets=['b8699109-3780-4241-823f-b3a9c2cd35b9', 'be115e1b-e425-44fd-bdb5-c748aeb457b2'], tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:02:04Z, vlan_transparent=None, network_id=cc3dc995-51cd-4d70-be2c-11c47524552d, port_security_enabled=True, project_id=fbf16d8f1271436498d8d9cbfb24239d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['513251a1-00ec-4f61-b1d4-b1337479c848'], standard_attr_id=1693, status=DOWN, tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:02:06Z on network cc3dc995-51cd-4d70-be2c-11c47524552d#033[00m Nov 26 05:02:11 localhost podman[318193]: 2025-11-26 10:02:11.494093683 +0000 UTC m=+0.100928817 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 26 05:02:11 localhost podman[318193]: 2025-11-26 10:02:11.528393226 +0000 UTC m=+0.135228360 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 26 05:02:11 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 05:02:11 localhost dnsmasq[318200]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 2 addresses Nov 26 05:02:11 localhost podman[318235]: 2025-11-26 10:02:11.690074273 +0000 UTC m=+0.061428582 container kill 35fa164138fe8feb6ffe07a57150d5b8b9d228f68356e8b876f12a708c29a573 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 26 05:02:11 localhost dnsmasq-dhcp[318200]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:02:11 localhost dnsmasq-dhcp[318200]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:02:11 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:11.768 262471 INFO neutron.agent.dhcp.agent [None req-f4090160-7dc6-4bc7-8824-a539e7b2add2 - - - - - -] DHCP configuration for ports {'ba010266-c829-4775-9f81-9e5e8ac0a898', 'd40c88ee-08b8-4e01-8b40-df66e45034fe'} is completed#033[00m Nov 26 05:02:11 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:11.892 262471 INFO neutron.agent.dhcp.agent [None req-fafdeb76-1b97-4eba-b513-15d7473a7f71 - - - - - -] DHCP configuration for ports {'0e19300a-8fc0-457b-8a6a-d0ceb3b7dd41'} is completed#033[00m Nov 26 05:02:12 localhost dnsmasq[318200]: exiting on receipt of SIGTERM Nov 26 05:02:12 localhost podman[318271]: 2025-11-26 10:02:12.147006579 +0000 UTC m=+0.070179951 container kill 35fa164138fe8feb6ffe07a57150d5b8b9d228f68356e8b876f12a708c29a573 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:02:12 localhost systemd[1]: libpod-35fa164138fe8feb6ffe07a57150d5b8b9d228f68356e8b876f12a708c29a573.scope: Deactivated successfully. Nov 26 05:02:12 localhost podman[318285]: 2025-11-26 10:02:12.228119071 +0000 UTC m=+0.061718191 container died 35fa164138fe8feb6ffe07a57150d5b8b9d228f68356e8b876f12a708c29a573 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2) Nov 26 05:02:12 localhost podman[318285]: 2025-11-26 10:02:12.266643617 +0000 UTC m=+0.100242737 container cleanup 35fa164138fe8feb6ffe07a57150d5b8b9d228f68356e8b876f12a708c29a573 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 26 05:02:12 localhost systemd[1]: libpod-conmon-35fa164138fe8feb6ffe07a57150d5b8b9d228f68356e8b876f12a708c29a573.scope: Deactivated successfully. Nov 26 05:02:12 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:12.288 2 INFO neutron.agent.securitygroups_rpc [None req-88fbac63-6968-4af1-b80f-2f917a885cb2 a025d41076dc4bf8b5c42446a20db4a7 083b00bb83474f96865b0c5a38c5f88f - - default default] Security group member updated ['8ef43f4c-6294-4e42-9cc5-27ecd40a0c93', 'da50e11f-04b7-4f09-8f8f-585014b4f5b5']#033[00m Nov 26 05:02:12 localhost podman[318287]: 2025-11-26 10:02:12.305568315 +0000 UTC m=+0.132124197 container remove 35fa164138fe8feb6ffe07a57150d5b8b9d228f68356e8b876f12a708c29a573 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 26 05:02:12 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:12.467 2 INFO neutron.agent.securitygroups_rpc [None req-28664749-fc04-4a18-8a26-c3ee8af41636 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:02:12 localhost systemd[1]: var-lib-containers-storage-overlay-8675092d5d2f0ddace739b4288a3fa1364e9b2d8a4042fc8d0bfd65377ab0975-merged.mount: Deactivated successfully. Nov 26 05:02:12 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-35fa164138fe8feb6ffe07a57150d5b8b9d228f68356e8b876f12a708c29a573-userdata-shm.mount: Deactivated successfully. Nov 26 05:02:12 localhost nova_compute[281415]: 2025-11-26 10:02:12.533 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:12 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:12.839 2 INFO neutron.agent.securitygroups_rpc [None req-c6875953-efdc-44da-8a75-3c82c4b83b00 a025d41076dc4bf8b5c42446a20db4a7 083b00bb83474f96865b0c5a38c5f88f - - default default] Security group member updated ['da50e11f-04b7-4f09-8f8f-585014b4f5b5']#033[00m Nov 26 05:02:13 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:13.096 2 INFO neutron.agent.securitygroups_rpc [None req-f67a784b-4990-470d-b501-c151f6618974 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:02:13 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:13.161 262471 INFO neutron.agent.linux.ip_lib [None req-21d42532-b9cb-424a-b0ed-0728e0b183b6 - - - - - -] Device tap0da4c8be-18 cannot be used as it has no MAC address#033[00m Nov 26 05:02:13 localhost nova_compute[281415]: 2025-11-26 10:02:13.193 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:13 localhost kernel: device tap0da4c8be-18 entered promiscuous mode Nov 26 05:02:13 localhost NetworkManager[5970]: [1764151333.2058] manager: (tap0da4c8be-18): new Generic device (/org/freedesktop/NetworkManager/Devices/49) Nov 26 05:02:13 localhost ovn_controller[153664]: 2025-11-26T10:02:13Z|00297|binding|INFO|Claiming lport 0da4c8be-1871-4fdf-8ba5-062b50e238ab for this chassis. Nov 26 05:02:13 localhost nova_compute[281415]: 2025-11-26 10:02:13.207 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:13 localhost ovn_controller[153664]: 2025-11-26T10:02:13Z|00298|binding|INFO|0da4c8be-1871-4fdf-8ba5-062b50e238ab: Claiming unknown Nov 26 05:02:13 localhost systemd-udevd[318382]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:02:13 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:13.220 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:ffff::2/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-8b259add-0a8d-49d3-827c-570e822875fa', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8b259add-0a8d-49d3-827c-570e822875fa', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1b441b9cc9474cf0bf826c2d3b0ac3a3', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=027d1b16-cf7e-43f2-89a2-4093a0950f2f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=0da4c8be-1871-4fdf-8ba5-062b50e238ab) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:02:13 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:13.222 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 0da4c8be-1871-4fdf-8ba5-062b50e238ab in datapath 8b259add-0a8d-49d3-827c-570e822875fa bound to our chassis#033[00m Nov 26 05:02:13 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:13.224 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8b259add-0a8d-49d3-827c-570e822875fa or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:02:13 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:13.225 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[3c1ed130-91df-424d-8533-4cbe716e2944]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:02:13 localhost journal[229445]: ethtool ioctl error on tap0da4c8be-18: No such device Nov 26 05:02:13 localhost ovn_controller[153664]: 2025-11-26T10:02:13Z|00299|binding|INFO|Setting lport 0da4c8be-1871-4fdf-8ba5-062b50e238ab ovn-installed in OVS Nov 26 05:02:13 localhost ovn_controller[153664]: 2025-11-26T10:02:13Z|00300|binding|INFO|Setting lport 0da4c8be-1871-4fdf-8ba5-062b50e238ab up in Southbound Nov 26 05:02:13 localhost nova_compute[281415]: 2025-11-26 10:02:13.247 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:13 localhost nova_compute[281415]: 2025-11-26 10:02:13.249 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:13 localhost journal[229445]: ethtool ioctl error on tap0da4c8be-18: No such device Nov 26 05:02:13 localhost journal[229445]: ethtool ioctl error on tap0da4c8be-18: No such device Nov 26 05:02:13 localhost podman[318370]: Nov 26 05:02:13 localhost journal[229445]: ethtool ioctl error on tap0da4c8be-18: No such device Nov 26 05:02:13 localhost journal[229445]: ethtool ioctl error on tap0da4c8be-18: No such device Nov 26 05:02:13 localhost podman[318370]: 2025-11-26 10:02:13.270155662 +0000 UTC m=+0.100273437 container create 4becc0078134ccb261801182b87aab1b10aec41da150fed89129d651c0ab080e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0) Nov 26 05:02:13 localhost journal[229445]: ethtool ioctl error on tap0da4c8be-18: No such device Nov 26 05:02:13 localhost journal[229445]: ethtool ioctl error on tap0da4c8be-18: No such device Nov 26 05:02:13 localhost journal[229445]: ethtool ioctl error on tap0da4c8be-18: No such device Nov 26 05:02:13 localhost nova_compute[281415]: 2025-11-26 10:02:13.296 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:13 localhost podman[318370]: 2025-11-26 10:02:13.233876002 +0000 UTC m=+0.063993827 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:02:13 localhost systemd[1]: Started libpod-conmon-4becc0078134ccb261801182b87aab1b10aec41da150fed89129d651c0ab080e.scope. Nov 26 05:02:13 localhost nova_compute[281415]: 2025-11-26 10:02:13.343 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:13 localhost systemd[1]: Started libcrun container. Nov 26 05:02:13 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e133 e133: 6 total, 6 up, 6 in Nov 26 05:02:13 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/318886c504d8ca75d2c617d9eaebb238407f9edf966142ea96759dd43f951d57/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:02:13 localhost podman[318370]: 2025-11-26 10:02:13.400054224 +0000 UTC m=+0.230171989 container init 4becc0078134ccb261801182b87aab1b10aec41da150fed89129d651c0ab080e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:02:13 localhost podman[318370]: 2025-11-26 10:02:13.412593373 +0000 UTC m=+0.242711168 container start 4becc0078134ccb261801182b87aab1b10aec41da150fed89129d651c0ab080e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:02:13 localhost dnsmasq[318418]: started, version 2.85 cachesize 150 Nov 26 05:02:13 localhost dnsmasq[318418]: DNS service limited to local subnets Nov 26 05:02:13 localhost dnsmasq[318418]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:02:13 localhost dnsmasq[318418]: warning: no upstream servers configured Nov 26 05:02:13 localhost dnsmasq-dhcp[318418]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 26 05:02:13 localhost dnsmasq[318418]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:02:13 localhost dnsmasq-dhcp[318418]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:02:13 localhost dnsmasq-dhcp[318418]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:02:13 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:13.485 262471 INFO neutron.agent.dhcp.agent [None req-7479064e-0fdd-4590-847e-c80e53bbd98f - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:02:11Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=505963de-66f4-43f7-ac51-b66afc572564, ip_allocation=immediate, mac_address=fa:16:3e:0f:2f:f3, name=tempest-NetworksTestDHCPv6-684352686, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:00:20Z, description=, dns_domain=, id=cc3dc995-51cd-4d70-be2c-11c47524552d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-843096697, port_security_enabled=True, project_id=fbf16d8f1271436498d8d9cbfb24239d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42907, qos_policy_id=None, revision_number=57, router:external=False, shared=False, standard_attr_id=1055, status=ACTIVE, subnets=['3b2c7cdd-1751-4b3a-bfbc-d944362f489f', '4170c344-72ea-4f95-8011-441bf5c09ffe'], tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:02:09Z, vlan_transparent=None, network_id=cc3dc995-51cd-4d70-be2c-11c47524552d, port_security_enabled=True, project_id=fbf16d8f1271436498d8d9cbfb24239d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['513251a1-00ec-4f61-b1d4-b1337479c848'], standard_attr_id=1735, status=DOWN, tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:02:12Z on network cc3dc995-51cd-4d70-be2c-11c47524552d#033[00m Nov 26 05:02:13 localhost systemd[1]: tmp-crun.JUH5ZG.mount: Deactivated successfully. Nov 26 05:02:13 localhost dnsmasq[318418]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 2 addresses Nov 26 05:02:13 localhost podman[318443]: 2025-11-26 10:02:13.696640951 +0000 UTC m=+0.066588445 container kill 4becc0078134ccb261801182b87aab1b10aec41da150fed89129d651c0ab080e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true) Nov 26 05:02:13 localhost dnsmasq-dhcp[318418]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:02:13 localhost dnsmasq-dhcp[318418]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:02:13 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:13.880 262471 INFO neutron.agent.dhcp.agent [None req-a31d0630-910e-4c55-bc30-d06706bdd94f - - - - - -] DHCP configuration for ports {'ba010266-c829-4775-9f81-9e5e8ac0a898', 'd40c88ee-08b8-4e01-8b40-df66e45034fe'} is completed#033[00m Nov 26 05:02:13 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:13.980 2 INFO neutron.agent.securitygroups_rpc [None req-56b61619-bab0-4759-ace1-a1ca400757a4 157ef898dc2545e0ba7d630031ec195f cbcf70cdf0f648dcb4f36e8ff029f9bd - - default default] Security group member updated ['0638b2b3-b725-4c44-9ee4-222e5c824dd6']#033[00m Nov 26 05:02:14 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:14.004 262471 INFO neutron.agent.dhcp.agent [None req-ad69f55a-f503-4e97-8919-c98fcc2ec8c8 - - - - - -] DHCP configuration for ports {'505963de-66f4-43f7-ac51-b66afc572564'} is completed#033[00m Nov 26 05:02:14 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:02:14 localhost dnsmasq[318418]: exiting on receipt of SIGTERM Nov 26 05:02:14 localhost systemd[1]: libpod-4becc0078134ccb261801182b87aab1b10aec41da150fed89129d651c0ab080e.scope: Deactivated successfully. Nov 26 05:02:14 localhost podman[318495]: 2025-11-26 10:02:14.260727306 +0000 UTC m=+0.068060908 container kill 4becc0078134ccb261801182b87aab1b10aec41da150fed89129d651c0ab080e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 26 05:02:14 localhost podman[318525]: 2025-11-26 10:02:14.338434158 +0000 UTC m=+0.066844082 container died 4becc0078134ccb261801182b87aab1b10aec41da150fed89129d651c0ab080e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 26 05:02:14 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:14.424 262471 INFO neutron.agent.linux.ip_lib [None req-6ab8569d-a427-4996-808e-bffc9ca4246e - - - - - -] Device tap372762f4-47 cannot be used as it has no MAC address#033[00m Nov 26 05:02:14 localhost podman[318525]: 2025-11-26 10:02:14.430364129 +0000 UTC m=+0.158774013 container cleanup 4becc0078134ccb261801182b87aab1b10aec41da150fed89129d651c0ab080e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 26 05:02:14 localhost systemd[1]: libpod-conmon-4becc0078134ccb261801182b87aab1b10aec41da150fed89129d651c0ab080e.scope: Deactivated successfully. Nov 26 05:02:14 localhost nova_compute[281415]: 2025-11-26 10:02:14.451 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:14 localhost kernel: device tap372762f4-47 entered promiscuous mode Nov 26 05:02:14 localhost ovn_controller[153664]: 2025-11-26T10:02:14Z|00301|binding|INFO|Claiming lport 372762f4-4715-48ac-9a46-545c83712abb for this chassis. Nov 26 05:02:14 localhost ovn_controller[153664]: 2025-11-26T10:02:14Z|00302|binding|INFO|372762f4-4715-48ac-9a46-545c83712abb: Claiming unknown Nov 26 05:02:14 localhost systemd-udevd[318387]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:02:14 localhost NetworkManager[5970]: [1764151334.4580] manager: (tap372762f4-47): new Generic device (/org/freedesktop/NetworkManager/Devices/50) Nov 26 05:02:14 localhost nova_compute[281415]: 2025-11-26 10:02:14.460 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:14 localhost podman[318532]: 2025-11-26 10:02:14.46565182 +0000 UTC m=+0.171205100 container remove 4becc0078134ccb261801182b87aab1b10aec41da150fed89129d651c0ab080e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 26 05:02:14 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:14.469 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-d9ce53f5-f65d-49e5-85b5-b057afd2da2a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9ce53f5-f65d-49e5-85b5-b057afd2da2a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbcf70cdf0f648dcb4f36e8ff029f9bd', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=14c48b4f-6a89-46d3-a552-11157569bb42, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=372762f4-4715-48ac-9a46-545c83712abb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:02:14 localhost podman[318556]: Nov 26 05:02:14 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:14.471 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 372762f4-4715-48ac-9a46-545c83712abb in datapath d9ce53f5-f65d-49e5-85b5-b057afd2da2a bound to our chassis#033[00m Nov 26 05:02:14 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:14.472 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network d9ce53f5-f65d-49e5-85b5-b057afd2da2a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:02:14 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:14.473 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[54f7c464-35b2-4f57-9270-89b43007f1f6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:02:14 localhost nova_compute[281415]: 2025-11-26 10:02:14.488 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:14 localhost ovn_controller[153664]: 2025-11-26T10:02:14Z|00303|binding|INFO|Setting lport 372762f4-4715-48ac-9a46-545c83712abb ovn-installed in OVS Nov 26 05:02:14 localhost ovn_controller[153664]: 2025-11-26T10:02:14Z|00304|binding|INFO|Setting lport 372762f4-4715-48ac-9a46-545c83712abb up in Southbound Nov 26 05:02:14 localhost podman[318556]: 2025-11-26 10:02:14.490000818 +0000 UTC m=+0.138594279 container create e8bafa6526788474e8354e06662f1c8e9561692b4efdf9372dce6c5ebe811560 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8b259add-0a8d-49d3-827c-570e822875fa, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:02:14 localhost nova_compute[281415]: 2025-11-26 10:02:14.489 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:14 localhost systemd[1]: tmp-crun.NzFPoo.mount: Deactivated successfully. Nov 26 05:02:14 localhost systemd[1]: var-lib-containers-storage-overlay-318886c504d8ca75d2c617d9eaebb238407f9edf966142ea96759dd43f951d57-merged.mount: Deactivated successfully. Nov 26 05:02:14 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4becc0078134ccb261801182b87aab1b10aec41da150fed89129d651c0ab080e-userdata-shm.mount: Deactivated successfully. Nov 26 05:02:14 localhost nova_compute[281415]: 2025-11-26 10:02:14.500 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:14 localhost podman[318556]: 2025-11-26 10:02:14.410610026 +0000 UTC m=+0.059203527 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:02:14 localhost systemd[1]: Started libpod-conmon-e8bafa6526788474e8354e06662f1c8e9561692b4efdf9372dce6c5ebe811560.scope. Nov 26 05:02:14 localhost systemd[1]: Started libcrun container. Nov 26 05:02:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a02c371f87d27f3c85733ea0f63759d8c40f385e9712b4b79a5dc884e8b8ec6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:02:14 localhost nova_compute[281415]: 2025-11-26 10:02:14.564 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:14 localhost podman[318556]: 2025-11-26 10:02:14.567332758 +0000 UTC m=+0.215926209 container init e8bafa6526788474e8354e06662f1c8e9561692b4efdf9372dce6c5ebe811560 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8b259add-0a8d-49d3-827c-570e822875fa, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:02:14 localhost podman[318556]: 2025-11-26 10:02:14.573534862 +0000 UTC m=+0.222128343 container start e8bafa6526788474e8354e06662f1c8e9561692b4efdf9372dce6c5ebe811560 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8b259add-0a8d-49d3-827c-570e822875fa, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:02:14 localhost dnsmasq[318597]: started, version 2.85 cachesize 150 Nov 26 05:02:14 localhost dnsmasq[318597]: DNS service limited to local subnets Nov 26 05:02:14 localhost dnsmasq[318597]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:02:14 localhost dnsmasq[318597]: warning: no upstream servers configured Nov 26 05:02:14 localhost dnsmasq-dhcp[318597]: DHCPv6, static leases only on 2001:db8:0:ffff::, lease time 1d Nov 26 05:02:14 localhost dnsmasq[318597]: read /var/lib/neutron/dhcp/8b259add-0a8d-49d3-827c-570e822875fa/addn_hosts - 0 addresses Nov 26 05:02:14 localhost dnsmasq-dhcp[318597]: read /var/lib/neutron/dhcp/8b259add-0a8d-49d3-827c-570e822875fa/host Nov 26 05:02:14 localhost dnsmasq-dhcp[318597]: read /var/lib/neutron/dhcp/8b259add-0a8d-49d3-827c-570e822875fa/opts Nov 26 05:02:14 localhost nova_compute[281415]: 2025-11-26 10:02:14.603 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:14 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:14.763 262471 INFO neutron.agent.dhcp.agent [None req-a0fff8ee-8802-455e-92a3-08eddbbb72ba - - - - - -] DHCP configuration for ports {'09172db5-0e42-4e77-ab76-52263168b2cd'} is completed#033[00m Nov 26 05:02:15 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:15.030 2 INFO neutron.agent.securitygroups_rpc [None req-50efe965-25c4-4039-9ea4-7a24a06ce0ec 157ef898dc2545e0ba7d630031ec195f cbcf70cdf0f648dcb4f36e8ff029f9bd - - default default] Security group member updated ['0638b2b3-b725-4c44-9ee4-222e5c824dd6']#033[00m Nov 26 05:02:15 localhost podman[318670]: Nov 26 05:02:15 localhost podman[318670]: 2025-11-26 10:02:15.445511288 +0000 UTC m=+0.095265381 container create 12af64cd3c7b1d59b3fbe9b724ebf549079139bb92ccc3856694fb9d689bdcbe (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true) Nov 26 05:02:15 localhost systemd[1]: Started libpod-conmon-12af64cd3c7b1d59b3fbe9b724ebf549079139bb92ccc3856694fb9d689bdcbe.scope. Nov 26 05:02:15 localhost podman[318670]: 2025-11-26 10:02:15.399678866 +0000 UTC m=+0.049433019 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:02:15 localhost systemd[1]: Started libcrun container. Nov 26 05:02:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6e8aa47184040e36a11ea9ef2a64e51eb178994f290737fe202e20338e130766/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:02:15 localhost podman[318670]: 2025-11-26 10:02:15.532538554 +0000 UTC m=+0.182292637 container init 12af64cd3c7b1d59b3fbe9b724ebf549079139bb92ccc3856694fb9d689bdcbe (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:02:15 localhost systemd[1]: tmp-crun.mVw7bZ.mount: Deactivated successfully. Nov 26 05:02:15 localhost podman[318670]: 2025-11-26 10:02:15.543507938 +0000 UTC m=+0.193262021 container start 12af64cd3c7b1d59b3fbe9b724ebf549079139bb92ccc3856694fb9d689bdcbe (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 26 05:02:15 localhost dnsmasq[318720]: started, version 2.85 cachesize 150 Nov 26 05:02:15 localhost dnsmasq[318720]: DNS service limited to local subnets Nov 26 05:02:15 localhost dnsmasq[318720]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:02:15 localhost dnsmasq[318720]: warning: no upstream servers configured Nov 26 05:02:15 localhost dnsmasq-dhcp[318720]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 26 05:02:15 localhost dnsmasq[318720]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:02:15 localhost dnsmasq-dhcp[318720]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:02:15 localhost dnsmasq-dhcp[318720]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:02:15 localhost podman[318702]: Nov 26 05:02:15 localhost podman[318702]: 2025-11-26 10:02:15.579112128 +0000 UTC m=+0.117211198 container create b3c2810b4f97904b9d5d4e39ac58909a4664a08c57d577afd7ac1758d8fd4c22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d9ce53f5-f65d-49e5-85b5-b057afd2da2a, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118) Nov 26 05:02:15 localhost podman[318702]: 2025-11-26 10:02:15.513095991 +0000 UTC m=+0.051195131 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:02:15 localhost systemd[1]: Started libpod-conmon-b3c2810b4f97904b9d5d4e39ac58909a4664a08c57d577afd7ac1758d8fd4c22.scope. Nov 26 05:02:15 localhost systemd[1]: Started libcrun container. Nov 26 05:02:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/278472759d418b83ff3f4b17cff9e0876edeec5ad27579e5c9b2bd923ebb0998/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:02:15 localhost podman[318702]: 2025-11-26 10:02:15.650857344 +0000 UTC m=+0.188956414 container init b3c2810b4f97904b9d5d4e39ac58909a4664a08c57d577afd7ac1758d8fd4c22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d9ce53f5-f65d-49e5-85b5-b057afd2da2a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 26 05:02:15 localhost podman[318702]: 2025-11-26 10:02:15.659405506 +0000 UTC m=+0.197504576 container start b3c2810b4f97904b9d5d4e39ac58909a4664a08c57d577afd7ac1758d8fd4c22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d9ce53f5-f65d-49e5-85b5-b057afd2da2a, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 26 05:02:15 localhost dnsmasq[318726]: started, version 2.85 cachesize 150 Nov 26 05:02:15 localhost dnsmasq[318726]: DNS service limited to local subnets Nov 26 05:02:15 localhost dnsmasq[318726]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:02:15 localhost dnsmasq[318726]: warning: no upstream servers configured Nov 26 05:02:15 localhost dnsmasq-dhcp[318726]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 26 05:02:15 localhost dnsmasq[318726]: read /var/lib/neutron/dhcp/d9ce53f5-f65d-49e5-85b5-b057afd2da2a/addn_hosts - 0 addresses Nov 26 05:02:15 localhost dnsmasq-dhcp[318726]: read /var/lib/neutron/dhcp/d9ce53f5-f65d-49e5-85b5-b057afd2da2a/host Nov 26 05:02:15 localhost dnsmasq-dhcp[318726]: read /var/lib/neutron/dhcp/d9ce53f5-f65d-49e5-85b5-b057afd2da2a/opts Nov 26 05:02:15 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:15.721 262471 INFO neutron.agent.dhcp.agent [None req-6ab8569d-a427-4996-808e-bffc9ca4246e - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:02:13Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=50e638da-c733-43a2-a1d3-358e5447faa1, ip_allocation=immediate, mac_address=fa:16:3e:63:56:7e, name=tempest-ExtraDHCPOptionsIpV6TestJSON-1749948912, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:02:11Z, description=, dns_domain=, id=d9ce53f5-f65d-49e5-85b5-b057afd2da2a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ExtraDHCPOptionsIpV6TestJSON-test-network-646149131, port_security_enabled=True, project_id=cbcf70cdf0f648dcb4f36e8ff029f9bd, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=38511, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1732, status=ACTIVE, subnets=['da88be11-9c89-45b1-808e-8eb824e587c6'], tags=[], tenant_id=cbcf70cdf0f648dcb4f36e8ff029f9bd, updated_at=2025-11-26T10:02:13Z, vlan_transparent=None, network_id=d9ce53f5-f65d-49e5-85b5-b057afd2da2a, port_security_enabled=True, project_id=cbcf70cdf0f648dcb4f36e8ff029f9bd, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['0638b2b3-b725-4c44-9ee4-222e5c824dd6'], standard_attr_id=1756, status=DOWN, tags=[], tenant_id=cbcf70cdf0f648dcb4f36e8ff029f9bd, updated_at=2025-11-26T10:02:13Z on network d9ce53f5-f65d-49e5-85b5-b057afd2da2a#033[00m Nov 26 05:02:15 localhost openstack_network_exporter[242153]: ERROR 10:02:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:02:15 localhost openstack_network_exporter[242153]: ERROR 10:02:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:02:15 localhost openstack_network_exporter[242153]: ERROR 10:02:15 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 05:02:15 localhost openstack_network_exporter[242153]: ERROR 10:02:15 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 05:02:15 localhost openstack_network_exporter[242153]: Nov 26 05:02:15 localhost openstack_network_exporter[242153]: ERROR 10:02:15 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 05:02:15 localhost openstack_network_exporter[242153]: Nov 26 05:02:15 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:15.789 2 INFO neutron.agent.securitygroups_rpc [None req-67c5da47-a025-4f8b-999e-119bfe771208 157ef898dc2545e0ba7d630031ec195f cbcf70cdf0f648dcb4f36e8ff029f9bd - - default default] Security group member updated ['0638b2b3-b725-4c44-9ee4-222e5c824dd6']#033[00m Nov 26 05:02:15 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:15.870 262471 INFO neutron.agent.dhcp.agent [None req-4cae81d2-e86d-4a50-885d-fdab07f94cf2 - - - - - -] DHCP configuration for ports {'ba010266-c829-4775-9f81-9e5e8ac0a898', 'd40c88ee-08b8-4e01-8b40-df66e45034fe'} is completed#033[00m Nov 26 05:02:15 localhost dnsmasq[318726]: read /var/lib/neutron/dhcp/d9ce53f5-f65d-49e5-85b5-b057afd2da2a/addn_hosts - 1 addresses Nov 26 05:02:15 localhost dnsmasq-dhcp[318726]: read /var/lib/neutron/dhcp/d9ce53f5-f65d-49e5-85b5-b057afd2da2a/host Nov 26 05:02:15 localhost podman[318755]: 2025-11-26 10:02:15.945682728 +0000 UTC m=+0.071052786 container kill b3c2810b4f97904b9d5d4e39ac58909a4664a08c57d577afd7ac1758d8fd4c22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d9ce53f5-f65d-49e5-85b5-b057afd2da2a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 26 05:02:15 localhost dnsmasq-dhcp[318726]: read /var/lib/neutron/dhcp/d9ce53f5-f65d-49e5-85b5-b057afd2da2a/opts Nov 26 05:02:16 localhost dnsmasq[318720]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:02:16 localhost dnsmasq-dhcp[318720]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:02:16 localhost dnsmasq-dhcp[318720]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:02:16 localhost podman[318777]: 2025-11-26 10:02:16.017828826 +0000 UTC m=+0.072372665 container kill 12af64cd3c7b1d59b3fbe9b724ebf549079139bb92ccc3856694fb9d689bdcbe (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 26 05:02:16 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:16.115 262471 INFO neutron.agent.dhcp.agent [None req-6bea91a6-207c-4e30-a634-bd6a67c6a506 - - - - - -] DHCP configuration for ports {'5853b90e-61eb-4212-baa5-14739c8eaced'} is completed#033[00m Nov 26 05:02:16 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:16.138 262471 INFO neutron.agent.dhcp.agent [None req-6ab8569d-a427-4996-808e-bffc9ca4246e - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:02:14Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[, , ], fixed_ips=[], id=bf4314d0-3bda-4d11-82fb-950988cadb53, ip_allocation=immediate, mac_address=fa:16:3e:2d:7e:19, name=tempest-ExtraDHCPOptionsIpV6TestJSON-54734991, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:02:11Z, description=, dns_domain=, id=d9ce53f5-f65d-49e5-85b5-b057afd2da2a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ExtraDHCPOptionsIpV6TestJSON-test-network-646149131, port_security_enabled=True, project_id=cbcf70cdf0f648dcb4f36e8ff029f9bd, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=38511, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1732, status=ACTIVE, subnets=['da88be11-9c89-45b1-808e-8eb824e587c6'], tags=[], tenant_id=cbcf70cdf0f648dcb4f36e8ff029f9bd, updated_at=2025-11-26T10:02:13Z, vlan_transparent=None, network_id=d9ce53f5-f65d-49e5-85b5-b057afd2da2a, port_security_enabled=True, project_id=cbcf70cdf0f648dcb4f36e8ff029f9bd, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['0638b2b3-b725-4c44-9ee4-222e5c824dd6'], standard_attr_id=1761, status=DOWN, tags=[], tenant_id=cbcf70cdf0f648dcb4f36e8ff029f9bd, updated_at=2025-11-26T10:02:14Z on network d9ce53f5-f65d-49e5-85b5-b057afd2da2a#033[00m Nov 26 05:02:16 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:16.158 262471 INFO neutron.agent.linux.dhcp [None req-6ab8569d-a427-4996-808e-bffc9ca4246e - - - - - -] Cannot apply dhcp option server-ip-address because it's ip_version 4 is not in port's address IP versions#033[00m Nov 26 05:02:16 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:16.159 262471 INFO neutron.agent.linux.dhcp [None req-6ab8569d-a427-4996-808e-bffc9ca4246e - - - - - -] Cannot apply dhcp option bootfile-name because it's ip_version 4 is not in port's address IP versions#033[00m Nov 26 05:02:16 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:16.159 262471 INFO neutron.agent.linux.dhcp [None req-6ab8569d-a427-4996-808e-bffc9ca4246e - - - - - -] Cannot apply dhcp option tftp-server because it's ip_version 4 is not in port's address IP versions#033[00m Nov 26 05:02:16 localhost nova_compute[281415]: 2025-11-26 10:02:16.288 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:16 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:16.308 262471 INFO neutron.agent.dhcp.agent [None req-eb0b6098-da81-4a9a-aa77-c88d5d121904 - - - - - -] DHCP configuration for ports {'50e638da-c733-43a2-a1d3-358e5447faa1'} is completed#033[00m Nov 26 05:02:16 localhost dnsmasq[318726]: read /var/lib/neutron/dhcp/d9ce53f5-f65d-49e5-85b5-b057afd2da2a/addn_hosts - 2 addresses Nov 26 05:02:16 localhost dnsmasq-dhcp[318726]: read /var/lib/neutron/dhcp/d9ce53f5-f65d-49e5-85b5-b057afd2da2a/host Nov 26 05:02:16 localhost podman[318826]: 2025-11-26 10:02:16.373331401 +0000 UTC m=+0.071683566 container kill b3c2810b4f97904b9d5d4e39ac58909a4664a08c57d577afd7ac1758d8fd4c22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d9ce53f5-f65d-49e5-85b5-b057afd2da2a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118) Nov 26 05:02:16 localhost dnsmasq-dhcp[318726]: read /var/lib/neutron/dhcp/d9ce53f5-f65d-49e5-85b5-b057afd2da2a/opts Nov 26 05:02:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 05:02:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 05:02:16 localhost systemd[1]: tmp-crun.UzXe7d.mount: Deactivated successfully. Nov 26 05:02:16 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:16.593 262471 INFO neutron.agent.dhcp.agent [None req-c3ee24e5-d9ad-4826-878a-39af8877424a - - - - - -] DHCP configuration for ports {'ba010266-c829-4775-9f81-9e5e8ac0a898', 'd40c88ee-08b8-4e01-8b40-df66e45034fe'} is completed#033[00m Nov 26 05:02:16 localhost podman[318845]: 2025-11-26 10:02:16.594574866 +0000 UTC m=+0.098884307 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0) Nov 26 05:02:16 localhost podman[318845]: 2025-11-26 10:02:16.629442404 +0000 UTC m=+0.133751825 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 26 05:02:16 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 05:02:16 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:16.650 2 INFO neutron.agent.securitygroups_rpc [None req-b01553bc-7fc7-440f-8fc8-1be00d4b9c8c b3c44a0e883d4a21bd13a1fdbfec53c1 1b441b9cc9474cf0bf826c2d3b0ac3a3 - - default default] Security group member updated ['3c90db89-734e-43e3-a179-44c4998e953c']#033[00m Nov 26 05:02:16 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:16.708 262471 INFO neutron.agent.dhcp.agent [None req-d7b484dd-4659-465f-9246-d08c371c0db1 - - - - - -] DHCP configuration for ports {'bf4314d0-3bda-4d11-82fb-950988cadb53'} is completed#033[00m Nov 26 05:02:16 localhost podman[318846]: 2025-11-26 10:02:16.692140173 +0000 UTC m=+0.191850039 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Nov 26 05:02:16 localhost podman[318846]: 2025-11-26 10:02:16.771872814 +0000 UTC m=+0.271582710 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Nov 26 05:02:16 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 05:02:16 localhost dnsmasq[318726]: read /var/lib/neutron/dhcp/d9ce53f5-f65d-49e5-85b5-b057afd2da2a/addn_hosts - 1 addresses Nov 26 05:02:16 localhost dnsmasq-dhcp[318726]: read /var/lib/neutron/dhcp/d9ce53f5-f65d-49e5-85b5-b057afd2da2a/host Nov 26 05:02:16 localhost podman[318903]: 2025-11-26 10:02:16.845564108 +0000 UTC m=+0.120103343 container kill b3c2810b4f97904b9d5d4e39ac58909a4664a08c57d577afd7ac1758d8fd4c22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d9ce53f5-f65d-49e5-85b5-b057afd2da2a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Nov 26 05:02:16 localhost dnsmasq-dhcp[318726]: read /var/lib/neutron/dhcp/d9ce53f5-f65d-49e5-85b5-b057afd2da2a/opts Nov 26 05:02:17 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:17.046 262471 INFO neutron.agent.dhcp.agent [None req-6ab8569d-a427-4996-808e-bffc9ca4246e - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:02:13Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[, , ], fixed_ips=[], id=50e638da-c733-43a2-a1d3-358e5447faa1, ip_allocation=immediate, mac_address=fa:16:3e:63:56:7e, name=tempest-new-port-name-1193191482, network_id=d9ce53f5-f65d-49e5-85b5-b057afd2da2a, port_security_enabled=True, project_id=cbcf70cdf0f648dcb4f36e8ff029f9bd, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['0638b2b3-b725-4c44-9ee4-222e5c824dd6'], standard_attr_id=1756, status=DOWN, tags=[], tenant_id=cbcf70cdf0f648dcb4f36e8ff029f9bd, updated_at=2025-11-26T10:02:16Z on network d9ce53f5-f65d-49e5-85b5-b057afd2da2a#033[00m Nov 26 05:02:17 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:17.069 262471 INFO neutron.agent.linux.dhcp [None req-6ab8569d-a427-4996-808e-bffc9ca4246e - - - - - -] Cannot apply dhcp option bootfile-name because it's ip_version 4 is not in port's address IP versions#033[00m Nov 26 05:02:17 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:17.070 262471 INFO neutron.agent.linux.dhcp [None req-6ab8569d-a427-4996-808e-bffc9ca4246e - - - - - -] Cannot apply dhcp option tftp-server because it's ip_version 4 is not in port's address IP versions#033[00m Nov 26 05:02:17 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:17.071 262471 INFO neutron.agent.linux.dhcp [None req-6ab8569d-a427-4996-808e-bffc9ca4246e - - - - - -] Cannot apply dhcp option server-ip-address because it's ip_version 4 is not in port's address IP versions#033[00m Nov 26 05:02:17 localhost dnsmasq[318720]: exiting on receipt of SIGTERM Nov 26 05:02:17 localhost podman[318941]: 2025-11-26 10:02:17.112997345 +0000 UTC m=+0.065230595 container kill 12af64cd3c7b1d59b3fbe9b724ebf549079139bb92ccc3856694fb9d689bdcbe (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0) Nov 26 05:02:17 localhost systemd[1]: libpod-12af64cd3c7b1d59b3fbe9b724ebf549079139bb92ccc3856694fb9d689bdcbe.scope: Deactivated successfully. Nov 26 05:02:17 localhost podman[318964]: 2025-11-26 10:02:17.197685302 +0000 UTC m=+0.067777259 container died 12af64cd3c7b1d59b3fbe9b724ebf549079139bb92ccc3856694fb9d689bdcbe (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 26 05:02:17 localhost podman[318964]: 2025-11-26 10:02:17.230397817 +0000 UTC m=+0.100489744 container cleanup 12af64cd3c7b1d59b3fbe9b724ebf549079139bb92ccc3856694fb9d689bdcbe (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true) Nov 26 05:02:17 localhost systemd[1]: libpod-conmon-12af64cd3c7b1d59b3fbe9b724ebf549079139bb92ccc3856694fb9d689bdcbe.scope: Deactivated successfully. Nov 26 05:02:17 localhost podman[318968]: 2025-11-26 10:02:17.271091087 +0000 UTC m=+0.130937302 container remove 12af64cd3c7b1d59b3fbe9b724ebf549079139bb92ccc3856694fb9d689bdcbe (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:02:17 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:17.282 2 INFO neutron.agent.securitygroups_rpc [None req-64b35d52-4482-4613-b414-d050e32bc932 a025d41076dc4bf8b5c42446a20db4a7 083b00bb83474f96865b0c5a38c5f88f - - default default] Security group member updated ['8aac6c4a-7436-4eac-ae13-d5f6aa807fff']#033[00m Nov 26 05:02:17 localhost dnsmasq[318726]: read /var/lib/neutron/dhcp/d9ce53f5-f65d-49e5-85b5-b057afd2da2a/addn_hosts - 1 addresses Nov 26 05:02:17 localhost dnsmasq-dhcp[318726]: read /var/lib/neutron/dhcp/d9ce53f5-f65d-49e5-85b5-b057afd2da2a/host Nov 26 05:02:17 localhost dnsmasq-dhcp[318726]: read /var/lib/neutron/dhcp/d9ce53f5-f65d-49e5-85b5-b057afd2da2a/opts Nov 26 05:02:17 localhost podman[318991]: 2025-11-26 10:02:17.321834014 +0000 UTC m=+0.116016073 container kill b3c2810b4f97904b9d5d4e39ac58909a4664a08c57d577afd7ac1758d8fd4c22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d9ce53f5-f65d-49e5-85b5-b057afd2da2a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 26 05:02:17 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:17.481 2 INFO neutron.agent.securitygroups_rpc [None req-9c002755-9590-47a8-ae6d-6e189ba014e8 157ef898dc2545e0ba7d630031ec195f cbcf70cdf0f648dcb4f36e8ff029f9bd - - default default] Security group member updated ['0638b2b3-b725-4c44-9ee4-222e5c824dd6']#033[00m Nov 26 05:02:17 localhost systemd[1]: var-lib-containers-storage-overlay-6e8aa47184040e36a11ea9ef2a64e51eb178994f290737fe202e20338e130766-merged.mount: Deactivated successfully. Nov 26 05:02:17 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-12af64cd3c7b1d59b3fbe9b724ebf549079139bb92ccc3856694fb9d689bdcbe-userdata-shm.mount: Deactivated successfully. Nov 26 05:02:17 localhost nova_compute[281415]: 2025-11-26 10:02:17.570 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:17 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:17.585 262471 INFO neutron.agent.dhcp.agent [None req-bac533b0-c734-4772-bf6c-27cf3fecce1c - - - - - -] DHCP configuration for ports {'50e638da-c733-43a2-a1d3-358e5447faa1'} is completed#033[00m Nov 26 05:02:17 localhost dnsmasq[318726]: read /var/lib/neutron/dhcp/d9ce53f5-f65d-49e5-85b5-b057afd2da2a/addn_hosts - 0 addresses Nov 26 05:02:17 localhost dnsmasq-dhcp[318726]: read /var/lib/neutron/dhcp/d9ce53f5-f65d-49e5-85b5-b057afd2da2a/host Nov 26 05:02:17 localhost dnsmasq-dhcp[318726]: read /var/lib/neutron/dhcp/d9ce53f5-f65d-49e5-85b5-b057afd2da2a/opts Nov 26 05:02:17 localhost podman[319035]: 2025-11-26 10:02:17.823690265 +0000 UTC m=+0.071979044 container kill b3c2810b4f97904b9d5d4e39ac58909a4664a08c57d577afd7ac1758d8fd4c22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d9ce53f5-f65d-49e5-85b5-b057afd2da2a, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 26 05:02:18 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:18.171 2 INFO neutron.agent.securitygroups_rpc [None req-dfbda559-8485-4335-8024-1eec4b5bab40 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:02:18 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e134 e134: 6 total, 6 up, 6 in Nov 26 05:02:18 localhost dnsmasq[318726]: exiting on receipt of SIGTERM Nov 26 05:02:18 localhost podman[319117]: 2025-11-26 10:02:18.639843114 +0000 UTC m=+0.062297208 container kill b3c2810b4f97904b9d5d4e39ac58909a4664a08c57d577afd7ac1758d8fd4c22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d9ce53f5-f65d-49e5-85b5-b057afd2da2a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:02:18 localhost systemd[1]: libpod-b3c2810b4f97904b9d5d4e39ac58909a4664a08c57d577afd7ac1758d8fd4c22.scope: Deactivated successfully. Nov 26 05:02:18 localhost podman[319133]: Nov 26 05:02:18 localhost podman[319145]: 2025-11-26 10:02:18.721394649 +0000 UTC m=+0.067491311 container died b3c2810b4f97904b9d5d4e39ac58909a4664a08c57d577afd7ac1758d8fd4c22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d9ce53f5-f65d-49e5-85b5-b057afd2da2a, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 26 05:02:18 localhost systemd[1]: tmp-crun.CcCmu2.mount: Deactivated successfully. Nov 26 05:02:18 localhost podman[319145]: 2025-11-26 10:02:18.76076396 +0000 UTC m=+0.106860602 container cleanup b3c2810b4f97904b9d5d4e39ac58909a4664a08c57d577afd7ac1758d8fd4c22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d9ce53f5-f65d-49e5-85b5-b057afd2da2a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 26 05:02:18 localhost systemd[1]: libpod-conmon-b3c2810b4f97904b9d5d4e39ac58909a4664a08c57d577afd7ac1758d8fd4c22.scope: Deactivated successfully. Nov 26 05:02:18 localhost podman[319133]: 2025-11-26 10:02:18.671273411 +0000 UTC m=+0.058472686 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:02:18 localhost podman[319154]: 2025-11-26 10:02:18.79737407 +0000 UTC m=+0.122388960 container remove b3c2810b4f97904b9d5d4e39ac58909a4664a08c57d577afd7ac1758d8fd4c22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d9ce53f5-f65d-49e5-85b5-b057afd2da2a, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 26 05:02:18 localhost podman[319133]: 2025-11-26 10:02:18.829665723 +0000 UTC m=+0.216864958 container create 4302ff1727451162f146fefcdade3802373548a77bb9bc9d8bc9abc1de5a46c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 26 05:02:18 localhost ovn_controller[153664]: 2025-11-26T10:02:18Z|00305|binding|INFO|Releasing lport 372762f4-4715-48ac-9a46-545c83712abb from this chassis (sb_readonly=0) Nov 26 05:02:18 localhost ovn_controller[153664]: 2025-11-26T10:02:18Z|00306|binding|INFO|Setting lport 372762f4-4715-48ac-9a46-545c83712abb down in Southbound Nov 26 05:02:18 localhost nova_compute[281415]: 2025-11-26 10:02:18.851 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:18 localhost kernel: device tap372762f4-47 left promiscuous mode Nov 26 05:02:18 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:18.860 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-d9ce53f5-f65d-49e5-85b5-b057afd2da2a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9ce53f5-f65d-49e5-85b5-b057afd2da2a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cbcf70cdf0f648dcb4f36e8ff029f9bd', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=14c48b4f-6a89-46d3-a552-11157569bb42, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=372762f4-4715-48ac-9a46-545c83712abb) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:02:18 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:18.863 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 372762f4-4715-48ac-9a46-545c83712abb in datapath d9ce53f5-f65d-49e5-85b5-b057afd2da2a unbound from our chassis#033[00m Nov 26 05:02:18 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:18.864 2 INFO neutron.agent.securitygroups_rpc [None req-8edcee90-11e6-463e-96b1-ee6e3413aad3 02dda53213c14fc6b2416359dddab4ae fbf16d8f1271436498d8d9cbfb24239d - - default default] Security group member updated ['513251a1-00ec-4f61-b1d4-b1337479c848']#033[00m Nov 26 05:02:18 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:18.864 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network d9ce53f5-f65d-49e5-85b5-b057afd2da2a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:02:18 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:18.865 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[0fcae214-3e34-4564-9b4b-dbcc8f64d096]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:02:18 localhost nova_compute[281415]: 2025-11-26 10:02:18.872 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:18 localhost systemd[1]: Started libpod-conmon-4302ff1727451162f146fefcdade3802373548a77bb9bc9d8bc9abc1de5a46c2.scope. Nov 26 05:02:18 localhost systemd[1]: Started libcrun container. Nov 26 05:02:18 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b36d24d76fd2eb32056164b7f578e49804db557cdc66488507519e22e2a7238d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:02:18 localhost podman[319133]: 2025-11-26 10:02:18.913563077 +0000 UTC m=+0.300762312 container init 4302ff1727451162f146fefcdade3802373548a77bb9bc9d8bc9abc1de5a46c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:02:18 localhost podman[319133]: 2025-11-26 10:02:18.919127171 +0000 UTC m=+0.306326416 container start 4302ff1727451162f146fefcdade3802373548a77bb9bc9d8bc9abc1de5a46c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 26 05:02:18 localhost dnsmasq[319182]: started, version 2.85 cachesize 150 Nov 26 05:02:18 localhost dnsmasq[319182]: DNS service limited to local subnets Nov 26 05:02:18 localhost dnsmasq[319182]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:02:18 localhost dnsmasq[319182]: warning: no upstream servers configured Nov 26 05:02:18 localhost dnsmasq-dhcp[319182]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 26 05:02:18 localhost dnsmasq-dhcp[319182]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Nov 26 05:02:18 localhost dnsmasq[319182]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:02:18 localhost dnsmasq-dhcp[319182]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:02:18 localhost dnsmasq-dhcp[319182]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:02:18 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:18.984 262471 INFO neutron.agent.dhcp.agent [None req-2f7c4cb8-e92d-455a-b0d5-0b07de331257 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:02:17Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=89ac684d-9b5e-4758-b02d-2e1ce06c25d3, ip_allocation=immediate, mac_address=fa:16:3e:a4:17:0d, name=tempest-NetworksTestDHCPv6-811734743, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:00:20Z, description=, dns_domain=, id=cc3dc995-51cd-4d70-be2c-11c47524552d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-843096697, port_security_enabled=True, project_id=fbf16d8f1271436498d8d9cbfb24239d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42907, qos_policy_id=None, revision_number=61, router:external=False, shared=False, standard_attr_id=1055, status=ACTIVE, subnets=['5980931f-7e56-4c29-a765-c67e6a338ad4', '9a4e4407-d930-46d6-b4b5-1b8015fda51d'], tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:02:15Z, vlan_transparent=None, network_id=cc3dc995-51cd-4d70-be2c-11c47524552d, port_security_enabled=True, project_id=fbf16d8f1271436498d8d9cbfb24239d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['513251a1-00ec-4f61-b1d4-b1337479c848'], standard_attr_id=1795, status=DOWN, tags=[], tenant_id=fbf16d8f1271436498d8d9cbfb24239d, updated_at=2025-11-26T10:02:17Z on network cc3dc995-51cd-4d70-be2c-11c47524552d#033[00m Nov 26 05:02:19 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:19.140 262471 INFO neutron.agent.dhcp.agent [None req-87353a38-958b-42dd-b54f-455d51974a38 - - - - - -] DHCP configuration for ports {'ba010266-c829-4775-9f81-9e5e8ac0a898', 'd40c88ee-08b8-4e01-8b40-df66e45034fe'} is completed#033[00m Nov 26 05:02:19 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:19.146 262471 INFO neutron.agent.dhcp.agent [None req-0abd0ff2-3a33-4c84-b988-9779da92a8b0 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:02:19 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:02:19 localhost dnsmasq[319182]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 2 addresses Nov 26 05:02:19 localhost dnsmasq-dhcp[319182]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:02:19 localhost dnsmasq-dhcp[319182]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:02:19 localhost podman[319202]: 2025-11-26 10:02:19.191624797 +0000 UTC m=+0.062274827 container kill 4302ff1727451162f146fefcdade3802373548a77bb9bc9d8bc9abc1de5a46c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Nov 26 05:02:19 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:19.347 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:02:19 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:19.455 262471 INFO neutron.agent.dhcp.agent [None req-f700edc0-0621-48b9-a663-4d8dcda05145 - - - - - -] DHCP configuration for ports {'89ac684d-9b5e-4758-b02d-2e1ce06c25d3'} is completed#033[00m Nov 26 05:02:19 localhost dnsmasq[319182]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:02:19 localhost dnsmasq-dhcp[319182]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:02:19 localhost dnsmasq-dhcp[319182]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:02:19 localhost podman[319240]: 2025-11-26 10:02:19.561466734 +0000 UTC m=+0.066282766 container kill 4302ff1727451162f146fefcdade3802373548a77bb9bc9d8bc9abc1de5a46c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 26 05:02:19 localhost systemd[1]: var-lib-containers-storage-overlay-278472759d418b83ff3f4b17cff9e0876edeec5ad27579e5c9b2bd923ebb0998-merged.mount: Deactivated successfully. Nov 26 05:02:19 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b3c2810b4f97904b9d5d4e39ac58909a4664a08c57d577afd7ac1758d8fd4c22-userdata-shm.mount: Deactivated successfully. Nov 26 05:02:19 localhost systemd[1]: run-netns-qdhcp\x2dd9ce53f5\x2df65d\x2d49e5\x2d85b5\x2db057afd2da2a.mount: Deactivated successfully. Nov 26 05:02:20 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:20.016 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:02:20 localhost podman[319279]: 2025-11-26 10:02:20.358671686 +0000 UTC m=+0.050661135 container kill 4302ff1727451162f146fefcdade3802373548a77bb9bc9d8bc9abc1de5a46c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3) Nov 26 05:02:20 localhost dnsmasq[319182]: exiting on receipt of SIGTERM Nov 26 05:02:20 localhost systemd[1]: libpod-4302ff1727451162f146fefcdade3802373548a77bb9bc9d8bc9abc1de5a46c2.scope: Deactivated successfully. Nov 26 05:02:20 localhost podman[319293]: 2025-11-26 10:02:20.416902242 +0000 UTC m=+0.041659138 container died 4302ff1727451162f146fefcdade3802373548a77bb9bc9d8bc9abc1de5a46c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 26 05:02:20 localhost podman[319293]: 2025-11-26 10:02:20.451585926 +0000 UTC m=+0.076342772 container cleanup 4302ff1727451162f146fefcdade3802373548a77bb9bc9d8bc9abc1de5a46c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:02:20 localhost systemd[1]: libpod-conmon-4302ff1727451162f146fefcdade3802373548a77bb9bc9d8bc9abc1de5a46c2.scope: Deactivated successfully. Nov 26 05:02:20 localhost podman[319294]: 2025-11-26 10:02:20.510449582 +0000 UTC m=+0.125098421 container remove 4302ff1727451162f146fefcdade3802373548a77bb9bc9d8bc9abc1de5a46c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:02:20 localhost systemd[1]: var-lib-containers-storage-overlay-b36d24d76fd2eb32056164b7f578e49804db557cdc66488507519e22e2a7238d-merged.mount: Deactivated successfully. Nov 26 05:02:20 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4302ff1727451162f146fefcdade3802373548a77bb9bc9d8bc9abc1de5a46c2-userdata-shm.mount: Deactivated successfully. Nov 26 05:02:20 localhost ovn_controller[153664]: 2025-11-26T10:02:20Z|00307|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:02:20 localhost nova_compute[281415]: 2025-11-26 10:02:20.756 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:20 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:20.921 2 INFO neutron.agent.securitygroups_rpc [None req-1a1a743b-5c9a-4d47-8abb-a399b61331d6 a025d41076dc4bf8b5c42446a20db4a7 083b00bb83474f96865b0c5a38c5f88f - - default default] Security group member updated ['a1ed0076-7d1f-49d0-ba76-2415daa9edc4', '8aac6c4a-7436-4eac-ae13-d5f6aa807fff', '55207669-6f58-43ae-a7bb-0ddcdb47a419']#033[00m Nov 26 05:02:21 localhost nova_compute[281415]: 2025-11-26 10:02:21.336 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:21 localhost podman[319370]: Nov 26 05:02:21 localhost podman[319370]: 2025-11-26 10:02:21.524110087 +0000 UTC m=+0.097995041 container create 3a38f25bbda49cdfce01c64fd7285c3a07b153dcee4be63554160049fab1d695 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118) Nov 26 05:02:21 localhost systemd[1]: Started libpod-conmon-3a38f25bbda49cdfce01c64fd7285c3a07b153dcee4be63554160049fab1d695.scope. Nov 26 05:02:21 localhost podman[319370]: 2025-11-26 10:02:21.479156471 +0000 UTC m=+0.053041465 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:02:21 localhost systemd[1]: Started libcrun container. Nov 26 05:02:21 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/51bafa993ea74d573451621019db16ed0bbc40a3c6f5962d388b76b266247142/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:02:21 localhost podman[319370]: 2025-11-26 10:02:21.612791111 +0000 UTC m=+0.186676065 container init 3a38f25bbda49cdfce01c64fd7285c3a07b153dcee4be63554160049fab1d695 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 26 05:02:21 localhost podman[319370]: 2025-11-26 10:02:21.622702674 +0000 UTC m=+0.196587628 container start 3a38f25bbda49cdfce01c64fd7285c3a07b153dcee4be63554160049fab1d695 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 26 05:02:21 localhost dnsmasq[319388]: started, version 2.85 cachesize 150 Nov 26 05:02:21 localhost dnsmasq[319388]: DNS service limited to local subnets Nov 26 05:02:21 localhost dnsmasq[319388]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:02:21 localhost dnsmasq[319388]: warning: no upstream servers configured Nov 26 05:02:21 localhost dnsmasq-dhcp[319388]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 26 05:02:21 localhost dnsmasq[319388]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/addn_hosts - 0 addresses Nov 26 05:02:21 localhost dnsmasq-dhcp[319388]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/host Nov 26 05:02:21 localhost dnsmasq-dhcp[319388]: read /var/lib/neutron/dhcp/cc3dc995-51cd-4d70-be2c-11c47524552d/opts Nov 26 05:02:21 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:21.815 2 INFO neutron.agent.securitygroups_rpc [None req-8699e15e-b75a-4972-aeba-ea5d6075e64b a025d41076dc4bf8b5c42446a20db4a7 083b00bb83474f96865b0c5a38c5f88f - - default default] Security group member updated ['a1ed0076-7d1f-49d0-ba76-2415daa9edc4', '55207669-6f58-43ae-a7bb-0ddcdb47a419']#033[00m Nov 26 05:02:21 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:21.905 262471 INFO neutron.agent.dhcp.agent [None req-9b61b560-6c60-4c31-8f59-f4f222345fbe - - - - - -] DHCP configuration for ports {'ba010266-c829-4775-9f81-9e5e8ac0a898', 'd40c88ee-08b8-4e01-8b40-df66e45034fe'} is completed#033[00m Nov 26 05:02:21 localhost dnsmasq[319388]: exiting on receipt of SIGTERM Nov 26 05:02:21 localhost podman[319406]: 2025-11-26 10:02:21.916819108 +0000 UTC m=+0.050267143 container kill 3a38f25bbda49cdfce01c64fd7285c3a07b153dcee4be63554160049fab1d695 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:02:21 localhost systemd[1]: tmp-crun.iHBuU8.mount: Deactivated successfully. Nov 26 05:02:21 localhost systemd[1]: libpod-3a38f25bbda49cdfce01c64fd7285c3a07b153dcee4be63554160049fab1d695.scope: Deactivated successfully. Nov 26 05:02:21 localhost podman[319420]: 2025-11-26 10:02:21.982648139 +0000 UTC m=+0.045263806 container died 3a38f25bbda49cdfce01c64fd7285c3a07b153dcee4be63554160049fab1d695 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:02:22 localhost podman[319420]: 2025-11-26 10:02:22.082532595 +0000 UTC m=+0.145148232 container remove 3a38f25bbda49cdfce01c64fd7285c3a07b153dcee4be63554160049fab1d695 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cc3dc995-51cd-4d70-be2c-11c47524552d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118) Nov 26 05:02:22 localhost systemd[1]: libpod-conmon-3a38f25bbda49cdfce01c64fd7285c3a07b153dcee4be63554160049fab1d695.scope: Deactivated successfully. Nov 26 05:02:22 localhost ovn_controller[153664]: 2025-11-26T10:02:22Z|00308|binding|INFO|Releasing lport d40c88ee-08b8-4e01-8b40-df66e45034fe from this chassis (sb_readonly=0) Nov 26 05:02:22 localhost nova_compute[281415]: 2025-11-26 10:02:22.100 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:22 localhost kernel: device tapd40c88ee-08 left promiscuous mode Nov 26 05:02:22 localhost ovn_controller[153664]: 2025-11-26T10:02:22Z|00309|binding|INFO|Setting lport d40c88ee-08b8-4e01-8b40-df66e45034fe down in Southbound Nov 26 05:02:22 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:22.109 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe50:ae77/64 2001:db8::2/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc3dc995-51cd-4d70-be2c-11c47524552d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fbf16d8f1271436498d8d9cbfb24239d', 'neutron:revision_number': '12', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1caee94c-7e68-4f07-b319-0ae9a5582637, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d40c88ee-08b8-4e01-8b40-df66e45034fe) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:02:22 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:22.111 159486 INFO neutron.agent.ovn.metadata.agent [-] Port d40c88ee-08b8-4e01-8b40-df66e45034fe in datapath cc3dc995-51cd-4d70-be2c-11c47524552d unbound from our chassis#033[00m Nov 26 05:02:22 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:22.114 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cc3dc995-51cd-4d70-be2c-11c47524552d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:02:22 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:22.115 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[f1f839a0-5b4b-4550-9403-af0379af102f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:02:22 localhost nova_compute[281415]: 2025-11-26 10:02:22.126 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:22 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:22.203 2 INFO neutron.agent.securitygroups_rpc [None req-35f4f6a7-5668-47fe-adb9-1545f5a20b39 6e007ecff2d54f3d96b4ca4d0583f705 b4b4d6e653de42458dbb1d0be0428a0e - - default default] Security group member updated ['6132014c-a03b-42fa-9169-0b75f723efcc']#033[00m Nov 26 05:02:22 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:22.368 262471 INFO neutron.agent.dhcp.agent [None req-91f6d6a4-23cc-44b6-bbec-29340b3a481e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:02:22 localhost nova_compute[281415]: 2025-11-26 10:02:22.605 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:22 localhost systemd[1]: var-lib-containers-storage-overlay-51bafa993ea74d573451621019db16ed0bbc40a3c6f5962d388b76b266247142-merged.mount: Deactivated successfully. Nov 26 05:02:22 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3a38f25bbda49cdfce01c64fd7285c3a07b153dcee4be63554160049fab1d695-userdata-shm.mount: Deactivated successfully. Nov 26 05:02:22 localhost systemd[1]: run-netns-qdhcp\x2dcc3dc995\x2d51cd\x2d4d70\x2dbe2c\x2d11c47524552d.mount: Deactivated successfully. Nov 26 05:02:23 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:23.599 2 INFO neutron.agent.securitygroups_rpc [None req-def794f9-825a-4b4a-886a-32af99eed4c9 b3c44a0e883d4a21bd13a1fdbfec53c1 1b441b9cc9474cf0bf826c2d3b0ac3a3 - - default default] Security group member updated ['3c90db89-734e-43e3-a179-44c4998e953c']#033[00m Nov 26 05:02:23 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:23.734 2 INFO neutron.agent.securitygroups_rpc [None req-5949bdff-b0f4-4302-ae9c-380f8fca807a 6e007ecff2d54f3d96b4ca4d0583f705 b4b4d6e653de42458dbb1d0be0428a0e - - default default] Security group member updated ['6132014c-a03b-42fa-9169-0b75f723efcc']#033[00m Nov 26 05:02:24 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:24.092 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:02:24 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:02:24 localhost ovn_controller[153664]: 2025-11-26T10:02:24Z|00310|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:02:24 localhost nova_compute[281415]: 2025-11-26 10:02:24.253 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:25 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:25.651 2 INFO neutron.agent.securitygroups_rpc [None req-2f2d3199-4d72-4546-9c95-e68d543a1a31 a025d41076dc4bf8b5c42446a20db4a7 083b00bb83474f96865b0c5a38c5f88f - - default default] Security group member updated ['6d45445d-04cd-4f12-afe3-c4bc11ac69da']#033[00m Nov 26 05:02:26 localhost nova_compute[281415]: 2025-11-26 10:02:26.384 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:27 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:27.190 2 INFO neutron.agent.securitygroups_rpc [None req-97ce959d-686a-4303-96f4-08ac8f74ff62 bf5696d8db9f4b2795974d3ed78b4991 c9a6d35bfc5f440e9fdc4ed36d883eff - - default default] Security group member updated ['512f55ba-befd-448e-8449-d75d9733402e']#033[00m Nov 26 05:02:27 localhost podman[240049]: time="2025-11-26T10:02:27Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 05:02:27 localhost podman[240049]: @ - - [26/Nov/2025:10:02:27 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1" Nov 26 05:02:27 localhost podman[240049]: @ - - [26/Nov/2025:10:02:27 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19254 "" "Go-http-client/1.1" Nov 26 05:02:27 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e135 e135: 6 total, 6 up, 6 in Nov 26 05:02:27 localhost nova_compute[281415]: 2025-11-26 10:02:27.642 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:28 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:28.526 262471 INFO neutron.agent.linux.ip_lib [None req-a7acf5ac-b646-47d3-b5da-fb6af4ff6285 - - - - - -] Device tapbc32361e-c7 cannot be used as it has no MAC address#033[00m Nov 26 05:02:28 localhost nova_compute[281415]: 2025-11-26 10:02:28.556 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:28 localhost kernel: device tapbc32361e-c7 entered promiscuous mode Nov 26 05:02:28 localhost NetworkManager[5970]: [1764151348.5652] manager: (tapbc32361e-c7): new Generic device (/org/freedesktop/NetworkManager/Devices/51) Nov 26 05:02:28 localhost ovn_controller[153664]: 2025-11-26T10:02:28Z|00311|binding|INFO|Claiming lport bc32361e-c73f-4f1c-9962-a55f573b5524 for this chassis. Nov 26 05:02:28 localhost ovn_controller[153664]: 2025-11-26T10:02:28Z|00312|binding|INFO|bc32361e-c73f-4f1c-9962-a55f573b5524: Claiming unknown Nov 26 05:02:28 localhost nova_compute[281415]: 2025-11-26 10:02:28.566 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:28 localhost systemd-udevd[319458]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:02:28 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:28.578 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-30c35eef-c17d-4ed0-af26-8569d47bac2a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-30c35eef-c17d-4ed0-af26-8569d47bac2a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1b441b9cc9474cf0bf826c2d3b0ac3a3', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e352e40-1543-471e-b0c3-207ab6fc10d5, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=bc32361e-c73f-4f1c-9962-a55f573b5524) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:02:28 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:28.580 159486 INFO neutron.agent.ovn.metadata.agent [-] Port bc32361e-c73f-4f1c-9962-a55f573b5524 in datapath 30c35eef-c17d-4ed0-af26-8569d47bac2a bound to our chassis#033[00m Nov 26 05:02:28 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:28.581 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 30c35eef-c17d-4ed0-af26-8569d47bac2a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:02:28 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:28.582 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[f9cf48c5-f0ea-41db-8eb2-5019badc1fbb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:02:28 localhost ovn_controller[153664]: 2025-11-26T10:02:28Z|00313|binding|INFO|Setting lport bc32361e-c73f-4f1c-9962-a55f573b5524 ovn-installed in OVS Nov 26 05:02:28 localhost ovn_controller[153664]: 2025-11-26T10:02:28Z|00314|binding|INFO|Setting lport bc32361e-c73f-4f1c-9962-a55f573b5524 up in Southbound Nov 26 05:02:28 localhost nova_compute[281415]: 2025-11-26 10:02:28.591 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:28 localhost nova_compute[281415]: 2025-11-26 10:02:28.601 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:28 localhost nova_compute[281415]: 2025-11-26 10:02:28.608 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:28 localhost nova_compute[281415]: 2025-11-26 10:02:28.651 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:28 localhost nova_compute[281415]: 2025-11-26 10:02:28.732 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:29 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e135 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:02:29 localhost podman[319511]: Nov 26 05:02:29 localhost podman[319511]: 2025-11-26 10:02:29.608837041 +0000 UTC m=+0.098514557 container create 7a9591bb37d1a7fae8f91de1221d13c6f43aefc8d9a23d545fd43f4aafeec1a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30c35eef-c17d-4ed0-af26-8569d47bac2a, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2) Nov 26 05:02:29 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:29.650 159486 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port fe46a36a-8f11-4d0b-8deb-3e489180dd46 with type ""#033[00m Nov 26 05:02:29 localhost ovn_controller[153664]: 2025-11-26T10:02:29Z|00315|binding|INFO|Removing iface tapbc32361e-c7 ovn-installed in OVS Nov 26 05:02:29 localhost ovn_controller[153664]: 2025-11-26T10:02:29Z|00316|binding|INFO|Removing lport bc32361e-c73f-4f1c-9962-a55f573b5524 ovn-installed in OVS Nov 26 05:02:29 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:29.654 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-30c35eef-c17d-4ed0-af26-8569d47bac2a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-30c35eef-c17d-4ed0-af26-8569d47bac2a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1b441b9cc9474cf0bf826c2d3b0ac3a3', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e352e40-1543-471e-b0c3-207ab6fc10d5, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=bc32361e-c73f-4f1c-9962-a55f573b5524) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:02:29 localhost nova_compute[281415]: 2025-11-26 10:02:29.655 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:29 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:29.657 159486 INFO neutron.agent.ovn.metadata.agent [-] Port bc32361e-c73f-4f1c-9962-a55f573b5524 in datapath 30c35eef-c17d-4ed0-af26-8569d47bac2a unbound from our chassis#033[00m Nov 26 05:02:29 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:29.658 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 30c35eef-c17d-4ed0-af26-8569d47bac2a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:02:29 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:29.659 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[d1f64418-3f2f-4be4-bc11-5e778f0cd3a7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:02:29 localhost podman[319511]: 2025-11-26 10:02:29.561446043 +0000 UTC m=+0.051123589 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:02:29 localhost nova_compute[281415]: 2025-11-26 10:02:29.661 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:29 localhost systemd[1]: Started libpod-conmon-7a9591bb37d1a7fae8f91de1221d13c6f43aefc8d9a23d545fd43f4aafeec1a0.scope. Nov 26 05:02:29 localhost systemd[1]: Started libcrun container. Nov 26 05:02:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54b919b904fb9bc99b2b35844f0bb0614794510a785040ab5fced82bde9d021a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:02:29 localhost podman[319511]: 2025-11-26 10:02:29.71190126 +0000 UTC m=+0.201578756 container init 7a9591bb37d1a7fae8f91de1221d13c6f43aefc8d9a23d545fd43f4aafeec1a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30c35eef-c17d-4ed0-af26-8569d47bac2a, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3) Nov 26 05:02:29 localhost podman[319511]: 2025-11-26 10:02:29.721990428 +0000 UTC m=+0.211667914 container start 7a9591bb37d1a7fae8f91de1221d13c6f43aefc8d9a23d545fd43f4aafeec1a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30c35eef-c17d-4ed0-af26-8569d47bac2a, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Nov 26 05:02:29 localhost dnsmasq[319530]: started, version 2.85 cachesize 150 Nov 26 05:02:29 localhost dnsmasq[319530]: DNS service limited to local subnets Nov 26 05:02:29 localhost dnsmasq[319530]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:02:29 localhost dnsmasq[319530]: warning: no upstream servers configured Nov 26 05:02:29 localhost dnsmasq[319530]: read /var/lib/neutron/dhcp/30c35eef-c17d-4ed0-af26-8569d47bac2a/addn_hosts - 0 addresses Nov 26 05:02:29 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:29.905 2 INFO neutron.agent.securitygroups_rpc [None req-5a5bbedd-0c37-4fe6-8ed6-c4b6628cd69f bf5696d8db9f4b2795974d3ed78b4991 c9a6d35bfc5f440e9fdc4ed36d883eff - - default default] Security group member updated ['512f55ba-befd-448e-8449-d75d9733402e']#033[00m Nov 26 05:02:30 localhost ovn_controller[153664]: 2025-11-26T10:02:30Z|00317|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:02:30 localhost nova_compute[281415]: 2025-11-26 10:02:30.046 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:30 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:30.067 262471 INFO neutron.agent.dhcp.agent [None req-13c3f700-ca6c-47f2-8e8c-ed1c395149a9 - - - - - -] DHCP configuration for ports {'72f0b29a-32e2-4375-89ad-77e39c7d9bf9'} is completed#033[00m Nov 26 05:02:30 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 26 05:02:30 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2934135789' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 26 05:02:30 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 26 05:02:30 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2934135789' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 26 05:02:30 localhost dnsmasq[319530]: exiting on receipt of SIGTERM Nov 26 05:02:30 localhost podman[319548]: 2025-11-26 10:02:30.266066604 +0000 UTC m=+0.069900703 container kill 7a9591bb37d1a7fae8f91de1221d13c6f43aefc8d9a23d545fd43f4aafeec1a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30c35eef-c17d-4ed0-af26-8569d47bac2a, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:02:30 localhost systemd[1]: libpod-7a9591bb37d1a7fae8f91de1221d13c6f43aefc8d9a23d545fd43f4aafeec1a0.scope: Deactivated successfully. Nov 26 05:02:30 localhost podman[319561]: 2025-11-26 10:02:30.344232539 +0000 UTC m=+0.061196026 container died 7a9591bb37d1a7fae8f91de1221d13c6f43aefc8d9a23d545fd43f4aafeec1a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30c35eef-c17d-4ed0-af26-8569d47bac2a, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 26 05:02:30 localhost podman[319561]: 2025-11-26 10:02:30.379795568 +0000 UTC m=+0.096759005 container cleanup 7a9591bb37d1a7fae8f91de1221d13c6f43aefc8d9a23d545fd43f4aafeec1a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30c35eef-c17d-4ed0-af26-8569d47bac2a, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 26 05:02:30 localhost systemd[1]: libpod-conmon-7a9591bb37d1a7fae8f91de1221d13c6f43aefc8d9a23d545fd43f4aafeec1a0.scope: Deactivated successfully. Nov 26 05:02:30 localhost podman[319563]: 2025-11-26 10:02:30.431884644 +0000 UTC m=+0.140093093 container remove 7a9591bb37d1a7fae8f91de1221d13c6f43aefc8d9a23d545fd43f4aafeec1a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30c35eef-c17d-4ed0-af26-8569d47bac2a, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 26 05:02:30 localhost nova_compute[281415]: 2025-11-26 10:02:30.448 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:30 localhost kernel: device tapbc32361e-c7 left promiscuous mode Nov 26 05:02:30 localhost nova_compute[281415]: 2025-11-26 10:02:30.460 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:30 localhost systemd[1]: var-lib-containers-storage-overlay-54b919b904fb9bc99b2b35844f0bb0614794510a785040ab5fced82bde9d021a-merged.mount: Deactivated successfully. Nov 26 05:02:30 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7a9591bb37d1a7fae8f91de1221d13c6f43aefc8d9a23d545fd43f4aafeec1a0-userdata-shm.mount: Deactivated successfully. Nov 26 05:02:31 localhost nova_compute[281415]: 2025-11-26 10:02:31.421 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 05:02:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 05:02:31 localhost podman[319593]: 2025-11-26 10:02:31.850065709 +0000 UTC m=+0.099269600 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 26 05:02:31 localhost podman[319594]: 2025-11-26 10:02:31.893747976 +0000 UTC m=+0.138450463 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3) Nov 26 05:02:31 localhost podman[319594]: 2025-11-26 10:02:31.90676469 +0000 UTC m=+0.151467157 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm) Nov 26 05:02:31 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 05:02:31 localhost podman[319593]: 2025-11-26 10:02:31.961823905 +0000 UTC m=+0.211027856 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 26 05:02:31 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 05:02:32 localhost systemd[1]: run-netns-qdhcp\x2d30c35eef\x2dc17d\x2d4ed0\x2daf26\x2d8569d47bac2a.mount: Deactivated successfully. Nov 26 05:02:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:32.101 262471 INFO neutron.agent.dhcp.agent [None req-d41893de-6c39-4697-92cf-48d818037842 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:02:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:32.102 262471 INFO neutron.agent.dhcp.agent [None req-d41893de-6c39-4697-92cf-48d818037842 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:02:32 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:32.591 2 INFO neutron.agent.securitygroups_rpc [None req-5a5bbedd-0c37-4fe6-8ed6-c4b6628cd69f bf5696d8db9f4b2795974d3ed78b4991 c9a6d35bfc5f440e9fdc4ed36d883eff - - default default] Security group member updated ['512f55ba-befd-448e-8449-d75d9733402e']#033[00m Nov 26 05:02:32 localhost nova_compute[281415]: 2025-11-26 10:02:32.665 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:33 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:33.284 2 INFO neutron.agent.securitygroups_rpc [None req-6f55497f-2065-485e-9c46-b48e4b5da11c bf5696d8db9f4b2795974d3ed78b4991 c9a6d35bfc5f440e9fdc4ed36d883eff - - default default] Security group member updated ['512f55ba-befd-448e-8449-d75d9733402e']#033[00m Nov 26 05:02:33 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:33.322 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:02:33 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:33.725 2 INFO neutron.agent.securitygroups_rpc [None req-da8ec25e-d1fb-4312-addb-4618131f3143 3b83e719d33b438fb483ce2739d86d02 0fb20b38c5e8412e83444d5370b9f73f - - default default] Security group member updated ['bb0f9bb4-ea46-4cd5-9769-51c5f841f116']#033[00m Nov 26 05:02:34 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e135 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:02:34 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:34.320 2 INFO neutron.agent.securitygroups_rpc [None req-92bfa7b7-cce5-4428-b332-6807d7b5a581 3b83e719d33b438fb483ce2739d86d02 0fb20b38c5e8412e83444d5370b9f73f - - default default] Security group member updated ['bb0f9bb4-ea46-4cd5-9769-51c5f841f116']#033[00m Nov 26 05:02:34 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:34.459 2 INFO neutron.agent.securitygroups_rpc [None req-21f7b495-dc42-4335-9048-d2e0b6a98fce bf5696d8db9f4b2795974d3ed78b4991 c9a6d35bfc5f440e9fdc4ed36d883eff - - default default] Security group member updated ['512f55ba-befd-448e-8449-d75d9733402e']#033[00m Nov 26 05:02:35 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:35.468 2 INFO neutron.agent.securitygroups_rpc [None req-c677a632-4278-4153-8907-36296559ae29 3b83e719d33b438fb483ce2739d86d02 0fb20b38c5e8412e83444d5370b9f73f - - default default] Security group member updated ['bb0f9bb4-ea46-4cd5-9769-51c5f841f116']#033[00m Nov 26 05:02:35 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e136 e136: 6 total, 6 up, 6 in Nov 26 05:02:35 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:35.700 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:02:36 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:36.433 2 INFO neutron.agent.securitygroups_rpc [None req-1da49cc9-d6e1-4422-a47e-8cde8804f351 3b83e719d33b438fb483ce2739d86d02 0fb20b38c5e8412e83444d5370b9f73f - - default default] Security group member updated ['bb0f9bb4-ea46-4cd5-9769-51c5f841f116']#033[00m Nov 26 05:02:36 localhost nova_compute[281415]: 2025-11-26 10:02:36.457 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:37 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e137 e137: 6 total, 6 up, 6 in Nov 26 05:02:37 localhost nova_compute[281415]: 2025-11-26 10:02:37.698 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 05:02:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 05:02:37 localhost systemd[1]: tmp-crun.llRxGs.mount: Deactivated successfully. Nov 26 05:02:37 localhost podman[319633]: 2025-11-26 10:02:37.862619327 +0000 UTC m=+0.110303344 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Nov 26 05:02:37 localhost podman[319633]: 2025-11-26 10:02:37.912104037 +0000 UTC m=+0.159788054 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 26 05:02:37 localhost systemd[1]: tmp-crun.gCaGDq.mount: Deactivated successfully. Nov 26 05:02:37 localhost podman[319634]: 2025-11-26 10:02:37.920294458 +0000 UTC m=+0.159273758 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, version=9.6, io.buildah.version=1.33.7, io.openshift.expose-services=, vendor=Red Hat, Inc., name=ubi9-minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, architecture=x86_64) Nov 26 05:02:37 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 05:02:37 localhost podman[319634]: 2025-11-26 10:02:37.938317639 +0000 UTC m=+0.177296889 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=9.6, config_id=edpm, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-type=git, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7) Nov 26 05:02:37 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:37.953 2 INFO neutron.agent.securitygroups_rpc [None req-05f2c7ad-c872-4d30-aa33-aeab7d89cf5d 6e007ecff2d54f3d96b4ca4d0583f705 b4b4d6e653de42458dbb1d0be0428a0e - - default default] Security group member updated ['6132014c-a03b-42fa-9169-0b75f723efcc']#033[00m Nov 26 05:02:37 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 05:02:39 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:39.068 262471 INFO neutron.agent.linux.ip_lib [None req-19eefcee-40ca-4520-b31a-bcecf246cbf2 - - - - - -] Device tap0e740171-68 cannot be used as it has no MAC address#033[00m Nov 26 05:02:39 localhost nova_compute[281415]: 2025-11-26 10:02:39.100 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:39 localhost kernel: device tap0e740171-68 entered promiscuous mode Nov 26 05:02:39 localhost NetworkManager[5970]: [1764151359.1097] manager: (tap0e740171-68): new Generic device (/org/freedesktop/NetworkManager/Devices/52) Nov 26 05:02:39 localhost nova_compute[281415]: 2025-11-26 10:02:39.110 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:39 localhost ovn_controller[153664]: 2025-11-26T10:02:39Z|00318|binding|INFO|Claiming lport 0e740171-68cc-4e00-a03e-23261db33f63 for this chassis. Nov 26 05:02:39 localhost ovn_controller[153664]: 2025-11-26T10:02:39Z|00319|binding|INFO|0e740171-68cc-4e00-a03e-23261db33f63: Claiming unknown Nov 26 05:02:39 localhost systemd-udevd[319689]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:02:39 localhost nova_compute[281415]: 2025-11-26 10:02:39.118 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:39 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:39.124 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-28eaeab7-5bbd-4432-b01d-9418570f7b99', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-28eaeab7-5bbd-4432-b01d-9418570f7b99', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0b275d5fc6d14aa0a6fd7bf9cf5d748e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=706617a3-34ce-4eb9-8077-b25188fc93f4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=0e740171-68cc-4e00-a03e-23261db33f63) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:02:39 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:39.127 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 0e740171-68cc-4e00-a03e-23261db33f63 in datapath 28eaeab7-5bbd-4432-b01d-9418570f7b99 bound to our chassis#033[00m Nov 26 05:02:39 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:39.130 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Port 2fd311b0-b65a-448d-be41-1696699b8feb IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 26 05:02:39 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:39.130 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 28eaeab7-5bbd-4432-b01d-9418570f7b99, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:02:39 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:39.132 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[d9994e40-5f20-4d11-8deb-397ba973b404]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:02:39 localhost journal[229445]: ethtool ioctl error on tap0e740171-68: No such device Nov 26 05:02:39 localhost journal[229445]: ethtool ioctl error on tap0e740171-68: No such device Nov 26 05:02:39 localhost journal[229445]: ethtool ioctl error on tap0e740171-68: No such device Nov 26 05:02:39 localhost ovn_controller[153664]: 2025-11-26T10:02:39Z|00320|binding|INFO|Setting lport 0e740171-68cc-4e00-a03e-23261db33f63 ovn-installed in OVS Nov 26 05:02:39 localhost ovn_controller[153664]: 2025-11-26T10:02:39Z|00321|binding|INFO|Setting lport 0e740171-68cc-4e00-a03e-23261db33f63 up in Southbound Nov 26 05:02:39 localhost nova_compute[281415]: 2025-11-26 10:02:39.159 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:39 localhost journal[229445]: ethtool ioctl error on tap0e740171-68: No such device Nov 26 05:02:39 localhost journal[229445]: ethtool ioctl error on tap0e740171-68: No such device Nov 26 05:02:39 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:02:39 localhost journal[229445]: ethtool ioctl error on tap0e740171-68: No such device Nov 26 05:02:39 localhost journal[229445]: ethtool ioctl error on tap0e740171-68: No such device Nov 26 05:02:39 localhost journal[229445]: ethtool ioctl error on tap0e740171-68: No such device Nov 26 05:02:39 localhost nova_compute[281415]: 2025-11-26 10:02:39.203 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:39 localhost nova_compute[281415]: 2025-11-26 10:02:39.241 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:39 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e138 e138: 6 total, 6 up, 6 in Nov 26 05:02:39 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:39.615 262471 INFO neutron.agent.linux.ip_lib [None req-b7e8a7c5-8835-4b9e-bde3-d6e51ab7af50 - - - - - -] Device tapbbd2f140-75 cannot be used as it has no MAC address#033[00m Nov 26 05:02:39 localhost nova_compute[281415]: 2025-11-26 10:02:39.665 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:39 localhost kernel: device tapbbd2f140-75 entered promiscuous mode Nov 26 05:02:39 localhost NetworkManager[5970]: [1764151359.6745] manager: (tapbbd2f140-75): new Generic device (/org/freedesktop/NetworkManager/Devices/53) Nov 26 05:02:39 localhost nova_compute[281415]: 2025-11-26 10:02:39.675 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:39 localhost ovn_controller[153664]: 2025-11-26T10:02:39Z|00322|binding|INFO|Claiming lport bbd2f140-7553-41cb-ae19-80a4b94cbc32 for this chassis. Nov 26 05:02:39 localhost ovn_controller[153664]: 2025-11-26T10:02:39Z|00323|binding|INFO|bbd2f140-7553-41cb-ae19-80a4b94cbc32: Claiming unknown Nov 26 05:02:39 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:39.688 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-5a53efe6-7771-47f7-95e2-63af1ee1b74d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5a53efe6-7771-47f7-95e2-63af1ee1b74d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c9a6d35bfc5f440e9fdc4ed36d883eff', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1d3593a4-08fa-423a-9a18-a188cc78b3c9, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=bbd2f140-7553-41cb-ae19-80a4b94cbc32) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:02:39 localhost sshd[319740]: main: sshd: ssh-rsa algorithm is disabled Nov 26 05:02:39 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:39.693 159486 INFO neutron.agent.ovn.metadata.agent [-] Port bbd2f140-7553-41cb-ae19-80a4b94cbc32 in datapath 5a53efe6-7771-47f7-95e2-63af1ee1b74d bound to our chassis#033[00m Nov 26 05:02:39 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:39.697 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Port 6e2df8a8-0dcc-408a-b47a-47df0523bc22 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 26 05:02:39 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:39.697 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5a53efe6-7771-47f7-95e2-63af1ee1b74d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:02:39 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:39.701 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[c31dd332-050a-4534-8c23-f7164a2566f5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:02:39 localhost ovn_controller[153664]: 2025-11-26T10:02:39Z|00324|binding|INFO|Setting lport bbd2f140-7553-41cb-ae19-80a4b94cbc32 ovn-installed in OVS Nov 26 05:02:39 localhost ovn_controller[153664]: 2025-11-26T10:02:39Z|00325|binding|INFO|Setting lport bbd2f140-7553-41cb-ae19-80a4b94cbc32 up in Southbound Nov 26 05:02:39 localhost nova_compute[281415]: 2025-11-26 10:02:39.721 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:39 localhost nova_compute[281415]: 2025-11-26 10:02:39.777 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:39 localhost nova_compute[281415]: 2025-11-26 10:02:39.817 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:40 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:40.008 2 INFO neutron.agent.securitygroups_rpc [None req-abdd2016-67c5-4a3b-8e8e-727d444b7df5 def334c345f4475ba8901c121c08f73a 396008a2b0d9436cab52644f4e54b6ad - - default default] Security group member updated ['a1cecb88-d0b9-4ce5-8ab2-3cde73c509d1']#033[00m Nov 26 05:02:40 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:40.138 2 INFO neutron.agent.securitygroups_rpc [None req-d89c9e49-79ce-4424-84e1-48fd0b011c23 bf5696d8db9f4b2795974d3ed78b4991 c9a6d35bfc5f440e9fdc4ed36d883eff - - default default] Security group member updated ['512f55ba-befd-448e-8449-d75d9733402e']#033[00m Nov 26 05:02:40 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:40.254 2 INFO neutron.agent.securitygroups_rpc [None req-abdd2016-67c5-4a3b-8e8e-727d444b7df5 def334c345f4475ba8901c121c08f73a 396008a2b0d9436cab52644f4e54b6ad - - default default] Security group member updated ['a1cecb88-d0b9-4ce5-8ab2-3cde73c509d1']#033[00m Nov 26 05:02:40 localhost podman[319798]: Nov 26 05:02:40 localhost podman[319798]: 2025-11-26 10:02:40.355557199 +0000 UTC m=+0.093379475 container create 330f327f462c84c2b7114e07c5805bf40489aa164b28888bcf85455e2c902a02 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-28eaeab7-5bbd-4432-b01d-9418570f7b99, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Nov 26 05:02:40 localhost podman[319798]: 2025-11-26 10:02:40.306073249 +0000 UTC m=+0.043895515 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:02:40 localhost systemd[1]: Started libpod-conmon-330f327f462c84c2b7114e07c5805bf40489aa164b28888bcf85455e2c902a02.scope. Nov 26 05:02:40 localhost systemd[1]: tmp-crun.v0KSue.mount: Deactivated successfully. Nov 26 05:02:40 localhost systemd[1]: Started libcrun container. Nov 26 05:02:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11d9cd093715b5b5a080bd9cb65aaf37c42471c562d6967139b417e0b72c52af/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:02:40 localhost podman[319798]: 2025-11-26 10:02:40.467961664 +0000 UTC m=+0.205783940 container init 330f327f462c84c2b7114e07c5805bf40489aa164b28888bcf85455e2c902a02 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-28eaeab7-5bbd-4432-b01d-9418570f7b99, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:02:40 localhost podman[319798]: 2025-11-26 10:02:40.478246977 +0000 UTC m=+0.216069243 container start 330f327f462c84c2b7114e07c5805bf40489aa164b28888bcf85455e2c902a02 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-28eaeab7-5bbd-4432-b01d-9418570f7b99, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Nov 26 05:02:40 localhost dnsmasq[319820]: started, version 2.85 cachesize 150 Nov 26 05:02:40 localhost dnsmasq[319820]: DNS service limited to local subnets Nov 26 05:02:40 localhost dnsmasq[319820]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:02:40 localhost dnsmasq[319820]: warning: no upstream servers configured Nov 26 05:02:40 localhost dnsmasq-dhcp[319820]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 26 05:02:40 localhost dnsmasq[319820]: read /var/lib/neutron/dhcp/28eaeab7-5bbd-4432-b01d-9418570f7b99/addn_hosts - 0 addresses Nov 26 05:02:40 localhost dnsmasq-dhcp[319820]: read /var/lib/neutron/dhcp/28eaeab7-5bbd-4432-b01d-9418570f7b99/host Nov 26 05:02:40 localhost dnsmasq-dhcp[319820]: read /var/lib/neutron/dhcp/28eaeab7-5bbd-4432-b01d-9418570f7b99/opts Nov 26 05:02:40 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:40.616 262471 INFO neutron.agent.dhcp.agent [None req-60273e18-85db-4e92-b920-7a37c77eee93 - - - - - -] DHCP configuration for ports {'f68aa2aa-f76f-41e2-bd23-348af8dfa5cd'} is completed#033[00m Nov 26 05:02:40 localhost podman[319843]: Nov 26 05:02:40 localhost podman[319843]: 2025-11-26 10:02:40.80253025 +0000 UTC m=+0.099750752 container create 07c06c47efa572859ac020a4160cafc2a681c098b6beb860f0c0af1605a545a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5a53efe6-7771-47f7-95e2-63af1ee1b74d, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 26 05:02:40 localhost systemd[1]: Started libpod-conmon-07c06c47efa572859ac020a4160cafc2a681c098b6beb860f0c0af1605a545a9.scope. Nov 26 05:02:40 localhost podman[319843]: 2025-11-26 10:02:40.757220624 +0000 UTC m=+0.054441166 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:02:40 localhost systemd[1]: Started libcrun container. Nov 26 05:02:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e459e9cc559d96aac2ad3c4832964d26cb5878f16d214ee3d9de6605f75e0941/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:02:40 localhost podman[319843]: 2025-11-26 10:02:40.870842216 +0000 UTC m=+0.168062718 container init 07c06c47efa572859ac020a4160cafc2a681c098b6beb860f0c0af1605a545a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5a53efe6-7771-47f7-95e2-63af1ee1b74d, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 26 05:02:40 localhost podman[319843]: 2025-11-26 10:02:40.881341606 +0000 UTC m=+0.178562108 container start 07c06c47efa572859ac020a4160cafc2a681c098b6beb860f0c0af1605a545a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5a53efe6-7771-47f7-95e2-63af1ee1b74d, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true) Nov 26 05:02:40 localhost dnsmasq[319861]: started, version 2.85 cachesize 150 Nov 26 05:02:40 localhost dnsmasq[319861]: DNS service limited to local subnets Nov 26 05:02:40 localhost dnsmasq[319861]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:02:40 localhost dnsmasq[319861]: warning: no upstream servers configured Nov 26 05:02:40 localhost dnsmasq-dhcp[319861]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 26 05:02:40 localhost dnsmasq[319861]: read /var/lib/neutron/dhcp/5a53efe6-7771-47f7-95e2-63af1ee1b74d/addn_hosts - 0 addresses Nov 26 05:02:40 localhost dnsmasq-dhcp[319861]: read /var/lib/neutron/dhcp/5a53efe6-7771-47f7-95e2-63af1ee1b74d/host Nov 26 05:02:40 localhost dnsmasq-dhcp[319861]: read /var/lib/neutron/dhcp/5a53efe6-7771-47f7-95e2-63af1ee1b74d/opts Nov 26 05:02:40 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:40.947 262471 INFO neutron.agent.dhcp.agent [None req-7fb0a686-9b4b-40e3-b464-824bdaddcff2 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:02:39Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=1437552d-aa23-44de-9640-d50158d0a4db, ip_allocation=immediate, mac_address=fa:16:3e:49:4f:21, name=tempest-PortsTestJSON-100716904, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:02:36Z, description=, dns_domain=, id=5a53efe6-7771-47f7-95e2-63af1ee1b74d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-810526649, port_security_enabled=True, project_id=c9a6d35bfc5f440e9fdc4ed36d883eff, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=63013, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1893, status=ACTIVE, subnets=['f05cdbcc-d81c-4001-8834-cf7baee6750f'], tags=[], tenant_id=c9a6d35bfc5f440e9fdc4ed36d883eff, updated_at=2025-11-26T10:02:37Z, vlan_transparent=None, network_id=5a53efe6-7771-47f7-95e2-63af1ee1b74d, port_security_enabled=True, project_id=c9a6d35bfc5f440e9fdc4ed36d883eff, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['512f55ba-befd-448e-8449-d75d9733402e'], standard_attr_id=1919, status=DOWN, tags=[], tenant_id=c9a6d35bfc5f440e9fdc4ed36d883eff, updated_at=2025-11-26T10:02:39Z on network 5a53efe6-7771-47f7-95e2-63af1ee1b74d#033[00m Nov 26 05:02:41 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:41.056 262471 INFO neutron.agent.dhcp.agent [None req-52719bea-3736-45eb-a5cd-93b2529223cb - - - - - -] DHCP configuration for ports {'7441db4e-f4d3-4637-9e32-ae5754b2720c'} is completed#033[00m Nov 26 05:02:41 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:41.116 2 INFO neutron.agent.securitygroups_rpc [None req-3308d4c8-a223-41aa-94a8-d85d6c29a647 bf5696d8db9f4b2795974d3ed78b4991 c9a6d35bfc5f440e9fdc4ed36d883eff - - default default] Security group member updated ['512f55ba-befd-448e-8449-d75d9733402e']#033[00m Nov 26 05:02:41 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:41.169 2 INFO neutron.agent.securitygroups_rpc [None req-3c972f72-291a-4b72-bb7f-98d291116257 def334c345f4475ba8901c121c08f73a 396008a2b0d9436cab52644f4e54b6ad - - default default] Security group member updated ['a1cecb88-d0b9-4ce5-8ab2-3cde73c509d1']#033[00m Nov 26 05:02:41 localhost dnsmasq[319861]: read /var/lib/neutron/dhcp/5a53efe6-7771-47f7-95e2-63af1ee1b74d/addn_hosts - 1 addresses Nov 26 05:02:41 localhost dnsmasq-dhcp[319861]: read /var/lib/neutron/dhcp/5a53efe6-7771-47f7-95e2-63af1ee1b74d/host Nov 26 05:02:41 localhost dnsmasq-dhcp[319861]: read /var/lib/neutron/dhcp/5a53efe6-7771-47f7-95e2-63af1ee1b74d/opts Nov 26 05:02:41 localhost podman[319879]: 2025-11-26 10:02:41.189457223 +0000 UTC m=+0.066028369 container kill 07c06c47efa572859ac020a4160cafc2a681c098b6beb860f0c0af1605a545a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5a53efe6-7771-47f7-95e2-63af1ee1b74d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 26 05:02:41 localhost nova_compute[281415]: 2025-11-26 10:02:41.498 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:41 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:41.513 262471 INFO neutron.agent.dhcp.agent [None req-d4b04dfc-e113-4cde-9275-1e2c1132aeb8 - - - - - -] DHCP configuration for ports {'1437552d-aa23-44de-9640-d50158d0a4db'} is completed#033[00m Nov 26 05:02:41 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:41.542 2 INFO neutron.agent.securitygroups_rpc [None req-cf888aec-76ed-4e62-b1c7-f42201f78d4c def334c345f4475ba8901c121c08f73a 396008a2b0d9436cab52644f4e54b6ad - - default default] Security group member updated ['a1cecb88-d0b9-4ce5-8ab2-3cde73c509d1']#033[00m Nov 26 05:02:41 localhost dnsmasq[319861]: read /var/lib/neutron/dhcp/5a53efe6-7771-47f7-95e2-63af1ee1b74d/addn_hosts - 0 addresses Nov 26 05:02:41 localhost dnsmasq-dhcp[319861]: read /var/lib/neutron/dhcp/5a53efe6-7771-47f7-95e2-63af1ee1b74d/host Nov 26 05:02:41 localhost dnsmasq-dhcp[319861]: read /var/lib/neutron/dhcp/5a53efe6-7771-47f7-95e2-63af1ee1b74d/opts Nov 26 05:02:41 localhost podman[319917]: 2025-11-26 10:02:41.608594363 +0000 UTC m=+0.068330636 container kill 07c06c47efa572859ac020a4160cafc2a681c098b6beb860f0c0af1605a545a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5a53efe6-7771-47f7-95e2-63af1ee1b74d, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:02:41 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:41.615 2 INFO neutron.agent.securitygroups_rpc [None req-56dad956-421b-4f3d-9bf3-049310340781 b3c44a0e883d4a21bd13a1fdbfec53c1 1b441b9cc9474cf0bf826c2d3b0ac3a3 - - default default] Security group member updated ['3c90db89-734e-43e3-a179-44c4998e953c']#033[00m Nov 26 05:02:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 05:02:41 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:41.768 2 INFO neutron.agent.securitygroups_rpc [None req-35c30512-3402-40a5-8857-5f6adedee4c5 6e007ecff2d54f3d96b4ca4d0583f705 b4b4d6e653de42458dbb1d0be0428a0e - - default default] Security group member updated ['6132014c-a03b-42fa-9169-0b75f723efcc']#033[00m Nov 26 05:02:41 localhost podman[319931]: 2025-11-26 10:02:41.787649216 +0000 UTC m=+0.140221438 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 26 05:02:41 localhost podman[319931]: 2025-11-26 10:02:41.825237602 +0000 UTC m=+0.177809814 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 05:02:41 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 05:02:41 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:41.965 262471 INFO neutron.agent.linux.ip_lib [None req-a05fb46f-b3a7-483f-908a-4e4b0f5286ed - - - - - -] Device tap10dfb5e4-9c cannot be used as it has no MAC address#033[00m Nov 26 05:02:41 localhost nova_compute[281415]: 2025-11-26 10:02:41.992 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:42 localhost kernel: device tap10dfb5e4-9c entered promiscuous mode Nov 26 05:02:42 localhost NetworkManager[5970]: [1764151362.0026] manager: (tap10dfb5e4-9c): new Generic device (/org/freedesktop/NetworkManager/Devices/54) Nov 26 05:02:42 localhost nova_compute[281415]: 2025-11-26 10:02:42.002 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:42 localhost ovn_controller[153664]: 2025-11-26T10:02:42Z|00326|binding|INFO|Claiming lport 10dfb5e4-9c8b-465e-ad19-d1f21f7259ab for this chassis. Nov 26 05:02:42 localhost ovn_controller[153664]: 2025-11-26T10:02:42Z|00327|binding|INFO|10dfb5e4-9c8b-465e-ad19-d1f21f7259ab: Claiming unknown Nov 26 05:02:42 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:42.017 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-5d21dfd3-676b-45ed-9c36-2d6f0bfc9f06', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5d21dfd3-676b-45ed-9c36-2d6f0bfc9f06', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0b275d5fc6d14aa0a6fd7bf9cf5d748e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8d41971c-67fe-45e3-af0f-8b1ac08af602, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=10dfb5e4-9c8b-465e-ad19-d1f21f7259ab) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:02:42 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:42.019 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 10dfb5e4-9c8b-465e-ad19-d1f21f7259ab in datapath 5d21dfd3-676b-45ed-9c36-2d6f0bfc9f06 bound to our chassis#033[00m Nov 26 05:02:42 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:42.022 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5d21dfd3-676b-45ed-9c36-2d6f0bfc9f06 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:02:42 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:42.024 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[7406d4ed-539d-4db8-bfc3-137eaf579ce3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:02:42 localhost ovn_controller[153664]: 2025-11-26T10:02:42Z|00328|binding|INFO|Setting lport 10dfb5e4-9c8b-465e-ad19-d1f21f7259ab ovn-installed in OVS Nov 26 05:02:42 localhost ovn_controller[153664]: 2025-11-26T10:02:42Z|00329|binding|INFO|Setting lport 10dfb5e4-9c8b-465e-ad19-d1f21f7259ab up in Southbound Nov 26 05:02:42 localhost nova_compute[281415]: 2025-11-26 10:02:42.027 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:42 localhost nova_compute[281415]: 2025-11-26 10:02:42.046 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:42 localhost nova_compute[281415]: 2025-11-26 10:02:42.106 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:42 localhost dnsmasq[319861]: exiting on receipt of SIGTERM Nov 26 05:02:42 localhost podman[319989]: 2025-11-26 10:02:42.121147209 +0000 UTC m=+0.072373555 container kill 07c06c47efa572859ac020a4160cafc2a681c098b6beb860f0c0af1605a545a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5a53efe6-7771-47f7-95e2-63af1ee1b74d, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 26 05:02:42 localhost systemd[1]: libpod-07c06c47efa572859ac020a4160cafc2a681c098b6beb860f0c0af1605a545a9.scope: Deactivated successfully. Nov 26 05:02:42 localhost nova_compute[281415]: 2025-11-26 10:02:42.153 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:42 localhost podman[320010]: 2025-11-26 10:02:42.202104217 +0000 UTC m=+0.064723950 container died 07c06c47efa572859ac020a4160cafc2a681c098b6beb860f0c0af1605a545a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5a53efe6-7771-47f7-95e2-63af1ee1b74d, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:02:42 localhost podman[320010]: 2025-11-26 10:02:42.238998195 +0000 UTC m=+0.101617918 container cleanup 07c06c47efa572859ac020a4160cafc2a681c098b6beb860f0c0af1605a545a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5a53efe6-7771-47f7-95e2-63af1ee1b74d, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 26 05:02:42 localhost systemd[1]: libpod-conmon-07c06c47efa572859ac020a4160cafc2a681c098b6beb860f0c0af1605a545a9.scope: Deactivated successfully. Nov 26 05:02:42 localhost podman[320013]: 2025-11-26 10:02:42.281753075 +0000 UTC m=+0.132543649 container remove 07c06c47efa572859ac020a4160cafc2a681c098b6beb860f0c0af1605a545a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5a53efe6-7771-47f7-95e2-63af1ee1b74d, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:02:42 localhost ovn_controller[153664]: 2025-11-26T10:02:42Z|00330|binding|INFO|Releasing lport bbd2f140-7553-41cb-ae19-80a4b94cbc32 from this chassis (sb_readonly=0) Nov 26 05:02:42 localhost ovn_controller[153664]: 2025-11-26T10:02:42Z|00331|binding|INFO|Setting lport bbd2f140-7553-41cb-ae19-80a4b94cbc32 down in Southbound Nov 26 05:02:42 localhost kernel: device tapbbd2f140-75 left promiscuous mode Nov 26 05:02:42 localhost nova_compute[281415]: 2025-11-26 10:02:42.299 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:42 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:42.306 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-5a53efe6-7771-47f7-95e2-63af1ee1b74d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5a53efe6-7771-47f7-95e2-63af1ee1b74d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c9a6d35bfc5f440e9fdc4ed36d883eff', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1d3593a4-08fa-423a-9a18-a188cc78b3c9, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=bbd2f140-7553-41cb-ae19-80a4b94cbc32) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:02:42 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:42.308 159486 INFO neutron.agent.ovn.metadata.agent [-] Port bbd2f140-7553-41cb-ae19-80a4b94cbc32 in datapath 5a53efe6-7771-47f7-95e2-63af1ee1b74d unbound from our chassis#033[00m Nov 26 05:02:42 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:42.310 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5a53efe6-7771-47f7-95e2-63af1ee1b74d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:02:42 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:42.311 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[d293fbc2-7a02-4f8c-a395-51d92e6ad45a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:02:42 localhost nova_compute[281415]: 2025-11-26 10:02:42.317 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:42 localhost systemd[1]: var-lib-containers-storage-overlay-e459e9cc559d96aac2ad3c4832964d26cb5878f16d214ee3d9de6605f75e0941-merged.mount: Deactivated successfully. Nov 26 05:02:42 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-07c06c47efa572859ac020a4160cafc2a681c098b6beb860f0c0af1605a545a9-userdata-shm.mount: Deactivated successfully. Nov 26 05:02:42 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:42.558 262471 INFO neutron.agent.dhcp.agent [None req-3f14db4e-6e9c-47c8-9b08-d1afcf468ce7 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:02:42 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:42.559 262471 INFO neutron.agent.dhcp.agent [None req-3f14db4e-6e9c-47c8-9b08-d1afcf468ce7 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:02:42 localhost systemd[1]: run-netns-qdhcp\x2d5a53efe6\x2d7771\x2d47f7\x2d95e2\x2d63af1ee1b74d.mount: Deactivated successfully. Nov 26 05:02:42 localhost nova_compute[281415]: 2025-11-26 10:02:42.729 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:43 localhost podman[320083]: Nov 26 05:02:43 localhost podman[320083]: 2025-11-26 10:02:43.135242527 +0000 UTC m=+0.101046961 container create 3567a1af4b5fd93772a46c31d82ad49712cff10ed133b2f4646b1f84b060f91a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d21dfd3-676b-45ed-9c36-2d6f0bfc9f06, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 26 05:02:43 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:43.151 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:02:43 localhost systemd[1]: Started libpod-conmon-3567a1af4b5fd93772a46c31d82ad49712cff10ed133b2f4646b1f84b060f91a.scope. Nov 26 05:02:43 localhost podman[320083]: 2025-11-26 10:02:43.085642784 +0000 UTC m=+0.051447288 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:02:43 localhost systemd[1]: Started libcrun container. Nov 26 05:02:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4522744029507b27dcb165f126932b25c4a56eb81c6fcf579d2fdea5eb41895/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:02:43 localhost podman[320083]: 2025-11-26 10:02:43.238658537 +0000 UTC m=+0.204462971 container init 3567a1af4b5fd93772a46c31d82ad49712cff10ed133b2f4646b1f84b060f91a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d21dfd3-676b-45ed-9c36-2d6f0bfc9f06, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0) Nov 26 05:02:43 localhost podman[320083]: 2025-11-26 10:02:43.249039693 +0000 UTC m=+0.214844127 container start 3567a1af4b5fd93772a46c31d82ad49712cff10ed133b2f4646b1f84b060f91a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d21dfd3-676b-45ed-9c36-2d6f0bfc9f06, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true) Nov 26 05:02:43 localhost dnsmasq[320102]: started, version 2.85 cachesize 150 Nov 26 05:02:43 localhost dnsmasq[320102]: DNS service limited to local subnets Nov 26 05:02:43 localhost dnsmasq[320102]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:02:43 localhost dnsmasq[320102]: warning: no upstream servers configured Nov 26 05:02:43 localhost dnsmasq-dhcp[320102]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 26 05:02:43 localhost dnsmasq[320102]: read /var/lib/neutron/dhcp/5d21dfd3-676b-45ed-9c36-2d6f0bfc9f06/addn_hosts - 0 addresses Nov 26 05:02:43 localhost dnsmasq-dhcp[320102]: read /var/lib/neutron/dhcp/5d21dfd3-676b-45ed-9c36-2d6f0bfc9f06/host Nov 26 05:02:43 localhost dnsmasq-dhcp[320102]: read /var/lib/neutron/dhcp/5d21dfd3-676b-45ed-9c36-2d6f0bfc9f06/opts Nov 26 05:02:43 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:43.317 2 INFO neutron.agent.securitygroups_rpc [None req-23d105ed-75f3-4ea5-b46a-a43928c3f0b3 b3c44a0e883d4a21bd13a1fdbfec53c1 1b441b9cc9474cf0bf826c2d3b0ac3a3 - - default default] Security group member updated ['3c90db89-734e-43e3-a179-44c4998e953c']#033[00m Nov 26 05:02:43 localhost systemd[1]: tmp-crun.qvsRbd.mount: Deactivated successfully. Nov 26 05:02:43 localhost ovn_controller[153664]: 2025-11-26T10:02:43Z|00332|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:02:43 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e139 e139: 6 total, 6 up, 6 in Nov 26 05:02:43 localhost nova_compute[281415]: 2025-11-26 10:02:43.416 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:43 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:43.422 262471 INFO neutron.agent.dhcp.agent [None req-128d9672-0f5d-49c1-a6e6-1dd19648c5c5 - - - - - -] DHCP configuration for ports {'0326e332-e30b-4127-9a54-ee521ce19aaf'} is completed#033[00m Nov 26 05:02:44 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:02:44 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:44.563 2 INFO neutron.agent.securitygroups_rpc [None req-b482d17e-ff4e-427b-8d70-4ae8087f2fde 6e007ecff2d54f3d96b4ca4d0583f705 b4b4d6e653de42458dbb1d0be0428a0e - - default default] Security group member updated ['6132014c-a03b-42fa-9169-0b75f723efcc']#033[00m Nov 26 05:02:45 localhost dnsmasq[318597]: exiting on receipt of SIGTERM Nov 26 05:02:45 localhost podman[320119]: 2025-11-26 10:02:45.35085695 +0000 UTC m=+0.061290528 container kill e8bafa6526788474e8354e06662f1c8e9561692b4efdf9372dce6c5ebe811560 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8b259add-0a8d-49d3-827c-570e822875fa, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 26 05:02:45 localhost systemd[1]: libpod-e8bafa6526788474e8354e06662f1c8e9561692b4efdf9372dce6c5ebe811560.scope: Deactivated successfully. Nov 26 05:02:45 localhost podman[320135]: 2025-11-26 10:02:45.412192219 +0000 UTC m=+0.040762583 container died e8bafa6526788474e8354e06662f1c8e9561692b4efdf9372dce6c5ebe811560 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8b259add-0a8d-49d3-827c-570e822875fa, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:02:45 localhost systemd[1]: tmp-crun.hmuYMu.mount: Deactivated successfully. Nov 26 05:02:45 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:45.444 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:02:45Z, description=, device_id=11fc9dbd-19ef-4c05-9396-91e6e2765ccc, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=6ce25750-5ef7-4268-93cf-8fb4d18370c5, ip_allocation=immediate, mac_address=fa:16:3e:4d:38:bf, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:02:38Z, description=, dns_domain=, id=5d21dfd3-676b-45ed-9c36-2d6f0bfc9f06, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-router-network01--812377686, port_security_enabled=True, project_id=0b275d5fc6d14aa0a6fd7bf9cf5d748e, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42415, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1907, status=ACTIVE, subnets=['78bd9bf9-9838-44ee-9ce2-75459fc43653'], tags=[], tenant_id=0b275d5fc6d14aa0a6fd7bf9cf5d748e, updated_at=2025-11-26T10:02:40Z, vlan_transparent=None, network_id=5d21dfd3-676b-45ed-9c36-2d6f0bfc9f06, port_security_enabled=False, project_id=0b275d5fc6d14aa0a6fd7bf9cf5d748e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1958, status=DOWN, tags=[], tenant_id=0b275d5fc6d14aa0a6fd7bf9cf5d748e, updated_at=2025-11-26T10:02:45Z on network 5d21dfd3-676b-45ed-9c36-2d6f0bfc9f06#033[00m Nov 26 05:02:45 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e8bafa6526788474e8354e06662f1c8e9561692b4efdf9372dce6c5ebe811560-userdata-shm.mount: Deactivated successfully. Nov 26 05:02:45 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e140 e140: 6 total, 6 up, 6 in Nov 26 05:02:45 localhost podman[320135]: 2025-11-26 10:02:45.484647266 +0000 UTC m=+0.113217580 container remove e8bafa6526788474e8354e06662f1c8e9561692b4efdf9372dce6c5ebe811560 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8b259add-0a8d-49d3-827c-570e822875fa, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:02:45 localhost ovn_controller[153664]: 2025-11-26T10:02:45Z|00333|binding|INFO|Releasing lport 0da4c8be-1871-4fdf-8ba5-062b50e238ab from this chassis (sb_readonly=0) Nov 26 05:02:45 localhost kernel: device tap0da4c8be-18 left promiscuous mode Nov 26 05:02:45 localhost ovn_controller[153664]: 2025-11-26T10:02:45Z|00334|binding|INFO|Setting lport 0da4c8be-1871-4fdf-8ba5-062b50e238ab down in Southbound Nov 26 05:02:45 localhost nova_compute[281415]: 2025-11-26 10:02:45.505 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:45 localhost nova_compute[281415]: 2025-11-26 10:02:45.523 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:45 localhost systemd[1]: libpod-conmon-e8bafa6526788474e8354e06662f1c8e9561692b4efdf9372dce6c5ebe811560.scope: Deactivated successfully. Nov 26 05:02:45 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:45.597 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:ffff::2/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-8b259add-0a8d-49d3-827c-570e822875fa', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8b259add-0a8d-49d3-827c-570e822875fa', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1b441b9cc9474cf0bf826c2d3b0ac3a3', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=027d1b16-cf7e-43f2-89a2-4093a0950f2f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=0da4c8be-1871-4fdf-8ba5-062b50e238ab) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:02:45 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:45.600 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 0da4c8be-1871-4fdf-8ba5-062b50e238ab in datapath 8b259add-0a8d-49d3-827c-570e822875fa unbound from our chassis#033[00m Nov 26 05:02:45 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:45.601 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8b259add-0a8d-49d3-827c-570e822875fa or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:02:45 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:45.604 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[914f449b-1653-447a-8ffc-fa8eaaf9da1e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:02:45 localhost dnsmasq[320102]: read /var/lib/neutron/dhcp/5d21dfd3-676b-45ed-9c36-2d6f0bfc9f06/addn_hosts - 1 addresses Nov 26 05:02:45 localhost dnsmasq-dhcp[320102]: read /var/lib/neutron/dhcp/5d21dfd3-676b-45ed-9c36-2d6f0bfc9f06/host Nov 26 05:02:45 localhost dnsmasq-dhcp[320102]: read /var/lib/neutron/dhcp/5d21dfd3-676b-45ed-9c36-2d6f0bfc9f06/opts Nov 26 05:02:45 localhost podman[320178]: 2025-11-26 10:02:45.689708154 +0000 UTC m=+0.063344570 container kill 3567a1af4b5fd93772a46c31d82ad49712cff10ed133b2f4646b1f84b060f91a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d21dfd3-676b-45ed-9c36-2d6f0bfc9f06, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 26 05:02:45 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:45.750 262471 INFO neutron.agent.dhcp.agent [None req-97ba40d1-1720-4a68-9dc1-081b2d1787b0 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:02:45 localhost openstack_network_exporter[242153]: ERROR 10:02:45 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 05:02:45 localhost openstack_network_exporter[242153]: Nov 26 05:02:45 localhost openstack_network_exporter[242153]: ERROR 10:02:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:02:45 localhost openstack_network_exporter[242153]: ERROR 10:02:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:02:45 localhost openstack_network_exporter[242153]: ERROR 10:02:45 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 05:02:45 localhost openstack_network_exporter[242153]: ERROR 10:02:45 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 05:02:45 localhost openstack_network_exporter[242153]: Nov 26 05:02:45 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:45.973 262471 INFO neutron.agent.dhcp.agent [None req-7ae616ad-013a-4bdc-a8bb-559d07cf981a - - - - - -] DHCP configuration for ports {'6ce25750-5ef7-4268-93cf-8fb4d18370c5'} is completed#033[00m Nov 26 05:02:46 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:46.009 262471 INFO neutron.agent.linux.ip_lib [None req-002c8c95-9189-4a00-8a6c-28451472220b - - - - - -] Device tap33e150bc-3f cannot be used as it has no MAC address#033[00m Nov 26 05:02:46 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:46.034 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:02:46 localhost nova_compute[281415]: 2025-11-26 10:02:46.046 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:46 localhost kernel: device tap33e150bc-3f entered promiscuous mode Nov 26 05:02:46 localhost NetworkManager[5970]: [1764151366.0559] manager: (tap33e150bc-3f): new Generic device (/org/freedesktop/NetworkManager/Devices/55) Nov 26 05:02:46 localhost nova_compute[281415]: 2025-11-26 10:02:46.057 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:46 localhost systemd-udevd[320208]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:02:46 localhost ovn_controller[153664]: 2025-11-26T10:02:46Z|00335|binding|INFO|Claiming lport 33e150bc-3fa4-48f6-af9d-1b8a75506874 for this chassis. Nov 26 05:02:46 localhost ovn_controller[153664]: 2025-11-26T10:02:46Z|00336|binding|INFO|33e150bc-3fa4-48f6-af9d-1b8a75506874: Claiming unknown Nov 26 05:02:46 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:46.082 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-95c06944-bfb3-4728-916a-c0571cdd5f7f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-95c06944-bfb3-4728-916a-c0571cdd5f7f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c9a6d35bfc5f440e9fdc4ed36d883eff', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a3a9d396-1135-46bc-b223-0f2a7d2d2d01, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=33e150bc-3fa4-48f6-af9d-1b8a75506874) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:02:46 localhost journal[229445]: ethtool ioctl error on tap33e150bc-3f: No such device Nov 26 05:02:46 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:46.085 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 33e150bc-3fa4-48f6-af9d-1b8a75506874 in datapath 95c06944-bfb3-4728-916a-c0571cdd5f7f bound to our chassis#033[00m Nov 26 05:02:46 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:46.088 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Port ece727fd-8fe6-4b9f-a775-8ebffc828687 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 26 05:02:46 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:46.089 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 95c06944-bfb3-4728-916a-c0571cdd5f7f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:02:46 localhost ovn_controller[153664]: 2025-11-26T10:02:46Z|00337|binding|INFO|Setting lport 33e150bc-3fa4-48f6-af9d-1b8a75506874 ovn-installed in OVS Nov 26 05:02:46 localhost ovn_controller[153664]: 2025-11-26T10:02:46Z|00338|binding|INFO|Setting lport 33e150bc-3fa4-48f6-af9d-1b8a75506874 up in Southbound Nov 26 05:02:46 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:46.090 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[fc155d20-2e5a-4697-8005-957b37c734c4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:02:46 localhost nova_compute[281415]: 2025-11-26 10:02:46.089 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:46 localhost nova_compute[281415]: 2025-11-26 10:02:46.093 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:46 localhost journal[229445]: ethtool ioctl error on tap33e150bc-3f: No such device Nov 26 05:02:46 localhost journal[229445]: ethtool ioctl error on tap33e150bc-3f: No such device Nov 26 05:02:46 localhost journal[229445]: ethtool ioctl error on tap33e150bc-3f: No such device Nov 26 05:02:46 localhost journal[229445]: ethtool ioctl error on tap33e150bc-3f: No such device Nov 26 05:02:46 localhost journal[229445]: ethtool ioctl error on tap33e150bc-3f: No such device Nov 26 05:02:46 localhost journal[229445]: ethtool ioctl error on tap33e150bc-3f: No such device Nov 26 05:02:46 localhost journal[229445]: ethtool ioctl error on tap33e150bc-3f: No such device Nov 26 05:02:46 localhost nova_compute[281415]: 2025-11-26 10:02:46.146 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:46 localhost nova_compute[281415]: 2025-11-26 10:02:46.181 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:46 localhost systemd[1]: var-lib-containers-storage-overlay-4a02c371f87d27f3c85733ea0f63759d8c40f385e9712b4b79a5dc884e8b8ec6-merged.mount: Deactivated successfully. Nov 26 05:02:46 localhost systemd[1]: run-netns-qdhcp\x2d8b259add\x2d0a8d\x2d49d3\x2d827c\x2d570e822875fa.mount: Deactivated successfully. Nov 26 05:02:46 localhost ovn_controller[153664]: 2025-11-26T10:02:46Z|00339|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:02:46 localhost nova_compute[281415]: 2025-11-26 10:02:46.389 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:46 localhost nova_compute[281415]: 2025-11-26 10:02:46.500 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 05:02:46 localhost podman[320257]: 2025-11-26 10:02:46.834805195 +0000 UTC m=+0.092146329 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Nov 26 05:02:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 05:02:46 localhost podman[320257]: 2025-11-26 10:02:46.847220841 +0000 UTC m=+0.104561945 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:02:46 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 05:02:46 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:46.933 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:02:45Z, description=, device_id=11fc9dbd-19ef-4c05-9396-91e6e2765ccc, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=6ce25750-5ef7-4268-93cf-8fb4d18370c5, ip_allocation=immediate, mac_address=fa:16:3e:4d:38:bf, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:02:38Z, description=, dns_domain=, id=5d21dfd3-676b-45ed-9c36-2d6f0bfc9f06, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-router-network01--812377686, port_security_enabled=True, project_id=0b275d5fc6d14aa0a6fd7bf9cf5d748e, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42415, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1907, status=ACTIVE, subnets=['78bd9bf9-9838-44ee-9ce2-75459fc43653'], tags=[], tenant_id=0b275d5fc6d14aa0a6fd7bf9cf5d748e, updated_at=2025-11-26T10:02:40Z, vlan_transparent=None, network_id=5d21dfd3-676b-45ed-9c36-2d6f0bfc9f06, port_security_enabled=False, project_id=0b275d5fc6d14aa0a6fd7bf9cf5d748e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1958, status=DOWN, tags=[], tenant_id=0b275d5fc6d14aa0a6fd7bf9cf5d748e, updated_at=2025-11-26T10:02:45Z on network 5d21dfd3-676b-45ed-9c36-2d6f0bfc9f06#033[00m Nov 26 05:02:46 localhost podman[320275]: 2025-11-26 10:02:46.949784436 +0000 UTC m=+0.088650196 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:02:47 localhost podman[320275]: 2025-11-26 10:02:47.042451809 +0000 UTC m=+0.181317579 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible) Nov 26 05:02:47 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 05:02:47 localhost podman[320328]: Nov 26 05:02:47 localhost podman[320328]: 2025-11-26 10:02:47.134414821 +0000 UTC m=+0.093683034 container create 5258d57d5b796013b048396563c56ec41a779aa0867ab3d12dbe514d936dbde8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-95c06944-bfb3-4728-916a-c0571cdd5f7f, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:02:47 localhost podman[320328]: 2025-11-26 10:02:47.084162029 +0000 UTC m=+0.043430272 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:02:47 localhost systemd[1]: Started libpod-conmon-5258d57d5b796013b048396563c56ec41a779aa0867ab3d12dbe514d936dbde8.scope. Nov 26 05:02:47 localhost systemd[1]: Started libcrun container. Nov 26 05:02:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/38781ffbcdadc1e0b4672942671cd276e62efdc06bb8991accdbac5831cafd76/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:02:47 localhost dnsmasq[320102]: read /var/lib/neutron/dhcp/5d21dfd3-676b-45ed-9c36-2d6f0bfc9f06/addn_hosts - 1 addresses Nov 26 05:02:47 localhost dnsmasq-dhcp[320102]: read /var/lib/neutron/dhcp/5d21dfd3-676b-45ed-9c36-2d6f0bfc9f06/host Nov 26 05:02:47 localhost podman[320343]: 2025-11-26 10:02:47.225326962 +0000 UTC m=+0.069240323 container kill 3567a1af4b5fd93772a46c31d82ad49712cff10ed133b2f4646b1f84b060f91a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d21dfd3-676b-45ed-9c36-2d6f0bfc9f06, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118) Nov 26 05:02:47 localhost dnsmasq-dhcp[320102]: read /var/lib/neutron/dhcp/5d21dfd3-676b-45ed-9c36-2d6f0bfc9f06/opts Nov 26 05:02:47 localhost podman[320328]: 2025-11-26 10:02:47.270999659 +0000 UTC m=+0.230267872 container init 5258d57d5b796013b048396563c56ec41a779aa0867ab3d12dbe514d936dbde8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-95c06944-bfb3-4728-916a-c0571cdd5f7f, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS) Nov 26 05:02:47 localhost podman[320328]: 2025-11-26 10:02:47.281598111 +0000 UTC m=+0.240866314 container start 5258d57d5b796013b048396563c56ec41a779aa0867ab3d12dbe514d936dbde8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-95c06944-bfb3-4728-916a-c0571cdd5f7f, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:02:47 localhost dnsmasq[320364]: started, version 2.85 cachesize 150 Nov 26 05:02:47 localhost dnsmasq[320364]: DNS service limited to local subnets Nov 26 05:02:47 localhost dnsmasq[320364]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:02:47 localhost dnsmasq[320364]: warning: no upstream servers configured Nov 26 05:02:47 localhost dnsmasq-dhcp[320364]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 26 05:02:47 localhost dnsmasq[320364]: read /var/lib/neutron/dhcp/95c06944-bfb3-4728-916a-c0571cdd5f7f/addn_hosts - 0 addresses Nov 26 05:02:47 localhost dnsmasq-dhcp[320364]: read /var/lib/neutron/dhcp/95c06944-bfb3-4728-916a-c0571cdd5f7f/host Nov 26 05:02:47 localhost dnsmasq-dhcp[320364]: read /var/lib/neutron/dhcp/95c06944-bfb3-4728-916a-c0571cdd5f7f/opts Nov 26 05:02:47 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:47.386 262471 INFO neutron.agent.dhcp.agent [None req-218af7c1-81b6-4264-bede-29988c1573e9 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:02:45Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=67c29dc6-b73a-4dea-a662-9ddaa4a5ef56, ip_allocation=immediate, mac_address=fa:16:3e:36:8c:d7, name=tempest-PortsTestJSON-1430192844, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:02:43Z, description=, dns_domain=, id=95c06944-bfb3-4728-916a-c0571cdd5f7f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-905054695, port_security_enabled=True, project_id=c9a6d35bfc5f440e9fdc4ed36d883eff, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=12202, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1942, status=ACTIVE, subnets=['80839bc8-f0ea-420b-8312-b21b268b2c76'], tags=[], tenant_id=c9a6d35bfc5f440e9fdc4ed36d883eff, updated_at=2025-11-26T10:02:44Z, vlan_transparent=None, network_id=95c06944-bfb3-4728-916a-c0571cdd5f7f, port_security_enabled=True, project_id=c9a6d35bfc5f440e9fdc4ed36d883eff, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1960, status=DOWN, tags=[], tenant_id=c9a6d35bfc5f440e9fdc4ed36d883eff, updated_at=2025-11-26T10:02:45Z on network 95c06944-bfb3-4728-916a-c0571cdd5f7f#033[00m Nov 26 05:02:47 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:47.402 2 INFO neutron.agent.securitygroups_rpc [None req-010e131e-cc41-460a-a7e6-86078fc5ec94 40f3024e4fce4769b0bb53ffd46eddc6 b9b074de0d914b61b080455f8b5f200f - - default default] Security group member updated ['cb0d2a88-45b2-4673-bd4e-248d24478fa7']#033[00m Nov 26 05:02:47 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e141 e141: 6 total, 6 up, 6 in Nov 26 05:02:47 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:47.606 2 INFO neutron.agent.securitygroups_rpc [None req-f3999741-e2d0-4c24-a30d-40a39e7937e1 6e007ecff2d54f3d96b4ca4d0583f705 b4b4d6e653de42458dbb1d0be0428a0e - - default default] Security group member updated ['6132014c-a03b-42fa-9169-0b75f723efcc']#033[00m Nov 26 05:02:47 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:47.621 262471 INFO neutron.agent.dhcp.agent [None req-7ca93d39-6633-452d-a13b-0cfca3aafcd9 - - - - - -] DHCP configuration for ports {'6ce25750-5ef7-4268-93cf-8fb4d18370c5', '59a1e7fe-d788-4b48-97ce-6c34d48dac83'} is completed#033[00m Nov 26 05:02:47 localhost podman[320388]: 2025-11-26 10:02:47.664064491 +0000 UTC m=+0.074479297 container kill 5258d57d5b796013b048396563c56ec41a779aa0867ab3d12dbe514d936dbde8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-95c06944-bfb3-4728-916a-c0571cdd5f7f, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS) Nov 26 05:02:47 localhost dnsmasq[320364]: read /var/lib/neutron/dhcp/95c06944-bfb3-4728-916a-c0571cdd5f7f/addn_hosts - 1 addresses Nov 26 05:02:47 localhost dnsmasq-dhcp[320364]: read /var/lib/neutron/dhcp/95c06944-bfb3-4728-916a-c0571cdd5f7f/host Nov 26 05:02:47 localhost dnsmasq-dhcp[320364]: read /var/lib/neutron/dhcp/95c06944-bfb3-4728-916a-c0571cdd5f7f/opts Nov 26 05:02:47 localhost nova_compute[281415]: 2025-11-26 10:02:47.763 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:47 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:47.947 262471 INFO neutron.agent.dhcp.agent [None req-17cf79bd-f870-41b8-9428-bdb1fb8185aa - - - - - -] DHCP configuration for ports {'67c29dc6-b73a-4dea-a662-9ddaa4a5ef56'} is completed#033[00m Nov 26 05:02:48 localhost dnsmasq[320364]: read /var/lib/neutron/dhcp/95c06944-bfb3-4728-916a-c0571cdd5f7f/addn_hosts - 0 addresses Nov 26 05:02:48 localhost dnsmasq-dhcp[320364]: read /var/lib/neutron/dhcp/95c06944-bfb3-4728-916a-c0571cdd5f7f/host Nov 26 05:02:48 localhost dnsmasq-dhcp[320364]: read /var/lib/neutron/dhcp/95c06944-bfb3-4728-916a-c0571cdd5f7f/opts Nov 26 05:02:48 localhost podman[320428]: 2025-11-26 10:02:48.147464047 +0000 UTC m=+0.070784768 container kill 5258d57d5b796013b048396563c56ec41a779aa0867ab3d12dbe514d936dbde8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-95c06944-bfb3-4728-916a-c0571cdd5f7f, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 26 05:02:48 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:48.339 2 INFO neutron.agent.securitygroups_rpc [None req-91cf4bce-6ebc-4847-ab94-af98d7f8b0a3 40f3024e4fce4769b0bb53ffd46eddc6 b9b074de0d914b61b080455f8b5f200f - - default default] Security group member updated ['cb0d2a88-45b2-4673-bd4e-248d24478fa7']#033[00m Nov 26 05:02:48 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e142 e142: 6 total, 6 up, 6 in Nov 26 05:02:48 localhost dnsmasq[320364]: exiting on receipt of SIGTERM Nov 26 05:02:48 localhost systemd[1]: tmp-crun.PLEsKa.mount: Deactivated successfully. Nov 26 05:02:48 localhost podman[320465]: 2025-11-26 10:02:48.673553633 +0000 UTC m=+0.070860041 container kill 5258d57d5b796013b048396563c56ec41a779aa0867ab3d12dbe514d936dbde8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-95c06944-bfb3-4728-916a-c0571cdd5f7f, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118) Nov 26 05:02:48 localhost systemd[1]: libpod-5258d57d5b796013b048396563c56ec41a779aa0867ab3d12dbe514d936dbde8.scope: Deactivated successfully. Nov 26 05:02:48 localhost podman[320479]: 2025-11-26 10:02:48.751818901 +0000 UTC m=+0.064916035 container died 5258d57d5b796013b048396563c56ec41a779aa0867ab3d12dbe514d936dbde8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-95c06944-bfb3-4728-916a-c0571cdd5f7f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 26 05:02:48 localhost podman[320479]: 2025-11-26 10:02:48.794911332 +0000 UTC m=+0.108008426 container cleanup 5258d57d5b796013b048396563c56ec41a779aa0867ab3d12dbe514d936dbde8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-95c06944-bfb3-4728-916a-c0571cdd5f7f, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 26 05:02:48 localhost systemd[1]: libpod-conmon-5258d57d5b796013b048396563c56ec41a779aa0867ab3d12dbe514d936dbde8.scope: Deactivated successfully. Nov 26 05:02:48 localhost podman[320481]: 2025-11-26 10:02:48.885515364 +0000 UTC m=+0.188234602 container remove 5258d57d5b796013b048396563c56ec41a779aa0867ab3d12dbe514d936dbde8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-95c06944-bfb3-4728-916a-c0571cdd5f7f, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 26 05:02:48 localhost ovn_controller[153664]: 2025-11-26T10:02:48Z|00340|binding|INFO|Removing iface tap33e150bc-3f ovn-installed in OVS Nov 26 05:02:48 localhost ovn_controller[153664]: 2025-11-26T10:02:48Z|00341|binding|INFO|Removing lport 33e150bc-3fa4-48f6-af9d-1b8a75506874 ovn-installed in OVS Nov 26 05:02:48 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:48.898 159486 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port ece727fd-8fe6-4b9f-a775-8ebffc828687 with type ""#033[00m Nov 26 05:02:48 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:48.899 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-95c06944-bfb3-4728-916a-c0571cdd5f7f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-95c06944-bfb3-4728-916a-c0571cdd5f7f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c9a6d35bfc5f440e9fdc4ed36d883eff', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a3a9d396-1135-46bc-b223-0f2a7d2d2d01, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=33e150bc-3fa4-48f6-af9d-1b8a75506874) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:02:48 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:48.900 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 33e150bc-3fa4-48f6-af9d-1b8a75506874 in datapath 95c06944-bfb3-4728-916a-c0571cdd5f7f unbound from our chassis#033[00m Nov 26 05:02:48 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:48.902 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 95c06944-bfb3-4728-916a-c0571cdd5f7f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:02:48 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:48.904 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[190d08f0-347e-416e-b92d-ee133c14b22b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:02:48 localhost nova_compute[281415]: 2025-11-26 10:02:48.947 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:48 localhost dnsmasq[320102]: read /var/lib/neutron/dhcp/5d21dfd3-676b-45ed-9c36-2d6f0bfc9f06/addn_hosts - 0 addresses Nov 26 05:02:48 localhost dnsmasq-dhcp[320102]: read /var/lib/neutron/dhcp/5d21dfd3-676b-45ed-9c36-2d6f0bfc9f06/host Nov 26 05:02:48 localhost podman[320522]: 2025-11-26 10:02:48.948391308 +0000 UTC m=+0.099577317 container kill 3567a1af4b5fd93772a46c31d82ad49712cff10ed133b2f4646b1f84b060f91a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d21dfd3-676b-45ed-9c36-2d6f0bfc9f06, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0) Nov 26 05:02:48 localhost kernel: device tap33e150bc-3f left promiscuous mode Nov 26 05:02:48 localhost dnsmasq-dhcp[320102]: read /var/lib/neutron/dhcp/5d21dfd3-676b-45ed-9c36-2d6f0bfc9f06/opts Nov 26 05:02:48 localhost nova_compute[281415]: 2025-11-26 10:02:48.958 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:48 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:48.977 262471 INFO neutron.agent.dhcp.agent [None req-29ba5edb-ceee-4846-883e-7bce9d0a4c5c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:02:49 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:02:49 localhost ovn_controller[153664]: 2025-11-26T10:02:49Z|00342|binding|INFO|Releasing lport 10dfb5e4-9c8b-465e-ad19-d1f21f7259ab from this chassis (sb_readonly=0) Nov 26 05:02:49 localhost nova_compute[281415]: 2025-11-26 10:02:49.180 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:49 localhost kernel: device tap10dfb5e4-9c left promiscuous mode Nov 26 05:02:49 localhost ovn_controller[153664]: 2025-11-26T10:02:49Z|00343|binding|INFO|Setting lport 10dfb5e4-9c8b-465e-ad19-d1f21f7259ab down in Southbound Nov 26 05:02:49 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:49.191 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-5d21dfd3-676b-45ed-9c36-2d6f0bfc9f06', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5d21dfd3-676b-45ed-9c36-2d6f0bfc9f06', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0b275d5fc6d14aa0a6fd7bf9cf5d748e', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8d41971c-67fe-45e3-af0f-8b1ac08af602, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=10dfb5e4-9c8b-465e-ad19-d1f21f7259ab) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:02:49 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:49.193 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 10dfb5e4-9c8b-465e-ad19-d1f21f7259ab in datapath 5d21dfd3-676b-45ed-9c36-2d6f0bfc9f06 unbound from our chassis#033[00m Nov 26 05:02:49 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:49.195 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5d21dfd3-676b-45ed-9c36-2d6f0bfc9f06, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:02:49 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:49.196 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[f194b8df-3ac3-43b1-a43a-26165b3f07f3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:02:49 localhost nova_compute[281415]: 2025-11-26 10:02:49.207 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:49 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:49.434 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:02:49 localhost systemd[1]: tmp-crun.1Z1e8b.mount: Deactivated successfully. Nov 26 05:02:49 localhost systemd[1]: var-lib-containers-storage-overlay-38781ffbcdadc1e0b4672942671cd276e62efdc06bb8991accdbac5831cafd76-merged.mount: Deactivated successfully. Nov 26 05:02:49 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5258d57d5b796013b048396563c56ec41a779aa0867ab3d12dbe514d936dbde8-userdata-shm.mount: Deactivated successfully. Nov 26 05:02:49 localhost systemd[1]: run-netns-qdhcp\x2d95c06944\x2dbfb3\x2d4728\x2d916a\x2dc0571cdd5f7f.mount: Deactivated successfully. Nov 26 05:02:49 localhost ovn_controller[153664]: 2025-11-26T10:02:49Z|00344|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:02:49 localhost nova_compute[281415]: 2025-11-26 10:02:49.788 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:50 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:50.390 2 INFO neutron.agent.securitygroups_rpc [None req-7058c171-bb7f-44db-af3f-a78feebd22b1 bf5696d8db9f4b2795974d3ed78b4991 c9a6d35bfc5f440e9fdc4ed36d883eff - - default default] Security group member updated ['512f55ba-befd-448e-8449-d75d9733402e']#033[00m Nov 26 05:02:50 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e143 e143: 6 total, 6 up, 6 in Nov 26 05:02:51 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:51.511 2 INFO neutron.agent.securitygroups_rpc [None req-d877f576-d573-4928-a014-0e90eb4bc0c0 bf5696d8db9f4b2795974d3ed78b4991 c9a6d35bfc5f440e9fdc4ed36d883eff - - default default] Security group member updated ['512f55ba-befd-448e-8449-d75d9733402e']#033[00m Nov 26 05:02:51 localhost nova_compute[281415]: 2025-11-26 10:02:51.544 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:52 localhost nova_compute[281415]: 2025-11-26 10:02:52.783 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:53 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:53.005 2 INFO neutron.agent.securitygroups_rpc [None req-69150c1c-1673-4a92-8808-26d9c96d587e bf5696d8db9f4b2795974d3ed78b4991 c9a6d35bfc5f440e9fdc4ed36d883eff - - default default] Security group member updated ['512f55ba-befd-448e-8449-d75d9733402e']#033[00m Nov 26 05:02:53 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e144 e144: 6 total, 6 up, 6 in Nov 26 05:02:53 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 26 05:02:53 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3385841575' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 26 05:02:53 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 26 05:02:53 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3385841575' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 26 05:02:53 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:53.887 2 INFO neutron.agent.securitygroups_rpc [None req-bbdbe133-46bb-4ec8-b9e7-0c014fbd46fb 02d34db97a834b479b13de621b958dc9 d0172107fdd34e19bdc10052ccfc7415 - - default default] Security group rule updated ['9b3021a0-0d1a-427c-b6d8-d2c2ad8ab561']#033[00m Nov 26 05:02:54 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:02:54 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:54.199 262471 INFO neutron.agent.linux.ip_lib [None req-61550d8a-34b7-4a8d-a7eb-c8b0f07c961c - - - - - -] Device tap6db7be84-38 cannot be used as it has no MAC address#033[00m Nov 26 05:02:54 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:54.208 2 INFO neutron.agent.securitygroups_rpc [None req-0dbf26b2-f979-4a73-ab85-f4073e07dde5 02d34db97a834b479b13de621b958dc9 d0172107fdd34e19bdc10052ccfc7415 - - default default] Security group rule updated ['9b3021a0-0d1a-427c-b6d8-d2c2ad8ab561']#033[00m Nov 26 05:02:54 localhost nova_compute[281415]: 2025-11-26 10:02:54.227 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:54 localhost kernel: device tap6db7be84-38 entered promiscuous mode Nov 26 05:02:54 localhost nova_compute[281415]: 2025-11-26 10:02:54.236 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:54 localhost NetworkManager[5970]: [1764151374.2372] manager: (tap6db7be84-38): new Generic device (/org/freedesktop/NetworkManager/Devices/56) Nov 26 05:02:54 localhost ovn_controller[153664]: 2025-11-26T10:02:54Z|00345|binding|INFO|Claiming lport 6db7be84-38d4-4396-9a18-472c4b4386d9 for this chassis. Nov 26 05:02:54 localhost ovn_controller[153664]: 2025-11-26T10:02:54Z|00346|binding|INFO|6db7be84-38d4-4396-9a18-472c4b4386d9: Claiming unknown Nov 26 05:02:54 localhost systemd-udevd[320555]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:02:54 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:54.254 2 INFO neutron.agent.securitygroups_rpc [None req-74f42de2-efc5-4b33-8124-9f2aa6f351ab bf5696d8db9f4b2795974d3ed78b4991 c9a6d35bfc5f440e9fdc4ed36d883eff - - default default] Security group member updated ['512f55ba-befd-448e-8449-d75d9733402e']#033[00m Nov 26 05:02:54 localhost ovn_controller[153664]: 2025-11-26T10:02:54Z|00347|binding|INFO|Setting lport 6db7be84-38d4-4396-9a18-472c4b4386d9 ovn-installed in OVS Nov 26 05:02:54 localhost journal[229445]: ethtool ioctl error on tap6db7be84-38: No such device Nov 26 05:02:54 localhost ovn_controller[153664]: 2025-11-26T10:02:54Z|00348|binding|INFO|Setting lport 6db7be84-38d4-4396-9a18-472c4b4386d9 up in Southbound Nov 26 05:02:54 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:54.276 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-2bd5a088-20bc-44fc-b6b5-c918d5d98faf', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2bd5a088-20bc-44fc-b6b5-c918d5d98faf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b4b4d6e653de42458dbb1d0be0428a0e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6d7f90ab-f6f9-4557-88bb-1180ecbebc81, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=6db7be84-38d4-4396-9a18-472c4b4386d9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:02:54 localhost nova_compute[281415]: 2025-11-26 10:02:54.277 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:54 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:54.279 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 6db7be84-38d4-4396-9a18-472c4b4386d9 in datapath 2bd5a088-20bc-44fc-b6b5-c918d5d98faf bound to our chassis#033[00m Nov 26 05:02:54 localhost journal[229445]: ethtool ioctl error on tap6db7be84-38: No such device Nov 26 05:02:54 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:54.281 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Port 2640658b-4b65-4a49-a658-1bcb13c8f6d2 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 26 05:02:54 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:54.281 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2bd5a088-20bc-44fc-b6b5-c918d5d98faf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:02:54 localhost ovn_metadata_agent[159481]: 2025-11-26 10:02:54.283 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[fdbbbd31-7137-4278-b9f3-409d4eecd081]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:02:54 localhost journal[229445]: ethtool ioctl error on tap6db7be84-38: No such device Nov 26 05:02:54 localhost journal[229445]: ethtool ioctl error on tap6db7be84-38: No such device Nov 26 05:02:54 localhost journal[229445]: ethtool ioctl error on tap6db7be84-38: No such device Nov 26 05:02:54 localhost journal[229445]: ethtool ioctl error on tap6db7be84-38: No such device Nov 26 05:02:54 localhost journal[229445]: ethtool ioctl error on tap6db7be84-38: No such device Nov 26 05:02:54 localhost journal[229445]: ethtool ioctl error on tap6db7be84-38: No such device Nov 26 05:02:54 localhost nova_compute[281415]: 2025-11-26 10:02:54.324 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:54 localhost nova_compute[281415]: 2025-11-26 10:02:54.361 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:55 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:55.155 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:02:54Z, description=, device_id=11fc9dbd-19ef-4c05-9396-91e6e2765ccc, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=13466cbe-8cef-45a9-a951-b534c7ea0734, ip_allocation=immediate, mac_address=fa:16:3e:07:cd:39, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:02:35Z, description=, dns_domain=, id=28eaeab7-5bbd-4432-b01d-9418570f7b99, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersNegativeTest-test-network-712405859, port_security_enabled=True, project_id=0b275d5fc6d14aa0a6fd7bf9cf5d748e, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=46481, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1887, status=ACTIVE, subnets=['d240e944-1bf4-4e9e-a7dc-1561c3afcf0b'], tags=[], tenant_id=0b275d5fc6d14aa0a6fd7bf9cf5d748e, updated_at=2025-11-26T10:02:36Z, vlan_transparent=None, network_id=28eaeab7-5bbd-4432-b01d-9418570f7b99, port_security_enabled=False, project_id=0b275d5fc6d14aa0a6fd7bf9cf5d748e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2008, status=DOWN, tags=[], tenant_id=0b275d5fc6d14aa0a6fd7bf9cf5d748e, updated_at=2025-11-26T10:02:54Z on network 28eaeab7-5bbd-4432-b01d-9418570f7b99#033[00m Nov 26 05:02:55 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:55.279 2 INFO neutron.agent.securitygroups_rpc [None req-6a7e2403-ba55-4715-85af-d8bed3e5d59b bf5696d8db9f4b2795974d3ed78b4991 c9a6d35bfc5f440e9fdc4ed36d883eff - - default default] Security group member updated ['512f55ba-befd-448e-8449-d75d9733402e']#033[00m Nov 26 05:02:55 localhost podman[320626]: Nov 26 05:02:55 localhost podman[320626]: 2025-11-26 10:02:55.336132733 +0000 UTC m=+0.112156678 container create 0962e2ccaea9b1323ecd4f9df03ccab9f6459ff91ac66260ff0583d7baa1e82d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2bd5a088-20bc-44fc-b6b5-c918d5d98faf, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0) Nov 26 05:02:55 localhost podman[320626]: 2025-11-26 10:02:55.282287845 +0000 UTC m=+0.058311820 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:02:55 localhost systemd[1]: Started libpod-conmon-0962e2ccaea9b1323ecd4f9df03ccab9f6459ff91ac66260ff0583d7baa1e82d.scope. Nov 26 05:02:55 localhost systemd[1]: tmp-crun.1o5aSE.mount: Deactivated successfully. Nov 26 05:02:55 localhost systemd[1]: Started libcrun container. Nov 26 05:02:55 localhost dnsmasq[319820]: read /var/lib/neutron/dhcp/28eaeab7-5bbd-4432-b01d-9418570f7b99/addn_hosts - 1 addresses Nov 26 05:02:55 localhost dnsmasq-dhcp[319820]: read /var/lib/neutron/dhcp/28eaeab7-5bbd-4432-b01d-9418570f7b99/host Nov 26 05:02:55 localhost dnsmasq-dhcp[319820]: read /var/lib/neutron/dhcp/28eaeab7-5bbd-4432-b01d-9418570f7b99/opts Nov 26 05:02:55 localhost podman[320656]: 2025-11-26 10:02:55.426452688 +0000 UTC m=+0.074963142 container kill 330f327f462c84c2b7114e07c5805bf40489aa164b28888bcf85455e2c902a02 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-28eaeab7-5bbd-4432-b01d-9418570f7b99, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118) Nov 26 05:02:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ac4177f4008bd2cdcddf91b223dcfbad322079d4684847f69f77f51d761bdd67/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:02:55 localhost podman[320626]: 2025-11-26 10:02:55.442876481 +0000 UTC m=+0.218900426 container init 0962e2ccaea9b1323ecd4f9df03ccab9f6459ff91ac66260ff0583d7baa1e82d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2bd5a088-20bc-44fc-b6b5-c918d5d98faf, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:02:55 localhost podman[320626]: 2025-11-26 10:02:55.452677611 +0000 UTC m=+0.228701556 container start 0962e2ccaea9b1323ecd4f9df03ccab9f6459ff91ac66260ff0583d7baa1e82d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2bd5a088-20bc-44fc-b6b5-c918d5d98faf, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3) Nov 26 05:02:55 localhost dnsmasq[320676]: started, version 2.85 cachesize 150 Nov 26 05:02:55 localhost dnsmasq[320676]: DNS service limited to local subnets Nov 26 05:02:55 localhost dnsmasq[320676]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:02:55 localhost dnsmasq[320676]: warning: no upstream servers configured Nov 26 05:02:55 localhost dnsmasq-dhcp[320676]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 26 05:02:55 localhost dnsmasq[320676]: read /var/lib/neutron/dhcp/2bd5a088-20bc-44fc-b6b5-c918d5d98faf/addn_hosts - 0 addresses Nov 26 05:02:55 localhost dnsmasq-dhcp[320676]: read /var/lib/neutron/dhcp/2bd5a088-20bc-44fc-b6b5-c918d5d98faf/host Nov 26 05:02:55 localhost dnsmasq-dhcp[320676]: read /var/lib/neutron/dhcp/2bd5a088-20bc-44fc-b6b5-c918d5d98faf/opts Nov 26 05:02:55 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:55.703 262471 INFO neutron.agent.dhcp.agent [None req-6baf9320-d461-497d-8ce1-6caa85c32fbd - - - - - -] DHCP configuration for ports {'8a75211c-5209-429f-9863-58857a805b27'} is completed#033[00m Nov 26 05:02:55 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:55.718 2 INFO neutron.agent.securitygroups_rpc [None req-1b558f29-ba07-4fcf-8be0-1da9a974def4 02d34db97a834b479b13de621b958dc9 d0172107fdd34e19bdc10052ccfc7415 - - default default] Security group rule updated ['a451f80e-a081-4949-bfca-19208d633660']#033[00m Nov 26 05:02:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:56.046 262471 INFO neutron.agent.dhcp.agent [None req-c048b581-9770-4ce2-84c7-11f339c391b9 - - - - - -] DHCP configuration for ports {'13466cbe-8cef-45a9-a951-b534c7ea0734'} is completed#033[00m Nov 26 05:02:56 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:56.429 2 INFO neutron.agent.securitygroups_rpc [None req-5bdadc5d-fdb6-4632-b7ae-856496e2482e 02d34db97a834b479b13de621b958dc9 d0172107fdd34e19bdc10052ccfc7415 - - default default] Security group rule updated ['a451f80e-a081-4949-bfca-19208d633660']#033[00m Nov 26 05:02:56 localhost nova_compute[281415]: 2025-11-26 10:02:56.586 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:56 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:56.891 2 INFO neutron.agent.securitygroups_rpc [None req-adf075b8-82ee-4a3f-99ad-2ff383e326c8 02d34db97a834b479b13de621b958dc9 d0172107fdd34e19bdc10052ccfc7415 - - default default] Security group rule updated ['a451f80e-a081-4949-bfca-19208d633660']#033[00m Nov 26 05:02:56 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:56.969 2 INFO neutron.agent.securitygroups_rpc [None req-5f3a1202-3dbb-4f4b-8627-57d6862936ea bf5696d8db9f4b2795974d3ed78b4991 c9a6d35bfc5f440e9fdc4ed36d883eff - - default default] Security group member updated ['512f55ba-befd-448e-8449-d75d9733402e']#033[00m Nov 26 05:02:57 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:57.115 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:02:54Z, description=, device_id=11fc9dbd-19ef-4c05-9396-91e6e2765ccc, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=13466cbe-8cef-45a9-a951-b534c7ea0734, ip_allocation=immediate, mac_address=fa:16:3e:07:cd:39, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:02:35Z, description=, dns_domain=, id=28eaeab7-5bbd-4432-b01d-9418570f7b99, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersNegativeTest-test-network-712405859, port_security_enabled=True, project_id=0b275d5fc6d14aa0a6fd7bf9cf5d748e, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=46481, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1887, status=ACTIVE, subnets=['d240e944-1bf4-4e9e-a7dc-1561c3afcf0b'], tags=[], tenant_id=0b275d5fc6d14aa0a6fd7bf9cf5d748e, updated_at=2025-11-26T10:02:36Z, vlan_transparent=None, network_id=28eaeab7-5bbd-4432-b01d-9418570f7b99, port_security_enabled=False, project_id=0b275d5fc6d14aa0a6fd7bf9cf5d748e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2008, status=DOWN, tags=[], tenant_id=0b275d5fc6d14aa0a6fd7bf9cf5d748e, updated_at=2025-11-26T10:02:54Z on network 28eaeab7-5bbd-4432-b01d-9418570f7b99#033[00m Nov 26 05:02:57 localhost dnsmasq[319820]: read /var/lib/neutron/dhcp/28eaeab7-5bbd-4432-b01d-9418570f7b99/addn_hosts - 1 addresses Nov 26 05:02:57 localhost dnsmasq-dhcp[319820]: read /var/lib/neutron/dhcp/28eaeab7-5bbd-4432-b01d-9418570f7b99/host Nov 26 05:02:57 localhost podman[320701]: 2025-11-26 10:02:57.359165426 +0000 UTC m=+0.063300118 container kill 330f327f462c84c2b7114e07c5805bf40489aa164b28888bcf85455e2c902a02 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-28eaeab7-5bbd-4432-b01d-9418570f7b99, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:02:57 localhost dnsmasq-dhcp[319820]: read /var/lib/neutron/dhcp/28eaeab7-5bbd-4432-b01d-9418570f7b99/opts Nov 26 05:02:57 localhost podman[240049]: time="2025-11-26T10:02:57Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 05:02:57 localhost podman[240049]: @ - - [26/Nov/2025:10:02:57 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 159335 "" "Go-http-client/1.1" Nov 26 05:02:57 localhost podman[240049]: @ - - [26/Nov/2025:10:02:57 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20191 "" "Go-http-client/1.1" Nov 26 05:02:57 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:57.691 262471 INFO neutron.agent.dhcp.agent [None req-c8bb3661-3292-44c4-b1ad-2d410988818d - - - - - -] DHCP configuration for ports {'13466cbe-8cef-45a9-a951-b534c7ea0734'} is completed#033[00m Nov 26 05:02:57 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:57.737 2 INFO neutron.agent.securitygroups_rpc [None req-aaad9e65-15e7-4bdb-b919-3c9aa0c45dc8 02d34db97a834b479b13de621b958dc9 d0172107fdd34e19bdc10052ccfc7415 - - default default] Security group rule updated ['a451f80e-a081-4949-bfca-19208d633660']#033[00m Nov 26 05:02:57 localhost nova_compute[281415]: 2025-11-26 10:02:57.814 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:02:58 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:58.250 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:02:57Z, description=, device_id=5292bbea-4901-44d7-bf2b-bda2f7edd9f0, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=de6e1dc9-f105-4cc1-8702-177c68ea5bdd, ip_allocation=immediate, mac_address=fa:16:3e:fc:ea:ca, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:02:49Z, description=, dns_domain=, id=2bd5a088-20bc-44fc-b6b5-c918d5d98faf, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPTestJSON-2057415237, port_security_enabled=True, project_id=b4b4d6e653de42458dbb1d0be0428a0e, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=10450, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1983, status=ACTIVE, subnets=['cb27faab-5cb7-4d33-997c-033cfef32c47'], tags=[], tenant_id=b4b4d6e653de42458dbb1d0be0428a0e, updated_at=2025-11-26T10:02:51Z, vlan_transparent=None, network_id=2bd5a088-20bc-44fc-b6b5-c918d5d98faf, port_security_enabled=False, project_id=b4b4d6e653de42458dbb1d0be0428a0e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2028, status=DOWN, tags=[], tenant_id=b4b4d6e653de42458dbb1d0be0428a0e, updated_at=2025-11-26T10:02:58Z on network 2bd5a088-20bc-44fc-b6b5-c918d5d98faf#033[00m Nov 26 05:02:58 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e145 e145: 6 total, 6 up, 6 in Nov 26 05:02:58 localhost podman[320737]: 2025-11-26 10:02:58.543142083 +0000 UTC m=+0.070846330 container kill 0962e2ccaea9b1323ecd4f9df03ccab9f6459ff91ac66260ff0583d7baa1e82d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2bd5a088-20bc-44fc-b6b5-c918d5d98faf, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:02:58 localhost dnsmasq[320676]: read /var/lib/neutron/dhcp/2bd5a088-20bc-44fc-b6b5-c918d5d98faf/addn_hosts - 1 addresses Nov 26 05:02:58 localhost dnsmasq-dhcp[320676]: read /var/lib/neutron/dhcp/2bd5a088-20bc-44fc-b6b5-c918d5d98faf/host Nov 26 05:02:58 localhost dnsmasq-dhcp[320676]: read /var/lib/neutron/dhcp/2bd5a088-20bc-44fc-b6b5-c918d5d98faf/opts Nov 26 05:02:58 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:58.824 2 INFO neutron.agent.securitygroups_rpc [None req-297a500d-fcfd-491f-a3bf-16795e52e12c 02d34db97a834b479b13de621b958dc9 d0172107fdd34e19bdc10052ccfc7415 - - default default] Security group rule updated ['a451f80e-a081-4949-bfca-19208d633660']#033[00m Nov 26 05:02:58 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:58.868 262471 INFO neutron.agent.dhcp.agent [None req-b8043700-3ee9-4ba7-a363-a1fd88fe9624 - - - - - -] DHCP configuration for ports {'de6e1dc9-f105-4cc1-8702-177c68ea5bdd'} is completed#033[00m Nov 26 05:02:59 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:59.025 2 INFO neutron.agent.securitygroups_rpc [None req-5700db8e-a0bb-4eda-a625-2adff166c979 02d34db97a834b479b13de621b958dc9 d0172107fdd34e19bdc10052ccfc7415 - - default default] Security group rule updated ['a451f80e-a081-4949-bfca-19208d633660']#033[00m Nov 26 05:02:59 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:59.171 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:02:57Z, description=, device_id=5292bbea-4901-44d7-bf2b-bda2f7edd9f0, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=de6e1dc9-f105-4cc1-8702-177c68ea5bdd, ip_allocation=immediate, mac_address=fa:16:3e:fc:ea:ca, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:02:49Z, description=, dns_domain=, id=2bd5a088-20bc-44fc-b6b5-c918d5d98faf, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPTestJSON-2057415237, port_security_enabled=True, project_id=b4b4d6e653de42458dbb1d0be0428a0e, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=10450, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1983, status=ACTIVE, subnets=['cb27faab-5cb7-4d33-997c-033cfef32c47'], tags=[], tenant_id=b4b4d6e653de42458dbb1d0be0428a0e, updated_at=2025-11-26T10:02:51Z, vlan_transparent=None, network_id=2bd5a088-20bc-44fc-b6b5-c918d5d98faf, port_security_enabled=False, project_id=b4b4d6e653de42458dbb1d0be0428a0e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2028, status=DOWN, tags=[], tenant_id=b4b4d6e653de42458dbb1d0be0428a0e, updated_at=2025-11-26T10:02:58Z on network 2bd5a088-20bc-44fc-b6b5-c918d5d98faf#033[00m Nov 26 05:02:59 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:02:59 localhost neutron_sriov_agent[255515]: 2025-11-26 10:02:59.305 2 INFO neutron.agent.securitygroups_rpc [None req-c0bb43be-fe42-400c-9836-57e3f17ed579 02d34db97a834b479b13de621b958dc9 d0172107fdd34e19bdc10052ccfc7415 - - default default] Security group rule updated ['a451f80e-a081-4949-bfca-19208d633660']#033[00m Nov 26 05:02:59 localhost systemd[1]: tmp-crun.I8volI.mount: Deactivated successfully. Nov 26 05:02:59 localhost dnsmasq[320676]: read /var/lib/neutron/dhcp/2bd5a088-20bc-44fc-b6b5-c918d5d98faf/addn_hosts - 1 addresses Nov 26 05:02:59 localhost dnsmasq-dhcp[320676]: read /var/lib/neutron/dhcp/2bd5a088-20bc-44fc-b6b5-c918d5d98faf/host Nov 26 05:02:59 localhost dnsmasq-dhcp[320676]: read /var/lib/neutron/dhcp/2bd5a088-20bc-44fc-b6b5-c918d5d98faf/opts Nov 26 05:02:59 localhost podman[320812]: 2025-11-26 10:02:59.416873181 +0000 UTC m=+0.078684471 container kill 0962e2ccaea9b1323ecd4f9df03ccab9f6459ff91ac66260ff0583d7baa1e82d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2bd5a088-20bc-44fc-b6b5-c918d5d98faf, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Nov 26 05:02:59 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:02:59.709 262471 INFO neutron.agent.dhcp.agent [None req-70288b58-a5f9-4577-a51a-0bf5b74f5536 - - - - - -] DHCP configuration for ports {'de6e1dc9-f105-4cc1-8702-177c68ea5bdd'} is completed#033[00m Nov 26 05:03:00 localhost dnsmasq[319820]: read /var/lib/neutron/dhcp/28eaeab7-5bbd-4432-b01d-9418570f7b99/addn_hosts - 0 addresses Nov 26 05:03:00 localhost dnsmasq-dhcp[319820]: read /var/lib/neutron/dhcp/28eaeab7-5bbd-4432-b01d-9418570f7b99/host Nov 26 05:03:00 localhost podman[320895]: 2025-11-26 10:03:00.158192784 +0000 UTC m=+0.071902562 container kill 330f327f462c84c2b7114e07c5805bf40489aa164b28888bcf85455e2c902a02 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-28eaeab7-5bbd-4432-b01d-9418570f7b99, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:03:00 localhost dnsmasq-dhcp[319820]: read /var/lib/neutron/dhcp/28eaeab7-5bbd-4432-b01d-9418570f7b99/opts Nov 26 05:03:00 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:00.179 2 INFO neutron.agent.securitygroups_rpc [None req-91dd1f98-7c11-467d-bb54-811097433e0e 02d34db97a834b479b13de621b958dc9 d0172107fdd34e19bdc10052ccfc7415 - - default default] Security group rule updated ['a451f80e-a081-4949-bfca-19208d633660']#033[00m Nov 26 05:03:00 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 05:03:00 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:03:00 localhost ovn_controller[153664]: 2025-11-26T10:03:00Z|00349|binding|INFO|Releasing lport 0e740171-68cc-4e00-a03e-23261db33f63 from this chassis (sb_readonly=0) Nov 26 05:03:00 localhost ovn_controller[153664]: 2025-11-26T10:03:00Z|00350|binding|INFO|Setting lport 0e740171-68cc-4e00-a03e-23261db33f63 down in Southbound Nov 26 05:03:00 localhost nova_compute[281415]: 2025-11-26 10:03:00.607 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:00 localhost kernel: device tap0e740171-68 left promiscuous mode Nov 26 05:03:00 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:00.621 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-28eaeab7-5bbd-4432-b01d-9418570f7b99', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-28eaeab7-5bbd-4432-b01d-9418570f7b99', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0b275d5fc6d14aa0a6fd7bf9cf5d748e', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=706617a3-34ce-4eb9-8077-b25188fc93f4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=0e740171-68cc-4e00-a03e-23261db33f63) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:03:00 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:00.623 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 0e740171-68cc-4e00-a03e-23261db33f63 in datapath 28eaeab7-5bbd-4432-b01d-9418570f7b99 unbound from our chassis#033[00m Nov 26 05:03:00 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:00.624 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 28eaeab7-5bbd-4432-b01d-9418570f7b99, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:03:00 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:00.625 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[e806c03c-d4e9-4d15-8a23-a32083522358]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:03:00 localhost nova_compute[281415]: 2025-11-26 10:03:00.637 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:00 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:00.885 2 INFO neutron.agent.securitygroups_rpc [None req-8a701f5a-37b8-43eb-b1bc-bcd2c2da6440 02d34db97a834b479b13de621b958dc9 d0172107fdd34e19bdc10052ccfc7415 - - default default] Security group rule updated ['a451f80e-a081-4949-bfca-19208d633660']#033[00m Nov 26 05:03:01 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:01.037 2 INFO neutron.agent.securitygroups_rpc [None req-741810af-84bd-4d7b-ac01-624d607599fa 6e007ecff2d54f3d96b4ca4d0583f705 b4b4d6e653de42458dbb1d0be0428a0e - - default default] Security group member updated ['6132014c-a03b-42fa-9169-0b75f723efcc']#033[00m Nov 26 05:03:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:01.143 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:03:00Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=84910499-d75e-4f60-a229-ef617c0f4da3, ip_allocation=immediate, mac_address=fa:16:3e:d9:59:d2, name=tempest-FloatingIPTestJSON-1321446764, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:02:49Z, description=, dns_domain=, id=2bd5a088-20bc-44fc-b6b5-c918d5d98faf, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPTestJSON-2057415237, port_security_enabled=True, project_id=b4b4d6e653de42458dbb1d0be0428a0e, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=10450, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1983, status=ACTIVE, subnets=['cb27faab-5cb7-4d33-997c-033cfef32c47'], tags=[], tenant_id=b4b4d6e653de42458dbb1d0be0428a0e, updated_at=2025-11-26T10:02:51Z, vlan_transparent=None, network_id=2bd5a088-20bc-44fc-b6b5-c918d5d98faf, port_security_enabled=True, project_id=b4b4d6e653de42458dbb1d0be0428a0e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['6132014c-a03b-42fa-9169-0b75f723efcc'], standard_attr_id=2047, status=DOWN, tags=[], tenant_id=b4b4d6e653de42458dbb1d0be0428a0e, updated_at=2025-11-26T10:03:00Z on network 2bd5a088-20bc-44fc-b6b5-c918d5d98faf#033[00m Nov 26 05:03:01 localhost dnsmasq[320676]: read /var/lib/neutron/dhcp/2bd5a088-20bc-44fc-b6b5-c918d5d98faf/addn_hosts - 2 addresses Nov 26 05:03:01 localhost dnsmasq-dhcp[320676]: read /var/lib/neutron/dhcp/2bd5a088-20bc-44fc-b6b5-c918d5d98faf/host Nov 26 05:03:01 localhost dnsmasq-dhcp[320676]: read /var/lib/neutron/dhcp/2bd5a088-20bc-44fc-b6b5-c918d5d98faf/opts Nov 26 05:03:01 localhost podman[320939]: 2025-11-26 10:03:01.367998673 +0000 UTC m=+0.048730558 container kill 0962e2ccaea9b1323ecd4f9df03ccab9f6459ff91ac66260ff0583d7baa1e82d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2bd5a088-20bc-44fc-b6b5-c918d5d98faf, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:03:01 localhost nova_compute[281415]: 2025-11-26 10:03:01.620 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:01.672 262471 INFO neutron.agent.dhcp.agent [None req-ca523065-b9d9-4b42-b885-d5051d9b4242 - - - - - -] DHCP configuration for ports {'84910499-d75e-4f60-a229-ef617c0f4da3'} is completed#033[00m Nov 26 05:03:01 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:01.722 2 INFO neutron.agent.securitygroups_rpc [None req-6fd92bf1-9280-4b31-bd52-32f1b9942540 02d34db97a834b479b13de621b958dc9 d0172107fdd34e19bdc10052ccfc7415 - - default default] Security group rule updated ['a451f80e-a081-4949-bfca-19208d633660']#033[00m Nov 26 05:03:01 localhost nova_compute[281415]: 2025-11-26 10:03:01.728 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:03:01 localhost nova_compute[281415]: 2025-11-26 10:03:01.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:03:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 05:03:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 05:03:02 localhost nova_compute[281415]: 2025-11-26 10:03:02.860 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:02 localhost podman[320961]: 2025-11-26 10:03:02.911414961 +0000 UTC m=+0.159858536 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 26 05:03:02 localhost podman[320960]: 2025-11-26 10:03:02.879470789 +0000 UTC m=+0.130483629 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 26 05:03:02 localhost podman[320961]: 2025-11-26 10:03:02.951448002 +0000 UTC m=+0.199891577 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS) Nov 26 05:03:02 localhost podman[320960]: 2025-11-26 10:03:02.959440118 +0000 UTC m=+0.210452958 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 05:03:02 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 05:03:02 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 05:03:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:03.669 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:03:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:03.669 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:03:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:03.670 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:03:03 localhost nova_compute[281415]: 2025-11-26 10:03:03.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:03:03 localhost nova_compute[281415]: 2025-11-26 10:03:03.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:03:04 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:03:04 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:03:04 localhost nova_compute[281415]: 2025-11-26 10:03:04.846 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:03:04 localhost nova_compute[281415]: 2025-11-26 10:03:04.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:03:04 localhost nova_compute[281415]: 2025-11-26 10:03:04.848 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 05:03:05 localhost nova_compute[281415]: 2025-11-26 10:03:05.004 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:05 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:05.574 2 INFO neutron.agent.securitygroups_rpc [None req-0438bfe8-d49d-4b4e-9419-ef8a2f0af921 02d34db97a834b479b13de621b958dc9 d0172107fdd34e19bdc10052ccfc7415 - - default default] Security group rule updated ['e6191f90-8dcc-404e-96e3-f464f990a451']#033[00m Nov 26 05:03:05 localhost nova_compute[281415]: 2025-11-26 10:03:05.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:03:05 localhost nova_compute[281415]: 2025-11-26 10:03:05.849 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:03:05 localhost nova_compute[281415]: 2025-11-26 10:03:05.876 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:03:05 localhost nova_compute[281415]: 2025-11-26 10:03:05.876 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:03:05 localhost nova_compute[281415]: 2025-11-26 10:03:05.876 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:03:05 localhost nova_compute[281415]: 2025-11-26 10:03:05.877 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 05:03:05 localhost nova_compute[281415]: 2025-11-26 10:03:05.877 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 05:03:06 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 05:03:06 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1771561668' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 05:03:06 localhost nova_compute[281415]: 2025-11-26 10:03:06.420 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.542s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 05:03:06 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:03:06 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:03:06 localhost nova_compute[281415]: 2025-11-26 10:03:06.498 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 05:03:06 localhost nova_compute[281415]: 2025-11-26 10:03:06.498 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 05:03:06 localhost nova_compute[281415]: 2025-11-26 10:03:06.663 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:06 localhost nova_compute[281415]: 2025-11-26 10:03:06.769 281419 WARNING nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 05:03:06 localhost nova_compute[281415]: 2025-11-26 10:03:06.770 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=11262MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 05:03:06 localhost nova_compute[281415]: 2025-11-26 10:03:06.771 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:03:06 localhost nova_compute[281415]: 2025-11-26 10:03:06.771 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:03:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:06.775 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:03:06 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:06.785 2 INFO neutron.agent.securitygroups_rpc [None req-5746c39a-6df0-4e72-82b8-88c26f00da52 6e007ecff2d54f3d96b4ca4d0583f705 b4b4d6e653de42458dbb1d0be0428a0e - - default default] Security group member updated ['6132014c-a03b-42fa-9169-0b75f723efcc']#033[00m Nov 26 05:03:06 localhost nova_compute[281415]: 2025-11-26 10:03:06.868 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 05:03:06 localhost nova_compute[281415]: 2025-11-26 10:03:06.868 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 05:03:06 localhost nova_compute[281415]: 2025-11-26 10:03:06.869 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 05:03:06 localhost nova_compute[281415]: 2025-11-26 10:03:06.921 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 05:03:06 localhost nova_compute[281415]: 2025-11-26 10:03:06.944 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:07 localhost dnsmasq[320676]: read /var/lib/neutron/dhcp/2bd5a088-20bc-44fc-b6b5-c918d5d98faf/addn_hosts - 1 addresses Nov 26 05:03:07 localhost podman[321043]: 2025-11-26 10:03:07.0790296 +0000 UTC m=+0.069593663 container kill 0962e2ccaea9b1323ecd4f9df03ccab9f6459ff91ac66260ff0583d7baa1e82d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2bd5a088-20bc-44fc-b6b5-c918d5d98faf, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 26 05:03:07 localhost dnsmasq-dhcp[320676]: read /var/lib/neutron/dhcp/2bd5a088-20bc-44fc-b6b5-c918d5d98faf/host Nov 26 05:03:07 localhost dnsmasq-dhcp[320676]: read /var/lib/neutron/dhcp/2bd5a088-20bc-44fc-b6b5-c918d5d98faf/opts Nov 26 05:03:07 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 05:03:07 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3893680136' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 05:03:07 localhost nova_compute[281415]: 2025-11-26 10:03:07.392 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 05:03:07 localhost nova_compute[281415]: 2025-11-26 10:03:07.399 281419 DEBUG nova.compute.provider_tree [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 05:03:07 localhost nova_compute[281415]: 2025-11-26 10:03:07.427 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 05:03:07 localhost nova_compute[281415]: 2025-11-26 10:03:07.429 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 05:03:07 localhost nova_compute[281415]: 2025-11-26 10:03:07.430 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.659s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:03:07 localhost podman[321099]: 2025-11-26 10:03:07.448765254 +0000 UTC m=+0.060533295 container kill 3567a1af4b5fd93772a46c31d82ad49712cff10ed133b2f4646b1f84b060f91a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d21dfd3-676b-45ed-9c36-2d6f0bfc9f06, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:03:07 localhost dnsmasq[320102]: exiting on receipt of SIGTERM Nov 26 05:03:07 localhost systemd[1]: libpod-3567a1af4b5fd93772a46c31d82ad49712cff10ed133b2f4646b1f84b060f91a.scope: Deactivated successfully. Nov 26 05:03:07 localhost podman[321115]: 2025-11-26 10:03:07.520177201 +0000 UTC m=+0.053293153 container died 3567a1af4b5fd93772a46c31d82ad49712cff10ed133b2f4646b1f84b060f91a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d21dfd3-676b-45ed-9c36-2d6f0bfc9f06, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:03:07 localhost podman[321115]: 2025-11-26 10:03:07.558449009 +0000 UTC m=+0.091564951 container cleanup 3567a1af4b5fd93772a46c31d82ad49712cff10ed133b2f4646b1f84b060f91a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d21dfd3-676b-45ed-9c36-2d6f0bfc9f06, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 26 05:03:07 localhost systemd[1]: libpod-conmon-3567a1af4b5fd93772a46c31d82ad49712cff10ed133b2f4646b1f84b060f91a.scope: Deactivated successfully. Nov 26 05:03:07 localhost podman[321116]: 2025-11-26 10:03:07.608488505 +0000 UTC m=+0.135505377 container remove 3567a1af4b5fd93772a46c31d82ad49712cff10ed133b2f4646b1f84b060f91a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d21dfd3-676b-45ed-9c36-2d6f0bfc9f06, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS) Nov 26 05:03:07 localhost nova_compute[281415]: 2025-11-26 10:03:07.893 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 05:03:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 05:03:08 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:08.027 262471 INFO neutron.agent.dhcp.agent [None req-a1d2bfaa-9222-4429-95ef-6ebb5f273fac - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:03:08 localhost systemd[1]: tmp-crun.A4svxt.mount: Deactivated successfully. Nov 26 05:03:08 localhost systemd[1]: var-lib-containers-storage-overlay-b4522744029507b27dcb165f126932b25c4a56eb81c6fcf579d2fdea5eb41895-merged.mount: Deactivated successfully. Nov 26 05:03:08 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3567a1af4b5fd93772a46c31d82ad49712cff10ed133b2f4646b1f84b060f91a-userdata-shm.mount: Deactivated successfully. Nov 26 05:03:08 localhost systemd[1]: run-netns-qdhcp\x2d5d21dfd3\x2d676b\x2d45ed\x2d9c36\x2d2d6f0bfc9f06.mount: Deactivated successfully. Nov 26 05:03:08 localhost systemd[1]: tmp-crun.WUhY1Q.mount: Deactivated successfully. Nov 26 05:03:08 localhost podman[321143]: 2025-11-26 10:03:08.143828102 +0000 UTC m=+0.147006266 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, architecture=x86_64, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, container_name=openstack_network_exporter, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible) Nov 26 05:03:08 localhost podman[321142]: 2025-11-26 10:03:08.106214594 +0000 UTC m=+0.113136659 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0) Nov 26 05:03:08 localhost podman[321142]: 2025-11-26 10:03:08.187251024 +0000 UTC m=+0.194173069 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:03:08 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 05:03:08 localhost podman[321143]: 2025-11-26 10:03:08.211269921 +0000 UTC m=+0.214448135 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, release=1755695350, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Nov 26 05:03:08 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 05:03:08 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:08.344 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:82:ef 10.100.0.19 10.100.0.3'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28 10.100.0.3/28', 'neutron:device_id': 'ovnmeta-5cc21cab-ff4d-43fe-9833-0f3737558817', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5cc21cab-ff4d-43fe-9833-0f3737558817', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c9a6d35bfc5f440e9fdc4ed36d883eff', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a8f071cf-61bd-40f5-a1c7-404348405c9d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=58453e1a-61e8-4d39-93a5-17a2cc5604c1) old=Port_Binding(mac=['fa:16:3e:b9:82:ef 10.100.0.3'], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ovnmeta-5cc21cab-ff4d-43fe-9833-0f3737558817', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5cc21cab-ff4d-43fe-9833-0f3737558817', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c9a6d35bfc5f440e9fdc4ed36d883eff', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:03:08 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:08.346 159486 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 58453e1a-61e8-4d39-93a5-17a2cc5604c1 in datapath 5cc21cab-ff4d-43fe-9833-0f3737558817 updated#033[00m Nov 26 05:03:08 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:08.349 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5cc21cab-ff4d-43fe-9833-0f3737558817, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:03:08 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:08.350 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[a70fd853-6bc6-460b-b16a-c3ae328f0fc9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:03:08 localhost nova_compute[281415]: 2025-11-26 10:03:08.743 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:08 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:08.744 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:5e:b4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '86:cf:7c:68:02:df'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:03:08 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:08.746 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 26 05:03:08 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:08.747 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fad182b-d1fd-4eb1-a4d3-436a76a6f49e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 05:03:09 localhost systemd[1]: tmp-crun.sb1czD.mount: Deactivated successfully. Nov 26 05:03:09 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:09.094 2 INFO neutron.agent.securitygroups_rpc [None req-1f4000a7-256a-4f94-840f-c1453dbefc5c 02d34db97a834b479b13de621b958dc9 d0172107fdd34e19bdc10052ccfc7415 - - default default] Security group rule updated ['a90cb5e4-b84b-44a9-a77a-6cdb757191dc']#033[00m Nov 26 05:03:09 localhost podman[321205]: 2025-11-26 10:03:09.128833792 +0000 UTC m=+0.067308796 container kill 0962e2ccaea9b1323ecd4f9df03ccab9f6459ff91ac66260ff0583d7baa1e82d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2bd5a088-20bc-44fc-b6b5-c918d5d98faf, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Nov 26 05:03:09 localhost dnsmasq[320676]: read /var/lib/neutron/dhcp/2bd5a088-20bc-44fc-b6b5-c918d5d98faf/addn_hosts - 0 addresses Nov 26 05:03:09 localhost dnsmasq-dhcp[320676]: read /var/lib/neutron/dhcp/2bd5a088-20bc-44fc-b6b5-c918d5d98faf/host Nov 26 05:03:09 localhost dnsmasq-dhcp[320676]: read /var/lib/neutron/dhcp/2bd5a088-20bc-44fc-b6b5-c918d5d98faf/opts Nov 26 05:03:09 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:09.146 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:03:09 localhost ovn_controller[153664]: 2025-11-26T10:03:09Z|00351|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:03:09 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:03:09 localhost nova_compute[281415]: 2025-11-26 10:03:09.221 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:09 localhost nova_compute[281415]: 2025-11-26 10:03:09.431 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:03:09 localhost nova_compute[281415]: 2025-11-26 10:03:09.431 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 05:03:09 localhost nova_compute[281415]: 2025-11-26 10:03:09.432 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 05:03:09 localhost ovn_controller[153664]: 2025-11-26T10:03:09Z|00352|binding|INFO|Releasing lport 6db7be84-38d4-4396-9a18-472c4b4386d9 from this chassis (sb_readonly=0) Nov 26 05:03:09 localhost kernel: device tap6db7be84-38 left promiscuous mode Nov 26 05:03:09 localhost ovn_controller[153664]: 2025-11-26T10:03:09Z|00353|binding|INFO|Setting lport 6db7be84-38d4-4396-9a18-472c4b4386d9 down in Southbound Nov 26 05:03:09 localhost nova_compute[281415]: 2025-11-26 10:03:09.455 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:09 localhost nova_compute[281415]: 2025-11-26 10:03:09.471 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:09 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:09.470 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-2bd5a088-20bc-44fc-b6b5-c918d5d98faf', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2bd5a088-20bc-44fc-b6b5-c918d5d98faf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b4b4d6e653de42458dbb1d0be0428a0e', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6d7f90ab-f6f9-4557-88bb-1180ecbebc81, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=6db7be84-38d4-4396-9a18-472c4b4386d9) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:03:09 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:09.473 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 6db7be84-38d4-4396-9a18-472c4b4386d9 in datapath 2bd5a088-20bc-44fc-b6b5-c918d5d98faf unbound from our chassis#033[00m Nov 26 05:03:09 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:09.476 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2bd5a088-20bc-44fc-b6b5-c918d5d98faf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:03:09 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:09.477 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[1a28e732-1292-41d6-895f-7364e9aa9ed6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:03:09 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:09.519 2 INFO neutron.agent.securitygroups_rpc [None req-2585e993-5ddd-46aa-840f-f956db4df03c 02d34db97a834b479b13de621b958dc9 d0172107fdd34e19bdc10052ccfc7415 - - default default] Security group rule updated ['a90cb5e4-b84b-44a9-a77a-6cdb757191dc']#033[00m Nov 26 05:03:09 localhost nova_compute[281415]: 2025-11-26 10:03:09.540 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 05:03:09 localhost nova_compute[281415]: 2025-11-26 10:03:09.540 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 05:03:09 localhost nova_compute[281415]: 2025-11-26 10:03:09.541 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 05:03:09 localhost nova_compute[281415]: 2025-11-26 10:03:09.541 281419 DEBUG nova.objects.instance [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 05:03:10 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:03:10 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:03:10 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:10.057 2 INFO neutron.agent.securitygroups_rpc [None req-165555de-40a2-4387-9c62-98fbafd4c002 bf5696d8db9f4b2795974d3ed78b4991 c9a6d35bfc5f440e9fdc4ed36d883eff - - default default] Security group member updated ['512f55ba-befd-448e-8449-d75d9733402e']#033[00m Nov 26 05:03:10 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 26 05:03:10 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4179144955' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 26 05:03:10 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 26 05:03:10 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4179144955' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 26 05:03:10 localhost systemd[1]: tmp-crun.K5lKjQ.mount: Deactivated successfully. Nov 26 05:03:10 localhost dnsmasq[319820]: exiting on receipt of SIGTERM Nov 26 05:03:10 localhost podman[321246]: 2025-11-26 10:03:10.872451754 +0000 UTC m=+0.071780857 container kill 330f327f462c84c2b7114e07c5805bf40489aa164b28888bcf85455e2c902a02 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-28eaeab7-5bbd-4432-b01d-9418570f7b99, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:03:10 localhost systemd[1]: libpod-330f327f462c84c2b7114e07c5805bf40489aa164b28888bcf85455e2c902a02.scope: Deactivated successfully. Nov 26 05:03:10 localhost podman[321262]: 2025-11-26 10:03:10.945099227 +0000 UTC m=+0.055622852 container died 330f327f462c84c2b7114e07c5805bf40489aa164b28888bcf85455e2c902a02 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-28eaeab7-5bbd-4432-b01d-9418570f7b99, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 26 05:03:10 localhost podman[321262]: 2025-11-26 10:03:10.975509674 +0000 UTC m=+0.086033259 container cleanup 330f327f462c84c2b7114e07c5805bf40489aa164b28888bcf85455e2c902a02 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-28eaeab7-5bbd-4432-b01d-9418570f7b99, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 26 05:03:10 localhost systemd[1]: libpod-conmon-330f327f462c84c2b7114e07c5805bf40489aa164b28888bcf85455e2c902a02.scope: Deactivated successfully. Nov 26 05:03:11 localhost podman[321263]: 2025-11-26 10:03:11.038178931 +0000 UTC m=+0.137557917 container remove 330f327f462c84c2b7114e07c5805bf40489aa164b28888bcf85455e2c902a02 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-28eaeab7-5bbd-4432-b01d-9418570f7b99, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 26 05:03:11 localhost ovn_controller[153664]: 2025-11-26T10:03:11Z|00354|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:03:11 localhost nova_compute[281415]: 2025-11-26 10:03:11.083 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:11 localhost nova_compute[281415]: 2025-11-26 10:03:11.110 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 05:03:11 localhost nova_compute[281415]: 2025-11-26 10:03:11.136 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 05:03:11 localhost nova_compute[281415]: 2025-11-26 10:03:11.136 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 05:03:11 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:11.253 262471 INFO neutron.agent.dhcp.agent [None req-c0def5c4-58f8-4937-8711-2fcd1bbd967e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:03:11 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:11.254 262471 INFO neutron.agent.dhcp.agent [None req-c0def5c4-58f8-4937-8711-2fcd1bbd967e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:03:11 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:11.435 2 INFO neutron.agent.securitygroups_rpc [None req-2c8d9faf-d3d9-4b32-a145-031b0f6cf4db bf5696d8db9f4b2795974d3ed78b4991 c9a6d35bfc5f440e9fdc4ed36d883eff - - default default] Security group member updated ['512f55ba-befd-448e-8449-d75d9733402e']#033[00m Nov 26 05:03:11 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:11.532 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:03:11 localhost nova_compute[281415]: 2025-11-26 10:03:11.699 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:11 localhost systemd[1]: var-lib-containers-storage-overlay-11d9cd093715b5b5a080bd9cb65aaf37c42471c562d6967139b417e0b72c52af-merged.mount: Deactivated successfully. Nov 26 05:03:11 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-330f327f462c84c2b7114e07c5805bf40489aa164b28888bcf85455e2c902a02-userdata-shm.mount: Deactivated successfully. Nov 26 05:03:11 localhost systemd[1]: run-netns-qdhcp\x2d28eaeab7\x2d5bbd\x2d4432\x2db01d\x2d9418570f7b99.mount: Deactivated successfully. Nov 26 05:03:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 05:03:11 localhost systemd[1]: tmp-crun.IeBySM.mount: Deactivated successfully. Nov 26 05:03:11 localhost podman[321291]: 2025-11-26 10:03:11.981306256 +0000 UTC m=+0.093886200 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 26 05:03:11 localhost podman[321291]: 2025-11-26 10:03:11.994520156 +0000 UTC m=+0.107100110 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 05:03:12 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 05:03:12 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:12.153 2 INFO neutron.agent.securitygroups_rpc [None req-c5443192-397a-4a2d-935a-41001df1d754 02d34db97a834b479b13de621b958dc9 d0172107fdd34e19bdc10052ccfc7415 - - default default] Security group rule updated ['73faac63-ee0b-4d1d-a772-54a64ccb923c']#033[00m Nov 26 05:03:12 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:12.665 2 INFO neutron.agent.securitygroups_rpc [None req-825865b2-3655-480d-8a17-e6c55be89f7e 02d34db97a834b479b13de621b958dc9 d0172107fdd34e19bdc10052ccfc7415 - - default default] Security group rule updated ['73faac63-ee0b-4d1d-a772-54a64ccb923c']#033[00m Nov 26 05:03:12 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:12.699 2 INFO neutron.agent.securitygroups_rpc [None req-0acd79d8-16bf-40b9-a311-46be7fe67263 bf5696d8db9f4b2795974d3ed78b4991 c9a6d35bfc5f440e9fdc4ed36d883eff - - default default] Security group member updated ['512f55ba-befd-448e-8449-d75d9733402e']#033[00m Nov 26 05:03:12 localhost dnsmasq[320676]: exiting on receipt of SIGTERM Nov 26 05:03:12 localhost podman[321328]: 2025-11-26 10:03:12.884282735 +0000 UTC m=+0.072112877 container kill 0962e2ccaea9b1323ecd4f9df03ccab9f6459ff91ac66260ff0583d7baa1e82d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2bd5a088-20bc-44fc-b6b5-c918d5d98faf, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 26 05:03:12 localhost systemd[1]: libpod-0962e2ccaea9b1323ecd4f9df03ccab9f6459ff91ac66260ff0583d7baa1e82d.scope: Deactivated successfully. Nov 26 05:03:12 localhost nova_compute[281415]: 2025-11-26 10:03:12.951 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:12 localhost podman[321343]: 2025-11-26 10:03:12.98379357 +0000 UTC m=+0.082910326 container died 0962e2ccaea9b1323ecd4f9df03ccab9f6459ff91ac66260ff0583d7baa1e82d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2bd5a088-20bc-44fc-b6b5-c918d5d98faf, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 26 05:03:13 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0962e2ccaea9b1323ecd4f9df03ccab9f6459ff91ac66260ff0583d7baa1e82d-userdata-shm.mount: Deactivated successfully. Nov 26 05:03:13 localhost podman[321343]: 2025-11-26 10:03:13.05125644 +0000 UTC m=+0.150373156 container cleanup 0962e2ccaea9b1323ecd4f9df03ccab9f6459ff91ac66260ff0583d7baa1e82d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2bd5a088-20bc-44fc-b6b5-c918d5d98faf, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 26 05:03:13 localhost systemd[1]: libpod-conmon-0962e2ccaea9b1323ecd4f9df03ccab9f6459ff91ac66260ff0583d7baa1e82d.scope: Deactivated successfully. Nov 26 05:03:13 localhost podman[321344]: 2025-11-26 10:03:13.099476302 +0000 UTC m=+0.189580602 container remove 0962e2ccaea9b1323ecd4f9df03ccab9f6459ff91ac66260ff0583d7baa1e82d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2bd5a088-20bc-44fc-b6b5-c918d5d98faf, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:03:13 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:13.142 262471 INFO neutron.agent.dhcp.agent [None req-adb59125-3552-4a9f-9cf5-547e6bbd3221 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:03:13 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:13.325 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:03:13 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:13.614 2 INFO neutron.agent.securitygroups_rpc [None req-1cf8224d-14bf-43c5-9d3c-a092becd385c bf5696d8db9f4b2795974d3ed78b4991 c9a6d35bfc5f440e9fdc4ed36d883eff - - default default] Security group member updated ['512f55ba-befd-448e-8449-d75d9733402e']#033[00m Nov 26 05:03:13 localhost systemd[1]: var-lib-containers-storage-overlay-ac4177f4008bd2cdcddf91b223dcfbad322079d4684847f69f77f51d761bdd67-merged.mount: Deactivated successfully. Nov 26 05:03:13 localhost systemd[1]: run-netns-qdhcp\x2d2bd5a088\x2d20bc\x2d44fc\x2db6b5\x2dc918d5d98faf.mount: Deactivated successfully. Nov 26 05:03:14 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 26 05:03:14 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2205873065' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 26 05:03:14 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 26 05:03:14 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2205873065' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 26 05:03:14 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:03:14 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 26 05:03:14 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3413274865' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 26 05:03:14 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 26 05:03:14 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3413274865' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 26 05:03:14 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:14.444 2 INFO neutron.agent.securitygroups_rpc [None req-e1926508-5547-47ed-ae6d-8164cd0293df 02d34db97a834b479b13de621b958dc9 d0172107fdd34e19bdc10052ccfc7415 - - default default] Security group rule updated ['986c96fe-5c11-4884-9fa3-905dd7c55cfb']#033[00m Nov 26 05:03:14 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:14.912 2 INFO neutron.agent.securitygroups_rpc [None req-5237d22b-a676-47bf-bea0-785e4120014e 02d34db97a834b479b13de621b958dc9 d0172107fdd34e19bdc10052ccfc7415 - - default default] Security group rule updated ['986c96fe-5c11-4884-9fa3-905dd7c55cfb']#033[00m Nov 26 05:03:15 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:15.354 2 INFO neutron.agent.securitygroups_rpc [None req-ac9d470b-123a-4566-b362-5d0784871ac8 6346fafdd1c14dd59d4ef383e1aab144 21d5728042164c6e94c0d557d2e063de - - default default] Security group member updated ['600a9e3a-8849-44e9-95f5-e052bb3d3297']#033[00m Nov 26 05:03:15 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:15.513 2 INFO neutron.agent.securitygroups_rpc [None req-828bc1f5-d80d-4f1d-840a-4f25a5a1ef60 6e007ecff2d54f3d96b4ca4d0583f705 b4b4d6e653de42458dbb1d0be0428a0e - - default default] Security group member updated ['6132014c-a03b-42fa-9169-0b75f723efcc']#033[00m Nov 26 05:03:15 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 26 05:03:15 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1055590572' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 26 05:03:15 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 26 05:03:15 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1055590572' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 26 05:03:15 localhost openstack_network_exporter[242153]: ERROR 10:03:15 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 05:03:15 localhost openstack_network_exporter[242153]: ERROR 10:03:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:03:15 localhost openstack_network_exporter[242153]: ERROR 10:03:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:03:15 localhost openstack_network_exporter[242153]: ERROR 10:03:15 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 05:03:15 localhost openstack_network_exporter[242153]: Nov 26 05:03:15 localhost openstack_network_exporter[242153]: ERROR 10:03:15 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 05:03:15 localhost openstack_network_exporter[242153]: Nov 26 05:03:15 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:15.859 2 INFO neutron.agent.securitygroups_rpc [None req-ac9d470b-123a-4566-b362-5d0784871ac8 6346fafdd1c14dd59d4ef383e1aab144 21d5728042164c6e94c0d557d2e063de - - default default] Security group member updated ['600a9e3a-8849-44e9-95f5-e052bb3d3297']#033[00m Nov 26 05:03:15 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:15.896 2 INFO neutron.agent.securitygroups_rpc [None req-c867bb4b-7780-4fd5-9968-863460bddf98 02d34db97a834b479b13de621b958dc9 d0172107fdd34e19bdc10052ccfc7415 - - default default] Security group rule updated ['986c96fe-5c11-4884-9fa3-905dd7c55cfb']#033[00m Nov 26 05:03:16 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:16.437 2 INFO neutron.agent.securitygroups_rpc [None req-e5fc8772-eff4-4fb5-88aa-309a5598e49c 02d34db97a834b479b13de621b958dc9 d0172107fdd34e19bdc10052ccfc7415 - - default default] Security group rule updated ['986c96fe-5c11-4884-9fa3-905dd7c55cfb']#033[00m Nov 26 05:03:16 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:16.492 2 INFO neutron.agent.securitygroups_rpc [None req-8c644a6f-d103-4a7c-9a15-e0389596c68f 6e007ecff2d54f3d96b4ca4d0583f705 b4b4d6e653de42458dbb1d0be0428a0e - - default default] Security group member updated ['6132014c-a03b-42fa-9169-0b75f723efcc']#033[00m Nov 26 05:03:16 localhost sshd[321371]: main: sshd: ssh-rsa algorithm is disabled Nov 26 05:03:16 localhost nova_compute[281415]: 2025-11-26 10:03:16.732 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:16 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:16.743 2 INFO neutron.agent.securitygroups_rpc [None req-3d76572f-c38a-4d8d-ae4c-22c58297ed57 6346fafdd1c14dd59d4ef383e1aab144 21d5728042164c6e94c0d557d2e063de - - default default] Security group member updated ['600a9e3a-8849-44e9-95f5-e052bb3d3297']#033[00m Nov 26 05:03:16 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:16.835 2 INFO neutron.agent.securitygroups_rpc [None req-da6ceb02-53bb-423e-a43d-4c3af607bcef 02d34db97a834b479b13de621b958dc9 d0172107fdd34e19bdc10052ccfc7415 - - default default] Security group rule updated ['986c96fe-5c11-4884-9fa3-905dd7c55cfb']#033[00m Nov 26 05:03:17 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:17.060 2 INFO neutron.agent.securitygroups_rpc [None req-dd0de9bd-5341-4e2b-b4c1-5cabc8ca94d9 6346fafdd1c14dd59d4ef383e1aab144 21d5728042164c6e94c0d557d2e063de - - default default] Security group member updated ['600a9e3a-8849-44e9-95f5-e052bb3d3297']#033[00m Nov 26 05:03:17 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:17.325 2 INFO neutron.agent.securitygroups_rpc [None req-fb021b7d-4bc7-48a0-abe6-ca0916500cb2 02d34db97a834b479b13de621b958dc9 d0172107fdd34e19bdc10052ccfc7415 - - default default] Security group rule updated ['986c96fe-5c11-4884-9fa3-905dd7c55cfb']#033[00m Nov 26 05:03:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 05:03:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 05:03:17 localhost podman[321373]: 2025-11-26 10:03:17.831050605 +0000 UTC m=+0.089600183 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 26 05:03:17 localhost podman[321373]: 2025-11-26 10:03:17.841074251 +0000 UTC m=+0.099623839 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, container_name=ovn_metadata_agent) Nov 26 05:03:17 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 05:03:17 localhost podman[321374]: 2025-11-26 10:03:17.934070754 +0000 UTC m=+0.191117627 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.build-date=20251118) Nov 26 05:03:17 localhost podman[321374]: 2025-11-26 10:03:17.975457735 +0000 UTC m=+0.232504628 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=multipathd) Nov 26 05:03:17 localhost nova_compute[281415]: 2025-11-26 10:03:17.979 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:17 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 05:03:18 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:18.712 2 INFO neutron.agent.securitygroups_rpc [None req-97ed4313-1d33-487c-b828-31b3bc28477b 02d34db97a834b479b13de621b958dc9 d0172107fdd34e19bdc10052ccfc7415 - - default default] Security group rule updated ['17e5fb85-dae7-4a99-aafe-ff3dcc1e307c']#033[00m Nov 26 05:03:19 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:03:19 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:19.597 262471 INFO neutron.agent.linux.ip_lib [None req-96275285-f1b1-44c3-a997-8e5734488341 - - - - - -] Device tap59d37b98-2c cannot be used as it has no MAC address#033[00m Nov 26 05:03:19 localhost nova_compute[281415]: 2025-11-26 10:03:19.623 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:19 localhost kernel: device tap59d37b98-2c entered promiscuous mode Nov 26 05:03:19 localhost NetworkManager[5970]: [1764151399.6312] manager: (tap59d37b98-2c): new Generic device (/org/freedesktop/NetworkManager/Devices/57) Nov 26 05:03:19 localhost ovn_controller[153664]: 2025-11-26T10:03:19Z|00355|binding|INFO|Claiming lport 59d37b98-2c6e-40c0-bcdd-9b7f2d45e2ce for this chassis. Nov 26 05:03:19 localhost ovn_controller[153664]: 2025-11-26T10:03:19Z|00356|binding|INFO|59d37b98-2c6e-40c0-bcdd-9b7f2d45e2ce: Claiming unknown Nov 26 05:03:19 localhost nova_compute[281415]: 2025-11-26 10:03:19.633 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:19 localhost systemd-udevd[321423]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:03:19 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:19.644 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-6d4b949a-546a-427b-b5e5-05b50124c034', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6d4b949a-546a-427b-b5e5-05b50124c034', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c9a6d35bfc5f440e9fdc4ed36d883eff', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8db1e31d-20be-4649-8280-330f2d6c81ff, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=59d37b98-2c6e-40c0-bcdd-9b7f2d45e2ce) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:03:19 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:19.646 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 59d37b98-2c6e-40c0-bcdd-9b7f2d45e2ce in datapath 6d4b949a-546a-427b-b5e5-05b50124c034 bound to our chassis#033[00m Nov 26 05:03:19 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:19.648 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6d4b949a-546a-427b-b5e5-05b50124c034 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:03:19 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:19.649 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[e31864ac-e6cb-43da-9199-9b3c86e7ba41]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:03:19 localhost journal[229445]: ethtool ioctl error on tap59d37b98-2c: No such device Nov 26 05:03:19 localhost ovn_controller[153664]: 2025-11-26T10:03:19Z|00357|binding|INFO|Setting lport 59d37b98-2c6e-40c0-bcdd-9b7f2d45e2ce ovn-installed in OVS Nov 26 05:03:19 localhost ovn_controller[153664]: 2025-11-26T10:03:19Z|00358|binding|INFO|Setting lport 59d37b98-2c6e-40c0-bcdd-9b7f2d45e2ce up in Southbound Nov 26 05:03:19 localhost journal[229445]: ethtool ioctl error on tap59d37b98-2c: No such device Nov 26 05:03:19 localhost nova_compute[281415]: 2025-11-26 10:03:19.679 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:19 localhost journal[229445]: ethtool ioctl error on tap59d37b98-2c: No such device Nov 26 05:03:19 localhost journal[229445]: ethtool ioctl error on tap59d37b98-2c: No such device Nov 26 05:03:19 localhost journal[229445]: ethtool ioctl error on tap59d37b98-2c: No such device Nov 26 05:03:19 localhost journal[229445]: ethtool ioctl error on tap59d37b98-2c: No such device Nov 26 05:03:19 localhost journal[229445]: ethtool ioctl error on tap59d37b98-2c: No such device Nov 26 05:03:19 localhost journal[229445]: ethtool ioctl error on tap59d37b98-2c: No such device Nov 26 05:03:19 localhost nova_compute[281415]: 2025-11-26 10:03:19.731 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:19 localhost nova_compute[281415]: 2025-11-26 10:03:19.768 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:20 localhost podman[321494]: Nov 26 05:03:20 localhost podman[321494]: 2025-11-26 10:03:20.730522856 +0000 UTC m=+0.109674155 container create a3c60fafd80be01ab2af2b0abe895041fd483350716b75b2ec20002a48e6f698 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d4b949a-546a-427b-b5e5-05b50124c034, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3) Nov 26 05:03:20 localhost podman[321494]: 2025-11-26 10:03:20.676008129 +0000 UTC m=+0.055159488 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:03:20 localhost systemd[1]: Started libpod-conmon-a3c60fafd80be01ab2af2b0abe895041fd483350716b75b2ec20002a48e6f698.scope. Nov 26 05:03:20 localhost systemd[1]: Started libcrun container. Nov 26 05:03:20 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b09ae2b071a5eb3f4a20bff21b46eda828b40ceb1ef9455f4d68614e1bbbea9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:03:20 localhost podman[321494]: 2025-11-26 10:03:20.815244265 +0000 UTC m=+0.194395564 container init a3c60fafd80be01ab2af2b0abe895041fd483350716b75b2ec20002a48e6f698 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d4b949a-546a-427b-b5e5-05b50124c034, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Nov 26 05:03:20 localhost podman[321494]: 2025-11-26 10:03:20.825003193 +0000 UTC m=+0.204154492 container start a3c60fafd80be01ab2af2b0abe895041fd483350716b75b2ec20002a48e6f698 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d4b949a-546a-427b-b5e5-05b50124c034, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 26 05:03:20 localhost dnsmasq[321513]: started, version 2.85 cachesize 150 Nov 26 05:03:20 localhost dnsmasq[321513]: DNS service limited to local subnets Nov 26 05:03:20 localhost dnsmasq[321513]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:03:20 localhost dnsmasq[321513]: warning: no upstream servers configured Nov 26 05:03:20 localhost dnsmasq-dhcp[321513]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 26 05:03:20 localhost dnsmasq[321513]: read /var/lib/neutron/dhcp/6d4b949a-546a-427b-b5e5-05b50124c034/addn_hosts - 0 addresses Nov 26 05:03:20 localhost dnsmasq-dhcp[321513]: read /var/lib/neutron/dhcp/6d4b949a-546a-427b-b5e5-05b50124c034/host Nov 26 05:03:20 localhost dnsmasq-dhcp[321513]: read /var/lib/neutron/dhcp/6d4b949a-546a-427b-b5e5-05b50124c034/opts Nov 26 05:03:20 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:20.977 262471 INFO neutron.agent.dhcp.agent [None req-cca4602b-2196-4fec-9b1c-2fc74c9419c0 - - - - - -] DHCP configuration for ports {'50cddd7c-8335-48ad-aa02-c35aff86882f'} is completed#033[00m Nov 26 05:03:21 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:21.048 2 INFO neutron.agent.securitygroups_rpc [None req-ea1326ca-c1c8-485c-b3fb-a3121d3880d6 bf5696d8db9f4b2795974d3ed78b4991 c9a6d35bfc5f440e9fdc4ed36d883eff - - default default] Security group member updated ['512f55ba-befd-448e-8449-d75d9733402e']#033[00m Nov 26 05:03:21 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:21.278 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:03:20Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=fb952edf-6cfb-4f3e-ab32-660360a89057, ip_allocation=immediate, mac_address=fa:16:3e:68:81:51, name=tempest-PortsTestJSON-153174429, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:03:16Z, description=, dns_domain=, id=6d4b949a-546a-427b-b5e5-05b50124c034, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-1209040959, port_security_enabled=True, project_id=c9a6d35bfc5f440e9fdc4ed36d883eff, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=56930, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2097, status=ACTIVE, subnets=['bbaf0241-9870-46c9-80d4-4e80d164a12a'], tags=[], tenant_id=c9a6d35bfc5f440e9fdc4ed36d883eff, updated_at=2025-11-26T10:03:18Z, vlan_transparent=None, network_id=6d4b949a-546a-427b-b5e5-05b50124c034, port_security_enabled=True, project_id=c9a6d35bfc5f440e9fdc4ed36d883eff, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['512f55ba-befd-448e-8449-d75d9733402e'], standard_attr_id=2132, status=DOWN, tags=[], tenant_id=c9a6d35bfc5f440e9fdc4ed36d883eff, updated_at=2025-11-26T10:03:20Z on network 6d4b949a-546a-427b-b5e5-05b50124c034#033[00m Nov 26 05:03:21 localhost dnsmasq[321513]: read /var/lib/neutron/dhcp/6d4b949a-546a-427b-b5e5-05b50124c034/addn_hosts - 1 addresses Nov 26 05:03:21 localhost dnsmasq-dhcp[321513]: read /var/lib/neutron/dhcp/6d4b949a-546a-427b-b5e5-05b50124c034/host Nov 26 05:03:21 localhost dnsmasq-dhcp[321513]: read /var/lib/neutron/dhcp/6d4b949a-546a-427b-b5e5-05b50124c034/opts Nov 26 05:03:21 localhost podman[321532]: 2025-11-26 10:03:21.496049393 +0000 UTC m=+0.047677727 container kill a3c60fafd80be01ab2af2b0abe895041fd483350716b75b2ec20002a48e6f698 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d4b949a-546a-427b-b5e5-05b50124c034, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118) Nov 26 05:03:21 localhost nova_compute[281415]: 2025-11-26 10:03:21.772 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:21 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:21.858 262471 INFO neutron.agent.dhcp.agent [None req-deede98a-4082-4334-aa90-0dbf5e1f8d03 - - - - - -] DHCP configuration for ports {'fb952edf-6cfb-4f3e-ab32-660360a89057'} is completed#033[00m Nov 26 05:03:22 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:22.808 2 INFO neutron.agent.securitygroups_rpc [None req-ed2746f6-bab7-4f76-9446-1770ba082df9 bf5696d8db9f4b2795974d3ed78b4991 c9a6d35bfc5f440e9fdc4ed36d883eff - - default default] Security group member updated ['512f55ba-befd-448e-8449-d75d9733402e']#033[00m Nov 26 05:03:22 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:22.853 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:03:22Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=909461d5-800a-4160-9981-0c7eb6edb371, ip_allocation=immediate, mac_address=fa:16:3e:2b:f9:ea, name=tempest-PortsTestJSON-1583632910, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:03:16Z, description=, dns_domain=, id=6d4b949a-546a-427b-b5e5-05b50124c034, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-1209040959, port_security_enabled=True, project_id=c9a6d35bfc5f440e9fdc4ed36d883eff, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=56930, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2097, status=ACTIVE, subnets=['bbaf0241-9870-46c9-80d4-4e80d164a12a'], tags=[], tenant_id=c9a6d35bfc5f440e9fdc4ed36d883eff, updated_at=2025-11-26T10:03:18Z, vlan_transparent=None, network_id=6d4b949a-546a-427b-b5e5-05b50124c034, port_security_enabled=True, project_id=c9a6d35bfc5f440e9fdc4ed36d883eff, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['512f55ba-befd-448e-8449-d75d9733402e'], standard_attr_id=2133, status=DOWN, tags=[], tenant_id=c9a6d35bfc5f440e9fdc4ed36d883eff, updated_at=2025-11-26T10:03:22Z on network 6d4b949a-546a-427b-b5e5-05b50124c034#033[00m Nov 26 05:03:23 localhost nova_compute[281415]: 2025-11-26 10:03:23.018 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:23 localhost dnsmasq[321513]: read /var/lib/neutron/dhcp/6d4b949a-546a-427b-b5e5-05b50124c034/addn_hosts - 2 addresses Nov 26 05:03:23 localhost dnsmasq-dhcp[321513]: read /var/lib/neutron/dhcp/6d4b949a-546a-427b-b5e5-05b50124c034/host Nov 26 05:03:23 localhost dnsmasq-dhcp[321513]: read /var/lib/neutron/dhcp/6d4b949a-546a-427b-b5e5-05b50124c034/opts Nov 26 05:03:23 localhost podman[321570]: 2025-11-26 10:03:23.156675117 +0000 UTC m=+0.071844609 container kill a3c60fafd80be01ab2af2b0abe895041fd483350716b75b2ec20002a48e6f698 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d4b949a-546a-427b-b5e5-05b50124c034, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 26 05:03:23 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:23.421 262471 INFO neutron.agent.dhcp.agent [None req-b313f2bd-c137-4684-8e47-df05d5336d21 - - - - - -] DHCP configuration for ports {'909461d5-800a-4160-9981-0c7eb6edb371'} is completed#033[00m Nov 26 05:03:23 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:23.539 2 INFO neutron.agent.securitygroups_rpc [None req-dfc3aed9-ecf4-4467-bc4e-b9dce67f3314 bf5696d8db9f4b2795974d3ed78b4991 c9a6d35bfc5f440e9fdc4ed36d883eff - - default default] Security group member updated ['512f55ba-befd-448e-8449-d75d9733402e']#033[00m Nov 26 05:03:23 localhost podman[321607]: 2025-11-26 10:03:23.790303235 +0000 UTC m=+0.080114184 container kill a3c60fafd80be01ab2af2b0abe895041fd483350716b75b2ec20002a48e6f698 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d4b949a-546a-427b-b5e5-05b50124c034, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118) Nov 26 05:03:23 localhost dnsmasq[321513]: read /var/lib/neutron/dhcp/6d4b949a-546a-427b-b5e5-05b50124c034/addn_hosts - 1 addresses Nov 26 05:03:23 localhost dnsmasq-dhcp[321513]: read /var/lib/neutron/dhcp/6d4b949a-546a-427b-b5e5-05b50124c034/host Nov 26 05:03:23 localhost dnsmasq-dhcp[321513]: read /var/lib/neutron/dhcp/6d4b949a-546a-427b-b5e5-05b50124c034/opts Nov 26 05:03:23 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:23.854 2 INFO neutron.agent.securitygroups_rpc [None req-9c57f73f-7a88-4ac2-b904-95f59c937856 bf5696d8db9f4b2795974d3ed78b4991 c9a6d35bfc5f440e9fdc4ed36d883eff - - default default] Security group member updated ['512f55ba-befd-448e-8449-d75d9733402e']#033[00m Nov 26 05:03:24 localhost podman[321644]: 2025-11-26 10:03:24.177966217 +0000 UTC m=+0.062607717 container kill a3c60fafd80be01ab2af2b0abe895041fd483350716b75b2ec20002a48e6f698 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d4b949a-546a-427b-b5e5-05b50124c034, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 26 05:03:24 localhost dnsmasq[321513]: read /var/lib/neutron/dhcp/6d4b949a-546a-427b-b5e5-05b50124c034/addn_hosts - 0 addresses Nov 26 05:03:24 localhost dnsmasq-dhcp[321513]: read /var/lib/neutron/dhcp/6d4b949a-546a-427b-b5e5-05b50124c034/host Nov 26 05:03:24 localhost dnsmasq-dhcp[321513]: read /var/lib/neutron/dhcp/6d4b949a-546a-427b-b5e5-05b50124c034/opts Nov 26 05:03:24 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:03:24 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:24.465 262471 INFO neutron.agent.linux.ip_lib [None req-1943f4f3-4d80-481b-802c-99f8720f9f18 - - - - - -] Device tapb867643b-d8 cannot be used as it has no MAC address#033[00m Nov 26 05:03:24 localhost nova_compute[281415]: 2025-11-26 10:03:24.495 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:24 localhost kernel: device tapb867643b-d8 entered promiscuous mode Nov 26 05:03:24 localhost NetworkManager[5970]: [1764151404.5056] manager: (tapb867643b-d8): new Generic device (/org/freedesktop/NetworkManager/Devices/58) Nov 26 05:03:24 localhost ovn_controller[153664]: 2025-11-26T10:03:24Z|00359|binding|INFO|Claiming lport b867643b-d8ae-48d4-93f1-185a4054e195 for this chassis. Nov 26 05:03:24 localhost nova_compute[281415]: 2025-11-26 10:03:24.507 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:24 localhost ovn_controller[153664]: 2025-11-26T10:03:24Z|00360|binding|INFO|b867643b-d8ae-48d4-93f1-185a4054e195: Claiming unknown Nov 26 05:03:24 localhost systemd-udevd[321675]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:03:24 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:24.519 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-50486772-0c5f-4d78-9cdb-b0d0aa3eb7be', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-50486772-0c5f-4d78-9cdb-b0d0aa3eb7be', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '21d5728042164c6e94c0d557d2e063de', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e6ce99e4-7c5d-4388-af14-f1ac3ac84689, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=b867643b-d8ae-48d4-93f1-185a4054e195) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:03:24 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:24.521 159486 INFO neutron.agent.ovn.metadata.agent [-] Port b867643b-d8ae-48d4-93f1-185a4054e195 in datapath 50486772-0c5f-4d78-9cdb-b0d0aa3eb7be bound to our chassis#033[00m Nov 26 05:03:24 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:24.523 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Port fc0f984f-a142-4ddc-a914-08e16a8f4462 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 26 05:03:24 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:24.524 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 50486772-0c5f-4d78-9cdb-b0d0aa3eb7be, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:03:24 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:24.525 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[ab77945f-ae2c-4dff-9658-15ead0fdba12]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:03:24 localhost journal[229445]: ethtool ioctl error on tapb867643b-d8: No such device Nov 26 05:03:24 localhost journal[229445]: ethtool ioctl error on tapb867643b-d8: No such device Nov 26 05:03:24 localhost ovn_controller[153664]: 2025-11-26T10:03:24Z|00361|binding|INFO|Setting lport b867643b-d8ae-48d4-93f1-185a4054e195 ovn-installed in OVS Nov 26 05:03:24 localhost ovn_controller[153664]: 2025-11-26T10:03:24Z|00362|binding|INFO|Setting lport b867643b-d8ae-48d4-93f1-185a4054e195 up in Southbound Nov 26 05:03:24 localhost nova_compute[281415]: 2025-11-26 10:03:24.556 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:24 localhost journal[229445]: ethtool ioctl error on tapb867643b-d8: No such device Nov 26 05:03:24 localhost journal[229445]: ethtool ioctl error on tapb867643b-d8: No such device Nov 26 05:03:24 localhost journal[229445]: ethtool ioctl error on tapb867643b-d8: No such device Nov 26 05:03:24 localhost journal[229445]: ethtool ioctl error on tapb867643b-d8: No such device Nov 26 05:03:24 localhost journal[229445]: ethtool ioctl error on tapb867643b-d8: No such device Nov 26 05:03:24 localhost journal[229445]: ethtool ioctl error on tapb867643b-d8: No such device Nov 26 05:03:24 localhost nova_compute[281415]: 2025-11-26 10:03:24.603 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:24 localhost nova_compute[281415]: 2025-11-26 10:03:24.635 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:25 localhost ovn_controller[153664]: 2025-11-26T10:03:25Z|00363|binding|INFO|Removing iface tap59d37b98-2c ovn-installed in OVS Nov 26 05:03:25 localhost ovn_controller[153664]: 2025-11-26T10:03:25Z|00364|binding|INFO|Removing lport 59d37b98-2c6e-40c0-bcdd-9b7f2d45e2ce ovn-installed in OVS Nov 26 05:03:25 localhost nova_compute[281415]: 2025-11-26 10:03:25.089 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:25 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:25.091 159486 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 8d902edb-fc33-4937-aeb6-43754a378279 with type ""#033[00m Nov 26 05:03:25 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:25.093 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-6d4b949a-546a-427b-b5e5-05b50124c034', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6d4b949a-546a-427b-b5e5-05b50124c034', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c9a6d35bfc5f440e9fdc4ed36d883eff', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8db1e31d-20be-4649-8280-330f2d6c81ff, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=59d37b98-2c6e-40c0-bcdd-9b7f2d45e2ce) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:03:25 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:25.095 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 59d37b98-2c6e-40c0-bcdd-9b7f2d45e2ce in datapath 6d4b949a-546a-427b-b5e5-05b50124c034 unbound from our chassis#033[00m Nov 26 05:03:25 localhost nova_compute[281415]: 2025-11-26 10:03:25.097 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:25 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:25.098 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6d4b949a-546a-427b-b5e5-05b50124c034, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:03:25 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:25.099 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[3914d93f-39e0-4057-a51f-56f20bbeff4b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:03:25 localhost podman[321737]: 2025-11-26 10:03:25.157213476 +0000 UTC m=+0.069900001 container kill a3c60fafd80be01ab2af2b0abe895041fd483350716b75b2ec20002a48e6f698 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d4b949a-546a-427b-b5e5-05b50124c034, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:03:25 localhost dnsmasq[321513]: exiting on receipt of SIGTERM Nov 26 05:03:25 localhost systemd[1]: libpod-a3c60fafd80be01ab2af2b0abe895041fd483350716b75b2ec20002a48e6f698.scope: Deactivated successfully. Nov 26 05:03:25 localhost podman[321754]: 2025-11-26 10:03:25.237962088 +0000 UTC m=+0.057510227 container died a3c60fafd80be01ab2af2b0abe895041fd483350716b75b2ec20002a48e6f698 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d4b949a-546a-427b-b5e5-05b50124c034, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:03:25 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a3c60fafd80be01ab2af2b0abe895041fd483350716b75b2ec20002a48e6f698-userdata-shm.mount: Deactivated successfully. Nov 26 05:03:25 localhost systemd[1]: var-lib-containers-storage-overlay-0b09ae2b071a5eb3f4a20bff21b46eda828b40ceb1ef9455f4d68614e1bbbea9-merged.mount: Deactivated successfully. Nov 26 05:03:25 localhost podman[321754]: 2025-11-26 10:03:25.268029785 +0000 UTC m=+0.087577894 container cleanup a3c60fafd80be01ab2af2b0abe895041fd483350716b75b2ec20002a48e6f698 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d4b949a-546a-427b-b5e5-05b50124c034, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:03:25 localhost systemd[1]: libpod-conmon-a3c60fafd80be01ab2af2b0abe895041fd483350716b75b2ec20002a48e6f698.scope: Deactivated successfully. Nov 26 05:03:25 localhost podman[321755]: 2025-11-26 10:03:25.318012809 +0000 UTC m=+0.130928182 container remove a3c60fafd80be01ab2af2b0abe895041fd483350716b75b2ec20002a48e6f698 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d4b949a-546a-427b-b5e5-05b50124c034, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 26 05:03:25 localhost kernel: device tap59d37b98-2c left promiscuous mode Nov 26 05:03:25 localhost nova_compute[281415]: 2025-11-26 10:03:25.362 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:25 localhost nova_compute[281415]: 2025-11-26 10:03:25.377 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:25 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:25.394 262471 INFO neutron.agent.dhcp.agent [None req-3c6aad1f-7395-4fef-a3ad-4d65a3d8bfdc - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:03:25 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:25.394 262471 INFO neutron.agent.dhcp.agent [None req-3c6aad1f-7395-4fef-a3ad-4d65a3d8bfdc - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:03:25 localhost podman[321809]: Nov 26 05:03:25 localhost podman[321809]: 2025-11-26 10:03:25.640246243 +0000 UTC m=+0.094323304 container create 59abd38d3cfc9c59196f7cc8902c9eb97ec40beaded6fb2411f28472aac84f77 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-50486772-0c5f-4d78-9cdb-b0d0aa3eb7be, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Nov 26 05:03:25 localhost systemd[1]: Started libpod-conmon-59abd38d3cfc9c59196f7cc8902c9eb97ec40beaded6fb2411f28472aac84f77.scope. Nov 26 05:03:25 localhost podman[321809]: 2025-11-26 10:03:25.59575775 +0000 UTC m=+0.049834821 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:03:25 localhost systemd[1]: Started libcrun container. Nov 26 05:03:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b5b1c3d9c5f912095047e80b646130d939315b9bab891800b6a3f424f3b4aea0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:03:25 localhost podman[321809]: 2025-11-26 10:03:25.713580745 +0000 UTC m=+0.167657796 container init 59abd38d3cfc9c59196f7cc8902c9eb97ec40beaded6fb2411f28472aac84f77 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-50486772-0c5f-4d78-9cdb-b0d0aa3eb7be, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 26 05:03:25 localhost podman[321809]: 2025-11-26 10:03:25.722293542 +0000 UTC m=+0.176370593 container start 59abd38d3cfc9c59196f7cc8902c9eb97ec40beaded6fb2411f28472aac84f77 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-50486772-0c5f-4d78-9cdb-b0d0aa3eb7be, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 26 05:03:25 localhost dnsmasq[321828]: started, version 2.85 cachesize 150 Nov 26 05:03:25 localhost dnsmasq[321828]: DNS service limited to local subnets Nov 26 05:03:25 localhost dnsmasq[321828]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:03:25 localhost dnsmasq[321828]: warning: no upstream servers configured Nov 26 05:03:25 localhost dnsmasq-dhcp[321828]: DHCP, static leases only on 10.100.0.16, lease time 1d Nov 26 05:03:25 localhost dnsmasq[321828]: read /var/lib/neutron/dhcp/50486772-0c5f-4d78-9cdb-b0d0aa3eb7be/addn_hosts - 0 addresses Nov 26 05:03:25 localhost dnsmasq-dhcp[321828]: read /var/lib/neutron/dhcp/50486772-0c5f-4d78-9cdb-b0d0aa3eb7be/host Nov 26 05:03:25 localhost dnsmasq-dhcp[321828]: read /var/lib/neutron/dhcp/50486772-0c5f-4d78-9cdb-b0d0aa3eb7be/opts Nov 26 05:03:25 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:25.777 159486 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port fc0f984f-a142-4ddc-a914-08e16a8f4462 with type ""#033[00m Nov 26 05:03:25 localhost ovn_controller[153664]: 2025-11-26T10:03:25Z|00365|binding|INFO|Removing iface tapb867643b-d8 ovn-installed in OVS Nov 26 05:03:25 localhost ovn_controller[153664]: 2025-11-26T10:03:25Z|00366|binding|INFO|Removing lport b867643b-d8ae-48d4-93f1-185a4054e195 ovn-installed in OVS Nov 26 05:03:25 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:25.778 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-50486772-0c5f-4d78-9cdb-b0d0aa3eb7be', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-50486772-0c5f-4d78-9cdb-b0d0aa3eb7be', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '21d5728042164c6e94c0d557d2e063de', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e6ce99e4-7c5d-4388-af14-f1ac3ac84689, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=b867643b-d8ae-48d4-93f1-185a4054e195) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:03:25 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:25.779 159486 INFO neutron.agent.ovn.metadata.agent [-] Port b867643b-d8ae-48d4-93f1-185a4054e195 in datapath 50486772-0c5f-4d78-9cdb-b0d0aa3eb7be unbound from our chassis#033[00m Nov 26 05:03:25 localhost nova_compute[281415]: 2025-11-26 10:03:25.780 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:25 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:25.781 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 50486772-0c5f-4d78-9cdb-b0d0aa3eb7be, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:03:25 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:25.782 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[b66b5207-798f-4517-8d96-734f0873244c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:03:25 localhost nova_compute[281415]: 2025-11-26 10:03:25.787 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:25 localhost ovn_controller[153664]: 2025-11-26T10:03:25Z|00367|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:03:25 localhost sshd[321829]: main: sshd: ssh-rsa algorithm is disabled Nov 26 05:03:25 localhost nova_compute[281415]: 2025-11-26 10:03:25.836 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:25 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:25.883 262471 INFO neutron.agent.dhcp.agent [None req-1bc43eff-60ef-4058-adc2-abbfa6039121 - - - - - -] DHCP configuration for ports {'371b414a-acd9-4932-b748-7a5ad6eb686d'} is completed#033[00m Nov 26 05:03:26 localhost dnsmasq[321828]: exiting on receipt of SIGTERM Nov 26 05:03:26 localhost podman[321848]: 2025-11-26 10:03:26.079651581 +0000 UTC m=+0.067996597 container kill 59abd38d3cfc9c59196f7cc8902c9eb97ec40beaded6fb2411f28472aac84f77 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-50486772-0c5f-4d78-9cdb-b0d0aa3eb7be, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 26 05:03:26 localhost systemd[1]: libpod-59abd38d3cfc9c59196f7cc8902c9eb97ec40beaded6fb2411f28472aac84f77.scope: Deactivated successfully. Nov 26 05:03:26 localhost podman[321860]: 2025-11-26 10:03:26.155679144 +0000 UTC m=+0.060983120 container died 59abd38d3cfc9c59196f7cc8902c9eb97ec40beaded6fb2411f28472aac84f77 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-50486772-0c5f-4d78-9cdb-b0d0aa3eb7be, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Nov 26 05:03:26 localhost podman[321860]: 2025-11-26 10:03:26.193297363 +0000 UTC m=+0.098601289 container cleanup 59abd38d3cfc9c59196f7cc8902c9eb97ec40beaded6fb2411f28472aac84f77 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-50486772-0c5f-4d78-9cdb-b0d0aa3eb7be, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:03:26 localhost systemd[1]: libpod-conmon-59abd38d3cfc9c59196f7cc8902c9eb97ec40beaded6fb2411f28472aac84f77.scope: Deactivated successfully. Nov 26 05:03:26 localhost podman[321862]: 2025-11-26 10:03:26.244807332 +0000 UTC m=+0.140395371 container remove 59abd38d3cfc9c59196f7cc8902c9eb97ec40beaded6fb2411f28472aac84f77 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-50486772-0c5f-4d78-9cdb-b0d0aa3eb7be, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:03:26 localhost systemd[1]: run-netns-qdhcp\x2d6d4b949a\x2d546a\x2d427b\x2db5e5\x2d05b50124c034.mount: Deactivated successfully. Nov 26 05:03:26 localhost nova_compute[281415]: 2025-11-26 10:03:26.263 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:26 localhost kernel: device tapb867643b-d8 left promiscuous mode Nov 26 05:03:26 localhost nova_compute[281415]: 2025-11-26 10:03:26.286 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:26 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:26.298 262471 INFO neutron.agent.dhcp.agent [None req-93eb3d50-7a02-4126-9ce6-4ba1102f4b1b - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:03:26 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:26.299 262471 INFO neutron.agent.dhcp.agent [None req-93eb3d50-7a02-4126-9ce6-4ba1102f4b1b - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:03:26 localhost systemd[1]: run-netns-qdhcp\x2d50486772\x2d0c5f\x2d4d78\x2d9cdb\x2db0d0aa3eb7be.mount: Deactivated successfully. Nov 26 05:03:26 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:03:26 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/440754653' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:03:26 localhost nova_compute[281415]: 2025-11-26 10:03:26.810 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:27 localhost podman[240049]: time="2025-11-26T10:03:27Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 05:03:27 localhost podman[240049]: @ - - [26/Nov/2025:10:03:27 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153864 "" "Go-http-client/1.1" Nov 26 05:03:27 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e146 e146: 6 total, 6 up, 6 in Nov 26 05:03:27 localhost podman[240049]: @ - - [26/Nov/2025:10:03:27 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18779 "" "Go-http-client/1.1" Nov 26 05:03:28 localhost nova_compute[281415]: 2025-11-26 10:03:28.047 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:28 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e147 e147: 6 total, 6 up, 6 in Nov 26 05:03:28 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:28.700 2 INFO neutron.agent.securitygroups_rpc [None req-d6b78ff3-9ff1-4c6f-8f82-03ebfcf541ab bf5696d8db9f4b2795974d3ed78b4991 c9a6d35bfc5f440e9fdc4ed36d883eff - - default default] Security group member updated ['512f55ba-befd-448e-8449-d75d9733402e']#033[00m Nov 26 05:03:29 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:03:29 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e148 e148: 6 total, 6 up, 6 in Nov 26 05:03:30 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:30.474 262471 INFO neutron.agent.linux.ip_lib [None req-0cd46b6e-dc0a-4313-87b6-42f2e94e3b39 - - - - - -] Device tap7ad0673d-f0 cannot be used as it has no MAC address#033[00m Nov 26 05:03:30 localhost nova_compute[281415]: 2025-11-26 10:03:30.539 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:30 localhost kernel: device tap7ad0673d-f0 entered promiscuous mode Nov 26 05:03:30 localhost NetworkManager[5970]: [1764151410.5508] manager: (tap7ad0673d-f0): new Generic device (/org/freedesktop/NetworkManager/Devices/59) Nov 26 05:03:30 localhost ovn_controller[153664]: 2025-11-26T10:03:30Z|00368|binding|INFO|Claiming lport 7ad0673d-f0c8-4656-af8e-2037e30e3c3b for this chassis. Nov 26 05:03:30 localhost nova_compute[281415]: 2025-11-26 10:03:30.552 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:30 localhost ovn_controller[153664]: 2025-11-26T10:03:30Z|00369|binding|INFO|7ad0673d-f0c8-4656-af8e-2037e30e3c3b: Claiming unknown Nov 26 05:03:30 localhost systemd-udevd[321899]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:03:30 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:30.571 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-7f4f14e5-4232-482d-96f4-8da45058c531', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7f4f14e5-4232-482d-96f4-8da45058c531', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a28ce44d2e9a40519a8955587c056dae', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=56e68f0a-110e-49e9-95dc-7c86ebfe4a46, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7ad0673d-f0c8-4656-af8e-2037e30e3c3b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:03:30 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:30.573 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 7ad0673d-f0c8-4656-af8e-2037e30e3c3b in datapath 7f4f14e5-4232-482d-96f4-8da45058c531 bound to our chassis#033[00m Nov 26 05:03:30 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:30.575 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7f4f14e5-4232-482d-96f4-8da45058c531 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:03:30 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:30.576 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[748baf2f-9abf-425a-94cc-59c68a9d5782]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:03:30 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e149 e149: 6 total, 6 up, 6 in Nov 26 05:03:30 localhost ovn_controller[153664]: 2025-11-26T10:03:30Z|00370|binding|INFO|Setting lport 7ad0673d-f0c8-4656-af8e-2037e30e3c3b ovn-installed in OVS Nov 26 05:03:30 localhost ovn_controller[153664]: 2025-11-26T10:03:30Z|00371|binding|INFO|Setting lport 7ad0673d-f0c8-4656-af8e-2037e30e3c3b up in Southbound Nov 26 05:03:30 localhost nova_compute[281415]: 2025-11-26 10:03:30.599 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:30 localhost nova_compute[281415]: 2025-11-26 10:03:30.646 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:30 localhost nova_compute[281415]: 2025-11-26 10:03:30.686 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:31 localhost podman[321952]: Nov 26 05:03:31 localhost podman[321952]: 2025-11-26 10:03:31.585289522 +0000 UTC m=+0.109780379 container create 01b9ac506e6225619a3c631a67e9993e3755257ac97cb532e357114a96292829 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7f4f14e5-4232-482d-96f4-8da45058c531, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 26 05:03:31 localhost podman[321952]: 2025-11-26 10:03:31.534673159 +0000 UTC m=+0.059164046 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:03:31 localhost systemd[1]: Started libpod-conmon-01b9ac506e6225619a3c631a67e9993e3755257ac97cb532e357114a96292829.scope. Nov 26 05:03:31 localhost systemd[1]: Started libcrun container. Nov 26 05:03:31 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/646407ecee7e7a9fe8644e20e65ce80cd17b673576e28e9e324e5d13ad3b64e1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:03:31 localhost podman[321952]: 2025-11-26 10:03:31.702818778 +0000 UTC m=+0.227309635 container init 01b9ac506e6225619a3c631a67e9993e3755257ac97cb532e357114a96292829 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7f4f14e5-4232-482d-96f4-8da45058c531, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Nov 26 05:03:31 localhost podman[321952]: 2025-11-26 10:03:31.713560355 +0000 UTC m=+0.238051212 container start 01b9ac506e6225619a3c631a67e9993e3755257ac97cb532e357114a96292829 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7f4f14e5-4232-482d-96f4-8da45058c531, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:03:31 localhost dnsmasq[321971]: started, version 2.85 cachesize 150 Nov 26 05:03:31 localhost dnsmasq[321971]: DNS service limited to local subnets Nov 26 05:03:31 localhost dnsmasq[321971]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:03:31 localhost dnsmasq[321971]: warning: no upstream servers configured Nov 26 05:03:31 localhost dnsmasq-dhcp[321971]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 26 05:03:31 localhost dnsmasq[321971]: read /var/lib/neutron/dhcp/7f4f14e5-4232-482d-96f4-8da45058c531/addn_hosts - 0 addresses Nov 26 05:03:31 localhost dnsmasq-dhcp[321971]: read /var/lib/neutron/dhcp/7f4f14e5-4232-482d-96f4-8da45058c531/host Nov 26 05:03:31 localhost dnsmasq-dhcp[321971]: read /var/lib/neutron/dhcp/7f4f14e5-4232-482d-96f4-8da45058c531/opts Nov 26 05:03:31 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:31.782 262471 INFO neutron.agent.dhcp.agent [None req-0cd46b6e-dc0a-4313-87b6-42f2e94e3b39 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:03:31Z, description=, device_id=97bf1045-7f39-457e-b91d-348930d504a5, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=3ba684c4-87c1-4d4f-b6e9-af51e56d50a7, ip_allocation=immediate, mac_address=fa:16:3e:67:23:d8, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:03:26Z, description=, dns_domain=, id=7f4f14e5-4232-482d-96f4-8da45058c531, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-434360503, port_security_enabled=True, project_id=a28ce44d2e9a40519a8955587c056dae, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=49557, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2150, status=ACTIVE, subnets=['baa26e94-2226-4e14-a718-b5b1047a4392'], tags=[], tenant_id=a28ce44d2e9a40519a8955587c056dae, updated_at=2025-11-26T10:03:29Z, vlan_transparent=None, network_id=7f4f14e5-4232-482d-96f4-8da45058c531, port_security_enabled=False, project_id=a28ce44d2e9a40519a8955587c056dae, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2175, status=DOWN, tags=[], tenant_id=a28ce44d2e9a40519a8955587c056dae, updated_at=2025-11-26T10:03:31Z on network 7f4f14e5-4232-482d-96f4-8da45058c531#033[00m Nov 26 05:03:31 localhost nova_compute[281415]: 2025-11-26 10:03:31.855 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:31 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:31.912 262471 INFO neutron.agent.dhcp.agent [None req-c0a66aa5-e64d-4025-9936-2a38c789fc01 - - - - - -] DHCP configuration for ports {'ccce1e2d-6ee8-4f61-afc5-f009d9fb7bb6'} is completed#033[00m Nov 26 05:03:32 localhost dnsmasq[321971]: read /var/lib/neutron/dhcp/7f4f14e5-4232-482d-96f4-8da45058c531/addn_hosts - 1 addresses Nov 26 05:03:32 localhost dnsmasq-dhcp[321971]: read /var/lib/neutron/dhcp/7f4f14e5-4232-482d-96f4-8da45058c531/host Nov 26 05:03:32 localhost dnsmasq-dhcp[321971]: read /var/lib/neutron/dhcp/7f4f14e5-4232-482d-96f4-8da45058c531/opts Nov 26 05:03:32 localhost podman[321990]: 2025-11-26 10:03:32.008375309 +0000 UTC m=+0.068873051 container kill 01b9ac506e6225619a3c631a67e9993e3755257ac97cb532e357114a96292829 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7f4f14e5-4232-482d-96f4-8da45058c531, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Nov 26 05:03:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:32.297 262471 INFO neutron.agent.dhcp.agent [None req-1337d0e9-9c5c-4e0f-a760-e517d0de02c5 - - - - - -] DHCP configuration for ports {'3ba684c4-87c1-4d4f-b6e9-af51e56d50a7'} is completed#033[00m Nov 26 05:03:32 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:32.460 2 INFO neutron.agent.securitygroups_rpc [None req-bd0dbdcc-476a-4c9d-a98c-0ec1415492d2 bf5696d8db9f4b2795974d3ed78b4991 c9a6d35bfc5f440e9fdc4ed36d883eff - - default default] Security group member updated ['512f55ba-befd-448e-8449-d75d9733402e']#033[00m Nov 26 05:03:32 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e150 e150: 6 total, 6 up, 6 in Nov 26 05:03:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:32.818 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:03:31Z, description=, device_id=97bf1045-7f39-457e-b91d-348930d504a5, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=3ba684c4-87c1-4d4f-b6e9-af51e56d50a7, ip_allocation=immediate, mac_address=fa:16:3e:67:23:d8, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:03:26Z, description=, dns_domain=, id=7f4f14e5-4232-482d-96f4-8da45058c531, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-434360503, port_security_enabled=True, project_id=a28ce44d2e9a40519a8955587c056dae, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=49557, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2150, status=ACTIVE, subnets=['baa26e94-2226-4e14-a718-b5b1047a4392'], tags=[], tenant_id=a28ce44d2e9a40519a8955587c056dae, updated_at=2025-11-26T10:03:29Z, vlan_transparent=None, network_id=7f4f14e5-4232-482d-96f4-8da45058c531, port_security_enabled=False, project_id=a28ce44d2e9a40519a8955587c056dae, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2175, status=DOWN, tags=[], tenant_id=a28ce44d2e9a40519a8955587c056dae, updated_at=2025-11-26T10:03:31Z on network 7f4f14e5-4232-482d-96f4-8da45058c531#033[00m Nov 26 05:03:33 localhost dnsmasq[321971]: read /var/lib/neutron/dhcp/7f4f14e5-4232-482d-96f4-8da45058c531/addn_hosts - 1 addresses Nov 26 05:03:33 localhost podman[322028]: 2025-11-26 10:03:33.039047236 +0000 UTC m=+0.069405219 container kill 01b9ac506e6225619a3c631a67e9993e3755257ac97cb532e357114a96292829 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7f4f14e5-4232-482d-96f4-8da45058c531, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true) Nov 26 05:03:33 localhost dnsmasq-dhcp[321971]: read /var/lib/neutron/dhcp/7f4f14e5-4232-482d-96f4-8da45058c531/host Nov 26 05:03:33 localhost dnsmasq-dhcp[321971]: read /var/lib/neutron/dhcp/7f4f14e5-4232-482d-96f4-8da45058c531/opts Nov 26 05:03:33 localhost nova_compute[281415]: 2025-11-26 10:03:33.086 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 05:03:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 05:03:33 localhost podman[322044]: 2025-11-26 10:03:33.200903969 +0000 UTC m=+0.095916639 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 26 05:03:33 localhost podman[322044]: 2025-11-26 10:03:33.242389633 +0000 UTC m=+0.137402333 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 26 05:03:33 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 05:03:33 localhost podman[322046]: 2025-11-26 10:03:33.261865587 +0000 UTC m=+0.154174478 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 26 05:03:33 localhost podman[322046]: 2025-11-26 10:03:33.274408387 +0000 UTC m=+0.166717308 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute) Nov 26 05:03:33 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 05:03:33 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:33.346 262471 INFO neutron.agent.dhcp.agent [None req-2fe35960-96af-4bc2-862d-48d1969c6a8d - - - - - -] DHCP configuration for ports {'3ba684c4-87c1-4d4f-b6e9-af51e56d50a7'} is completed#033[00m Nov 26 05:03:33 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e151 e151: 6 total, 6 up, 6 in Nov 26 05:03:34 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:03:34 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e152 e152: 6 total, 6 up, 6 in Nov 26 05:03:35 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 26 05:03:35 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2771094319' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 26 05:03:35 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 26 05:03:35 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2771094319' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 26 05:03:36 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:03:36 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:03:36 localhost nova_compute[281415]: 2025-11-26 10:03:36.899 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:36 localhost dnsmasq[321971]: read /var/lib/neutron/dhcp/7f4f14e5-4232-482d-96f4-8da45058c531/addn_hosts - 0 addresses Nov 26 05:03:36 localhost dnsmasq-dhcp[321971]: read /var/lib/neutron/dhcp/7f4f14e5-4232-482d-96f4-8da45058c531/host Nov 26 05:03:36 localhost podman[322107]: 2025-11-26 10:03:36.972106808 +0000 UTC m=+0.061668079 container kill 01b9ac506e6225619a3c631a67e9993e3755257ac97cb532e357114a96292829 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7f4f14e5-4232-482d-96f4-8da45058c531, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true) Nov 26 05:03:36 localhost dnsmasq-dhcp[321971]: read /var/lib/neutron/dhcp/7f4f14e5-4232-482d-96f4-8da45058c531/opts Nov 26 05:03:37 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:37.182 262471 INFO neutron.agent.linux.ip_lib [None req-a343e935-a3fb-4708-9721-83b03033b5af - - - - - -] Device tape6a0b266-65 cannot be used as it has no MAC address#033[00m Nov 26 05:03:37 localhost kernel: device tap7ad0673d-f0 left promiscuous mode Nov 26 05:03:37 localhost nova_compute[281415]: 2025-11-26 10:03:37.185 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:37 localhost ovn_controller[153664]: 2025-11-26T10:03:37Z|00372|binding|INFO|Releasing lport 7ad0673d-f0c8-4656-af8e-2037e30e3c3b from this chassis (sb_readonly=0) Nov 26 05:03:37 localhost ovn_controller[153664]: 2025-11-26T10:03:37Z|00373|binding|INFO|Setting lport 7ad0673d-f0c8-4656-af8e-2037e30e3c3b down in Southbound Nov 26 05:03:37 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:37.193 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-7f4f14e5-4232-482d-96f4-8da45058c531', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7f4f14e5-4232-482d-96f4-8da45058c531', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a28ce44d2e9a40519a8955587c056dae', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=56e68f0a-110e-49e9-95dc-7c86ebfe4a46, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7ad0673d-f0c8-4656-af8e-2037e30e3c3b) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:03:37 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:37.194 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 7ad0673d-f0c8-4656-af8e-2037e30e3c3b in datapath 7f4f14e5-4232-482d-96f4-8da45058c531 unbound from our chassis#033[00m Nov 26 05:03:37 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:37.195 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7f4f14e5-4232-482d-96f4-8da45058c531 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:03:37 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:37.196 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[f384cd56-995b-4792-9361-54075320f416]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:03:37 localhost nova_compute[281415]: 2025-11-26 10:03:37.202 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:37 localhost nova_compute[281415]: 2025-11-26 10:03:37.212 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:37 localhost kernel: device tape6a0b266-65 entered promiscuous mode Nov 26 05:03:37 localhost NetworkManager[5970]: [1764151417.2217] manager: (tape6a0b266-65): new Generic device (/org/freedesktop/NetworkManager/Devices/60) Nov 26 05:03:37 localhost nova_compute[281415]: 2025-11-26 10:03:37.221 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:37 localhost ovn_controller[153664]: 2025-11-26T10:03:37Z|00374|binding|INFO|Claiming lport e6a0b266-6564-46aa-9e5e-68e512c254f0 for this chassis. Nov 26 05:03:37 localhost ovn_controller[153664]: 2025-11-26T10:03:37Z|00375|binding|INFO|e6a0b266-6564-46aa-9e5e-68e512c254f0: Claiming unknown Nov 26 05:03:37 localhost systemd-udevd[322138]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:03:37 localhost nova_compute[281415]: 2025-11-26 10:03:37.228 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:37 localhost ovn_controller[153664]: 2025-11-26T10:03:37Z|00376|binding|INFO|Setting lport e6a0b266-6564-46aa-9e5e-68e512c254f0 ovn-installed in OVS Nov 26 05:03:37 localhost nova_compute[281415]: 2025-11-26 10:03:37.241 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:37 localhost ovn_controller[153664]: 2025-11-26T10:03:37Z|00377|binding|INFO|Setting lport e6a0b266-6564-46aa-9e5e-68e512c254f0 up in Southbound Nov 26 05:03:37 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:37.249 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:ffff::2/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-c351755e-976a-4c66-be24-e78c1192e045', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c351755e-976a-4c66-be24-e78c1192e045', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a556ffda51124a0fb5ad54c9ab27653e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b989b984-81a9-468c-92e0-4e964397d8bd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e6a0b266-6564-46aa-9e5e-68e512c254f0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:03:37 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:37.250 159486 INFO neutron.agent.ovn.metadata.agent [-] Port e6a0b266-6564-46aa-9e5e-68e512c254f0 in datapath c351755e-976a-4c66-be24-e78c1192e045 bound to our chassis#033[00m Nov 26 05:03:37 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:37.251 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c351755e-976a-4c66-be24-e78c1192e045 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:03:37 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:37.252 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[0cb58894-575f-4588-989e-d36e3dedb879]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:03:37 localhost journal[229445]: ethtool ioctl error on tape6a0b266-65: No such device Nov 26 05:03:37 localhost nova_compute[281415]: 2025-11-26 10:03:37.259 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:37 localhost journal[229445]: ethtool ioctl error on tape6a0b266-65: No such device Nov 26 05:03:37 localhost journal[229445]: ethtool ioctl error on tape6a0b266-65: No such device Nov 26 05:03:37 localhost journal[229445]: ethtool ioctl error on tape6a0b266-65: No such device Nov 26 05:03:37 localhost journal[229445]: ethtool ioctl error on tape6a0b266-65: No such device Nov 26 05:03:37 localhost journal[229445]: ethtool ioctl error on tape6a0b266-65: No such device Nov 26 05:03:37 localhost journal[229445]: ethtool ioctl error on tape6a0b266-65: No such device Nov 26 05:03:37 localhost journal[229445]: ethtool ioctl error on tape6a0b266-65: No such device Nov 26 05:03:37 localhost nova_compute[281415]: 2025-11-26 10:03:37.296 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:37 localhost nova_compute[281415]: 2025-11-26 10:03:37.321 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:37 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e153 e153: 6 total, 6 up, 6 in Nov 26 05:03:38 localhost nova_compute[281415]: 2025-11-26 10:03:38.130 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:38 localhost podman[322209]: Nov 26 05:03:38 localhost podman[322209]: 2025-11-26 10:03:38.192958904 +0000 UTC m=+0.139311400 container create 9343aac1f18b866d17a648f9ff3e2440300bd7bab4022fdc75072f96077ef76c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c351755e-976a-4c66-be24-e78c1192e045, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true) Nov 26 05:03:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 05:03:38 localhost systemd[1]: Started libpod-conmon-9343aac1f18b866d17a648f9ff3e2440300bd7bab4022fdc75072f96077ef76c.scope. Nov 26 05:03:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 05:03:38 localhost podman[322209]: 2025-11-26 10:03:38.144756092 +0000 UTC m=+0.091108618 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:03:38 localhost systemd[1]: tmp-crun.xL4iSn.mount: Deactivated successfully. Nov 26 05:03:38 localhost systemd[1]: Started libcrun container. Nov 26 05:03:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/23971d47478032a10dccde92bfc6e0fb83fe997c423e30015b49916dcc38ff38/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:03:38 localhost podman[322209]: 2025-11-26 10:03:38.296146117 +0000 UTC m=+0.242498623 container init 9343aac1f18b866d17a648f9ff3e2440300bd7bab4022fdc75072f96077ef76c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c351755e-976a-4c66-be24-e78c1192e045, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2) Nov 26 05:03:38 localhost podman[322209]: 2025-11-26 10:03:38.310381227 +0000 UTC m=+0.256733683 container start 9343aac1f18b866d17a648f9ff3e2440300bd7bab4022fdc75072f96077ef76c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c351755e-976a-4c66-be24-e78c1192e045, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 26 05:03:38 localhost dnsmasq[322252]: started, version 2.85 cachesize 150 Nov 26 05:03:38 localhost dnsmasq[322252]: DNS service limited to local subnets Nov 26 05:03:38 localhost dnsmasq[322252]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:03:38 localhost dnsmasq[322252]: warning: no upstream servers configured Nov 26 05:03:38 localhost dnsmasq-dhcp[322252]: DHCPv6, static leases only on 2001:db8:0:ffff::, lease time 1d Nov 26 05:03:38 localhost dnsmasq[322252]: read /var/lib/neutron/dhcp/c351755e-976a-4c66-be24-e78c1192e045/addn_hosts - 0 addresses Nov 26 05:03:38 localhost dnsmasq-dhcp[322252]: read /var/lib/neutron/dhcp/c351755e-976a-4c66-be24-e78c1192e045/host Nov 26 05:03:38 localhost dnsmasq-dhcp[322252]: read /var/lib/neutron/dhcp/c351755e-976a-4c66-be24-e78c1192e045/opts Nov 26 05:03:38 localhost podman[322225]: 2025-11-26 10:03:38.365066989 +0000 UTC m=+0.115568809 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, maintainer=Red Hat, Inc., architecture=x86_64, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, config_id=edpm, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Nov 26 05:03:38 localhost podman[322225]: 2025-11-26 10:03:38.386326836 +0000 UTC m=+0.136828726 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, version=9.6, build-date=2025-08-20T13:12:41, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vcs-type=git, architecture=x86_64, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Nov 26 05:03:38 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 05:03:38 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e154 e154: 6 total, 6 up, 6 in Nov 26 05:03:38 localhost podman[322223]: 2025-11-26 10:03:38.483873803 +0000 UTC m=+0.243859102 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller) Nov 26 05:03:38 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:38.501 262471 INFO neutron.agent.dhcp.agent [None req-64cc011e-eeef-41a0-bb00-2887aee1c2f6 - - - - - -] DHCP configuration for ports {'8e2c4560-4a3d-4a57-8056-2d95674c0cee'} is completed#033[00m Nov 26 05:03:38 localhost podman[322223]: 2025-11-26 10:03:38.559551885 +0000 UTC m=+0.319537164 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251118, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 26 05:03:38 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 05:03:38 localhost dnsmasq[321971]: exiting on receipt of SIGTERM Nov 26 05:03:38 localhost podman[322286]: 2025-11-26 10:03:38.604685076 +0000 UTC m=+0.072579341 container kill 01b9ac506e6225619a3c631a67e9993e3755257ac97cb532e357114a96292829 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7f4f14e5-4232-482d-96f4-8da45058c531, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:03:38 localhost systemd[1]: libpod-01b9ac506e6225619a3c631a67e9993e3755257ac97cb532e357114a96292829.scope: Deactivated successfully. Nov 26 05:03:38 localhost podman[322301]: 2025-11-26 10:03:38.69501111 +0000 UTC m=+0.069756609 container died 01b9ac506e6225619a3c631a67e9993e3755257ac97cb532e357114a96292829 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7f4f14e5-4232-482d-96f4-8da45058c531, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:03:38 localhost podman[322301]: 2025-11-26 10:03:38.725920471 +0000 UTC m=+0.100665910 container cleanup 01b9ac506e6225619a3c631a67e9993e3755257ac97cb532e357114a96292829 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7f4f14e5-4232-482d-96f4-8da45058c531, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3) Nov 26 05:03:38 localhost systemd[1]: libpod-conmon-01b9ac506e6225619a3c631a67e9993e3755257ac97cb532e357114a96292829.scope: Deactivated successfully. Nov 26 05:03:38 localhost podman[322302]: 2025-11-26 10:03:38.772577498 +0000 UTC m=+0.142137894 container remove 01b9ac506e6225619a3c631a67e9993e3755257ac97cb532e357114a96292829 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7f4f14e5-4232-482d-96f4-8da45058c531, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Nov 26 05:03:38 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:38.796 2 INFO neutron.agent.securitygroups_rpc [None req-cfcae91f-878e-4b2f-87b4-971cf57d9ffa bf5696d8db9f4b2795974d3ed78b4991 c9a6d35bfc5f440e9fdc4ed36d883eff - - default default] Security group member updated ['1fe38523-8001-4ec9-b037-99c71dd3b61c']#033[00m Nov 26 05:03:39 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:39.144 262471 INFO neutron.agent.dhcp.agent [None req-af464d7e-2f36-445d-be22-c5be840dd74c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:03:39 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:39.148 262471 INFO neutron.agent.dhcp.agent [None req-c5a2b9df-a10c-4fbe-a719-a325cd188f8b - - - - - -] Synchronizing state#033[00m Nov 26 05:03:39 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:03:39 localhost systemd[1]: var-lib-containers-storage-overlay-646407ecee7e7a9fe8644e20e65ce80cd17b673576e28e9e324e5d13ad3b64e1-merged.mount: Deactivated successfully. Nov 26 05:03:39 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-01b9ac506e6225619a3c631a67e9993e3755257ac97cb532e357114a96292829-userdata-shm.mount: Deactivated successfully. Nov 26 05:03:39 localhost systemd[1]: run-netns-qdhcp\x2d7f4f14e5\x2d4232\x2d482d\x2d96f4\x2d8da45058c531.mount: Deactivated successfully. Nov 26 05:03:39 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:39.319 262471 INFO neutron.agent.dhcp.agent [None req-2b3c46c1-c948-41db-b362-1eb41e744815 - - - - - -] All active networks have been fetched through RPC.#033[00m Nov 26 05:03:39 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:39.320 262471 INFO neutron.agent.dhcp.agent [-] Starting network 7f4f14e5-4232-482d-96f4-8da45058c531 dhcp configuration#033[00m Nov 26 05:03:39 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:39.320 262471 INFO neutron.agent.dhcp.agent [-] Finished network 7f4f14e5-4232-482d-96f4-8da45058c531 dhcp configuration#033[00m Nov 26 05:03:39 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:39.321 262471 INFO neutron.agent.dhcp.agent [-] Starting network 98add213-215f-47b1-9de1-da68264e62d9 dhcp configuration#033[00m Nov 26 05:03:39 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:39.321 262471 INFO neutron.agent.dhcp.agent [-] Finished network 98add213-215f-47b1-9de1-da68264e62d9 dhcp configuration#033[00m Nov 26 05:03:39 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:39.321 262471 INFO neutron.agent.dhcp.agent [None req-2b3c46c1-c948-41db-b362-1eb41e744815 - - - - - -] Synchronizing state complete#033[00m Nov 26 05:03:39 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:39.321 262471 INFO neutron.agent.dhcp.agent [None req-2a1dd2b2-ba02-44a4-a690-2e6b0c91c157 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:03:39 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:39.631 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:03:39 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:39.772 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:03:39 localhost ovn_controller[153664]: 2025-11-26T10:03:39Z|00378|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:03:39 localhost nova_compute[281415]: 2025-11-26 10:03:39.823 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:40 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:40.230 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:63:fd:62 10.100.0.18 10.100.0.3'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.3/28', 'neutron:device_id': 'ovnmeta-d124220f-7e45-4229-9a4e-08e47dba3ff8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d124220f-7e45-4229-9a4e-08e47dba3ff8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c9a6d35bfc5f440e9fdc4ed36d883eff', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=41dab89f-0031-48c7-994f-1563fe8e097d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=6e2528d5-a5e9-492f-97b9-4bddb367a6ae) old=Port_Binding(mac=['fa:16:3e:63:fd:62 10.100.0.3'], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ovnmeta-d124220f-7e45-4229-9a4e-08e47dba3ff8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d124220f-7e45-4229-9a4e-08e47dba3ff8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c9a6d35bfc5f440e9fdc4ed36d883eff', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:03:40 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:40.233 159486 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 6e2528d5-a5e9-492f-97b9-4bddb367a6ae in datapath d124220f-7e45-4229-9a4e-08e47dba3ff8 updated#033[00m Nov 26 05:03:40 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:40.235 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d124220f-7e45-4229-9a4e-08e47dba3ff8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:03:40 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:40.236 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[e3e7b292-eeb1-47e9-9346-bf7c91c17814]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:03:41 localhost nova_compute[281415]: 2025-11-26 10:03:41.937 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:42 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:42.550 2 INFO neutron.agent.securitygroups_rpc [None req-68ca35fe-17a8-40c7-bc3f-470eb7bddfc8 bf5696d8db9f4b2795974d3ed78b4991 c9a6d35bfc5f440e9fdc4ed36d883eff - - default default] Security group member updated ['1fe38523-8001-4ec9-b037-99c71dd3b61c', 'b6a2fcdb-98ff-432a-a584-e02aa70aabc8']#033[00m Nov 26 05:03:42 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:42.589 2 INFO neutron.agent.securitygroups_rpc [None req-18e818ad-4b99-4d0e-85cb-73571558dd54 9cd6a2fca5b14a20a20bb1fb09c0d3c2 a28ce44d2e9a40519a8955587c056dae - - default default] Security group member updated ['abfeae79-b54d-4a2b-9096-6c50815ff4ea']#033[00m Nov 26 05:03:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 05:03:42 localhost podman[322327]: 2025-11-26 10:03:42.830255253 +0000 UTC m=+0.089798980 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 05:03:42 localhost podman[322327]: 2025-11-26 10:03:42.868503274 +0000 UTC m=+0.128046981 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 05:03:42 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 05:03:43 localhost nova_compute[281415]: 2025-11-26 10:03:43.173 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:43 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:43.426 2 INFO neutron.agent.securitygroups_rpc [None req-e80b4c5f-bffa-48a3-93ff-92a859d94b60 bf5696d8db9f4b2795974d3ed78b4991 c9a6d35bfc5f440e9fdc4ed36d883eff - - default default] Security group member updated ['b6a2fcdb-98ff-432a-a584-e02aa70aabc8']#033[00m Nov 26 05:03:43 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e155 e155: 6 total, 6 up, 6 in Nov 26 05:03:43 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:43.838 262471 INFO neutron.agent.linux.ip_lib [None req-efe04d50-41fd-48ef-a5bf-25726311deeb - - - - - -] Device tape966386f-83 cannot be used as it has no MAC address#033[00m Nov 26 05:03:43 localhost nova_compute[281415]: 2025-11-26 10:03:43.865 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:43 localhost kernel: device tape966386f-83 entered promiscuous mode Nov 26 05:03:43 localhost nova_compute[281415]: 2025-11-26 10:03:43.872 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:43 localhost ovn_controller[153664]: 2025-11-26T10:03:43Z|00379|binding|INFO|Claiming lport e966386f-8317-49af-b52a-0c8093a7a76a for this chassis. Nov 26 05:03:43 localhost NetworkManager[5970]: [1764151423.8731] manager: (tape966386f-83): new Generic device (/org/freedesktop/NetworkManager/Devices/61) Nov 26 05:03:43 localhost ovn_controller[153664]: 2025-11-26T10:03:43Z|00380|binding|INFO|e966386f-8317-49af-b52a-0c8093a7a76a: Claiming unknown Nov 26 05:03:43 localhost systemd-udevd[322360]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:03:43 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:43.887 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.255.242/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-d13a4499-ed1d-419a-b7ec-18731e67f9ad', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d13a4499-ed1d-419a-b7ec-18731e67f9ad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '140ee02dff30450e88d5baa79f6f7df2', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3694eb6a-62d4-45ac-b83e-207240edfdd0, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e966386f-8317-49af-b52a-0c8093a7a76a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:03:43 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:43.889 159486 INFO neutron.agent.ovn.metadata.agent [-] Port e966386f-8317-49af-b52a-0c8093a7a76a in datapath d13a4499-ed1d-419a-b7ec-18731e67f9ad bound to our chassis#033[00m Nov 26 05:03:43 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:43.891 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network d13a4499-ed1d-419a-b7ec-18731e67f9ad or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:03:43 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:43.897 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[ea9e74c9-466f-450a-870e-fb8007e86f79]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:03:43 localhost journal[229445]: ethtool ioctl error on tape966386f-83: No such device Nov 26 05:03:43 localhost ovn_controller[153664]: 2025-11-26T10:03:43Z|00381|binding|INFO|Setting lport e966386f-8317-49af-b52a-0c8093a7a76a ovn-installed in OVS Nov 26 05:03:43 localhost ovn_controller[153664]: 2025-11-26T10:03:43Z|00382|binding|INFO|Setting lport e966386f-8317-49af-b52a-0c8093a7a76a up in Southbound Nov 26 05:03:43 localhost nova_compute[281415]: 2025-11-26 10:03:43.912 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:43 localhost journal[229445]: ethtool ioctl error on tape966386f-83: No such device Nov 26 05:03:43 localhost nova_compute[281415]: 2025-11-26 10:03:43.914 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:43 localhost journal[229445]: ethtool ioctl error on tape966386f-83: No such device Nov 26 05:03:43 localhost journal[229445]: ethtool ioctl error on tape966386f-83: No such device Nov 26 05:03:43 localhost journal[229445]: ethtool ioctl error on tape966386f-83: No such device Nov 26 05:03:43 localhost journal[229445]: ethtool ioctl error on tape966386f-83: No such device Nov 26 05:03:43 localhost journal[229445]: ethtool ioctl error on tape966386f-83: No such device Nov 26 05:03:43 localhost journal[229445]: ethtool ioctl error on tape966386f-83: No such device Nov 26 05:03:43 localhost nova_compute[281415]: 2025-11-26 10:03:43.959 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:43 localhost nova_compute[281415]: 2025-11-26 10:03:43.997 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:44 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:03:45 localhost podman[322431]: Nov 26 05:03:45 localhost podman[322431]: 2025-11-26 10:03:45.165764989 +0000 UTC m=+0.094433167 container create b4a12ced12807c774e29e8a9831e7480761887ce5e69e1e96cfcd81bfd67f811 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d13a4499-ed1d-419a-b7ec-18731e67f9ad, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118) Nov 26 05:03:45 localhost podman[322431]: 2025-11-26 10:03:45.119795064 +0000 UTC m=+0.048463262 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:03:45 localhost systemd[1]: Started libpod-conmon-b4a12ced12807c774e29e8a9831e7480761887ce5e69e1e96cfcd81bfd67f811.scope. Nov 26 05:03:45 localhost systemd[1]: Started libcrun container. Nov 26 05:03:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f1a08f726bcf402cd459f5ea4f8c45f486c7fafaceadfe24a40e70399b8a68d6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:03:45 localhost podman[322431]: 2025-11-26 10:03:45.256295584 +0000 UTC m=+0.184963772 container init b4a12ced12807c774e29e8a9831e7480761887ce5e69e1e96cfcd81bfd67f811 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d13a4499-ed1d-419a-b7ec-18731e67f9ad, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0) Nov 26 05:03:45 localhost podman[322431]: 2025-11-26 10:03:45.265132797 +0000 UTC m=+0.193800985 container start b4a12ced12807c774e29e8a9831e7480761887ce5e69e1e96cfcd81bfd67f811 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d13a4499-ed1d-419a-b7ec-18731e67f9ad, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true) Nov 26 05:03:45 localhost dnsmasq[322449]: started, version 2.85 cachesize 150 Nov 26 05:03:45 localhost dnsmasq[322449]: DNS service limited to local subnets Nov 26 05:03:45 localhost dnsmasq[322449]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:03:45 localhost dnsmasq[322449]: warning: no upstream servers configured Nov 26 05:03:45 localhost dnsmasq-dhcp[322449]: DHCP, static leases only on 10.100.255.240, lease time 1d Nov 26 05:03:45 localhost dnsmasq[322449]: read /var/lib/neutron/dhcp/d13a4499-ed1d-419a-b7ec-18731e67f9ad/addn_hosts - 0 addresses Nov 26 05:03:45 localhost dnsmasq-dhcp[322449]: read /var/lib/neutron/dhcp/d13a4499-ed1d-419a-b7ec-18731e67f9ad/host Nov 26 05:03:45 localhost dnsmasq-dhcp[322449]: read /var/lib/neutron/dhcp/d13a4499-ed1d-419a-b7ec-18731e67f9ad/opts Nov 26 05:03:45 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:45.374 262471 INFO neutron.agent.dhcp.agent [None req-6dd63c96-e45e-467f-ad5c-bc78c68d241a - - - - - -] DHCP configuration for ports {'b45e0e09-aef5-4135-8dff-b021284ec024'} is completed#033[00m Nov 26 05:03:45 localhost openstack_network_exporter[242153]: ERROR 10:03:45 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 05:03:45 localhost openstack_network_exporter[242153]: ERROR 10:03:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:03:45 localhost openstack_network_exporter[242153]: ERROR 10:03:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:03:45 localhost openstack_network_exporter[242153]: ERROR 10:03:45 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 05:03:45 localhost openstack_network_exporter[242153]: Nov 26 05:03:45 localhost openstack_network_exporter[242153]: ERROR 10:03:45 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 05:03:45 localhost openstack_network_exporter[242153]: Nov 26 05:03:45 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:45.831 2 INFO neutron.agent.securitygroups_rpc [None req-81aa01cd-662c-4fde-a59a-3911606c9edb 9cd6a2fca5b14a20a20bb1fb09c0d3c2 a28ce44d2e9a40519a8955587c056dae - - default default] Security group member updated ['abfeae79-b54d-4a2b-9096-6c50815ff4ea']#033[00m Nov 26 05:03:46 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:46.066 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:63:fd:62 10.100.0.18 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-d124220f-7e45-4229-9a4e-08e47dba3ff8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d124220f-7e45-4229-9a4e-08e47dba3ff8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c9a6d35bfc5f440e9fdc4ed36d883eff', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=41dab89f-0031-48c7-994f-1563fe8e097d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=6e2528d5-a5e9-492f-97b9-4bddb367a6ae) old=Port_Binding(mac=['fa:16:3e:63:fd:62 10.100.0.18 10.100.0.3'], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.3/28', 'neutron:device_id': 'ovnmeta-d124220f-7e45-4229-9a4e-08e47dba3ff8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d124220f-7e45-4229-9a4e-08e47dba3ff8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c9a6d35bfc5f440e9fdc4ed36d883eff', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:03:46 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:46.068 159486 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 6e2528d5-a5e9-492f-97b9-4bddb367a6ae in datapath d124220f-7e45-4229-9a4e-08e47dba3ff8 updated#033[00m Nov 26 05:03:46 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:46.070 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d124220f-7e45-4229-9a4e-08e47dba3ff8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:03:46 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:46.072 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[1b8fe004-1bdf-416a-bf03-cf455bc1fed2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:03:46 localhost nova_compute[281415]: 2025-11-26 10:03:46.993 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:47 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e156 e156: 6 total, 6 up, 6 in Nov 26 05:03:48 localhost nova_compute[281415]: 2025-11-26 10:03:48.215 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:48 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:48.221 2 INFO neutron.agent.securitygroups_rpc [None req-aad702cf-d3d8-4c20-8929-280771ad3e3e bf5696d8db9f4b2795974d3ed78b4991 c9a6d35bfc5f440e9fdc4ed36d883eff - - default default] Security group member updated ['84381e85-c98b-4e9a-b4fa-023b4899d35e']#033[00m Nov 26 05:03:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 05:03:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 05:03:48 localhost podman[322454]: 2025-11-26 10:03:48.342135344 +0000 UTC m=+0.090686752 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS) Nov 26 05:03:48 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:48.342 262471 INFO neutron.agent.linux.ip_lib [None req-cf6c373d-bdc2-4d4d-a549-fa9336ac3158 - - - - - -] Device tap0997460b-5e cannot be used as it has no MAC address#033[00m Nov 26 05:03:48 localhost nova_compute[281415]: 2025-11-26 10:03:48.375 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:48 localhost kernel: device tap0997460b-5e entered promiscuous mode Nov 26 05:03:48 localhost NetworkManager[5970]: [1764151428.3817] manager: (tap0997460b-5e): new Generic device (/org/freedesktop/NetworkManager/Devices/62) Nov 26 05:03:48 localhost podman[322454]: 2025-11-26 10:03:48.383182445 +0000 UTC m=+0.131733883 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2) Nov 26 05:03:48 localhost systemd-udevd[322491]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:03:48 localhost ovn_controller[153664]: 2025-11-26T10:03:48Z|00383|binding|INFO|Claiming lport 0997460b-5e19-4748-b46b-31180af42203 for this chassis. Nov 26 05:03:48 localhost ovn_controller[153664]: 2025-11-26T10:03:48Z|00384|binding|INFO|0997460b-5e19-4748-b46b-31180af42203: Claiming unknown Nov 26 05:03:48 localhost nova_compute[281415]: 2025-11-26 10:03:48.385 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:48 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:48.401 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-f0549b39-ed4b-4ffc-bb4b-a02397604463', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f0549b39-ed4b-4ffc-bb4b-a02397604463', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f26aa24f87924ee1873628bdfc9d6d35', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2fca7c47-efde-413f-b7cc-9f4bd34ee3e1, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=0997460b-5e19-4748-b46b-31180af42203) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:03:48 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 05:03:48 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:48.408 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 0997460b-5e19-4748-b46b-31180af42203 in datapath f0549b39-ed4b-4ffc-bb4b-a02397604463 bound to our chassis#033[00m Nov 26 05:03:48 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:48.410 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f0549b39-ed4b-4ffc-bb4b-a02397604463 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:03:48 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:48.411 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[5abfc379-7d5e-4a66-9996-c0e27db3474e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:03:48 localhost ovn_controller[153664]: 2025-11-26T10:03:48Z|00385|binding|INFO|Setting lport 0997460b-5e19-4748-b46b-31180af42203 ovn-installed in OVS Nov 26 05:03:48 localhost ovn_controller[153664]: 2025-11-26T10:03:48Z|00386|binding|INFO|Setting lport 0997460b-5e19-4748-b46b-31180af42203 up in Southbound Nov 26 05:03:48 localhost nova_compute[281415]: 2025-11-26 10:03:48.419 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:48 localhost nova_compute[281415]: 2025-11-26 10:03:48.421 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:48 localhost podman[322455]: 2025-11-26 10:03:48.422604516 +0000 UTC m=+0.167072227 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 26 05:03:48 localhost podman[322455]: 2025-11-26 10:03:48.440644946 +0000 UTC m=+0.185112677 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:03:48 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 05:03:48 localhost nova_compute[281415]: 2025-11-26 10:03:48.479 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:48 localhost nova_compute[281415]: 2025-11-26 10:03:48.519 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:49 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e156 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:03:49 localhost podman[322550]: Nov 26 05:03:49 localhost podman[322550]: 2025-11-26 10:03:49.553994781 +0000 UTC m=+0.132146115 container create 416cf1559770b1ec07fcd4d574d51ef3e2a7d59ef7a24485f3b67d7b79665653 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f0549b39-ed4b-4ffc-bb4b-a02397604463, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:03:49 localhost podman[322550]: 2025-11-26 10:03:49.470531885 +0000 UTC m=+0.048683229 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:03:49 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e157 e157: 6 total, 6 up, 6 in Nov 26 05:03:49 localhost systemd[1]: Started libpod-conmon-416cf1559770b1ec07fcd4d574d51ef3e2a7d59ef7a24485f3b67d7b79665653.scope. Nov 26 05:03:49 localhost systemd[1]: Started libcrun container. Nov 26 05:03:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e79c0d28f34c5cebdd492b41edb96bd4e577be59e9da077accbd62b367103fa1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:03:49 localhost podman[322550]: 2025-11-26 10:03:49.640788 +0000 UTC m=+0.218939334 container init 416cf1559770b1ec07fcd4d574d51ef3e2a7d59ef7a24485f3b67d7b79665653 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f0549b39-ed4b-4ffc-bb4b-a02397604463, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:03:49 localhost podman[322550]: 2025-11-26 10:03:49.650198902 +0000 UTC m=+0.228350246 container start 416cf1559770b1ec07fcd4d574d51ef3e2a7d59ef7a24485f3b67d7b79665653 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f0549b39-ed4b-4ffc-bb4b-a02397604463, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0) Nov 26 05:03:49 localhost dnsmasq[322569]: started, version 2.85 cachesize 150 Nov 26 05:03:49 localhost dnsmasq[322569]: DNS service limited to local subnets Nov 26 05:03:49 localhost dnsmasq[322569]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:03:49 localhost dnsmasq[322569]: warning: no upstream servers configured Nov 26 05:03:49 localhost dnsmasq-dhcp[322569]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 26 05:03:49 localhost dnsmasq[322569]: read /var/lib/neutron/dhcp/f0549b39-ed4b-4ffc-bb4b-a02397604463/addn_hosts - 0 addresses Nov 26 05:03:49 localhost dnsmasq-dhcp[322569]: read /var/lib/neutron/dhcp/f0549b39-ed4b-4ffc-bb4b-a02397604463/host Nov 26 05:03:49 localhost dnsmasq-dhcp[322569]: read /var/lib/neutron/dhcp/f0549b39-ed4b-4ffc-bb4b-a02397604463/opts Nov 26 05:03:49 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:49.850 262471 INFO neutron.agent.dhcp.agent [None req-f0486b0c-5359-4d91-ab45-0ca5ded69770 - - - - - -] DHCP configuration for ports {'33b7edd4-e9f4-4c32-8e47-a2560f714210'} is completed#033[00m Nov 26 05:03:50 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e158 e158: 6 total, 6 up, 6 in Nov 26 05:03:50 localhost nova_compute[281415]: 2025-11-26 10:03:50.759 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:51 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:51.505 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:63:fd:62 10.100.0.18 10.100.0.2 10.100.0.34'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28 10.100.0.34/28', 'neutron:device_id': 'ovnmeta-d124220f-7e45-4229-9a4e-08e47dba3ff8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d124220f-7e45-4229-9a4e-08e47dba3ff8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c9a6d35bfc5f440e9fdc4ed36d883eff', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=41dab89f-0031-48c7-994f-1563fe8e097d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=6e2528d5-a5e9-492f-97b9-4bddb367a6ae) old=Port_Binding(mac=['fa:16:3e:63:fd:62 10.100.0.18 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-d124220f-7e45-4229-9a4e-08e47dba3ff8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d124220f-7e45-4229-9a4e-08e47dba3ff8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c9a6d35bfc5f440e9fdc4ed36d883eff', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:03:51 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:51.507 159486 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 6e2528d5-a5e9-492f-97b9-4bddb367a6ae in datapath d124220f-7e45-4229-9a4e-08e47dba3ff8 updated#033[00m Nov 26 05:03:51 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:51.510 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d124220f-7e45-4229-9a4e-08e47dba3ff8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:03:51 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:51.511 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[8371bfbe-9272-4e91-b732-d906e9a5d279]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:03:51 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e159 e159: 6 total, 6 up, 6 in Nov 26 05:03:51 localhost ceph-mon[297296]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0. Nov 26 05:03:51 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:03:51.640173) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 26 05:03:51 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40 Nov 26 05:03:51 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151431640223, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 2659, "num_deletes": 262, "total_data_size": 4906590, "memory_usage": 5050288, "flush_reason": "Manual Compaction"} Nov 26 05:03:51 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started Nov 26 05:03:51 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151431664749, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 3208610, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 22934, "largest_seqno": 25588, "table_properties": {"data_size": 3198448, "index_size": 6539, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2565, "raw_key_size": 22092, "raw_average_key_size": 21, "raw_value_size": 3177791, "raw_average_value_size": 3106, "num_data_blocks": 282, "num_entries": 1023, "num_filter_entries": 1023, "num_deletions": 262, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764151272, "oldest_key_time": 1764151272, "file_creation_time": 1764151431, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79fcc812-ff89-4330-9c0d-148e92884770", "db_session_id": "NHY1MHQN9GMTALZ562AS", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}} Nov 26 05:03:51 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 24682 microseconds, and 10694 cpu microseconds. Nov 26 05:03:51 localhost ceph-mon[297296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 05:03:51 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:03:51.664844) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 3208610 bytes OK Nov 26 05:03:51 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:03:51.664880) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started Nov 26 05:03:51 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:03:51.666771) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done Nov 26 05:03:51 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:03:51.666796) EVENT_LOG_v1 {"time_micros": 1764151431666788, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 26 05:03:51 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:03:51.666825) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 26 05:03:51 localhost ceph-mon[297296]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 4894675, prev total WAL file size 4894675, number of live WAL files 2. Nov 26 05:03:51 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 05:03:51 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:03:51.669374) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131373937' seq:72057594037927935, type:22 .. '7061786F73003132303439' seq:0, type:0; will stop at (end) Nov 26 05:03:51 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 26 05:03:51 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(3133KB)], [39(14MB)] Nov 26 05:03:51 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151431669467, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 17890997, "oldest_snapshot_seqno": -1} Nov 26 05:03:51 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 12418 keys, 16355321 bytes, temperature: kUnknown Nov 26 05:03:51 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151431747172, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 16355321, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16283101, "index_size": 40055, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31109, "raw_key_size": 332076, "raw_average_key_size": 26, "raw_value_size": 16070212, "raw_average_value_size": 1294, "num_data_blocks": 1525, "num_entries": 12418, "num_filter_entries": 12418, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764150724, "oldest_key_time": 0, "file_creation_time": 1764151431, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79fcc812-ff89-4330-9c0d-148e92884770", "db_session_id": "NHY1MHQN9GMTALZ562AS", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}} Nov 26 05:03:51 localhost ceph-mon[297296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 05:03:51 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:03:51.747559) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 16355321 bytes Nov 26 05:03:51 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:03:51.749268) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 229.8 rd, 210.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.1, 14.0 +0.0 blob) out(15.6 +0.0 blob), read-write-amplify(10.7) write-amplify(5.1) OK, records in: 12956, records dropped: 538 output_compression: NoCompression Nov 26 05:03:51 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:03:51.749291) EVENT_LOG_v1 {"time_micros": 1764151431749280, "job": 22, "event": "compaction_finished", "compaction_time_micros": 77850, "compaction_time_cpu_micros": 48147, "output_level": 6, "num_output_files": 1, "total_output_size": 16355321, "num_input_records": 12956, "num_output_records": 12418, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 26 05:03:51 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 05:03:51 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151431749753, "job": 22, "event": "table_file_deletion", "file_number": 41} Nov 26 05:03:51 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 05:03:51 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151431751606, "job": 22, "event": "table_file_deletion", "file_number": 39} Nov 26 05:03:51 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:03:51.668161) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:03:51 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:03:51.751697) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:03:51 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:03:51.751707) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:03:51 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:03:51.751710) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:03:51 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:03:51.751713) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:03:51 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:03:51.751717) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:03:52 localhost nova_compute[281415]: 2025-11-26 10:03:52.029 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:52 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 26 05:03:52 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/515983333' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 26 05:03:52 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 26 05:03:52 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/515983333' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 26 05:03:52 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:52.246 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:03:51Z, description=, device_id=1cb68201-9400-4558-9048-600bcec0044a, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=7b54c15f-c17b-4c7d-8d98-3663b538df00, ip_allocation=immediate, mac_address=fa:16:3e:0c:68:d3, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:03:46Z, description=, dns_domain=, id=f0549b39-ed4b-4ffc-bb4b-a02397604463, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-1598966045-network, port_security_enabled=True, project_id=f26aa24f87924ee1873628bdfc9d6d35, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=14316, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2258, status=ACTIVE, subnets=['ab3496d1-1163-4b7d-8255-70e799ffb947'], tags=[], tenant_id=f26aa24f87924ee1873628bdfc9d6d35, updated_at=2025-11-26T10:03:47Z, vlan_transparent=None, network_id=f0549b39-ed4b-4ffc-bb4b-a02397604463, port_security_enabled=False, project_id=f26aa24f87924ee1873628bdfc9d6d35, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2301, status=DOWN, tags=[], tenant_id=f26aa24f87924ee1873628bdfc9d6d35, updated_at=2025-11-26T10:03:51Z on network f0549b39-ed4b-4ffc-bb4b-a02397604463#033[00m Nov 26 05:03:52 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:52.326 262471 INFO neutron.agent.linux.ip_lib [None req-f80a2be6-0a01-4955-9165-9057a7904368 - - - - - -] Device tapa96241e6-86 cannot be used as it has no MAC address#033[00m Nov 26 05:03:52 localhost nova_compute[281415]: 2025-11-26 10:03:52.357 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:52 localhost kernel: device tapa96241e6-86 entered promiscuous mode Nov 26 05:03:52 localhost ovn_controller[153664]: 2025-11-26T10:03:52Z|00387|binding|INFO|Claiming lport a96241e6-8623-4885-9abc-341e3472e16e for this chassis. Nov 26 05:03:52 localhost ovn_controller[153664]: 2025-11-26T10:03:52Z|00388|binding|INFO|a96241e6-8623-4885-9abc-341e3472e16e: Claiming unknown Nov 26 05:03:52 localhost nova_compute[281415]: 2025-11-26 10:03:52.367 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:52 localhost NetworkManager[5970]: [1764151432.3695] manager: (tapa96241e6-86): new Generic device (/org/freedesktop/NetworkManager/Devices/63) Nov 26 05:03:52 localhost systemd-udevd[322594]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:03:52 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:52.382 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-d8ca1ffb-6b41-4895-9626-77fdd0551e50', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d8ca1ffb-6b41-4895-9626-77fdd0551e50', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '140ee02dff30450e88d5baa79f6f7df2', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f3c020d1-286d-45b8-be3e-02fb63d30ba9, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a96241e6-8623-4885-9abc-341e3472e16e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:03:52 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:52.384 159486 INFO neutron.agent.ovn.metadata.agent [-] Port a96241e6-8623-4885-9abc-341e3472e16e in datapath d8ca1ffb-6b41-4895-9626-77fdd0551e50 bound to our chassis#033[00m Nov 26 05:03:52 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:52.386 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network d8ca1ffb-6b41-4895-9626-77fdd0551e50 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:03:52 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:52.387 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[5ca9f9fc-d1cd-462e-95a0-1af4d5d70890]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:03:52 localhost nova_compute[281415]: 2025-11-26 10:03:52.389 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:52 localhost ovn_controller[153664]: 2025-11-26T10:03:52Z|00389|binding|INFO|Setting lport a96241e6-8623-4885-9abc-341e3472e16e ovn-installed in OVS Nov 26 05:03:52 localhost ovn_controller[153664]: 2025-11-26T10:03:52Z|00390|binding|INFO|Setting lport a96241e6-8623-4885-9abc-341e3472e16e up in Southbound Nov 26 05:03:52 localhost nova_compute[281415]: 2025-11-26 10:03:52.394 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:52 localhost nova_compute[281415]: 2025-11-26 10:03:52.422 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:52 localhost nova_compute[281415]: 2025-11-26 10:03:52.475 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:52 localhost nova_compute[281415]: 2025-11-26 10:03:52.520 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:52 localhost systemd[1]: tmp-crun.iRj0dT.mount: Deactivated successfully. Nov 26 05:03:52 localhost dnsmasq[322569]: read /var/lib/neutron/dhcp/f0549b39-ed4b-4ffc-bb4b-a02397604463/addn_hosts - 1 addresses Nov 26 05:03:52 localhost dnsmasq-dhcp[322569]: read /var/lib/neutron/dhcp/f0549b39-ed4b-4ffc-bb4b-a02397604463/host Nov 26 05:03:52 localhost dnsmasq-dhcp[322569]: read /var/lib/neutron/dhcp/f0549b39-ed4b-4ffc-bb4b-a02397604463/opts Nov 26 05:03:52 localhost podman[322600]: 2025-11-26 10:03:52.535909722 +0000 UTC m=+0.100562607 container kill 416cf1559770b1ec07fcd4d574d51ef3e2a7d59ef7a24485f3b67d7b79665653 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f0549b39-ed4b-4ffc-bb4b-a02397604463, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 26 05:03:52 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:52.590 2 INFO neutron.agent.securitygroups_rpc [None req-093d8a0e-bafd-42ef-b420-1cdb6aac0c46 bf5696d8db9f4b2795974d3ed78b4991 c9a6d35bfc5f440e9fdc4ed36d883eff - - default default] Security group member updated ['b75d67e2-e430-4ea8-a1c0-e1638a75f2a7', '84381e85-c98b-4e9a-b4fa-023b4899d35e', 'f8502d0f-ec13-4628-ad59-6d6e315ba187']#033[00m Nov 26 05:03:52 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e160 e160: 6 total, 6 up, 6 in Nov 26 05:03:52 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:52.819 262471 INFO neutron.agent.dhcp.agent [None req-9ab1e0a7-ea32-4dba-9f95-715a78eb18dc - - - - - -] DHCP configuration for ports {'7b54c15f-c17b-4c7d-8d98-3663b538df00'} is completed#033[00m Nov 26 05:03:52 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 26 05:03:52 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1612663540' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 26 05:03:52 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 26 05:03:52 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1612663540' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 26 05:03:53 localhost nova_compute[281415]: 2025-11-26 10:03:53.255 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:53 localhost podman[322670]: Nov 26 05:03:53 localhost podman[322670]: 2025-11-26 10:03:53.454964528 +0000 UTC m=+0.093167418 container create c47c77c4f9ba81016088c19307be49e95dac4db8c8c03947edb6663f87e853f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d8ca1ffb-6b41-4895-9626-77fdd0551e50, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:03:53 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e161 e161: 6 total, 6 up, 6 in Nov 26 05:03:53 localhost systemd[1]: Started libpod-conmon-c47c77c4f9ba81016088c19307be49e95dac4db8c8c03947edb6663f87e853f3.scope. Nov 26 05:03:53 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:53.508 2 INFO neutron.agent.securitygroups_rpc [None req-4e4f1e65-20d4-464c-85e8-d02f1bb5eb0f bf5696d8db9f4b2795974d3ed78b4991 c9a6d35bfc5f440e9fdc4ed36d883eff - - default default] Security group member updated ['b75d67e2-e430-4ea8-a1c0-e1638a75f2a7', 'f8502d0f-ec13-4628-ad59-6d6e315ba187']#033[00m Nov 26 05:03:53 localhost podman[322670]: 2025-11-26 10:03:53.409917552 +0000 UTC m=+0.048120452 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:03:53 localhost systemd[1]: Started libcrun container. Nov 26 05:03:53 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/520e2f0a574709fbd3e63566b1b13e1ec71409e7b674bdcc2e4bfbbc62b9ec92/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:03:53 localhost podman[322670]: 2025-11-26 10:03:53.555281336 +0000 UTC m=+0.193484236 container init c47c77c4f9ba81016088c19307be49e95dac4db8c8c03947edb6663f87e853f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d8ca1ffb-6b41-4895-9626-77fdd0551e50, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Nov 26 05:03:53 localhost podman[322670]: 2025-11-26 10:03:53.571846649 +0000 UTC m=+0.210049539 container start c47c77c4f9ba81016088c19307be49e95dac4db8c8c03947edb6663f87e853f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d8ca1ffb-6b41-4895-9626-77fdd0551e50, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118) Nov 26 05:03:53 localhost dnsmasq[322689]: started, version 2.85 cachesize 150 Nov 26 05:03:53 localhost dnsmasq[322689]: DNS service limited to local subnets Nov 26 05:03:53 localhost dnsmasq[322689]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:03:53 localhost dnsmasq[322689]: warning: no upstream servers configured Nov 26 05:03:53 localhost dnsmasq-dhcp[322689]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 26 05:03:53 localhost dnsmasq[322689]: read /var/lib/neutron/dhcp/d8ca1ffb-6b41-4895-9626-77fdd0551e50/addn_hosts - 0 addresses Nov 26 05:03:53 localhost dnsmasq-dhcp[322689]: read /var/lib/neutron/dhcp/d8ca1ffb-6b41-4895-9626-77fdd0551e50/host Nov 26 05:03:53 localhost dnsmasq-dhcp[322689]: read /var/lib/neutron/dhcp/d8ca1ffb-6b41-4895-9626-77fdd0551e50/opts Nov 26 05:03:53 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:53.784 262471 INFO neutron.agent.dhcp.agent [None req-a221d9a9-f614-41e6-aaad-5d7d6dcba1b3 - - - - - -] DHCP configuration for ports {'ccf39800-304f-40c6-85bd-ba9004790ed6'} is completed#033[00m Nov 26 05:03:53 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:53.813 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:03:51Z, description=, device_id=1cb68201-9400-4558-9048-600bcec0044a, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=7b54c15f-c17b-4c7d-8d98-3663b538df00, ip_allocation=immediate, mac_address=fa:16:3e:0c:68:d3, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:03:46Z, description=, dns_domain=, id=f0549b39-ed4b-4ffc-bb4b-a02397604463, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-1598966045-network, port_security_enabled=True, project_id=f26aa24f87924ee1873628bdfc9d6d35, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=14316, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2258, status=ACTIVE, subnets=['ab3496d1-1163-4b7d-8255-70e799ffb947'], tags=[], tenant_id=f26aa24f87924ee1873628bdfc9d6d35, updated_at=2025-11-26T10:03:47Z, vlan_transparent=None, network_id=f0549b39-ed4b-4ffc-bb4b-a02397604463, port_security_enabled=False, project_id=f26aa24f87924ee1873628bdfc9d6d35, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2301, status=DOWN, tags=[], tenant_id=f26aa24f87924ee1873628bdfc9d6d35, updated_at=2025-11-26T10:03:51Z on network f0549b39-ed4b-4ffc-bb4b-a02397604463#033[00m Nov 26 05:03:54 localhost systemd[1]: tmp-crun.GewC4R.mount: Deactivated successfully. Nov 26 05:03:54 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e161 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:03:54 localhost sshd[322690]: main: sshd: ssh-rsa algorithm is disabled Nov 26 05:03:54 localhost dnsmasq[322569]: read /var/lib/neutron/dhcp/f0549b39-ed4b-4ffc-bb4b-a02397604463/addn_hosts - 1 addresses Nov 26 05:03:54 localhost dnsmasq-dhcp[322569]: read /var/lib/neutron/dhcp/f0549b39-ed4b-4ffc-bb4b-a02397604463/host Nov 26 05:03:54 localhost podman[322707]: 2025-11-26 10:03:54.897545544 +0000 UTC m=+0.054553422 container kill 416cf1559770b1ec07fcd4d574d51ef3e2a7d59ef7a24485f3b67d7b79665653 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f0549b39-ed4b-4ffc-bb4b-a02397604463, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:03:54 localhost dnsmasq-dhcp[322569]: read /var/lib/neutron/dhcp/f0549b39-ed4b-4ffc-bb4b-a02397604463/opts Nov 26 05:03:55 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:55.259 262471 INFO neutron.agent.dhcp.agent [None req-42ee5c14-6f7b-46fb-91ca-8192bfd742fc - - - - - -] DHCP configuration for ports {'7b54c15f-c17b-4c7d-8d98-3663b538df00'} is completed#033[00m Nov 26 05:03:55 localhost ovn_controller[153664]: 2025-11-26T10:03:55Z|00391|binding|INFO|Releasing lport a96241e6-8623-4885-9abc-341e3472e16e from this chassis (sb_readonly=0) Nov 26 05:03:55 localhost ovn_controller[153664]: 2025-11-26T10:03:55Z|00392|binding|INFO|Setting lport a96241e6-8623-4885-9abc-341e3472e16e down in Southbound Nov 26 05:03:55 localhost nova_compute[281415]: 2025-11-26 10:03:55.335 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:55 localhost kernel: device tapa96241e6-86 left promiscuous mode Nov 26 05:03:55 localhost nova_compute[281415]: 2025-11-26 10:03:55.343 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:55 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:55.346 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-d8ca1ffb-6b41-4895-9626-77fdd0551e50', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d8ca1ffb-6b41-4895-9626-77fdd0551e50', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '140ee02dff30450e88d5baa79f6f7df2', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f3c020d1-286d-45b8-be3e-02fb63d30ba9, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a96241e6-8623-4885-9abc-341e3472e16e) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:03:55 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:55.348 159486 INFO neutron.agent.ovn.metadata.agent [-] Port a96241e6-8623-4885-9abc-341e3472e16e in datapath d8ca1ffb-6b41-4895-9626-77fdd0551e50 unbound from our chassis#033[00m Nov 26 05:03:55 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:55.351 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d8ca1ffb-6b41-4895-9626-77fdd0551e50, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:03:55 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:55.352 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[f801064b-c1a2-4720-b950-dc3300ec331d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:03:55 localhost nova_compute[281415]: 2025-11-26 10:03:55.355 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:55 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 26 05:03:55 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1171623774' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 26 05:03:55 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 26 05:03:55 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1171623774' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 26 05:03:56 localhost dnsmasq[322689]: read /var/lib/neutron/dhcp/d8ca1ffb-6b41-4895-9626-77fdd0551e50/addn_hosts - 0 addresses Nov 26 05:03:56 localhost dnsmasq-dhcp[322689]: read /var/lib/neutron/dhcp/d8ca1ffb-6b41-4895-9626-77fdd0551e50/host Nov 26 05:03:56 localhost dnsmasq-dhcp[322689]: read /var/lib/neutron/dhcp/d8ca1ffb-6b41-4895-9626-77fdd0551e50/opts Nov 26 05:03:56 localhost podman[322747]: 2025-11-26 10:03:56.131078303 +0000 UTC m=+0.059814184 container kill c47c77c4f9ba81016088c19307be49e95dac4db8c8c03947edb6663f87e853f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d8ca1ffb-6b41-4895-9626-77fdd0551e50, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.157 262471 ERROR neutron.agent.dhcp.agent [None req-79627428-2c4d-4b66-b94d-312850d047a0 - - - - - -] Unable to reload_allocations dhcp for d8ca1ffb-6b41-4895-9626-77fdd0551e50.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapa96241e6-86 not found in namespace qdhcp-d8ca1ffb-6b41-4895-9626-77fdd0551e50. Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.157 262471 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.157 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.157 262471 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.157 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.157 262471 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.157 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.157 262471 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.157 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.157 262471 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.157 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.157 262471 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.157 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.157 262471 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.157 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.157 262471 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.157 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.157 262471 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.157 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.157 262471 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.157 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.157 262471 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.157 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.157 262471 ERROR neutron.agent.dhcp.agent return fut.result() Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.157 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.157 262471 ERROR neutron.agent.dhcp.agent return self.__get_result() Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.157 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.157 262471 ERROR neutron.agent.dhcp.agent raise self._exception Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.157 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.157 262471 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.157 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.157 262471 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.157 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.157 262471 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.157 262471 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapa96241e6-86 not found in namespace qdhcp-d8ca1ffb-6b41-4895-9626-77fdd0551e50. Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.157 262471 ERROR neutron.agent.dhcp.agent #033[00m Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.160 262471 INFO neutron.agent.dhcp.agent [None req-2b3c46c1-c948-41db-b362-1eb41e744815 - - - - - -] Synchronizing state#033[00m Nov 26 05:03:56 localhost ovn_controller[153664]: 2025-11-26T10:03:56Z|00393|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:03:56 localhost nova_compute[281415]: 2025-11-26 10:03:56.545 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.627 262471 INFO neutron.agent.dhcp.agent [None req-f696aa43-fc6b-4141-a070-5e294a3f51b6 - - - - - -] All active networks have been fetched through RPC.#033[00m Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.628 262471 INFO neutron.agent.dhcp.agent [-] Starting network d8ca1ffb-6b41-4895-9626-77fdd0551e50 dhcp configuration#033[00m Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.629 262471 INFO neutron.agent.dhcp.agent [-] Finished network d8ca1ffb-6b41-4895-9626-77fdd0551e50 dhcp configuration#033[00m Nov 26 05:03:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:56.629 262471 INFO neutron.agent.dhcp.agent [None req-f696aa43-fc6b-4141-a070-5e294a3f51b6 - - - - - -] Synchronizing state complete#033[00m Nov 26 05:03:56 localhost neutron_sriov_agent[255515]: 2025-11-26 10:03:56.698 2 INFO neutron.agent.securitygroups_rpc [None req-0bb27143-780e-4584-b2c1-353a94068388 bf5696d8db9f4b2795974d3ed78b4991 c9a6d35bfc5f440e9fdc4ed36d883eff - - default default] Security group member updated ['512f55ba-befd-448e-8449-d75d9733402e']#033[00m Nov 26 05:03:56 localhost dnsmasq[322689]: exiting on receipt of SIGTERM Nov 26 05:03:56 localhost podman[322779]: 2025-11-26 10:03:56.844118086 +0000 UTC m=+0.062544559 container kill c47c77c4f9ba81016088c19307be49e95dac4db8c8c03947edb6663f87e853f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d8ca1ffb-6b41-4895-9626-77fdd0551e50, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Nov 26 05:03:56 localhost systemd[1]: libpod-c47c77c4f9ba81016088c19307be49e95dac4db8c8c03947edb6663f87e853f3.scope: Deactivated successfully. Nov 26 05:03:56 localhost podman[322793]: 2025-11-26 10:03:56.917275073 +0000 UTC m=+0.057850184 container died c47c77c4f9ba81016088c19307be49e95dac4db8c8c03947edb6663f87e853f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d8ca1ffb-6b41-4895-9626-77fdd0551e50, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:03:56 localhost systemd[1]: tmp-crun.rYyczS.mount: Deactivated successfully. Nov 26 05:03:56 localhost podman[322793]: 2025-11-26 10:03:56.957080376 +0000 UTC m=+0.097655407 container cleanup c47c77c4f9ba81016088c19307be49e95dac4db8c8c03947edb6663f87e853f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d8ca1ffb-6b41-4895-9626-77fdd0551e50, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 26 05:03:56 localhost systemd[1]: libpod-conmon-c47c77c4f9ba81016088c19307be49e95dac4db8c8c03947edb6663f87e853f3.scope: Deactivated successfully. Nov 26 05:03:57 localhost podman[322795]: 2025-11-26 10:03:57.004059012 +0000 UTC m=+0.135070677 container remove c47c77c4f9ba81016088c19307be49e95dac4db8c8c03947edb6663f87e853f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d8ca1ffb-6b41-4895-9626-77fdd0551e50, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 26 05:03:57 localhost nova_compute[281415]: 2025-11-26 10:03:57.067 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:57 localhost systemd[1]: var-lib-containers-storage-overlay-520e2f0a574709fbd3e63566b1b13e1ec71409e7b674bdcc2e4bfbbc62b9ec92-merged.mount: Deactivated successfully. Nov 26 05:03:57 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c47c77c4f9ba81016088c19307be49e95dac4db8c8c03947edb6663f87e853f3-userdata-shm.mount: Deactivated successfully. Nov 26 05:03:57 localhost systemd[1]: run-netns-qdhcp\x2dd8ca1ffb\x2d6b41\x2d4895\x2d9626\x2d77fdd0551e50.mount: Deactivated successfully. Nov 26 05:03:57 localhost podman[240049]: time="2025-11-26T10:03:57Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 05:03:57 localhost podman[240049]: @ - - [26/Nov/2025:10:03:57 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 159340 "" "Go-http-client/1.1" Nov 26 05:03:57 localhost podman[240049]: @ - - [26/Nov/2025:10:03:57 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20200 "" "Go-http-client/1.1" Nov 26 05:03:58 localhost nova_compute[281415]: 2025-11-26 10:03:58.288 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:58 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e162 e162: 6 total, 6 up, 6 in Nov 26 05:03:58 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:03:58.903 262471 INFO neutron.agent.linux.ip_lib [None req-5e596d37-2f02-440e-a5ed-40a463809b14 - - - - - -] Device tap94a1f8ac-70 cannot be used as it has no MAC address#033[00m Nov 26 05:03:58 localhost nova_compute[281415]: 2025-11-26 10:03:58.933 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:58 localhost kernel: device tap94a1f8ac-70 entered promiscuous mode Nov 26 05:03:58 localhost nova_compute[281415]: 2025-11-26 10:03:58.944 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:58 localhost ovn_controller[153664]: 2025-11-26T10:03:58Z|00394|binding|INFO|Claiming lport 94a1f8ac-70de-4a59-8a56-1a87ef1ec6c7 for this chassis. Nov 26 05:03:58 localhost ovn_controller[153664]: 2025-11-26T10:03:58Z|00395|binding|INFO|94a1f8ac-70de-4a59-8a56-1a87ef1ec6c7: Claiming unknown Nov 26 05:03:58 localhost NetworkManager[5970]: [1764151438.9472] manager: (tap94a1f8ac-70): new Generic device (/org/freedesktop/NetworkManager/Devices/64) Nov 26 05:03:58 localhost systemd-udevd[322832]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:03:58 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:58.954 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.1/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-fb1cc77c-45ea-4dd3-a42a-ffba463f728a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fb1cc77c-45ea-4dd3-a42a-ffba463f728a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '140ee02dff30450e88d5baa79f6f7df2', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7d3b03c9-c5a8-4998-b1ed-401a93ac8687, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=94a1f8ac-70de-4a59-8a56-1a87ef1ec6c7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:03:58 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:58.956 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 94a1f8ac-70de-4a59-8a56-1a87ef1ec6c7 in datapath fb1cc77c-45ea-4dd3-a42a-ffba463f728a bound to our chassis#033[00m Nov 26 05:03:58 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:58.958 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network fb1cc77c-45ea-4dd3-a42a-ffba463f728a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:03:58 localhost ovn_metadata_agent[159481]: 2025-11-26 10:03:58.959 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[f08f155c-8e0b-4021-8bb9-dd2e822bb925]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:03:58 localhost journal[229445]: ethtool ioctl error on tap94a1f8ac-70: No such device Nov 26 05:03:58 localhost journal[229445]: ethtool ioctl error on tap94a1f8ac-70: No such device Nov 26 05:03:58 localhost ovn_controller[153664]: 2025-11-26T10:03:58Z|00396|binding|INFO|Setting lport 94a1f8ac-70de-4a59-8a56-1a87ef1ec6c7 ovn-installed in OVS Nov 26 05:03:58 localhost ovn_controller[153664]: 2025-11-26T10:03:58Z|00397|binding|INFO|Setting lport 94a1f8ac-70de-4a59-8a56-1a87ef1ec6c7 up in Southbound Nov 26 05:03:58 localhost journal[229445]: ethtool ioctl error on tap94a1f8ac-70: No such device Nov 26 05:03:58 localhost nova_compute[281415]: 2025-11-26 10:03:58.992 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:59 localhost journal[229445]: ethtool ioctl error on tap94a1f8ac-70: No such device Nov 26 05:03:59 localhost journal[229445]: ethtool ioctl error on tap94a1f8ac-70: No such device Nov 26 05:03:59 localhost journal[229445]: ethtool ioctl error on tap94a1f8ac-70: No such device Nov 26 05:03:59 localhost journal[229445]: ethtool ioctl error on tap94a1f8ac-70: No such device Nov 26 05:03:59 localhost journal[229445]: ethtool ioctl error on tap94a1f8ac-70: No such device Nov 26 05:03:59 localhost nova_compute[281415]: 2025-11-26 10:03:59.036 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:59 localhost nova_compute[281415]: 2025-11-26 10:03:59.070 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:03:59 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e163 e163: 6 total, 6 up, 6 in Nov 26 05:03:59 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:04:00 localhost podman[322901]: Nov 26 05:04:00 localhost podman[322901]: 2025-11-26 10:04:00.048109767 +0000 UTC m=+0.091795234 container create bb626efa5c0f671543d2923074f6b854e07b734435fb5d18cdc0ae1200479971 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fb1cc77c-45ea-4dd3-a42a-ffba463f728a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0) Nov 26 05:04:00 localhost systemd[1]: Started libpod-conmon-bb626efa5c0f671543d2923074f6b854e07b734435fb5d18cdc0ae1200479971.scope. Nov 26 05:04:00 localhost podman[322901]: 2025-11-26 10:04:00.004771265 +0000 UTC m=+0.048456792 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:04:00 localhost systemd[1]: tmp-crun.GJydOR.mount: Deactivated successfully. Nov 26 05:04:00 localhost systemd[1]: Started libcrun container. Nov 26 05:04:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5ac8d04c13e3f9f8d4d758d57b8a7d1d5c4ec631caf768a741cd4e54944fda03/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:04:00 localhost podman[322901]: 2025-11-26 10:04:00.133238906 +0000 UTC m=+0.176924373 container init bb626efa5c0f671543d2923074f6b854e07b734435fb5d18cdc0ae1200479971 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fb1cc77c-45ea-4dd3-a42a-ffba463f728a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:04:00 localhost podman[322901]: 2025-11-26 10:04:00.141894744 +0000 UTC m=+0.185580231 container start bb626efa5c0f671543d2923074f6b854e07b734435fb5d18cdc0ae1200479971 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fb1cc77c-45ea-4dd3-a42a-ffba463f728a, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Nov 26 05:04:00 localhost dnsmasq[322919]: started, version 2.85 cachesize 150 Nov 26 05:04:00 localhost dnsmasq[322919]: DNS service limited to local subnets Nov 26 05:04:00 localhost dnsmasq[322919]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:04:00 localhost dnsmasq[322919]: warning: no upstream servers configured Nov 26 05:04:00 localhost dnsmasq-dhcp[322919]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 26 05:04:00 localhost dnsmasq[322919]: read /var/lib/neutron/dhcp/fb1cc77c-45ea-4dd3-a42a-ffba463f728a/addn_hosts - 0 addresses Nov 26 05:04:00 localhost dnsmasq-dhcp[322919]: read /var/lib/neutron/dhcp/fb1cc77c-45ea-4dd3-a42a-ffba463f728a/host Nov 26 05:04:00 localhost dnsmasq-dhcp[322919]: read /var/lib/neutron/dhcp/fb1cc77c-45ea-4dd3-a42a-ffba463f728a/opts Nov 26 05:04:00 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:00.347 262471 INFO neutron.agent.dhcp.agent [None req-63db62a6-3f4b-4e81-b319-7229c5a39b27 - - - - - -] DHCP configuration for ports {'e45c4f6b-6662-4583-a3df-8e04cae69d30'} is completed#033[00m Nov 26 05:04:00 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e164 e164: 6 total, 6 up, 6 in Nov 26 05:04:01 localhost ovn_controller[153664]: 2025-11-26T10:04:01Z|00398|binding|INFO|Removing iface tap94a1f8ac-70 ovn-installed in OVS Nov 26 05:04:01 localhost kernel: device tap94a1f8ac-70 left promiscuous mode Nov 26 05:04:01 localhost nova_compute[281415]: 2025-11-26 10:04:01.209 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:01 localhost ovn_controller[153664]: 2025-11-26T10:04:01Z|00399|binding|INFO|Removing lport 94a1f8ac-70de-4a59-8a56-1a87ef1ec6c7 ovn-installed in OVS Nov 26 05:04:01 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:01.212 159486 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port a481e85c-a791-4e57-875a-553f94a18cdb with type ""#033[00m Nov 26 05:04:01 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:01.215 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.1/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-fb1cc77c-45ea-4dd3-a42a-ffba463f728a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fb1cc77c-45ea-4dd3-a42a-ffba463f728a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '140ee02dff30450e88d5baa79f6f7df2', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7d3b03c9-c5a8-4998-b1ed-401a93ac8687, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=94a1f8ac-70de-4a59-8a56-1a87ef1ec6c7) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:04:01 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:01.218 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 94a1f8ac-70de-4a59-8a56-1a87ef1ec6c7 in datapath fb1cc77c-45ea-4dd3-a42a-ffba463f728a unbound from our chassis#033[00m Nov 26 05:04:01 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:01.222 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fb1cc77c-45ea-4dd3-a42a-ffba463f728a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:04:01 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:01.223 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[be7455be-cdf4-43b7-81ee-4619ba8c3346]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:04:01 localhost nova_compute[281415]: 2025-11-26 10:04:01.230 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:01 localhost neutron_sriov_agent[255515]: 2025-11-26 10:04:01.565 2 INFO neutron.agent.securitygroups_rpc [None req-017455e6-2606-46e8-b8bc-e0ee787202b0 be7e3593ef5549ccbb3a9112d216d4df bfa7121af70e41c0a159446c2be9337b - - default default] Security group rule updated ['7ea3d147-ab72-4b99-a077-41a459e6634c']#033[00m Nov 26 05:04:01 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 05:04:01 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:04:01 localhost dnsmasq[322919]: read /var/lib/neutron/dhcp/fb1cc77c-45ea-4dd3-a42a-ffba463f728a/addn_hosts - 0 addresses Nov 26 05:04:01 localhost dnsmasq-dhcp[322919]: read /var/lib/neutron/dhcp/fb1cc77c-45ea-4dd3-a42a-ffba463f728a/host Nov 26 05:04:01 localhost dnsmasq-dhcp[322919]: read /var/lib/neutron/dhcp/fb1cc77c-45ea-4dd3-a42a-ffba463f728a/opts Nov 26 05:04:01 localhost podman[323024]: 2025-11-26 10:04:01.758388238 +0000 UTC m=+0.066557793 container kill bb626efa5c0f671543d2923074f6b854e07b734435fb5d18cdc0ae1200479971 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fb1cc77c-45ea-4dd3-a42a-ffba463f728a, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 26 05:04:01 localhost ovn_controller[153664]: 2025-11-26T10:04:01Z|00400|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:04:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:01.788 262471 ERROR neutron.agent.dhcp.agent [None req-086273e5-0f12-434f-b8f4-6500ef8a8b5a - - - - - -] Unable to reload_allocations dhcp for fb1cc77c-45ea-4dd3-a42a-ffba463f728a.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap94a1f8ac-70 not found in namespace qdhcp-fb1cc77c-45ea-4dd3-a42a-ffba463f728a. Nov 26 05:04:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:01.788 262471 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Nov 26 05:04:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:01.788 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Nov 26 05:04:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:01.788 262471 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Nov 26 05:04:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:01.788 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Nov 26 05:04:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:01.788 262471 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Nov 26 05:04:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:01.788 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Nov 26 05:04:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:01.788 262471 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Nov 26 05:04:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:01.788 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Nov 26 05:04:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:01.788 262471 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Nov 26 05:04:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:01.788 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Nov 26 05:04:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:01.788 262471 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Nov 26 05:04:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:01.788 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Nov 26 05:04:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:01.788 262471 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Nov 26 05:04:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:01.788 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Nov 26 05:04:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:01.788 262471 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Nov 26 05:04:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:01.788 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Nov 26 05:04:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:01.788 262471 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Nov 26 05:04:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:01.788 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Nov 26 05:04:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:01.788 262471 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Nov 26 05:04:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:01.788 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Nov 26 05:04:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:01.788 262471 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Nov 26 05:04:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:01.788 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Nov 26 05:04:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:01.788 262471 ERROR neutron.agent.dhcp.agent return fut.result() Nov 26 05:04:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:01.788 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Nov 26 05:04:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:01.788 262471 ERROR neutron.agent.dhcp.agent return self.__get_result() Nov 26 05:04:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:01.788 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Nov 26 05:04:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:01.788 262471 ERROR neutron.agent.dhcp.agent raise self._exception Nov 26 05:04:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:01.788 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Nov 26 05:04:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:01.788 262471 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Nov 26 05:04:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:01.788 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Nov 26 05:04:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:01.788 262471 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Nov 26 05:04:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:01.788 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Nov 26 05:04:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:01.788 262471 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Nov 26 05:04:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:01.788 262471 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap94a1f8ac-70 not found in namespace qdhcp-fb1cc77c-45ea-4dd3-a42a-ffba463f728a. Nov 26 05:04:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:01.788 262471 ERROR neutron.agent.dhcp.agent #033[00m Nov 26 05:04:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:01.792 262471 INFO neutron.agent.dhcp.agent [None req-f696aa43-fc6b-4141-a070-5e294a3f51b6 - - - - - -] Synchronizing state#033[00m Nov 26 05:04:01 localhost nova_compute[281415]: 2025-11-26 10:04:01.807 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:01 localhost nova_compute[281415]: 2025-11-26 10:04:01.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:04:02 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:02.006 262471 INFO neutron.agent.dhcp.agent [None req-9332b23d-493b-4ff8-b1bf-5c8958814d03 - - - - - -] All active networks have been fetched through RPC.#033[00m Nov 26 05:04:02 localhost nova_compute[281415]: 2025-11-26 10:04:02.107 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:02 localhost dnsmasq[322919]: exiting on receipt of SIGTERM Nov 26 05:04:02 localhost podman[323055]: 2025-11-26 10:04:02.235092938 +0000 UTC m=+0.064594282 container kill bb626efa5c0f671543d2923074f6b854e07b734435fb5d18cdc0ae1200479971 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fb1cc77c-45ea-4dd3-a42a-ffba463f728a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 26 05:04:02 localhost systemd[1]: libpod-bb626efa5c0f671543d2923074f6b854e07b734435fb5d18cdc0ae1200479971.scope: Deactivated successfully. Nov 26 05:04:02 localhost nova_compute[281415]: 2025-11-26 10:04:02.305 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:02 localhost podman[323069]: 2025-11-26 10:04:02.321241608 +0000 UTC m=+0.071729914 container died bb626efa5c0f671543d2923074f6b854e07b734435fb5d18cdc0ae1200479971 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fb1cc77c-45ea-4dd3-a42a-ffba463f728a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 26 05:04:02 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bb626efa5c0f671543d2923074f6b854e07b734435fb5d18cdc0ae1200479971-userdata-shm.mount: Deactivated successfully. Nov 26 05:04:02 localhost podman[323069]: 2025-11-26 10:04:02.369156132 +0000 UTC m=+0.119644408 container cleanup bb626efa5c0f671543d2923074f6b854e07b734435fb5d18cdc0ae1200479971 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fb1cc77c-45ea-4dd3-a42a-ffba463f728a, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 26 05:04:02 localhost systemd[1]: libpod-conmon-bb626efa5c0f671543d2923074f6b854e07b734435fb5d18cdc0ae1200479971.scope: Deactivated successfully. Nov 26 05:04:02 localhost podman[323071]: 2025-11-26 10:04:02.456101316 +0000 UTC m=+0.198558613 container remove bb626efa5c0f671543d2923074f6b854e07b734435fb5d18cdc0ae1200479971 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fb1cc77c-45ea-4dd3-a42a-ffba463f728a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 26 05:04:02 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:02.483 262471 INFO neutron.agent.dhcp.agent [None req-a13af6bd-d46d-4aa1-9e20-a244c593e34a - - - - - -] Synchronizing state complete#033[00m Nov 26 05:04:02 localhost nova_compute[281415]: 2025-11-26 10:04:02.844 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:04:03 localhost systemd[1]: var-lib-containers-storage-overlay-5ac8d04c13e3f9f8d4d758d57b8a7d1d5c4ec631caf768a741cd4e54944fda03-merged.mount: Deactivated successfully. Nov 26 05:04:03 localhost systemd[1]: run-netns-qdhcp\x2dfb1cc77c\x2d45ea\x2d4dd3\x2da42a\x2dffba463f728a.mount: Deactivated successfully. Nov 26 05:04:03 localhost nova_compute[281415]: 2025-11-26 10:04:03.335 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.586 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'name': 'test', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005536118.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'hostId': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.587 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.624 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.625 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a061ef2d-ca93-4139-92b4-f00a369cb114', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:04:03.587861', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3cc70a8a-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11918.830145395, 'message_signature': '88d3f238b5df3c5a60d2d24e59eb5f2b27a7f5b7d957bb0bc0f84e4334486468'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:04:03.587861', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3cc72b32-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11918.830145395, 'message_signature': '879c02624336f4617c16eef0d70207ddf0f471953a8c15f67f15b2109582d9e0'}]}, 'timestamp': '2025-11-26 10:04:03.626176', '_unique_id': '8ae2fb91313143cabfa363a1ba8c77aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.627 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.629 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.633 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes volume: 7557 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:04:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'efe2c4ab-785a-4640-adab-b5bc35d75b78', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7557, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:04:03.629524', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '3cc86fe2-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11918.871834427, 'message_signature': '130be70f81b595e4a8c1c154635e40eb38b5298ef3ebe687591f26136ea6fbf7'}]}, 'timestamp': '2025-11-26 10:04:03.634417', '_unique_id': '06fdad04cbfe46dd8e247e507f8e7589'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.635 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.636 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.636 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 26 05:04:03 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:04:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.654 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.655 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7bfcb784-e0fa-4611-8a88-2137fdba2948', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:04:03.636744', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3ccba338-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11918.878993858, 'message_signature': '7548d43ada65856a8aa7f433ee1201677b768b716e31cc246cc213e92cfdca3d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:04:03.636744', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3ccbb436-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11918.878993858, 'message_signature': 'fd37d9e573a93b50223858e7b570aa958f71cd44146f554dce0d5640060bc5ce'}]}, 'timestamp': '2025-11-26 10:04:03.655785', '_unique_id': 'b69be946bf3841d399cd54991d1d7d6b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.656 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.657 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.658 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8f5414f3-3145-45df-8d94-6d0e26e75029', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:04:03.658070', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '3ccc1e62-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11918.871834427, 'message_signature': '6f5e6c364b2bebb6c77bea9b22729553b71be69d660a7d237edc76214214857a'}]}, 'timestamp': '2025-11-26 10:04:03.658534', '_unique_id': 'f8f1535b47bb4373882342e4443f535b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.659 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.660 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.660 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f49d70ad-90bb-4fab-827b-0201c229b40e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:04:03.660650', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '3ccc82da-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11918.871834427, 'message_signature': 'fd87259391c756f8a213f94d44e8772a26123c5b3a0edc3e3cf98d3672aefd42'}]}, 'timestamp': '2025-11-26 10:04:03.661137', '_unique_id': 'c394ad290c9a4ca68cb09e4e4aec5ec1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.662 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.663 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.663 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 1723586642 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.663 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 89399569 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eba3654d-c2ef-468c-be7d-969a6552267c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1723586642, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:04:03.663233', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3ccce7a2-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11918.830145395, 'message_signature': '21f307f276829584fdfbff074a1cac85674780ffade33d6d237e580791ad6cce'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89399569, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:04:03.663233', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3cccf7ba-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11918.830145395, 'message_signature': 'd588dd549b4dc12296e6ed031d01bb64b2b9f6a11c44f060550b9fbb929ce816'}]}, 'timestamp': '2025-11-26 10:04:03.664101', '_unique_id': '036af4a20cc74154b9368853100b554e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.665 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.666 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.666 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '036de256-529b-4680-870e-f5f7402c2e13', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:04:03.666247', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '3ccd5d7c-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11918.871834427, 'message_signature': '9b166836eed2cdaaf08669f988b394bcd20ae70c58d0e44b268ea393be4ae485'}]}, 'timestamp': '2025-11-26 10:04:03.666700', '_unique_id': '0886780d784c4c13b4ff502fbcc3f493'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.667 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.668 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.668 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d3a18e4-a4ec-4de5-953f-ef257c24376b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:04:03.668799', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '3ccdc26c-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11918.871834427, 'message_signature': '60418929a36ba0fe666431b3ebbd7ea2797e77e215d41040e2a0d206b1c17866'}]}, 'timestamp': '2025-11-26 10:04:03.669287', '_unique_id': 'c35ea9b785a943749cd5a1e735fbc992'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.670 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.671 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.671 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.671 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:04:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:03.671 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:04:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:03.672 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:04:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:03.672 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac2797f7-debe-4628-ba08-9873a89ec4ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:04:03.671479', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3cce29a0-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11918.830145395, 'message_signature': '64895e58d3d5a5588dc44e80d9751b275283f004977befe1b9383d9348167d7f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:04:03.671479', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3cce3b3e-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11918.830145395, 'message_signature': 'a660ede7c1b276648245a44af27355c7d54d5854ca6e3f71e6c6555363c24392'}]}, 'timestamp': '2025-11-26 10:04:03.672349', '_unique_id': '0f173c9f732e4559b3ae729185b0eadc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.673 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.674 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.674 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.675 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4eb42f18-59fd-49fb-b60c-4c57938afb85', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:04:03.674615', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3ccea434-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11918.830145395, 'message_signature': '318c28c53244292c203f1585f59e28ab00a454008fb2a45f6c8bfc88fd1f56e3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:04:03.674615', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3cceb5aa-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11918.830145395, 'message_signature': '0d9a9c439bb7a1395b395b49a5005e98cb60975505b228b5e5cda5bb3e8ecfeb'}]}, 'timestamp': '2025-11-26 10:04:03.675484', '_unique_id': '78dbcf0c6aa848e98727bb5f5c2566e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.676 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.677 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.677 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.677 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f659e9c8-ebd2-4e38-a852-690197f5d86b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:04:03.677874', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '3ccf24ea-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11918.871834427, 'message_signature': '2bcc918c2dd92e2d6fe493b57be6a388319e46f538202795ce2f55a99f436cb4'}]}, 'timestamp': '2025-11-26 10:04:03.678360', '_unique_id': 'f6d8f6febeb84eba8e8459ed89c9de72'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.679 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.680 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.680 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c1a2632a-5c0a-4eb4-b1aa-099370956b58', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:04:03.680422', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '3ccf8700-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11918.871834427, 'message_signature': '1350a5a6cbbf54d817344c10bd1df584f927f72104520802e784f3dc0ec99e7c'}]}, 'timestamp': '2025-11-26 10:04:03.680869', '_unique_id': '7f27064d6f184fcdb87d5254b494975f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.681 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.682 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.683 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0dd250aa-8029-408a-80f0-cd1f276152e5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:04:03.682981', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '3ccfeb5a-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11918.871834427, 'message_signature': '4224e8fbd8d7a358d472d9694d2aa2b042b6f90a8b96a128ffb86afa795a5e5d'}]}, 'timestamp': '2025-11-26 10:04:03.683439', '_unique_id': '0a0bfe85e3764fdfb51f16d81f02690a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.684 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.685 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.685 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets volume: 68 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a87f8765-dc17-4bd5-99f0-d54dd2a8b8bd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 68, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:04:03.685484', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '3cd04cb2-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11918.871834427, 'message_signature': '7a38690e7fc80a25d32b9874389738b81ecc816f51555a623da25344f7cef9d2'}]}, 'timestamp': '2025-11-26 10:04:03.685991', '_unique_id': '94620eb39ee74e5aabf371bce8afd78c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.686 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.687 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.687 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.687 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3f32a8e-d52d-40d4-855c-af2b8799c7c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:04:03.687393', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '3cd09424-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11918.871834427, 'message_signature': 'ac4a714ff4d3ba5118331d91e4d79f6c97ac14683a723e8c1626ba1dfac06054'}]}, 'timestamp': '2025-11-26 10:04:03.687675', '_unique_id': '3f3e5085f4274d559bc00dab20b7e47c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.688 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 1143371229 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.689 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 23326743 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4c183225-e947-4d73-9fdb-4a1d01d214c8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1143371229, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:04:03.688965', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3cd0d1a0-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11918.830145395, 'message_signature': 'abd99c538293aea4c00584c1d0bbdae97d052ee24929eafb9880d70205eca05b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 23326743, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:04:03.688965', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3cd0dbaa-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11918.830145395, 'message_signature': '8b66af08d13a6bd35ca743be4a62e638d346aef81e5fb393aa3d611db6c8ec26'}]}, 'timestamp': '2025-11-26 10:04:03.689486', '_unique_id': 'ce674366dd734f3798e2fef280146039'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.690 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4742a11d-3b53-484c-bec1-d062d9fa0c19', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:04:03.690803', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3cd11a16-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11918.830145395, 'message_signature': 'a50b19b546e725d4b2bfac0816015136092da68cc148c66921b021e09a5190c3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:04:03.690803', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3cd123e4-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11918.830145395, 'message_signature': 'c00d7b49e2c4810f8f7ce59cb9fd727831689cf4a2ca9eede8dd83844fc92f1f'}]}, 'timestamp': '2025-11-26 10:04:03.691335', '_unique_id': 'f6a4ae2e9feb43dc95834ede0e4b148d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.691 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.692 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.707 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/cpu volume: 17250000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0c1376a-3600-4305-97df-8b2c3ac419ff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17250000000, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T10:04:03.692612', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '3cd3adc6-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11918.949819914, 'message_signature': 'd888888a19eead74c1a5b416bf75359b8b79d8f1569e130087dc480868e3bb7f'}]}, 'timestamp': '2025-11-26 10:04:03.708010', '_unique_id': 'd2995fba989e41be8570acd6ee10addb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.708 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.709 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.709 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f75fa7a2-bbdc-40a8-8c1e-823ade074fea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:04:03.709826', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3cd40208-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11918.878993858, 'message_signature': '3afcf5c28d9cb1d5b35869f6aa26f9089b281752d1c999d4bbb5d662088644b5'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:04:03.709826', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3cd40bcc-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11918.878993858, 'message_signature': 'bc1d1c2ff33cc611c51f6a7214d4fb96539f6bcc438eaefd445d98d9d7cc8f2b'}]}, 'timestamp': '2025-11-26 10:04:03.710380', '_unique_id': '4bb9a9cb680b4a638fd02bc415ca97aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.710 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.711 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.711 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.711 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fa96ab1d-d251-4490-888a-d0deb67dc077', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:04:03.711692', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3cd4495c-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11918.878993858, 'message_signature': '4492aeee180a486e2abc80a7217173a67c922404691c3b209666cbc1aa83f7dd'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:04:03.711692', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3cd45410-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11918.878993858, 'message_signature': 'ee75b8789c7fd20fc9a33b776d854dac329169b3699702f6fdcdcdf0a8a68091'}]}, 'timestamp': '2025-11-26 10:04:03.712230', '_unique_id': 'c20ecd07c0924ad5b037acd56fcf5d2d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.712 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.713 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.713 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/memory.usage volume: 51.79296875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b5299785-4d42-4dab-848a-b98d89a0c5c6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.79296875, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T10:04:03.713510', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '3cd49056-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 11918.949819914, 'message_signature': 'f1f4261fd9c4ead207fea13eb360df59931c9ae8dbbc797e318baa2a2a3cea90'}]}, 'timestamp': '2025-11-26 10:04:03.713780', '_unique_id': 'eb342b380bf748838d000e4cca9cdeba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.714 12 ERROR oslo_messaging.notify.messaging Nov 26 05:04:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:04:03.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 05:04:03 localhost podman[323104]: 2025-11-26 10:04:03.75207854 +0000 UTC m=+0.098211194 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true, managed_by=edpm_ansible) Nov 26 05:04:03 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:03.758 262471 INFO neutron.agent.linux.ip_lib [None req-1b751a54-2552-4b0c-832d-bd165034bb2e - - - - - -] Device tap1d8a5772-15 cannot be used as it has no MAC address#033[00m Nov 26 05:04:03 localhost podman[323104]: 2025-11-26 10:04:03.76335856 +0000 UTC m=+0.109491214 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Nov 26 05:04:03 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 05:04:03 localhost nova_compute[281415]: 2025-11-26 10:04:03.785 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:03 localhost kernel: device tap1d8a5772-15 entered promiscuous mode Nov 26 05:04:03 localhost NetworkManager[5970]: [1764151443.7963] manager: (tap1d8a5772-15): new Generic device (/org/freedesktop/NetworkManager/Devices/65) Nov 26 05:04:03 localhost ovn_controller[153664]: 2025-11-26T10:04:03Z|00401|binding|INFO|Claiming lport 1d8a5772-15ec-4f57-a020-fff90b2c501e for this chassis. Nov 26 05:04:03 localhost ovn_controller[153664]: 2025-11-26T10:04:03Z|00402|binding|INFO|1d8a5772-15ec-4f57-a020-fff90b2c501e: Claiming unknown Nov 26 05:04:03 localhost nova_compute[281415]: 2025-11-26 10:04:03.795 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:03 localhost systemd-udevd[323139]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:04:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:03.804 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-cf1e4ba9-4399-4222-8d26-929327690ad2', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cf1e4ba9-4399-4222-8d26-929327690ad2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '140ee02dff30450e88d5baa79f6f7df2', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c54fbd2-5e7f-43db-969f-72ead4b85244, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=1d8a5772-15ec-4f57-a020-fff90b2c501e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:04:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:03.805 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 1d8a5772-15ec-4f57-a020-fff90b2c501e in datapath cf1e4ba9-4399-4222-8d26-929327690ad2 bound to our chassis#033[00m Nov 26 05:04:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:03.806 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cf1e4ba9-4399-4222-8d26-929327690ad2 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:04:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:03.807 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[648aacf8-8964-41ed-a2ed-b6ad36dde10b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:04:03 localhost ovn_controller[153664]: 2025-11-26T10:04:03Z|00403|binding|INFO|Setting lport 1d8a5772-15ec-4f57-a020-fff90b2c501e ovn-installed in OVS Nov 26 05:04:03 localhost ovn_controller[153664]: 2025-11-26T10:04:03Z|00404|binding|INFO|Setting lport 1d8a5772-15ec-4f57-a020-fff90b2c501e up in Southbound Nov 26 05:04:03 localhost nova_compute[281415]: 2025-11-26 10:04:03.827 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:03 localhost journal[229445]: ethtool ioctl error on tap1d8a5772-15: No such device Nov 26 05:04:03 localhost journal[229445]: ethtool ioctl error on tap1d8a5772-15: No such device Nov 26 05:04:03 localhost journal[229445]: ethtool ioctl error on tap1d8a5772-15: No such device Nov 26 05:04:03 localhost nova_compute[281415]: 2025-11-26 10:04:03.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:04:03 localhost nova_compute[281415]: 2025-11-26 10:04:03.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:04:03 localhost journal[229445]: ethtool ioctl error on tap1d8a5772-15: No such device Nov 26 05:04:03 localhost journal[229445]: ethtool ioctl error on tap1d8a5772-15: No such device Nov 26 05:04:03 localhost journal[229445]: ethtool ioctl error on tap1d8a5772-15: No such device Nov 26 05:04:03 localhost journal[229445]: ethtool ioctl error on tap1d8a5772-15: No such device Nov 26 05:04:03 localhost podman[323103]: 2025-11-26 10:04:03.86729935 +0000 UTC m=+0.214617131 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 26 05:04:03 localhost journal[229445]: ethtool ioctl error on tap1d8a5772-15: No such device Nov 26 05:04:03 localhost nova_compute[281415]: 2025-11-26 10:04:03.874 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:03 localhost nova_compute[281415]: 2025-11-26 10:04:03.905 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:03 localhost podman[323103]: 2025-11-26 10:04:03.90763386 +0000 UTC m=+0.254951621 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 26 05:04:03 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 05:04:04 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e164 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:04:04 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 26 05:04:04 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2248385811' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 26 05:04:04 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 26 05:04:04 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2248385811' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 26 05:04:05 localhost podman[323221]: Nov 26 05:04:05 localhost podman[323221]: 2025-11-26 10:04:05.256581956 +0000 UTC m=+0.097229394 container create 4cf4d249f24202232fc9d0dcfd2ee6eb264c02430f8e588fc9a32efeb50ed52f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cf1e4ba9-4399-4222-8d26-929327690ad2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 26 05:04:05 localhost systemd[1]: Started libpod-conmon-4cf4d249f24202232fc9d0dcfd2ee6eb264c02430f8e588fc9a32efeb50ed52f.scope. Nov 26 05:04:05 localhost podman[323221]: 2025-11-26 10:04:05.211432296 +0000 UTC m=+0.052079824 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:04:05 localhost systemd[1]: tmp-crun.9eoZ93.mount: Deactivated successfully. Nov 26 05:04:05 localhost systemd[1]: Started libcrun container. Nov 26 05:04:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/70c6c23e2a79a20b2d39bf59bb5e4e96e698d134c859bb4969f0f763b7094c74/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:04:05 localhost podman[323221]: 2025-11-26 10:04:05.369909946 +0000 UTC m=+0.210557374 container init 4cf4d249f24202232fc9d0dcfd2ee6eb264c02430f8e588fc9a32efeb50ed52f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cf1e4ba9-4399-4222-8d26-929327690ad2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 26 05:04:05 localhost podman[323221]: 2025-11-26 10:04:05.378824052 +0000 UTC m=+0.219471480 container start 4cf4d249f24202232fc9d0dcfd2ee6eb264c02430f8e588fc9a32efeb50ed52f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cf1e4ba9-4399-4222-8d26-929327690ad2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 26 05:04:05 localhost dnsmasq[323239]: started, version 2.85 cachesize 150 Nov 26 05:04:05 localhost dnsmasq[323239]: DNS service limited to local subnets Nov 26 05:04:05 localhost dnsmasq[323239]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:04:05 localhost dnsmasq[323239]: warning: no upstream servers configured Nov 26 05:04:05 localhost dnsmasq-dhcp[323239]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 26 05:04:05 localhost dnsmasq[323239]: read /var/lib/neutron/dhcp/cf1e4ba9-4399-4222-8d26-929327690ad2/addn_hosts - 0 addresses Nov 26 05:04:05 localhost dnsmasq-dhcp[323239]: read /var/lib/neutron/dhcp/cf1e4ba9-4399-4222-8d26-929327690ad2/host Nov 26 05:04:05 localhost dnsmasq-dhcp[323239]: read /var/lib/neutron/dhcp/cf1e4ba9-4399-4222-8d26-929327690ad2/opts Nov 26 05:04:05 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:05.578 262471 INFO neutron.agent.dhcp.agent [None req-5215f379-3e17-4aa8-a16d-0716265b8edd - - - - - -] DHCP configuration for ports {'088f46d5-a752-47a4-bb5c-6e125974476c'} is completed#033[00m Nov 26 05:04:05 localhost nova_compute[281415]: 2025-11-26 10:04:05.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:04:05 localhost nova_compute[281415]: 2025-11-26 10:04:05.849 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:04:05 localhost nova_compute[281415]: 2025-11-26 10:04:05.849 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 05:04:05 localhost nova_compute[281415]: 2025-11-26 10:04:05.849 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:04:05 localhost nova_compute[281415]: 2025-11-26 10:04:05.874 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:04:05 localhost nova_compute[281415]: 2025-11-26 10:04:05.874 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:04:05 localhost nova_compute[281415]: 2025-11-26 10:04:05.875 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:04:05 localhost nova_compute[281415]: 2025-11-26 10:04:05.875 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 05:04:05 localhost nova_compute[281415]: 2025-11-26 10:04:05.876 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 05:04:06 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:06.022 159486 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 3a06e652-e670-4e1f-9ce4-a96b43c001b4 with type ""#033[00m Nov 26 05:04:06 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:06.023 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-cf1e4ba9-4399-4222-8d26-929327690ad2', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cf1e4ba9-4399-4222-8d26-929327690ad2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '140ee02dff30450e88d5baa79f6f7df2', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c54fbd2-5e7f-43db-969f-72ead4b85244, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=1d8a5772-15ec-4f57-a020-fff90b2c501e) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:04:06 localhost ovn_controller[153664]: 2025-11-26T10:04:06Z|00405|binding|INFO|Removing iface tap1d8a5772-15 ovn-installed in OVS Nov 26 05:04:06 localhost ovn_controller[153664]: 2025-11-26T10:04:06Z|00406|binding|INFO|Removing lport 1d8a5772-15ec-4f57-a020-fff90b2c501e ovn-installed in OVS Nov 26 05:04:06 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:06.026 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 1d8a5772-15ec-4f57-a020-fff90b2c501e in datapath cf1e4ba9-4399-4222-8d26-929327690ad2 unbound from our chassis#033[00m Nov 26 05:04:06 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:06.029 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cf1e4ba9-4399-4222-8d26-929327690ad2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:04:06 localhost nova_compute[281415]: 2025-11-26 10:04:06.030 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:06 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:06.030 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[19e09faa-88a3-4735-bd94-f8340170f60f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:04:06 localhost kernel: device tap1d8a5772-15 left promiscuous mode Nov 26 05:04:06 localhost nova_compute[281415]: 2025-11-26 10:04:06.056 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:06 localhost nova_compute[281415]: 2025-11-26 10:04:06.213 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:06 localhost nova_compute[281415]: 2025-11-26 10:04:06.361 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 05:04:06 localhost nova_compute[281415]: 2025-11-26 10:04:06.437 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 05:04:06 localhost nova_compute[281415]: 2025-11-26 10:04:06.438 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 05:04:06 localhost systemd[1]: tmp-crun.7zmCPi.mount: Deactivated successfully. Nov 26 05:04:06 localhost podman[323281]: 2025-11-26 10:04:06.696779548 +0000 UTC m=+0.100599978 container kill 4cf4d249f24202232fc9d0dcfd2ee6eb264c02430f8e588fc9a32efeb50ed52f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cf1e4ba9-4399-4222-8d26-929327690ad2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true) Nov 26 05:04:06 localhost dnsmasq[323239]: read /var/lib/neutron/dhcp/cf1e4ba9-4399-4222-8d26-929327690ad2/addn_hosts - 0 addresses Nov 26 05:04:06 localhost dnsmasq-dhcp[323239]: read /var/lib/neutron/dhcp/cf1e4ba9-4399-4222-8d26-929327690ad2/host Nov 26 05:04:06 localhost dnsmasq-dhcp[323239]: read /var/lib/neutron/dhcp/cf1e4ba9-4399-4222-8d26-929327690ad2/opts Nov 26 05:04:06 localhost nova_compute[281415]: 2025-11-26 10:04:06.698 281419 WARNING nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 05:04:06 localhost nova_compute[281415]: 2025-11-26 10:04:06.700 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=11244MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 05:04:06 localhost nova_compute[281415]: 2025-11-26 10:04:06.701 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:04:06 localhost nova_compute[281415]: 2025-11-26 10:04:06.701 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.731 262471 ERROR neutron.agent.dhcp.agent [None req-43a11324-2a84-44b1-b6bc-cde09bc34a34 - - - - - -] Unable to reload_allocations dhcp for cf1e4ba9-4399-4222-8d26-929327690ad2.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap1d8a5772-15 not found in namespace qdhcp-cf1e4ba9-4399-4222-8d26-929327690ad2. Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.731 262471 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.731 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.731 262471 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.731 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.731 262471 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.731 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.731 262471 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.731 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.731 262471 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.731 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.731 262471 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.731 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.731 262471 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.731 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.731 262471 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.731 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.731 262471 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.731 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.731 262471 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.731 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.731 262471 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.731 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.731 262471 ERROR neutron.agent.dhcp.agent return fut.result() Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.731 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.731 262471 ERROR neutron.agent.dhcp.agent return self.__get_result() Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.731 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.731 262471 ERROR neutron.agent.dhcp.agent raise self._exception Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.731 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.731 262471 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.731 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.731 262471 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.731 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.731 262471 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.731 262471 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap1d8a5772-15 not found in namespace qdhcp-cf1e4ba9-4399-4222-8d26-929327690ad2. Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.731 262471 ERROR neutron.agent.dhcp.agent #033[00m Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.734 262471 INFO neutron.agent.dhcp.agent [None req-a13af6bd-d46d-4aa1-9e20-a244c593e34a - - - - - -] Synchronizing state#033[00m Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.911 262471 INFO neutron.agent.dhcp.agent [None req-1d75ae1e-e240-4995-88ba-7660190e193a - - - - - -] All active networks have been fetched through RPC.#033[00m Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.913 262471 INFO neutron.agent.dhcp.agent [-] Starting network cf1e4ba9-4399-4222-8d26-929327690ad2 dhcp configuration#033[00m Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.913 262471 INFO neutron.agent.dhcp.agent [-] Finished network cf1e4ba9-4399-4222-8d26-929327690ad2 dhcp configuration#033[00m Nov 26 05:04:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:06.914 262471 INFO neutron.agent.dhcp.agent [None req-1d75ae1e-e240-4995-88ba-7660190e193a - - - - - -] Synchronizing state complete#033[00m Nov 26 05:04:06 localhost nova_compute[281415]: 2025-11-26 10:04:06.985 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 05:04:06 localhost nova_compute[281415]: 2025-11-26 10:04:06.986 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 05:04:06 localhost nova_compute[281415]: 2025-11-26 10:04:06.986 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 05:04:07 localhost nova_compute[281415]: 2025-11-26 10:04:07.057 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Refreshing inventories for resource provider 05276789-7461-410b-9529-16f5185a8bff _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Nov 26 05:04:07 localhost nova_compute[281415]: 2025-11-26 10:04:07.149 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Updating ProviderTree inventory for provider 05276789-7461-410b-9529-16f5185a8bff from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Nov 26 05:04:07 localhost nova_compute[281415]: 2025-11-26 10:04:07.150 281419 DEBUG nova.compute.provider_tree [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Updating inventory in ProviderTree for provider 05276789-7461-410b-9529-16f5185a8bff with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Nov 26 05:04:07 localhost nova_compute[281415]: 2025-11-26 10:04:07.154 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:07 localhost nova_compute[281415]: 2025-11-26 10:04:07.169 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Refreshing aggregate associations for resource provider 05276789-7461-410b-9529-16f5185a8bff, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Nov 26 05:04:07 localhost dnsmasq[323239]: exiting on receipt of SIGTERM Nov 26 05:04:07 localhost podman[323313]: 2025-11-26 10:04:07.205340505 +0000 UTC m=+0.112481626 container kill 4cf4d249f24202232fc9d0dcfd2ee6eb264c02430f8e588fc9a32efeb50ed52f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cf1e4ba9-4399-4222-8d26-929327690ad2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:04:07 localhost systemd[1]: libpod-4cf4d249f24202232fc9d0dcfd2ee6eb264c02430f8e588fc9a32efeb50ed52f.scope: Deactivated successfully. Nov 26 05:04:07 localhost nova_compute[281415]: 2025-11-26 10:04:07.214 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Refreshing trait associations for resource provider 05276789-7461-410b-9529-16f5185a8bff, traits: COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_F16C,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_FMA3,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ACCELERATORS,HW_CPU_X86_AESNI,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_CLMUL,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_BMI,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Nov 26 05:04:07 localhost nova_compute[281415]: 2025-11-26 10:04:07.253 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 05:04:07 localhost systemd[1]: tmp-crun.7GlBW6.mount: Deactivated successfully. Nov 26 05:04:07 localhost podman[323324]: 2025-11-26 10:04:07.290843254 +0000 UTC m=+0.068796493 container died 4cf4d249f24202232fc9d0dcfd2ee6eb264c02430f8e588fc9a32efeb50ed52f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cf1e4ba9-4399-4222-8d26-929327690ad2, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true) Nov 26 05:04:07 localhost systemd[1]: tmp-crun.L6G2bD.mount: Deactivated successfully. Nov 26 05:04:07 localhost podman[323324]: 2025-11-26 10:04:07.335069545 +0000 UTC m=+0.113022764 container cleanup 4cf4d249f24202232fc9d0dcfd2ee6eb264c02430f8e588fc9a32efeb50ed52f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cf1e4ba9-4399-4222-8d26-929327690ad2, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 26 05:04:07 localhost systemd[1]: libpod-conmon-4cf4d249f24202232fc9d0dcfd2ee6eb264c02430f8e588fc9a32efeb50ed52f.scope: Deactivated successfully. Nov 26 05:04:07 localhost ovn_controller[153664]: 2025-11-26T10:04:07Z|00407|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:04:07 localhost podman[323326]: 2025-11-26 10:04:07.383419153 +0000 UTC m=+0.151385532 container remove 4cf4d249f24202232fc9d0dcfd2ee6eb264c02430f8e588fc9a32efeb50ed52f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cf1e4ba9-4399-4222-8d26-929327690ad2, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118) Nov 26 05:04:07 localhost nova_compute[281415]: 2025-11-26 10:04:07.390 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:07 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 05:04:07 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/655775371' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 05:04:07 localhost nova_compute[281415]: 2025-11-26 10:04:07.716 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 05:04:07 localhost nova_compute[281415]: 2025-11-26 10:04:07.724 281419 DEBUG nova.compute.provider_tree [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 05:04:07 localhost nova_compute[281415]: 2025-11-26 10:04:07.746 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 05:04:07 localhost nova_compute[281415]: 2025-11-26 10:04:07.786 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 05:04:07 localhost nova_compute[281415]: 2025-11-26 10:04:07.786 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.085s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:04:07 localhost nova_compute[281415]: 2025-11-26 10:04:07.787 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:04:07 localhost nova_compute[281415]: 2025-11-26 10:04:07.788 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Nov 26 05:04:07 localhost nova_compute[281415]: 2025-11-26 10:04:07.810 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Nov 26 05:04:07 localhost nova_compute[281415]: 2025-11-26 10:04:07.810 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:04:07 localhost nova_compute[281415]: 2025-11-26 10:04:07.810 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Nov 26 05:04:08 localhost systemd[1]: var-lib-containers-storage-overlay-70c6c23e2a79a20b2d39bf59bb5e4e96e698d134c859bb4969f0f763b7094c74-merged.mount: Deactivated successfully. Nov 26 05:04:08 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4cf4d249f24202232fc9d0dcfd2ee6eb264c02430f8e588fc9a32efeb50ed52f-userdata-shm.mount: Deactivated successfully. Nov 26 05:04:08 localhost systemd[1]: run-netns-qdhcp\x2dcf1e4ba9\x2d4399\x2d4222\x2d8d26\x2d929327690ad2.mount: Deactivated successfully. Nov 26 05:04:08 localhost nova_compute[281415]: 2025-11-26 10:04:08.364 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:08 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e165 e165: 6 total, 6 up, 6 in Nov 26 05:04:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 05:04:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 05:04:08 localhost nova_compute[281415]: 2025-11-26 10:04:08.815 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:04:08 localhost nova_compute[281415]: 2025-11-26 10:04:08.838 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:04:08 localhost nova_compute[281415]: 2025-11-26 10:04:08.839 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 05:04:08 localhost nova_compute[281415]: 2025-11-26 10:04:08.839 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 05:04:08 localhost podman[323373]: 2025-11-26 10:04:08.839953552 +0000 UTC m=+0.094172439 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 26 05:04:08 localhost systemd[1]: tmp-crun.A0m9Of.mount: Deactivated successfully. Nov 26 05:04:08 localhost podman[323373]: 2025-11-26 10:04:08.899590589 +0000 UTC m=+0.153809436 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible) Nov 26 05:04:08 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 05:04:08 localhost nova_compute[281415]: 2025-11-26 10:04:08.934 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 05:04:08 localhost nova_compute[281415]: 2025-11-26 10:04:08.934 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 05:04:08 localhost nova_compute[281415]: 2025-11-26 10:04:08.935 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 05:04:08 localhost nova_compute[281415]: 2025-11-26 10:04:08.935 281419 DEBUG nova.objects.instance [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 05:04:08 localhost podman[323374]: 2025-11-26 10:04:08.900455596 +0000 UTC m=+0.154315152 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, release=1755695350, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 05:04:08 localhost podman[323374]: 2025-11-26 10:04:08.981042253 +0000 UTC m=+0.234901789 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, name=ubi9-minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git) Nov 26 05:04:08 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 05:04:09 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:09.069 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:5e:b4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '86:cf:7c:68:02:df'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:04:09 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:09.072 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 26 05:04:09 localhost nova_compute[281415]: 2025-11-26 10:04:09.072 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:09 localhost nova_compute[281415]: 2025-11-26 10:04:09.574 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 05:04:09 localhost nova_compute[281415]: 2025-11-26 10:04:09.599 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 05:04:09 localhost nova_compute[281415]: 2025-11-26 10:04:09.600 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 05:04:09 localhost nova_compute[281415]: 2025-11-26 10:04:09.600 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:04:09 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:04:12 localhost nova_compute[281415]: 2025-11-26 10:04:12.188 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:12 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e166 e166: 6 total, 6 up, 6 in Nov 26 05:04:13 localhost nova_compute[281415]: 2025-11-26 10:04:13.404 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 05:04:13 localhost podman[323420]: 2025-11-26 10:04:13.831075775 +0000 UTC m=+0.089445652 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 05:04:13 localhost podman[323420]: 2025-11-26 10:04:13.86191142 +0000 UTC m=+0.120281337 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 26 05:04:13 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 05:04:14 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 26 05:04:14 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/944958916' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 26 05:04:14 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 26 05:04:14 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/944958916' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 26 05:04:14 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e166 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:04:15 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e167 e167: 6 total, 6 up, 6 in Nov 26 05:04:15 localhost openstack_network_exporter[242153]: ERROR 10:04:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:04:15 localhost openstack_network_exporter[242153]: ERROR 10:04:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:04:15 localhost openstack_network_exporter[242153]: ERROR 10:04:15 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 05:04:15 localhost openstack_network_exporter[242153]: ERROR 10:04:15 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 05:04:15 localhost openstack_network_exporter[242153]: Nov 26 05:04:15 localhost openstack_network_exporter[242153]: ERROR 10:04:15 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 05:04:15 localhost openstack_network_exporter[242153]: Nov 26 05:04:15 localhost nova_compute[281415]: 2025-11-26 10:04:15.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:04:16 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e168 e168: 6 total, 6 up, 6 in Nov 26 05:04:16 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:16.274 262471 INFO neutron.agent.linux.ip_lib [None req-bfa349c0-55f7-4016-8506-087ead42f744 - - - - - -] Device tap8c5e19ce-a7 cannot be used as it has no MAC address#033[00m Nov 26 05:04:16 localhost nova_compute[281415]: 2025-11-26 10:04:16.308 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:16 localhost kernel: device tap8c5e19ce-a7 entered promiscuous mode Nov 26 05:04:16 localhost ovn_controller[153664]: 2025-11-26T10:04:16Z|00408|binding|INFO|Claiming lport 8c5e19ce-a7c3-42c0-90d1-dfc5e9b2e99d for this chassis. Nov 26 05:04:16 localhost NetworkManager[5970]: [1764151456.3168] manager: (tap8c5e19ce-a7): new Generic device (/org/freedesktop/NetworkManager/Devices/66) Nov 26 05:04:16 localhost ovn_controller[153664]: 2025-11-26T10:04:16Z|00409|binding|INFO|8c5e19ce-a7c3-42c0-90d1-dfc5e9b2e99d: Claiming unknown Nov 26 05:04:16 localhost nova_compute[281415]: 2025-11-26 10:04:16.317 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:16 localhost systemd-udevd[323452]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:04:16 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:16.343 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-96766190-d4c9-4230-a751-90a95a31e14a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-96766190-d4c9-4230-a751-90a95a31e14a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a28ce44d2e9a40519a8955587c056dae', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b657d111-bf25-44bc-bb74-ac3717e5ddee, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=8c5e19ce-a7c3-42c0-90d1-dfc5e9b2e99d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:04:16 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:16.345 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 8c5e19ce-a7c3-42c0-90d1-dfc5e9b2e99d in datapath 96766190-d4c9-4230-a751-90a95a31e14a bound to our chassis#033[00m Nov 26 05:04:16 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:16.347 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 96766190-d4c9-4230-a751-90a95a31e14a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:04:16 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:16.348 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[18130896-5c68-426b-8152-8653838430af]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:04:16 localhost ovn_controller[153664]: 2025-11-26T10:04:16Z|00410|binding|INFO|Setting lport 8c5e19ce-a7c3-42c0-90d1-dfc5e9b2e99d ovn-installed in OVS Nov 26 05:04:16 localhost ovn_controller[153664]: 2025-11-26T10:04:16Z|00411|binding|INFO|Setting lport 8c5e19ce-a7c3-42c0-90d1-dfc5e9b2e99d up in Southbound Nov 26 05:04:16 localhost journal[229445]: ethtool ioctl error on tap8c5e19ce-a7: No such device Nov 26 05:04:16 localhost nova_compute[281415]: 2025-11-26 10:04:16.356 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:16 localhost journal[229445]: ethtool ioctl error on tap8c5e19ce-a7: No such device Nov 26 05:04:16 localhost journal[229445]: ethtool ioctl error on tap8c5e19ce-a7: No such device Nov 26 05:04:16 localhost journal[229445]: ethtool ioctl error on tap8c5e19ce-a7: No such device Nov 26 05:04:16 localhost journal[229445]: ethtool ioctl error on tap8c5e19ce-a7: No such device Nov 26 05:04:16 localhost journal[229445]: ethtool ioctl error on tap8c5e19ce-a7: No such device Nov 26 05:04:16 localhost journal[229445]: ethtool ioctl error on tap8c5e19ce-a7: No such device Nov 26 05:04:16 localhost journal[229445]: ethtool ioctl error on tap8c5e19ce-a7: No such device Nov 26 05:04:16 localhost nova_compute[281415]: 2025-11-26 10:04:16.408 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:16 localhost nova_compute[281415]: 2025-11-26 10:04:16.445 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:17 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e169 e169: 6 total, 6 up, 6 in Nov 26 05:04:17 localhost nova_compute[281415]: 2025-11-26 10:04:17.233 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:17 localhost podman[323523]: Nov 26 05:04:17 localhost podman[323523]: 2025-11-26 10:04:17.335433223 +0000 UTC m=+0.077665887 container create edef080f69a556ede796dca1219bfb86a1c1e8cbceb3d0bf993012f022e13b5f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-96766190-d4c9-4230-a751-90a95a31e14a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 26 05:04:17 localhost systemd[1]: Started libpod-conmon-edef080f69a556ede796dca1219bfb86a1c1e8cbceb3d0bf993012f022e13b5f.scope. Nov 26 05:04:17 localhost podman[323523]: 2025-11-26 10:04:17.296093584 +0000 UTC m=+0.038326248 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:04:17 localhost systemd[1]: Started libcrun container. Nov 26 05:04:17 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ba3e684ade5c8f781716bac137847392521841213cf36eb9bdc380b6503fb665/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:04:17 localhost podman[323523]: 2025-11-26 10:04:17.415173444 +0000 UTC m=+0.157406088 container init edef080f69a556ede796dca1219bfb86a1c1e8cbceb3d0bf993012f022e13b5f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-96766190-d4c9-4230-a751-90a95a31e14a, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118) Nov 26 05:04:17 localhost podman[323523]: 2025-11-26 10:04:17.423822252 +0000 UTC m=+0.166054896 container start edef080f69a556ede796dca1219bfb86a1c1e8cbceb3d0bf993012f022e13b5f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-96766190-d4c9-4230-a751-90a95a31e14a, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:04:17 localhost dnsmasq[323541]: started, version 2.85 cachesize 150 Nov 26 05:04:17 localhost dnsmasq[323541]: DNS service limited to local subnets Nov 26 05:04:17 localhost dnsmasq[323541]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:04:17 localhost dnsmasq[323541]: warning: no upstream servers configured Nov 26 05:04:17 localhost dnsmasq-dhcp[323541]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 26 05:04:17 localhost dnsmasq[323541]: read /var/lib/neutron/dhcp/96766190-d4c9-4230-a751-90a95a31e14a/addn_hosts - 0 addresses Nov 26 05:04:17 localhost dnsmasq-dhcp[323541]: read /var/lib/neutron/dhcp/96766190-d4c9-4230-a751-90a95a31e14a/host Nov 26 05:04:17 localhost dnsmasq-dhcp[323541]: read /var/lib/neutron/dhcp/96766190-d4c9-4230-a751-90a95a31e14a/opts Nov 26 05:04:17 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:17.476 262471 INFO neutron.agent.dhcp.agent [None req-bfa349c0-55f7-4016-8506-087ead42f744 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:04:15Z, description=, device_id=50b9da03-77ab-4b5e-8c44-e89155fd90ed, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=c00e43a3-b8ae-4fa9-92fc-26533a047feb, ip_allocation=immediate, mac_address=fa:16:3e:9e:65:94, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:04:13Z, description=, dns_domain=, id=96766190-d4c9-4230-a751-90a95a31e14a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1147142383, port_security_enabled=True, project_id=a28ce44d2e9a40519a8955587c056dae, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=52125, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2461, status=ACTIVE, subnets=['90dd1661-b439-4cc8-82f3-6ee6215a5ea6'], tags=[], tenant_id=a28ce44d2e9a40519a8955587c056dae, updated_at=2025-11-26T10:04:14Z, vlan_transparent=None, network_id=96766190-d4c9-4230-a751-90a95a31e14a, port_security_enabled=False, project_id=a28ce44d2e9a40519a8955587c056dae, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2473, status=DOWN, tags=[], tenant_id=a28ce44d2e9a40519a8955587c056dae, updated_at=2025-11-26T10:04:15Z on network 96766190-d4c9-4230-a751-90a95a31e14a#033[00m Nov 26 05:04:17 localhost nova_compute[281415]: 2025-11-26 10:04:17.495 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:17 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:17.605 262471 INFO neutron.agent.dhcp.agent [None req-ef141fa5-5d73-4fcd-bcc6-ca4e4b08d6b8 - - - - - -] DHCP configuration for ports {'2600002c-6077-47ec-a572-2821511d061e'} is completed#033[00m Nov 26 05:04:17 localhost dnsmasq[323541]: read /var/lib/neutron/dhcp/96766190-d4c9-4230-a751-90a95a31e14a/addn_hosts - 1 addresses Nov 26 05:04:17 localhost dnsmasq-dhcp[323541]: read /var/lib/neutron/dhcp/96766190-d4c9-4230-a751-90a95a31e14a/host Nov 26 05:04:17 localhost dnsmasq-dhcp[323541]: read /var/lib/neutron/dhcp/96766190-d4c9-4230-a751-90a95a31e14a/opts Nov 26 05:04:17 localhost podman[323560]: 2025-11-26 10:04:17.662909129 +0000 UTC m=+0.065772658 container kill edef080f69a556ede796dca1219bfb86a1c1e8cbceb3d0bf993012f022e13b5f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-96766190-d4c9-4230-a751-90a95a31e14a, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:04:17 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:17.859 262471 INFO neutron.agent.dhcp.agent [None req-bfa349c0-55f7-4016-8506-087ead42f744 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:04:15Z, description=, device_id=50b9da03-77ab-4b5e-8c44-e89155fd90ed, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=c00e43a3-b8ae-4fa9-92fc-26533a047feb, ip_allocation=immediate, mac_address=fa:16:3e:9e:65:94, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:04:13Z, description=, dns_domain=, id=96766190-d4c9-4230-a751-90a95a31e14a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1147142383, port_security_enabled=True, project_id=a28ce44d2e9a40519a8955587c056dae, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=52125, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2461, status=ACTIVE, subnets=['90dd1661-b439-4cc8-82f3-6ee6215a5ea6'], tags=[], tenant_id=a28ce44d2e9a40519a8955587c056dae, updated_at=2025-11-26T10:04:14Z, vlan_transparent=None, network_id=96766190-d4c9-4230-a751-90a95a31e14a, port_security_enabled=False, project_id=a28ce44d2e9a40519a8955587c056dae, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2473, status=DOWN, tags=[], tenant_id=a28ce44d2e9a40519a8955587c056dae, updated_at=2025-11-26T10:04:15Z on network 96766190-d4c9-4230-a751-90a95a31e14a#033[00m Nov 26 05:04:17 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:17.951 262471 INFO neutron.agent.dhcp.agent [None req-d8dc8d68-0c7e-4c91-94a3-42317e791a63 - - - - - -] DHCP configuration for ports {'c00e43a3-b8ae-4fa9-92fc-26533a047feb'} is completed#033[00m Nov 26 05:04:18 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:18.074 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fad182b-d1fd-4eb1-a4d3-436a76a6f49e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 05:04:18 localhost dnsmasq[323541]: read /var/lib/neutron/dhcp/96766190-d4c9-4230-a751-90a95a31e14a/addn_hosts - 1 addresses Nov 26 05:04:18 localhost dnsmasq-dhcp[323541]: read /var/lib/neutron/dhcp/96766190-d4c9-4230-a751-90a95a31e14a/host Nov 26 05:04:18 localhost dnsmasq-dhcp[323541]: read /var/lib/neutron/dhcp/96766190-d4c9-4230-a751-90a95a31e14a/opts Nov 26 05:04:18 localhost podman[323598]: 2025-11-26 10:04:18.088985491 +0000 UTC m=+0.072676053 container kill edef080f69a556ede796dca1219bfb86a1c1e8cbceb3d0bf993012f022e13b5f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-96766190-d4c9-4230-a751-90a95a31e14a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Nov 26 05:04:18 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:18.389 262471 INFO neutron.agent.dhcp.agent [None req-ee283261-3ef9-4e07-873b-73017b524545 - - - - - -] DHCP configuration for ports {'c00e43a3-b8ae-4fa9-92fc-26533a047feb'} is completed#033[00m Nov 26 05:04:18 localhost nova_compute[281415]: 2025-11-26 10:04:18.450 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 05:04:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 05:04:18 localhost systemd[1]: tmp-crun.FIcF9n.mount: Deactivated successfully. Nov 26 05:04:18 localhost podman[323619]: 2025-11-26 10:04:18.842956301 +0000 UTC m=+0.096632564 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118) Nov 26 05:04:18 localhost podman[323619]: 2025-11-26 10:04:18.874142508 +0000 UTC m=+0.127818781 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 26 05:04:18 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 05:04:18 localhost podman[323620]: 2025-11-26 10:04:18.892693312 +0000 UTC m=+0.144164147 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible) Nov 26 05:04:18 localhost podman[323620]: 2025-11-26 10:04:18.911776394 +0000 UTC m=+0.163247179 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, config_id=multipathd, io.buildah.version=1.41.3) Nov 26 05:04:18 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 05:04:19 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:19.026 262471 INFO neutron.agent.linux.ip_lib [None req-c8f2a975-9bbf-4cfe-8ae6-94089c28be04 - - - - - -] Device tape3f2e9e0-b4 cannot be used as it has no MAC address#033[00m Nov 26 05:04:19 localhost nova_compute[281415]: 2025-11-26 10:04:19.054 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:19 localhost kernel: device tape3f2e9e0-b4 entered promiscuous mode Nov 26 05:04:19 localhost NetworkManager[5970]: [1764151459.0626] manager: (tape3f2e9e0-b4): new Generic device (/org/freedesktop/NetworkManager/Devices/67) Nov 26 05:04:19 localhost nova_compute[281415]: 2025-11-26 10:04:19.063 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:19 localhost ovn_controller[153664]: 2025-11-26T10:04:19Z|00412|binding|INFO|Claiming lport e3f2e9e0-b4a2-48fe-9e78-ffe06988aecd for this chassis. Nov 26 05:04:19 localhost ovn_controller[153664]: 2025-11-26T10:04:19Z|00413|binding|INFO|e3f2e9e0-b4a2-48fe-9e78-ffe06988aecd: Claiming unknown Nov 26 05:04:19 localhost systemd-udevd[323454]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:04:19 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:19.074 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-42c68140-a3bd-4051-b8d1-10ff94db9a6f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42c68140-a3bd-4051-b8d1-10ff94db9a6f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a28ce44d2e9a40519a8955587c056dae', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8c251bbe-22c2-4726-9783-a8255fe9926e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e3f2e9e0-b4a2-48fe-9e78-ffe06988aecd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:04:19 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:19.076 159486 INFO neutron.agent.ovn.metadata.agent [-] Port e3f2e9e0-b4a2-48fe-9e78-ffe06988aecd in datapath 42c68140-a3bd-4051-b8d1-10ff94db9a6f bound to our chassis#033[00m Nov 26 05:04:19 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:19.078 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 42c68140-a3bd-4051-b8d1-10ff94db9a6f or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:04:19 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:19.079 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[90eb6c20-3c50-4f92-b5bb-cde796c2f367]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:04:19 localhost ovn_controller[153664]: 2025-11-26T10:04:19Z|00414|binding|INFO|Setting lport e3f2e9e0-b4a2-48fe-9e78-ffe06988aecd ovn-installed in OVS Nov 26 05:04:19 localhost ovn_controller[153664]: 2025-11-26T10:04:19Z|00415|binding|INFO|Setting lport e3f2e9e0-b4a2-48fe-9e78-ffe06988aecd up in Southbound Nov 26 05:04:19 localhost nova_compute[281415]: 2025-11-26 10:04:19.102 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:19 localhost nova_compute[281415]: 2025-11-26 10:04:19.146 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:19 localhost nova_compute[281415]: 2025-11-26 10:04:19.182 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:19 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e170 e170: 6 total, 6 up, 6 in Nov 26 05:04:19 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e170 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:04:20 localhost podman[323720]: Nov 26 05:04:20 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 26 05:04:20 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3829445094' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 26 05:04:20 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 26 05:04:20 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3829445094' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 26 05:04:20 localhost podman[323720]: 2025-11-26 10:04:20.120751413 +0000 UTC m=+0.104329165 container create 73732e16a9e27b75de9ce4c0d21c4840bd178179e57c09cd49c1a6aab556249d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42c68140-a3bd-4051-b8d1-10ff94db9a6f, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 26 05:04:20 localhost systemd[1]: Started libpod-conmon-73732e16a9e27b75de9ce4c0d21c4840bd178179e57c09cd49c1a6aab556249d.scope. Nov 26 05:04:20 localhost podman[323720]: 2025-11-26 10:04:20.068643588 +0000 UTC m=+0.052221390 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:04:20 localhost systemd[1]: Started libcrun container. Nov 26 05:04:20 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/12d21a29cfb97528b810347c91976b102e78407d7f474ec8e5f44f8f45c76fbd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:04:20 localhost podman[323720]: 2025-11-26 10:04:20.204669003 +0000 UTC m=+0.188246755 container init 73732e16a9e27b75de9ce4c0d21c4840bd178179e57c09cd49c1a6aab556249d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42c68140-a3bd-4051-b8d1-10ff94db9a6f, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2) Nov 26 05:04:20 localhost podman[323720]: 2025-11-26 10:04:20.214479376 +0000 UTC m=+0.198057138 container start 73732e16a9e27b75de9ce4c0d21c4840bd178179e57c09cd49c1a6aab556249d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42c68140-a3bd-4051-b8d1-10ff94db9a6f, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:04:20 localhost dnsmasq[323738]: started, version 2.85 cachesize 150 Nov 26 05:04:20 localhost dnsmasq[323738]: DNS service limited to local subnets Nov 26 05:04:20 localhost dnsmasq[323738]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:04:20 localhost dnsmasq[323738]: warning: no upstream servers configured Nov 26 05:04:20 localhost dnsmasq-dhcp[323738]: DHCPv6, static leases only on 2001:db8:1::, lease time 1d Nov 26 05:04:20 localhost dnsmasq[323738]: read /var/lib/neutron/dhcp/42c68140-a3bd-4051-b8d1-10ff94db9a6f/addn_hosts - 0 addresses Nov 26 05:04:20 localhost dnsmasq-dhcp[323738]: read /var/lib/neutron/dhcp/42c68140-a3bd-4051-b8d1-10ff94db9a6f/host Nov 26 05:04:20 localhost dnsmasq-dhcp[323738]: read /var/lib/neutron/dhcp/42c68140-a3bd-4051-b8d1-10ff94db9a6f/opts Nov 26 05:04:20 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:20.288 262471 INFO neutron.agent.dhcp.agent [None req-c8f2a975-9bbf-4cfe-8ae6-94089c28be04 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:04:18Z, description=, device_id=50b9da03-77ab-4b5e-8c44-e89155fd90ed, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=76c24320-f409-41f0-b1b8-e2fd645a4936, ip_allocation=immediate, mac_address=fa:16:3e:c9:de:5f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:04:17Z, description=, dns_domain=, id=42c68140-a3bd-4051-b8d1-10ff94db9a6f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-503599426, port_security_enabled=True, project_id=a28ce44d2e9a40519a8955587c056dae, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=6404, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2477, status=ACTIVE, subnets=['bb51c8c2-d476-4009-aebb-0b90eff01099'], tags=[], tenant_id=a28ce44d2e9a40519a8955587c056dae, updated_at=2025-11-26T10:04:18Z, vlan_transparent=None, network_id=42c68140-a3bd-4051-b8d1-10ff94db9a6f, port_security_enabled=False, project_id=a28ce44d2e9a40519a8955587c056dae, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2483, status=DOWN, tags=[], tenant_id=a28ce44d2e9a40519a8955587c056dae, updated_at=2025-11-26T10:04:18Z on network 42c68140-a3bd-4051-b8d1-10ff94db9a6f#033[00m Nov 26 05:04:20 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:20.355 262471 INFO neutron.agent.dhcp.agent [None req-65b964a0-b3bd-4b86-9f51-7deac10f87e9 - - - - - -] DHCP configuration for ports {'79d6f9e9-b37b-42d0-90e3-7f3d90b695e9'} is completed#033[00m Nov 26 05:04:20 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e171 e171: 6 total, 6 up, 6 in Nov 26 05:04:20 localhost dnsmasq[323738]: read /var/lib/neutron/dhcp/42c68140-a3bd-4051-b8d1-10ff94db9a6f/addn_hosts - 1 addresses Nov 26 05:04:20 localhost dnsmasq-dhcp[323738]: read /var/lib/neutron/dhcp/42c68140-a3bd-4051-b8d1-10ff94db9a6f/host Nov 26 05:04:20 localhost dnsmasq-dhcp[323738]: read /var/lib/neutron/dhcp/42c68140-a3bd-4051-b8d1-10ff94db9a6f/opts Nov 26 05:04:20 localhost podman[323758]: 2025-11-26 10:04:20.520064184 +0000 UTC m=+0.068546014 container kill 73732e16a9e27b75de9ce4c0d21c4840bd178179e57c09cd49c1a6aab556249d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42c68140-a3bd-4051-b8d1-10ff94db9a6f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 26 05:04:20 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:20.753 262471 INFO neutron.agent.dhcp.agent [None req-c8f2a975-9bbf-4cfe-8ae6-94089c28be04 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:04:18Z, description=, device_id=50b9da03-77ab-4b5e-8c44-e89155fd90ed, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=76c24320-f409-41f0-b1b8-e2fd645a4936, ip_allocation=immediate, mac_address=fa:16:3e:c9:de:5f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:04:17Z, description=, dns_domain=, id=42c68140-a3bd-4051-b8d1-10ff94db9a6f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-503599426, port_security_enabled=True, project_id=a28ce44d2e9a40519a8955587c056dae, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=6404, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2477, status=ACTIVE, subnets=['bb51c8c2-d476-4009-aebb-0b90eff01099'], tags=[], tenant_id=a28ce44d2e9a40519a8955587c056dae, updated_at=2025-11-26T10:04:18Z, vlan_transparent=None, network_id=42c68140-a3bd-4051-b8d1-10ff94db9a6f, port_security_enabled=False, project_id=a28ce44d2e9a40519a8955587c056dae, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2483, status=DOWN, tags=[], tenant_id=a28ce44d2e9a40519a8955587c056dae, updated_at=2025-11-26T10:04:18Z on network 42c68140-a3bd-4051-b8d1-10ff94db9a6f#033[00m Nov 26 05:04:20 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:20.828 262471 INFO neutron.agent.dhcp.agent [None req-334f67c4-9251-44a0-8beb-122465fff69b - - - - - -] DHCP configuration for ports {'76c24320-f409-41f0-b1b8-e2fd645a4936'} is completed#033[00m Nov 26 05:04:20 localhost dnsmasq[323738]: read /var/lib/neutron/dhcp/42c68140-a3bd-4051-b8d1-10ff94db9a6f/addn_hosts - 1 addresses Nov 26 05:04:20 localhost dnsmasq-dhcp[323738]: read /var/lib/neutron/dhcp/42c68140-a3bd-4051-b8d1-10ff94db9a6f/host Nov 26 05:04:20 localhost dnsmasq-dhcp[323738]: read /var/lib/neutron/dhcp/42c68140-a3bd-4051-b8d1-10ff94db9a6f/opts Nov 26 05:04:20 localhost podman[323798]: 2025-11-26 10:04:20.976673912 +0000 UTC m=+0.072218029 container kill 73732e16a9e27b75de9ce4c0d21c4840bd178179e57c09cd49c1a6aab556249d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42c68140-a3bd-4051-b8d1-10ff94db9a6f, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 26 05:04:21 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:21.208 262471 INFO neutron.agent.linux.ip_lib [None req-0dd2a4cc-0cb3-4a50-96ef-db86bde9ae61 - - - - - -] Device tap9db834ed-36 cannot be used as it has no MAC address#033[00m Nov 26 05:04:21 localhost nova_compute[281415]: 2025-11-26 10:04:21.242 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:21 localhost kernel: device tap9db834ed-36 entered promiscuous mode Nov 26 05:04:21 localhost NetworkManager[5970]: [1764151461.2528] manager: (tap9db834ed-36): new Generic device (/org/freedesktop/NetworkManager/Devices/68) Nov 26 05:04:21 localhost nova_compute[281415]: 2025-11-26 10:04:21.253 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:21 localhost ovn_controller[153664]: 2025-11-26T10:04:21Z|00416|binding|INFO|Claiming lport 9db834ed-3699-4554-919f-97a6fac3e915 for this chassis. Nov 26 05:04:21 localhost ovn_controller[153664]: 2025-11-26T10:04:21Z|00417|binding|INFO|9db834ed-3699-4554-919f-97a6fac3e915: Claiming unknown Nov 26 05:04:21 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:21.265 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-ddff5853-e5d2-41f7-abc5-5f9298c6d546', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ddff5853-e5d2-41f7-abc5-5f9298c6d546', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a556ffda51124a0fb5ad54c9ab27653e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9b2ccf9f-c89e-47a5-bc47-16dd22deda97, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9db834ed-3699-4554-919f-97a6fac3e915) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:04:21 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:21.267 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 9db834ed-3699-4554-919f-97a6fac3e915 in datapath ddff5853-e5d2-41f7-abc5-5f9298c6d546 bound to our chassis#033[00m Nov 26 05:04:21 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:21.268 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ddff5853-e5d2-41f7-abc5-5f9298c6d546 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:04:21 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:21.270 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[4a4dcb53-e2e7-465a-a503-5af9348d9797]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:04:21 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:21.281 262471 INFO neutron.agent.dhcp.agent [None req-528684cc-748a-4c01-b1a7-c1b2c2f4dfb4 - - - - - -] DHCP configuration for ports {'76c24320-f409-41f0-b1b8-e2fd645a4936'} is completed#033[00m Nov 26 05:04:21 localhost journal[229445]: ethtool ioctl error on tap9db834ed-36: No such device Nov 26 05:04:21 localhost ovn_controller[153664]: 2025-11-26T10:04:21Z|00418|binding|INFO|Setting lport 9db834ed-3699-4554-919f-97a6fac3e915 ovn-installed in OVS Nov 26 05:04:21 localhost ovn_controller[153664]: 2025-11-26T10:04:21Z|00419|binding|INFO|Setting lport 9db834ed-3699-4554-919f-97a6fac3e915 up in Southbound Nov 26 05:04:21 localhost nova_compute[281415]: 2025-11-26 10:04:21.296 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:21 localhost journal[229445]: ethtool ioctl error on tap9db834ed-36: No such device Nov 26 05:04:21 localhost journal[229445]: ethtool ioctl error on tap9db834ed-36: No such device Nov 26 05:04:21 localhost journal[229445]: ethtool ioctl error on tap9db834ed-36: No such device Nov 26 05:04:21 localhost journal[229445]: ethtool ioctl error on tap9db834ed-36: No such device Nov 26 05:04:21 localhost journal[229445]: ethtool ioctl error on tap9db834ed-36: No such device Nov 26 05:04:21 localhost journal[229445]: ethtool ioctl error on tap9db834ed-36: No such device Nov 26 05:04:21 localhost journal[229445]: ethtool ioctl error on tap9db834ed-36: No such device Nov 26 05:04:21 localhost nova_compute[281415]: 2025-11-26 10:04:21.359 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:21 localhost nova_compute[281415]: 2025-11-26 10:04:21.397 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:21 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e172 e172: 6 total, 6 up, 6 in Nov 26 05:04:22 localhost dnsmasq[322569]: read /var/lib/neutron/dhcp/f0549b39-ed4b-4ffc-bb4b-a02397604463/addn_hosts - 0 addresses Nov 26 05:04:22 localhost dnsmasq-dhcp[322569]: read /var/lib/neutron/dhcp/f0549b39-ed4b-4ffc-bb4b-a02397604463/host Nov 26 05:04:22 localhost dnsmasq-dhcp[322569]: read /var/lib/neutron/dhcp/f0549b39-ed4b-4ffc-bb4b-a02397604463/opts Nov 26 05:04:22 localhost podman[323891]: 2025-11-26 10:04:22.008610124 +0000 UTC m=+0.068824083 container kill 416cf1559770b1ec07fcd4d574d51ef3e2a7d59ef7a24485f3b67d7b79665653 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f0549b39-ed4b-4ffc-bb4b-a02397604463, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 26 05:04:22 localhost kernel: device tap0997460b-5e left promiscuous mode Nov 26 05:04:22 localhost ovn_controller[153664]: 2025-11-26T10:04:22Z|00420|binding|INFO|Releasing lport 0997460b-5e19-4748-b46b-31180af42203 from this chassis (sb_readonly=0) Nov 26 05:04:22 localhost ovn_controller[153664]: 2025-11-26T10:04:22Z|00421|binding|INFO|Setting lport 0997460b-5e19-4748-b46b-31180af42203 down in Southbound Nov 26 05:04:22 localhost nova_compute[281415]: 2025-11-26 10:04:22.243 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:22 localhost nova_compute[281415]: 2025-11-26 10:04:22.245 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 26 05:04:22 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:22.252 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-f0549b39-ed4b-4ffc-bb4b-a02397604463', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f0549b39-ed4b-4ffc-bb4b-a02397604463', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f26aa24f87924ee1873628bdfc9d6d35', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2fca7c47-efde-413f-b7cc-9f4bd34ee3e1, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=0997460b-5e19-4748-b46b-31180af42203) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:04:22 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:22.254 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 0997460b-5e19-4748-b46b-31180af42203 in datapath f0549b39-ed4b-4ffc-bb4b-a02397604463 unbound from our chassis#033[00m Nov 26 05:04:22 localhost nova_compute[281415]: 2025-11-26 10:04:22.259 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:22 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:22.260 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f0549b39-ed4b-4ffc-bb4b-a02397604463, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:04:22 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:22.262 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[f812b1ef-ca9e-43c5-a609-23e6a945bad5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:04:22 localhost podman[323937]: Nov 26 05:04:22 localhost podman[323937]: 2025-11-26 10:04:22.455323216 +0000 UTC m=+0.099682120 container create 8b331bd07590a60ea92231d152e15399a51ca97adf177e8ebbe8211da0d8fbc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ddff5853-e5d2-41f7-abc5-5f9298c6d546, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, io.buildah.version=1.41.3) Nov 26 05:04:22 localhost systemd[1]: Started libpod-conmon-8b331bd07590a60ea92231d152e15399a51ca97adf177e8ebbe8211da0d8fbc2.scope. Nov 26 05:04:22 localhost podman[323937]: 2025-11-26 10:04:22.409506836 +0000 UTC m=+0.053865780 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:04:22 localhost systemd[1]: Started libcrun container. Nov 26 05:04:22 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/735a90309c6962b7d94c054ee80083ad674968fc28db5e75e5a88c4ad650730f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:04:22 localhost podman[323937]: 2025-11-26 10:04:22.53521 +0000 UTC m=+0.179568904 container init 8b331bd07590a60ea92231d152e15399a51ca97adf177e8ebbe8211da0d8fbc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ddff5853-e5d2-41f7-abc5-5f9298c6d546, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:04:22 localhost podman[323937]: 2025-11-26 10:04:22.546617585 +0000 UTC m=+0.190976509 container start 8b331bd07590a60ea92231d152e15399a51ca97adf177e8ebbe8211da0d8fbc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ddff5853-e5d2-41f7-abc5-5f9298c6d546, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:04:22 localhost dnsmasq[323957]: started, version 2.85 cachesize 150 Nov 26 05:04:22 localhost dnsmasq[323957]: DNS service limited to local subnets Nov 26 05:04:22 localhost dnsmasq[323957]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:04:22 localhost dnsmasq[323957]: warning: no upstream servers configured Nov 26 05:04:22 localhost dnsmasq-dhcp[323957]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 26 05:04:22 localhost dnsmasq[323957]: read /var/lib/neutron/dhcp/ddff5853-e5d2-41f7-abc5-5f9298c6d546/addn_hosts - 0 addresses Nov 26 05:04:22 localhost dnsmasq-dhcp[323957]: read /var/lib/neutron/dhcp/ddff5853-e5d2-41f7-abc5-5f9298c6d546/host Nov 26 05:04:22 localhost dnsmasq-dhcp[323957]: read /var/lib/neutron/dhcp/ddff5853-e5d2-41f7-abc5-5f9298c6d546/opts Nov 26 05:04:22 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e173 e173: 6 total, 6 up, 6 in Nov 26 05:04:22 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:22.733 262471 INFO neutron.agent.dhcp.agent [None req-ec3819fe-8975-407d-afa7-f0d9329559d2 - - - - - -] DHCP configuration for ports {'19d451df-c687-43b8-8d48-53a5b6d077c6'} is completed#033[00m Nov 26 05:04:22 localhost dnsmasq[323957]: exiting on receipt of SIGTERM Nov 26 05:04:22 localhost podman[323975]: 2025-11-26 10:04:22.948883178 +0000 UTC m=+0.071647420 container kill 8b331bd07590a60ea92231d152e15399a51ca97adf177e8ebbe8211da0d8fbc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ddff5853-e5d2-41f7-abc5-5f9298c6d546, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:04:22 localhost systemd[1]: libpod-8b331bd07590a60ea92231d152e15399a51ca97adf177e8ebbe8211da0d8fbc2.scope: Deactivated successfully. Nov 26 05:04:23 localhost podman[323989]: 2025-11-26 10:04:23.02835062 +0000 UTC m=+0.059843275 container died 8b331bd07590a60ea92231d152e15399a51ca97adf177e8ebbe8211da0d8fbc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ddff5853-e5d2-41f7-abc5-5f9298c6d546, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118) Nov 26 05:04:23 localhost ovn_controller[153664]: 2025-11-26T10:04:23Z|00422|binding|INFO|Removing iface tap9db834ed-36 ovn-installed in OVS Nov 26 05:04:23 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:23.093 159486 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 9cec270b-c004-4f25-a5c4-76c124c5121a with type ""#033[00m Nov 26 05:04:23 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:23.096 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-ddff5853-e5d2-41f7-abc5-5f9298c6d546', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ddff5853-e5d2-41f7-abc5-5f9298c6d546', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a556ffda51124a0fb5ad54c9ab27653e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9b2ccf9f-c89e-47a5-bc47-16dd22deda97, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9db834ed-3699-4554-919f-97a6fac3e915) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:04:23 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:23.098 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 9db834ed-3699-4554-919f-97a6fac3e915 in datapath ddff5853-e5d2-41f7-abc5-5f9298c6d546 unbound from our chassis#033[00m Nov 26 05:04:23 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:23.099 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ddff5853-e5d2-41f7-abc5-5f9298c6d546 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:04:23 localhost ovn_controller[153664]: 2025-11-26T10:04:23Z|00423|binding|INFO|Removing lport 9db834ed-3699-4554-919f-97a6fac3e915 ovn-installed in OVS Nov 26 05:04:23 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:23.100 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[669d5366-a025-4e6b-ae09-26c8e9176418]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:04:23 localhost nova_compute[281415]: 2025-11-26 10:04:23.102 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:23 localhost podman[323989]: 2025-11-26 10:04:23.114264162 +0000 UTC m=+0.145756767 container cleanup 8b331bd07590a60ea92231d152e15399a51ca97adf177e8ebbe8211da0d8fbc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ddff5853-e5d2-41f7-abc5-5f9298c6d546, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:04:23 localhost systemd[1]: libpod-conmon-8b331bd07590a60ea92231d152e15399a51ca97adf177e8ebbe8211da0d8fbc2.scope: Deactivated successfully. Nov 26 05:04:23 localhost podman[323990]: 2025-11-26 10:04:23.14677373 +0000 UTC m=+0.170924437 container remove 8b331bd07590a60ea92231d152e15399a51ca97adf177e8ebbe8211da0d8fbc2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ddff5853-e5d2-41f7-abc5-5f9298c6d546, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true) Nov 26 05:04:23 localhost kernel: device tap9db834ed-36 left promiscuous mode Nov 26 05:04:23 localhost nova_compute[281415]: 2025-11-26 10:04:23.161 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:23 localhost nova_compute[281415]: 2025-11-26 10:04:23.179 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:23 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:23.337 262471 INFO neutron.agent.dhcp.agent [None req-7ce64f6a-e40c-4b83-b7bc-d377890062c8 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:04:23 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:23.338 262471 INFO neutron.agent.dhcp.agent [None req-7ce64f6a-e40c-4b83-b7bc-d377890062c8 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:04:23 localhost systemd[1]: var-lib-containers-storage-overlay-735a90309c6962b7d94c054ee80083ad674968fc28db5e75e5a88c4ad650730f-merged.mount: Deactivated successfully. Nov 26 05:04:23 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8b331bd07590a60ea92231d152e15399a51ca97adf177e8ebbe8211da0d8fbc2-userdata-shm.mount: Deactivated successfully. Nov 26 05:04:23 localhost systemd[1]: run-netns-qdhcp\x2dddff5853\x2de5d2\x2d41f7\x2dabc5\x2d5f9298c6d546.mount: Deactivated successfully. Nov 26 05:04:23 localhost nova_compute[281415]: 2025-11-26 10:04:23.489 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:23 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e174 e174: 6 total, 6 up, 6 in Nov 26 05:04:23 localhost ovn_controller[153664]: 2025-11-26T10:04:23Z|00424|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:04:23 localhost nova_compute[281415]: 2025-11-26 10:04:23.860 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:24 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e174 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 26 05:04:25 localhost ovn_controller[153664]: 2025-11-26T10:04:25Z|00425|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:04:25 localhost nova_compute[281415]: 2025-11-26 10:04:25.664 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:25 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e175 e175: 6 total, 6 up, 6 in Nov 26 05:04:25 localhost nova_compute[281415]: 2025-11-26 10:04:25.912 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:26 localhost dnsmasq[322569]: exiting on receipt of SIGTERM Nov 26 05:04:26 localhost podman[324034]: 2025-11-26 10:04:26.621119878 +0000 UTC m=+0.063549600 container kill 416cf1559770b1ec07fcd4d574d51ef3e2a7d59ef7a24485f3b67d7b79665653 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f0549b39-ed4b-4ffc-bb4b-a02397604463, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Nov 26 05:04:26 localhost systemd[1]: libpod-416cf1559770b1ec07fcd4d574d51ef3e2a7d59ef7a24485f3b67d7b79665653.scope: Deactivated successfully. Nov 26 05:04:26 localhost podman[324046]: 2025-11-26 10:04:26.690926641 +0000 UTC m=+0.058790933 container died 416cf1559770b1ec07fcd4d574d51ef3e2a7d59ef7a24485f3b67d7b79665653 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f0549b39-ed4b-4ffc-bb4b-a02397604463, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118) Nov 26 05:04:26 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e176 e176: 6 total, 6 up, 6 in Nov 26 05:04:26 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-416cf1559770b1ec07fcd4d574d51ef3e2a7d59ef7a24485f3b67d7b79665653-userdata-shm.mount: Deactivated successfully. Nov 26 05:04:26 localhost podman[324046]: 2025-11-26 10:04:26.727554496 +0000 UTC m=+0.095418778 container cleanup 416cf1559770b1ec07fcd4d574d51ef3e2a7d59ef7a24485f3b67d7b79665653 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f0549b39-ed4b-4ffc-bb4b-a02397604463, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:04:26 localhost systemd[1]: libpod-conmon-416cf1559770b1ec07fcd4d574d51ef3e2a7d59ef7a24485f3b67d7b79665653.scope: Deactivated successfully. Nov 26 05:04:26 localhost podman[324054]: 2025-11-26 10:04:26.758383161 +0000 UTC m=+0.109983208 container remove 416cf1559770b1ec07fcd4d574d51ef3e2a7d59ef7a24485f3b67d7b79665653 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f0549b39-ed4b-4ffc-bb4b-a02397604463, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 26 05:04:26 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:26.805 262471 INFO neutron.agent.dhcp.agent [None req-7ff033df-af40-4bbc-878b-e386ad6b6908 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:04:26 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:26.920 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:04:27 localhost nova_compute[281415]: 2025-11-26 10:04:27.291 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:27 localhost podman[240049]: time="2025-11-26T10:04:27Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 05:04:27 localhost podman[240049]: @ - - [26/Nov/2025:10:04:27 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 161152 "" "Go-http-client/1.1" Nov 26 05:04:27 localhost podman[240049]: @ - - [26/Nov/2025:10:04:27 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20686 "" "Go-http-client/1.1" Nov 26 05:04:27 localhost systemd[1]: var-lib-containers-storage-overlay-e79c0d28f34c5cebdd492b41edb96bd4e577be59e9da077accbd62b367103fa1-merged.mount: Deactivated successfully. Nov 26 05:04:27 localhost systemd[1]: run-netns-qdhcp\x2df0549b39\x2ded4b\x2d4ffc\x2dbb4b\x2da02397604463.mount: Deactivated successfully. Nov 26 05:04:28 localhost nova_compute[281415]: 2025-11-26 10:04:28.527 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:28 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e177 e177: 6 total, 6 up, 6 in Nov 26 05:04:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:29.297 262471 INFO neutron.agent.linux.ip_lib [None req-ecd5abe3-ad02-408e-887a-88815a2722c8 - - - - - -] Device tap382b8943-46 cannot be used as it has no MAC address#033[00m Nov 26 05:04:29 localhost nova_compute[281415]: 2025-11-26 10:04:29.329 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:29 localhost kernel: device tap382b8943-46 entered promiscuous mode Nov 26 05:04:29 localhost NetworkManager[5970]: [1764151469.3367] manager: (tap382b8943-46): new Generic device (/org/freedesktop/NetworkManager/Devices/69) Nov 26 05:04:29 localhost nova_compute[281415]: 2025-11-26 10:04:29.338 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:29 localhost ovn_controller[153664]: 2025-11-26T10:04:29Z|00426|binding|INFO|Claiming lport 382b8943-46cb-40f3-9f3a-e8a7613a95c4 for this chassis. Nov 26 05:04:29 localhost ovn_controller[153664]: 2025-11-26T10:04:29Z|00427|binding|INFO|382b8943-46cb-40f3-9f3a-e8a7613a95c4: Claiming unknown Nov 26 05:04:29 localhost systemd-udevd[324087]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:04:29 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:29.362 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-c79f52c6-eded-4129-96ad-66b78eea79d4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c79f52c6-eded-4129-96ad-66b78eea79d4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a556ffda51124a0fb5ad54c9ab27653e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cbd35d0a-00ef-4d37-b224-47586a5e5228, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=382b8943-46cb-40f3-9f3a-e8a7613a95c4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:04:29 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:29.364 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 382b8943-46cb-40f3-9f3a-e8a7613a95c4 in datapath c79f52c6-eded-4129-96ad-66b78eea79d4 bound to our chassis#033[00m Nov 26 05:04:29 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:29.367 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c79f52c6-eded-4129-96ad-66b78eea79d4 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:04:29 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:29.368 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[610f0613-6bb6-4c1b-a4e6-a5cf5142a475]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:04:29 localhost ovn_controller[153664]: 2025-11-26T10:04:29Z|00428|binding|INFO|Setting lport 382b8943-46cb-40f3-9f3a-e8a7613a95c4 ovn-installed in OVS Nov 26 05:04:29 localhost ovn_controller[153664]: 2025-11-26T10:04:29Z|00429|binding|INFO|Setting lport 382b8943-46cb-40f3-9f3a-e8a7613a95c4 up in Southbound Nov 26 05:04:29 localhost nova_compute[281415]: 2025-11-26 10:04:29.385 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:29 localhost nova_compute[281415]: 2025-11-26 10:04:29.424 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:29 localhost nova_compute[281415]: 2025-11-26 10:04:29.461 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:29 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e178 e178: 6 total, 6 up, 6 in Nov 26 05:04:29 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e178 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:04:29 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 26 05:04:29 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4238861329' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 26 05:04:29 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 26 05:04:29 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4238861329' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 26 05:04:30 localhost ovn_controller[153664]: 2025-11-26T10:04:30Z|00430|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:04:30 localhost nova_compute[281415]: 2025-11-26 10:04:30.350 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:30 localhost podman[324140]: Nov 26 05:04:30 localhost podman[324140]: 2025-11-26 10:04:30.446762671 +0000 UTC m=+0.095361176 container create 7c6edca11b17f653faddcdfc6b35b02440f416ebb75e4e3f039810ad6f58752e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c79f52c6-eded-4129-96ad-66b78eea79d4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:04:30 localhost systemd[1]: Started libpod-conmon-7c6edca11b17f653faddcdfc6b35b02440f416ebb75e4e3f039810ad6f58752e.scope. Nov 26 05:04:30 localhost podman[324140]: 2025-11-26 10:04:30.40026123 +0000 UTC m=+0.048859765 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:04:30 localhost systemd[1]: Started libcrun container. Nov 26 05:04:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9351fbed5eb1264a4723e02476024fceb48fb6c4cb726a666de7be02cd28b7ea/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:04:30 localhost podman[324140]: 2025-11-26 10:04:30.524675954 +0000 UTC m=+0.173274469 container init 7c6edca11b17f653faddcdfc6b35b02440f416ebb75e4e3f039810ad6f58752e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c79f52c6-eded-4129-96ad-66b78eea79d4, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Nov 26 05:04:30 localhost podman[324140]: 2025-11-26 10:04:30.537066218 +0000 UTC m=+0.185664723 container start 7c6edca11b17f653faddcdfc6b35b02440f416ebb75e4e3f039810ad6f58752e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c79f52c6-eded-4129-96ad-66b78eea79d4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0) Nov 26 05:04:30 localhost dnsmasq[324159]: started, version 2.85 cachesize 150 Nov 26 05:04:30 localhost dnsmasq[324159]: DNS service limited to local subnets Nov 26 05:04:30 localhost dnsmasq[324159]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:04:30 localhost dnsmasq[324159]: warning: no upstream servers configured Nov 26 05:04:30 localhost dnsmasq-dhcp[324159]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 26 05:04:30 localhost dnsmasq[324159]: read /var/lib/neutron/dhcp/c79f52c6-eded-4129-96ad-66b78eea79d4/addn_hosts - 0 addresses Nov 26 05:04:30 localhost dnsmasq-dhcp[324159]: read /var/lib/neutron/dhcp/c79f52c6-eded-4129-96ad-66b78eea79d4/host Nov 26 05:04:30 localhost dnsmasq-dhcp[324159]: read /var/lib/neutron/dhcp/c79f52c6-eded-4129-96ad-66b78eea79d4/opts Nov 26 05:04:30 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:30.763 262471 INFO neutron.agent.dhcp.agent [None req-eac15c27-d0a7-437a-9ae7-2e30a58afba9 - - - - - -] DHCP configuration for ports {'10f4b056-1e1e-4c48-96fd-de9455c00f98'} is completed#033[00m Nov 26 05:04:30 localhost dnsmasq[324159]: read /var/lib/neutron/dhcp/c79f52c6-eded-4129-96ad-66b78eea79d4/addn_hosts - 0 addresses Nov 26 05:04:30 localhost dnsmasq-dhcp[324159]: read /var/lib/neutron/dhcp/c79f52c6-eded-4129-96ad-66b78eea79d4/host Nov 26 05:04:30 localhost podman[324177]: 2025-11-26 10:04:30.966558155 +0000 UTC m=+0.065049546 container kill 7c6edca11b17f653faddcdfc6b35b02440f416ebb75e4e3f039810ad6f58752e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c79f52c6-eded-4129-96ad-66b78eea79d4, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 26 05:04:30 localhost dnsmasq-dhcp[324159]: read /var/lib/neutron/dhcp/c79f52c6-eded-4129-96ad-66b78eea79d4/opts Nov 26 05:04:31 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 26 05:04:31 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2025524811' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 26 05:04:31 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 26 05:04:31 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2025524811' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 26 05:04:31 localhost systemd[1]: tmp-crun.Y4ALoV.mount: Deactivated successfully. Nov 26 05:04:31 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:31.483 262471 INFO neutron.agent.dhcp.agent [None req-113e8b79-81ed-473f-94a4-e5b666be1fe1 - - - - - -] DHCP configuration for ports {'10f4b056-1e1e-4c48-96fd-de9455c00f98', '382b8943-46cb-40f3-9f3a-e8a7613a95c4'} is completed#033[00m Nov 26 05:04:31 localhost sshd[324203]: main: sshd: ssh-rsa algorithm is disabled Nov 26 05:04:31 localhost ovn_controller[153664]: 2025-11-26T10:04:31Z|00431|binding|INFO|Removing iface tap382b8943-46 ovn-installed in OVS Nov 26 05:04:31 localhost ovn_controller[153664]: 2025-11-26T10:04:31Z|00432|binding|INFO|Removing lport 382b8943-46cb-40f3-9f3a-e8a7613a95c4 ovn-installed in OVS Nov 26 05:04:31 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:31.774 159486 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port a4ddd6e6-d57c-431c-8270-27fca3703837 with type ""#033[00m Nov 26 05:04:31 localhost nova_compute[281415]: 2025-11-26 10:04:31.775 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:31 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:31.775 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-c79f52c6-eded-4129-96ad-66b78eea79d4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c79f52c6-eded-4129-96ad-66b78eea79d4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a556ffda51124a0fb5ad54c9ab27653e', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cbd35d0a-00ef-4d37-b224-47586a5e5228, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=382b8943-46cb-40f3-9f3a-e8a7613a95c4) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:04:31 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:31.778 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 382b8943-46cb-40f3-9f3a-e8a7613a95c4 in datapath c79f52c6-eded-4129-96ad-66b78eea79d4 unbound from our chassis#033[00m Nov 26 05:04:31 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:31.781 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c79f52c6-eded-4129-96ad-66b78eea79d4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:04:31 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:31.782 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[97a8ec06-9232-4208-9149-31812dd90e5a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:04:31 localhost nova_compute[281415]: 2025-11-26 10:04:31.783 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:31 localhost dnsmasq[324159]: exiting on receipt of SIGTERM Nov 26 05:04:31 localhost podman[324217]: 2025-11-26 10:04:31.89300357 +0000 UTC m=+0.076701388 container kill 7c6edca11b17f653faddcdfc6b35b02440f416ebb75e4e3f039810ad6f58752e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c79f52c6-eded-4129-96ad-66b78eea79d4, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:04:31 localhost systemd[1]: libpod-7c6edca11b17f653faddcdfc6b35b02440f416ebb75e4e3f039810ad6f58752e.scope: Deactivated successfully. Nov 26 05:04:31 localhost podman[324232]: 2025-11-26 10:04:31.97431238 +0000 UTC m=+0.061961291 container died 7c6edca11b17f653faddcdfc6b35b02440f416ebb75e4e3f039810ad6f58752e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c79f52c6-eded-4129-96ad-66b78eea79d4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:04:32 localhost podman[324232]: 2025-11-26 10:04:32.01244293 +0000 UTC m=+0.100091801 container cleanup 7c6edca11b17f653faddcdfc6b35b02440f416ebb75e4e3f039810ad6f58752e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c79f52c6-eded-4129-96ad-66b78eea79d4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Nov 26 05:04:32 localhost systemd[1]: libpod-conmon-7c6edca11b17f653faddcdfc6b35b02440f416ebb75e4e3f039810ad6f58752e.scope: Deactivated successfully. Nov 26 05:04:32 localhost podman[324234]: 2025-11-26 10:04:32.060260663 +0000 UTC m=+0.137940296 container remove 7c6edca11b17f653faddcdfc6b35b02440f416ebb75e4e3f039810ad6f58752e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c79f52c6-eded-4129-96ad-66b78eea79d4, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Nov 26 05:04:32 localhost kernel: device tap382b8943-46 left promiscuous mode Nov 26 05:04:32 localhost nova_compute[281415]: 2025-11-26 10:04:32.110 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:32 localhost nova_compute[281415]: 2025-11-26 10:04:32.127 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.151 262471 INFO neutron.agent.dhcp.agent [None req-1d75ae1e-e240-4995-88ba-7660190e193a - - - - - -] Synchronizing state#033[00m Nov 26 05:04:32 localhost nova_compute[281415]: 2025-11-26 10:04:32.293 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:32 localhost systemd[1]: var-lib-containers-storage-overlay-9351fbed5eb1264a4723e02476024fceb48fb6c4cb726a666de7be02cd28b7ea-merged.mount: Deactivated successfully. Nov 26 05:04:32 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7c6edca11b17f653faddcdfc6b35b02440f416ebb75e4e3f039810ad6f58752e-userdata-shm.mount: Deactivated successfully. Nov 26 05:04:32 localhost systemd[1]: run-netns-qdhcp\x2dc79f52c6\x2deded\x2d4129\x2d96ad\x2d66b78eea79d4.mount: Deactivated successfully. Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.500 262471 INFO neutron.agent.dhcp.agent [None req-1086a9df-db9d-4c2a-a7ab-62cd34b8cb55 - - - - - -] All active networks have been fetched through RPC.#033[00m Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.501 262471 INFO neutron.agent.dhcp.agent [-] Starting network c79f52c6-eded-4129-96ad-66b78eea79d4 dhcp configuration#033[00m Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.620 262471 ERROR neutron.agent.dhcp.agent [None req-c63bb883-7168-4940-aa69-07f5254743f8 - - - - - -] Unable to enable dhcp for c79f52c6-eded-4129-96ad-66b78eea79d4.: oslo_messaging.rpc.client.RemoteError: Remote error: SubnetInUse Unable to complete operation on subnet add6bf50-35d9-4ef1-a532-f680760070ee: This subnet is being modified by another concurrent operation. Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: ['Traceback (most recent call last):\n', ' File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming\n res = self.dispatcher.dispatch(message)\n', ' File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch\n return self._do_dispatch(endpoint, method, ctxt, args)\n', ' File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch\n result = func(ctxt, **new_args)\n', ' File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 244, in inner\n return func(*args, **kwargs)\n', ' File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 142, in wrapped\n setattr(e, \'_RETRY_EXCEEDED\', True)\n', ' File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 138, in wrapped\n return f(*args, **kwargs)\n', ' File "/usr/lib/python3.9/site-packages/oslo_db/api.py", line 154, in wrapper\n ectxt.value = e.inner_exc\n', ' File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/usr/lib/python3.9/site-packages/oslo_db/api.py", line 142, in wrapper\n return f(*args, **kwargs)\n', ' File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 190, in wrapped\n context_reference.session.rollback()\n', ' File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 184, in wrapped\n return f(*dup_args, **dup_kwargs)\n', ' File "/usr/lib/python3.9/site-packages/neutron/quota/resource_registry.py", line 95, in wrapper\n ret_val = f(_self, context, *args, **kwargs)\n', ' File "/usr/lib/python3.9/site-packages/neutron/api/rpc/handlers/dhcp_rpc.py", line 292, in create_dhcp_port\n return self._port_action(plugin, context, port, \'create_port\')\n', ' File "/usr/lib/python3.9/site-packages/neutron/api/rpc/handlers/dhcp_rpc.py", line 118, in _port_action\n return p_utils.create_port(plugin, context, port)\n', ' File "/usr/lib/python3.9/site-packages/neutron_lib/plugins/utils.py", line 338, in create_port\n return core_plugin.create_port(\n', ' File "/usr/lib/python3.9/site-packages/neutron/common/utils.py", line 728, in inner\n return f(*args, **kwargs)\n', ' File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 226, in wrapped\n return f_with_retry(*args, **kwargs,\n', ' File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 142, in wrapped\n setattr(e, \'_RETRY_EXCEEDED\', True)\n', ' File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 138, in wrapped\n return f(*args, **kwargs)\n', ' File "/usr/lib/python3.9/site-packages/oslo_db/api.py", line 154, in wrapper\n ectxt.value = e.inner_exc\n', ' File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/usr/lib/python3.9/site-packages/oslo_db/api.py", line 142, in wrapper\n return f(*args, **kwargs)\n', ' File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 190, in wrapped\n context_reference.session.rollback()\n', ' File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 184, in wrapped\n return f(*dup_args, **dup_kwargs)\n', ' File "/usr/lib/python3.9/site-packages/neutron/plugins/ml2/plugin.py", line 1582, in create_port\n result, mech_context = self._create_port_db(context, port)\n', ' File "/usr/lib/python3.9/site-packages/neutron/plugins/ml2/plugin.py", line 1547, in _create_port_db\n port_db = self.create_port_db(context, port)\n', ' File "/usr/lib/python3.9/site-packages/neutron/db/db_base_plugin_v2.py", line 1501, in create_port_db\n self.ipam.allocate_ips_for_port_and_store(\n', ' File "/usr/lib/python3.9/site-packages/neutron/db/ipam_pluggable_backend.py", line 219, in allocate_ips_for_port_and_store\n ips = self.allocate_ips_for_port(context, port_copy)\n', ' File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 224, in wrapped\n return f(*args, **kwargs)\n', ' File "/usr/lib/python3.9/site-packages/oslo_db/sqlalchemy/enginefacade.py", line 1044, in wrapper\n return fn(*args, **kwargs)\n', ' File "/usr/lib/python3.9/site-packages/neutron/db/ipam_pluggable_backend.py", line 226, in allocate_ips_for_port\n return self._allocate_ips_for_port(context, port)\n', ' File "/usr/lib/python3.9/site-packages/neutron/db/ipam_pluggable_backend.py", line 258, in _allocate_ips_for_port\n subnets = self._ipam_get_subnets(\n', ' File "/usr/lib/python3.9/site-packages/neutron/db/ipam_backend_mixin.py", line 686, in _ipam_get_subnets\n subnet.read_lock_register(\n', ' File "/usr/lib/python3.9/site-packages/neutron/db/models_v2.py", line 81, in read_lock_register\n raise exception\n', 'neutron_lib.exceptions.SubnetInUse: Unable to complete operation on subnet add6bf50-35d9-4ef1-a532-f680760070ee: This subnet is being modified by another concurrent operation.\n']. Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.620 262471 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.620 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.620 262471 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.620 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 324, in enable Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.620 262471 ERROR neutron.agent.dhcp.agent common_utils.wait_until_true(self._enable, timeout=300) Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.620 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/common/utils.py", line 744, in wait_until_true Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.620 262471 ERROR neutron.agent.dhcp.agent while not predicate(): Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.620 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 336, in _enable Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.620 262471 ERROR neutron.agent.dhcp.agent interface_name = self.device_manager.setup( Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.620 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1825, in setup Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.620 262471 ERROR neutron.agent.dhcp.agent self.cleanup_stale_devices(network, dhcp_port=None) Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.620 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__ Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.620 262471 ERROR neutron.agent.dhcp.agent self.force_reraise() Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.620 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.620 262471 ERROR neutron.agent.dhcp.agent raise self.value Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.620 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1820, in setup Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.620 262471 ERROR neutron.agent.dhcp.agent port = self.setup_dhcp_port(network, segment) Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.620 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1755, in setup_dhcp_port Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.620 262471 ERROR neutron.agent.dhcp.agent dhcp_port = setup_method(network, device_id, dhcp_subnets) Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.620 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1703, in _setup_new_dhcp_port Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.620 262471 ERROR neutron.agent.dhcp.agent return self.plugin.create_dhcp_port({'port': port_dict}) Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.620 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 893, in create_dhcp_port Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.620 262471 ERROR neutron.agent.dhcp.agent port = cctxt.call(self.context, 'create_dhcp_port', Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.620 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron_lib/rpc.py", line 157, in call Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.620 262471 ERROR neutron.agent.dhcp.agent return self._original_context.call(ctxt, method, **kwargs) Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.620 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 190, in call Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.620 262471 ERROR neutron.agent.dhcp.agent result = self.transport._send( Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.620 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.620 262471 ERROR neutron.agent.dhcp.agent return self._driver.send(target, ctxt, message, Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.620 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.620 262471 ERROR neutron.agent.dhcp.agent return self._send(target, ctxt, message, wait_for_reply, timeout, Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.620 262471 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.620 262471 ERROR neutron.agent.dhcp.agent raise result Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.620 262471 ERROR neutron.agent.dhcp.agent oslo_messaging.rpc.client.RemoteError: Remote error: SubnetInUse Unable to complete operation on subnet add6bf50-35d9-4ef1-a532-f680760070ee: This subnet is being modified by another concurrent operation. Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.620 262471 ERROR neutron.agent.dhcp.agent ['Traceback (most recent call last):\n', ' File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming\n res = self.dispatcher.dispatch(message)\n', ' File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch\n return self._do_dispatch(endpoint, method, ctxt, args)\n', ' File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch\n result = func(ctxt, **new_args)\n', ' File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 244, in inner\n return func(*args, **kwargs)\n', ' File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 142, in wrapped\n setattr(e, \'_RETRY_EXCEEDED\', True)\n', ' File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 138, in wrapped\n return f(*args, **kwargs)\n', ' File "/usr/lib/python3.9/site-packages/oslo_db/api.py", line 154, in wrapper\n ectxt.value = e.inner_exc\n', ' File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/usr/lib/python3.9/site-packages/oslo_db/api.py", line 142, in wrapper\n return f(*args, **kwargs)\n', ' File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 190, in wrapped\n context_reference.session.rollback()\n', ' File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 184, in wrapped\n return f(*dup_args, **dup_kwargs)\n', ' File "/usr/lib/python3.9/site-packages/neutron/quota/resource_registry.py", line 95, in wrapper\n ret_val = f(_self, context, *args, **kwargs)\n', ' File "/usr/lib/python3.9/site-packages/neutron/api/rpc/handlers/dhcp_rpc.py", line 292, in create_dhcp_port\n return self._port_action(plugin, context, port, \'create_port\')\n', ' File "/usr/lib/python3.9/site-packages/neutron/api/rpc/handlers/dhcp_rpc.py", line 118, in _port_action\n return p_utils.create_port(plugin, context, port)\n', ' File "/usr/lib/python3.9/site-packages/neutron_lib/plugins/utils.py", line 338, in create_port\n return core_plugin.create_port(\n', ' File "/usr/lib/python3.9/site-packages/neutron/common/utils.py", line 728, in inner\n return f(*args, **kwargs)\n', ' File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 226, in wrapped\n return f_with_retry(*args, **kwargs,\n', ' File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 142, in wrapped\n setattr(e, \'_RETRY_EXCEEDED\', True)\n', ' File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 138, in wrapped\n return f(*args, **kwargs)\n', ' File "/usr/lib/python3.9/site-packages/oslo_db/api.py", line 154, in wrapper\n ectxt.value = e.inner_exc\n', ' File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/usr/lib/python3.9/site-packages/oslo_db/api.py", line 142, in wrapper\n return f(*args, **kwargs)\n', ' File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 190, in wrapped\n context_reference.session.rollback()\n', ' File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 184, in wrapped\n return f(*dup_args, **dup_kwargs)\n', ' File "/usr/lib/python3.9/site-packages/neutron/plugins/ml2/plugin.py", line 1582, in create_port\n result, mech_context = self._create_port_db(context, port)\n', ' File "/usr/lib/python3.9/site-packages/neutron/plugins/ml2/plugin.py", line 1547, in _create_port_db\n port_db = self.create_port_db(context, port)\n', ' File "/usr/lib/python3.9/site-packages/neutron/db/db_base_plugin_v2.py", line 1501, in create_port_db\n self.ipam.allocate_ips_for_port_and_store(\n', ' File "/usr/lib/python3.9/site-packages/neutron/db/ipam_pluggable_backend.py", line 219, in allocate_ips_for_port_and_store\n ips = self.allocate_ips_for_port(context, port_copy)\n', ' File "/usr/lib/python3.9/site-packages/neutron_lib/db/api.py", line 224, in wrapped\n return f(*args, **kwargs)\n', ' File "/usr/lib/python3.9/site-packages/oslo_db/sqlalchemy/enginefacade.py", line 1044, in wrapper\n return fn(*args, **kwargs)\n', ' File "/usr/lib/python3.9/site-packages/neutron/db/ipam_pluggable_backend.py", line 226, in allocate_ips_for_port\n return self._allocate_ips_for_port(context, port)\n', ' File "/usr/lib/python3.9/site-packages/neutron/db/ipam_pluggable_backend.py", line 258, in _allocate_ips_for_port\n subnets = self._ipam_get_subnets(\n', ' File "/usr/lib/python3.9/site-packages/neutron/db/ipam_backend_mixin.py", line 686, in _ipam_get_subnets\n subnet.read_lock_register(\n', ' File "/usr/lib/python3.9/site-packages/neutron/db/models_v2.py", line 81, in read_lock_register\n raise exception\n', 'neutron_lib.exceptions.SubnetInUse: Unable to complete operation on subnet add6bf50-35d9-4ef1-a532-f680760070ee: This subnet is being modified by another concurrent operation.\n']. Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.620 262471 ERROR neutron.agent.dhcp.agent #033[00m Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.622 262471 INFO neutron.agent.dhcp.agent [None req-c63bb883-7168-4940-aa69-07f5254743f8 - - - - - -] Finished network c79f52c6-eded-4129-96ad-66b78eea79d4 dhcp configuration#033[00m Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.622 262471 INFO neutron.agent.dhcp.agent [None req-1086a9df-db9d-4c2a-a7ab-62cd34b8cb55 - - - - - -] Synchronizing state complete#033[00m Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.623 262471 INFO neutron.agent.dhcp.agent [None req-1086a9df-db9d-4c2a-a7ab-62cd34b8cb55 - - - - - -] Synchronizing state#033[00m Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.734 262471 INFO neutron.agent.dhcp.agent [None req-6e7cb48c-f83a-40de-9a0a-da84e9465407 - - - - - -] DHCP configuration for ports {'10f4b056-1e1e-4c48-96fd-de9455c00f98'} is completed#033[00m Nov 26 05:04:32 localhost nova_compute[281415]: 2025-11-26 10:04:32.851 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.924 262471 INFO neutron.agent.dhcp.agent [None req-022c6658-4fbf-48be-944e-9c65e17f82bc - - - - - -] All active networks have been fetched through RPC.#033[00m Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.925 262471 INFO neutron.agent.dhcp.agent [-] Starting network c79f52c6-eded-4129-96ad-66b78eea79d4 dhcp configuration#033[00m Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.926 262471 INFO neutron.agent.dhcp.agent [-] Finished network c79f52c6-eded-4129-96ad-66b78eea79d4 dhcp configuration#033[00m Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.926 262471 INFO neutron.agent.dhcp.agent [None req-022c6658-4fbf-48be-944e-9c65e17f82bc - - - - - -] Synchronizing state complete#033[00m Nov 26 05:04:32 localhost ovn_controller[153664]: 2025-11-26T10:04:32Z|00433|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:04:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:32.968 262471 INFO neutron.agent.dhcp.agent [None req-546f59ba-fb64-49b2-afa7-3fc13fdbebbb - - - - - -] DHCP configuration for ports {'10f4b056-1e1e-4c48-96fd-de9455c00f98'} is completed#033[00m Nov 26 05:04:33 localhost nova_compute[281415]: 2025-11-26 10:04:32.998 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:33 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e179 e179: 6 total, 6 up, 6 in Nov 26 05:04:33 localhost nova_compute[281415]: 2025-11-26 10:04:33.560 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:34 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e179 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:04:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 05:04:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 05:04:34 localhost systemd[1]: tmp-crun.2k3UFZ.mount: Deactivated successfully. Nov 26 05:04:34 localhost podman[324263]: 2025-11-26 10:04:34.839578906 +0000 UTC m=+0.092002562 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 05:04:34 localhost podman[324263]: 2025-11-26 10:04:34.879571945 +0000 UTC m=+0.131995561 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 05:04:34 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 05:04:34 localhost podman[324264]: 2025-11-26 10:04:34.914135085 +0000 UTC m=+0.163993641 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:04:34 localhost podman[324264]: 2025-11-26 10:04:34.926827379 +0000 UTC m=+0.176685985 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:04:34 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 05:04:37 localhost nova_compute[281415]: 2025-11-26 10:04:37.294 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:38 localhost ovn_controller[153664]: 2025-11-26T10:04:38Z|00434|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:04:38 localhost nova_compute[281415]: 2025-11-26 10:04:38.336 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:38 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e180 e180: 6 total, 6 up, 6 in Nov 26 05:04:38 localhost nova_compute[281415]: 2025-11-26 10:04:38.562 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:39 localhost podman[324318]: 2025-11-26 10:04:39.685741608 +0000 UTC m=+0.052530458 container kill 9343aac1f18b866d17a648f9ff3e2440300bd7bab4022fdc75072f96077ef76c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c351755e-976a-4c66-be24-e78c1192e045, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 26 05:04:39 localhost dnsmasq[322252]: exiting on receipt of SIGTERM Nov 26 05:04:39 localhost systemd[1]: libpod-9343aac1f18b866d17a648f9ff3e2440300bd7bab4022fdc75072f96077ef76c.scope: Deactivated successfully. Nov 26 05:04:39 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 26 05:04:39 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/13785823' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 26 05:04:39 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 26 05:04:39 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/13785823' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 26 05:04:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 05:04:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 05:04:39 localhost ovn_controller[153664]: 2025-11-26T10:04:39Z|00435|binding|INFO|Removing iface tape6a0b266-65 ovn-installed in OVS Nov 26 05:04:39 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:39.709 159486 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 8c817834-90c8-4a40-9391-72d2288bf47b with type ""#033[00m Nov 26 05:04:39 localhost ovn_controller[153664]: 2025-11-26T10:04:39Z|00436|binding|INFO|Removing lport e6a0b266-6564-46aa-9e5e-68e512c254f0 ovn-installed in OVS Nov 26 05:04:39 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:39.712 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:ffff::2/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-c351755e-976a-4c66-be24-e78c1192e045', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c351755e-976a-4c66-be24-e78c1192e045', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a556ffda51124a0fb5ad54c9ab27653e', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b989b984-81a9-468c-92e0-4e964397d8bd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e6a0b266-6564-46aa-9e5e-68e512c254f0) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:04:39 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:39.714 159486 INFO neutron.agent.ovn.metadata.agent [-] Port e6a0b266-6564-46aa-9e5e-68e512c254f0 in datapath c351755e-976a-4c66-be24-e78c1192e045 unbound from our chassis#033[00m Nov 26 05:04:39 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:39.716 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c351755e-976a-4c66-be24-e78c1192e045 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:04:39 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:39.717 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[1618ac1f-76eb-403d-92dd-ab9bce998d74]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:04:39 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e180 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:04:39 localhost nova_compute[281415]: 2025-11-26 10:04:39.750 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:39 localhost podman[324330]: 2025-11-26 10:04:39.771787373 +0000 UTC m=+0.072318281 container died 9343aac1f18b866d17a648f9ff3e2440300bd7bab4022fdc75072f96077ef76c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c351755e-976a-4c66-be24-e78c1192e045, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:04:39 localhost podman[324339]: 2025-11-26 10:04:39.858757208 +0000 UTC m=+0.139725020 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, name=ubi9-minimal, version=9.6) Nov 26 05:04:39 localhost podman[324339]: 2025-11-26 10:04:39.875191608 +0000 UTC m=+0.156159440 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, version=9.6, io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git) Nov 26 05:04:39 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 05:04:39 localhost podman[324330]: 2025-11-26 10:04:39.907811669 +0000 UTC m=+0.208342567 container cleanup 9343aac1f18b866d17a648f9ff3e2440300bd7bab4022fdc75072f96077ef76c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c351755e-976a-4c66-be24-e78c1192e045, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 26 05:04:39 localhost systemd[1]: libpod-conmon-9343aac1f18b866d17a648f9ff3e2440300bd7bab4022fdc75072f96077ef76c.scope: Deactivated successfully. Nov 26 05:04:39 localhost podman[324338]: 2025-11-26 10:04:39.878258263 +0000 UTC m=+0.156457059 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 26 05:04:39 localhost podman[324338]: 2025-11-26 10:04:39.959338095 +0000 UTC m=+0.237536901 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true) Nov 26 05:04:39 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 05:04:39 localhost podman[324337]: 2025-11-26 10:04:39.984648599 +0000 UTC m=+0.260374558 container remove 9343aac1f18b866d17a648f9ff3e2440300bd7bab4022fdc75072f96077ef76c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c351755e-976a-4c66-be24-e78c1192e045, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Nov 26 05:04:40 localhost nova_compute[281415]: 2025-11-26 10:04:39.999 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:40 localhost kernel: device tape6a0b266-65 left promiscuous mode Nov 26 05:04:40 localhost nova_compute[281415]: 2025-11-26 10:04:40.011 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:40 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:40.029 262471 INFO neutron.agent.dhcp.agent [None req-34a0d1ea-9148-4684-ae24-05a6db34f597 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:04:40 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:40.030 262471 INFO neutron.agent.dhcp.agent [None req-34a0d1ea-9148-4684-ae24-05a6db34f597 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:04:40 localhost ovn_controller[153664]: 2025-11-26T10:04:40Z|00437|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:04:40 localhost nova_compute[281415]: 2025-11-26 10:04:40.224 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:40 localhost systemd[1]: var-lib-containers-storage-overlay-23971d47478032a10dccde92bfc6e0fb83fe997c423e30015b49916dcc38ff38-merged.mount: Deactivated successfully. Nov 26 05:04:40 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9343aac1f18b866d17a648f9ff3e2440300bd7bab4022fdc75072f96077ef76c-userdata-shm.mount: Deactivated successfully. Nov 26 05:04:40 localhost systemd[1]: run-netns-qdhcp\x2dc351755e\x2d976a\x2d4c66\x2dbe24\x2de78c1192e045.mount: Deactivated successfully. Nov 26 05:04:41 localhost podman[324423]: 2025-11-26 10:04:41.226715934 +0000 UTC m=+0.072750505 container kill 73732e16a9e27b75de9ce4c0d21c4840bd178179e57c09cd49c1a6aab556249d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42c68140-a3bd-4051-b8d1-10ff94db9a6f, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118) Nov 26 05:04:41 localhost dnsmasq[323738]: read /var/lib/neutron/dhcp/42c68140-a3bd-4051-b8d1-10ff94db9a6f/addn_hosts - 0 addresses Nov 26 05:04:41 localhost dnsmasq-dhcp[323738]: read /var/lib/neutron/dhcp/42c68140-a3bd-4051-b8d1-10ff94db9a6f/host Nov 26 05:04:41 localhost dnsmasq-dhcp[323738]: read /var/lib/neutron/dhcp/42c68140-a3bd-4051-b8d1-10ff94db9a6f/opts Nov 26 05:04:41 localhost nova_compute[281415]: 2025-11-26 10:04:41.444 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:41 localhost kernel: device tape3f2e9e0-b4 left promiscuous mode Nov 26 05:04:41 localhost ovn_controller[153664]: 2025-11-26T10:04:41Z|00438|binding|INFO|Releasing lport e3f2e9e0-b4a2-48fe-9e78-ffe06988aecd from this chassis (sb_readonly=0) Nov 26 05:04:41 localhost ovn_controller[153664]: 2025-11-26T10:04:41Z|00439|binding|INFO|Setting lport e3f2e9e0-b4a2-48fe-9e78-ffe06988aecd down in Southbound Nov 26 05:04:41 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:41.459 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-42c68140-a3bd-4051-b8d1-10ff94db9a6f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42c68140-a3bd-4051-b8d1-10ff94db9a6f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a28ce44d2e9a40519a8955587c056dae', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8c251bbe-22c2-4726-9783-a8255fe9926e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e3f2e9e0-b4a2-48fe-9e78-ffe06988aecd) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:04:41 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:41.461 159486 INFO neutron.agent.ovn.metadata.agent [-] Port e3f2e9e0-b4a2-48fe-9e78-ffe06988aecd in datapath 42c68140-a3bd-4051-b8d1-10ff94db9a6f unbound from our chassis#033[00m Nov 26 05:04:41 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:41.462 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 42c68140-a3bd-4051-b8d1-10ff94db9a6f or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:04:41 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:41.465 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[f40e86db-df11-443c-a322-d0fca3c42d95]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:04:41 localhost nova_compute[281415]: 2025-11-26 10:04:41.467 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:41 localhost podman[324463]: 2025-11-26 10:04:41.870024905 +0000 UTC m=+0.067284745 container kill 73732e16a9e27b75de9ce4c0d21c4840bd178179e57c09cd49c1a6aab556249d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42c68140-a3bd-4051-b8d1-10ff94db9a6f, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118) Nov 26 05:04:41 localhost dnsmasq[323738]: exiting on receipt of SIGTERM Nov 26 05:04:41 localhost systemd[1]: libpod-73732e16a9e27b75de9ce4c0d21c4840bd178179e57c09cd49c1a6aab556249d.scope: Deactivated successfully. Nov 26 05:04:41 localhost podman[324476]: 2025-11-26 10:04:41.951105818 +0000 UTC m=+0.061657002 container died 73732e16a9e27b75de9ce4c0d21c4840bd178179e57c09cd49c1a6aab556249d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42c68140-a3bd-4051-b8d1-10ff94db9a6f, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 26 05:04:41 localhost podman[324476]: 2025-11-26 10:04:41.989239969 +0000 UTC m=+0.099791113 container cleanup 73732e16a9e27b75de9ce4c0d21c4840bd178179e57c09cd49c1a6aab556249d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42c68140-a3bd-4051-b8d1-10ff94db9a6f, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:04:41 localhost systemd[1]: libpod-conmon-73732e16a9e27b75de9ce4c0d21c4840bd178179e57c09cd49c1a6aab556249d.scope: Deactivated successfully. Nov 26 05:04:42 localhost podman[324478]: 2025-11-26 10:04:42.027866986 +0000 UTC m=+0.131533276 container remove 73732e16a9e27b75de9ce4c0d21c4840bd178179e57c09cd49c1a6aab556249d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42c68140-a3bd-4051-b8d1-10ff94db9a6f, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 26 05:04:42 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:42.058 262471 INFO neutron.agent.dhcp.agent [None req-a282539e-b7e8-4a55-b45e-2641a93353b4 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:04:42 localhost nova_compute[281415]: 2025-11-26 10:04:42.329 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:42 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:42.361 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:04:42 localhost ovn_controller[153664]: 2025-11-26T10:04:42Z|00440|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:04:42 localhost nova_compute[281415]: 2025-11-26 10:04:42.584 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:42 localhost systemd[1]: var-lib-containers-storage-overlay-12d21a29cfb97528b810347c91976b102e78407d7f474ec8e5f44f8f45c76fbd-merged.mount: Deactivated successfully. Nov 26 05:04:42 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-73732e16a9e27b75de9ce4c0d21c4840bd178179e57c09cd49c1a6aab556249d-userdata-shm.mount: Deactivated successfully. Nov 26 05:04:42 localhost systemd[1]: run-netns-qdhcp\x2d42c68140\x2da3bd\x2d4051\x2db8d1\x2d10ff94db9a6f.mount: Deactivated successfully. Nov 26 05:04:43 localhost dnsmasq[323541]: read /var/lib/neutron/dhcp/96766190-d4c9-4230-a751-90a95a31e14a/addn_hosts - 0 addresses Nov 26 05:04:43 localhost dnsmasq-dhcp[323541]: read /var/lib/neutron/dhcp/96766190-d4c9-4230-a751-90a95a31e14a/host Nov 26 05:04:43 localhost podman[324522]: 2025-11-26 10:04:43.45905194 +0000 UTC m=+0.070587079 container kill edef080f69a556ede796dca1219bfb86a1c1e8cbceb3d0bf993012f022e13b5f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-96766190-d4c9-4230-a751-90a95a31e14a, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:04:43 localhost dnsmasq-dhcp[323541]: read /var/lib/neutron/dhcp/96766190-d4c9-4230-a751-90a95a31e14a/opts Nov 26 05:04:43 localhost nova_compute[281415]: 2025-11-26 10:04:43.597 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:43 localhost ovn_controller[153664]: 2025-11-26T10:04:43Z|00441|binding|INFO|Releasing lport 8c5e19ce-a7c3-42c0-90d1-dfc5e9b2e99d from this chassis (sb_readonly=0) Nov 26 05:04:43 localhost kernel: device tap8c5e19ce-a7 left promiscuous mode Nov 26 05:04:43 localhost nova_compute[281415]: 2025-11-26 10:04:43.669 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:43 localhost ovn_controller[153664]: 2025-11-26T10:04:43Z|00442|binding|INFO|Setting lport 8c5e19ce-a7c3-42c0-90d1-dfc5e9b2e99d down in Southbound Nov 26 05:04:43 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:43.684 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-96766190-d4c9-4230-a751-90a95a31e14a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-96766190-d4c9-4230-a751-90a95a31e14a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a28ce44d2e9a40519a8955587c056dae', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b657d111-bf25-44bc-bb74-ac3717e5ddee, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=8c5e19ce-a7c3-42c0-90d1-dfc5e9b2e99d) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:04:43 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:43.686 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 8c5e19ce-a7c3-42c0-90d1-dfc5e9b2e99d in datapath 96766190-d4c9-4230-a751-90a95a31e14a unbound from our chassis#033[00m Nov 26 05:04:43 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:43.688 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 96766190-d4c9-4230-a751-90a95a31e14a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:04:43 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:43.689 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[c099d7a7-d689-4cd7-a52a-9743b14b3b4a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:04:43 localhost nova_compute[281415]: 2025-11-26 10:04:43.691 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:43 localhost nova_compute[281415]: 2025-11-26 10:04:43.692 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:44 localhost dnsmasq[323541]: exiting on receipt of SIGTERM Nov 26 05:04:44 localhost podman[324561]: 2025-11-26 10:04:44.188331876 +0000 UTC m=+0.058942977 container kill edef080f69a556ede796dca1219bfb86a1c1e8cbceb3d0bf993012f022e13b5f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-96766190-d4c9-4230-a751-90a95a31e14a, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:04:44 localhost systemd[1]: libpod-edef080f69a556ede796dca1219bfb86a1c1e8cbceb3d0bf993012f022e13b5f.scope: Deactivated successfully. Nov 26 05:04:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 05:04:44 localhost podman[324577]: 2025-11-26 10:04:44.254065662 +0000 UTC m=+0.045457389 container died edef080f69a556ede796dca1219bfb86a1c1e8cbceb3d0bf993012f022e13b5f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-96766190-d4c9-4230-a751-90a95a31e14a, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 26 05:04:44 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-edef080f69a556ede796dca1219bfb86a1c1e8cbceb3d0bf993012f022e13b5f-userdata-shm.mount: Deactivated successfully. Nov 26 05:04:44 localhost systemd[1]: var-lib-containers-storage-overlay-ba3e684ade5c8f781716bac137847392521841213cf36eb9bdc380b6503fb665-merged.mount: Deactivated successfully. Nov 26 05:04:44 localhost podman[324578]: 2025-11-26 10:04:44.324064411 +0000 UTC m=+0.108012127 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 26 05:04:44 localhost podman[324577]: 2025-11-26 10:04:44.35371082 +0000 UTC m=+0.145102547 container remove edef080f69a556ede796dca1219bfb86a1c1e8cbceb3d0bf993012f022e13b5f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-96766190-d4c9-4230-a751-90a95a31e14a, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 26 05:04:44 localhost systemd[1]: libpod-conmon-edef080f69a556ede796dca1219bfb86a1c1e8cbceb3d0bf993012f022e13b5f.scope: Deactivated successfully. Nov 26 05:04:44 localhost podman[324578]: 2025-11-26 10:04:44.38311803 +0000 UTC m=+0.167065756 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 26 05:04:44 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 05:04:44 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e180 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:04:45 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:45.041 262471 INFO neutron.agent.dhcp.agent [None req-244315da-31d0-4828-8306-efe5aab99fd1 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:04:45 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:45.042 262471 INFO neutron.agent.dhcp.agent [None req-244315da-31d0-4828-8306-efe5aab99fd1 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:04:45 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:45.163 262471 INFO neutron.agent.linux.ip_lib [None req-ac8c3da4-1bb0-46ca-9c4a-cb3ced1e7217 - - - - - -] Device tapee9a6356-2b cannot be used as it has no MAC address#033[00m Nov 26 05:04:45 localhost systemd[1]: run-netns-qdhcp\x2d96766190\x2dd4c9\x2d4230\x2da751\x2d90a95a31e14a.mount: Deactivated successfully. Nov 26 05:04:45 localhost nova_compute[281415]: 2025-11-26 10:04:45.202 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:45 localhost kernel: device tapee9a6356-2b entered promiscuous mode Nov 26 05:04:45 localhost nova_compute[281415]: 2025-11-26 10:04:45.212 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:45 localhost ovn_controller[153664]: 2025-11-26T10:04:45Z|00443|binding|INFO|Claiming lport ee9a6356-2bbd-47d2-b872-41e118ba6f17 for this chassis. Nov 26 05:04:45 localhost ovn_controller[153664]: 2025-11-26T10:04:45Z|00444|binding|INFO|ee9a6356-2bbd-47d2-b872-41e118ba6f17: Claiming unknown Nov 26 05:04:45 localhost NetworkManager[5970]: [1764151485.2143] manager: (tapee9a6356-2b): new Generic device (/org/freedesktop/NetworkManager/Devices/70) Nov 26 05:04:45 localhost systemd-udevd[324634]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:04:45 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:45.224 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-4a88868a-c729-4b53-a5da-713d88b5b238', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a88868a-c729-4b53-a5da-713d88b5b238', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ff1c2725eb448ea88b1272d72be47e9', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=53ae7ca7-f004-4de2-b27e-b430deae157d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ee9a6356-2bbd-47d2-b872-41e118ba6f17) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:04:45 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:45.226 159486 INFO neutron.agent.ovn.metadata.agent [-] Port ee9a6356-2bbd-47d2-b872-41e118ba6f17 in datapath 4a88868a-c729-4b53-a5da-713d88b5b238 bound to our chassis#033[00m Nov 26 05:04:45 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:45.227 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4a88868a-c729-4b53-a5da-713d88b5b238 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:04:45 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:45.228 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[665f863c-a8a4-4002-b9e3-9d5fda4d1ed2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:04:45 localhost journal[229445]: ethtool ioctl error on tapee9a6356-2b: No such device Nov 26 05:04:45 localhost nova_compute[281415]: 2025-11-26 10:04:45.252 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:45 localhost ovn_controller[153664]: 2025-11-26T10:04:45Z|00445|binding|INFO|Setting lport ee9a6356-2bbd-47d2-b872-41e118ba6f17 ovn-installed in OVS Nov 26 05:04:45 localhost ovn_controller[153664]: 2025-11-26T10:04:45Z|00446|binding|INFO|Setting lport ee9a6356-2bbd-47d2-b872-41e118ba6f17 up in Southbound Nov 26 05:04:45 localhost journal[229445]: ethtool ioctl error on tapee9a6356-2b: No such device Nov 26 05:04:45 localhost journal[229445]: ethtool ioctl error on tapee9a6356-2b: No such device Nov 26 05:04:45 localhost journal[229445]: ethtool ioctl error on tapee9a6356-2b: No such device Nov 26 05:04:45 localhost journal[229445]: ethtool ioctl error on tapee9a6356-2b: No such device Nov 26 05:04:45 localhost journal[229445]: ethtool ioctl error on tapee9a6356-2b: No such device Nov 26 05:04:45 localhost journal[229445]: ethtool ioctl error on tapee9a6356-2b: No such device Nov 26 05:04:45 localhost journal[229445]: ethtool ioctl error on tapee9a6356-2b: No such device Nov 26 05:04:45 localhost nova_compute[281415]: 2025-11-26 10:04:45.310 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:45 localhost nova_compute[281415]: 2025-11-26 10:04:45.343 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:45 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:45.358 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:04:45 localhost ovn_controller[153664]: 2025-11-26T10:04:45Z|00447|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:04:45 localhost nova_compute[281415]: 2025-11-26 10:04:45.701 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:45 localhost openstack_network_exporter[242153]: ERROR 10:04:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:04:45 localhost openstack_network_exporter[242153]: ERROR 10:04:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:04:45 localhost openstack_network_exporter[242153]: ERROR 10:04:45 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 05:04:45 localhost openstack_network_exporter[242153]: ERROR 10:04:45 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 05:04:45 localhost openstack_network_exporter[242153]: Nov 26 05:04:45 localhost openstack_network_exporter[242153]: ERROR 10:04:45 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 05:04:45 localhost openstack_network_exporter[242153]: Nov 26 05:04:46 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 26 05:04:46 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4212569927' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 26 05:04:46 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 26 05:04:46 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4212569927' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 26 05:04:46 localhost podman[324705]: Nov 26 05:04:46 localhost podman[324705]: 2025-11-26 10:04:46.364427809 +0000 UTC m=+0.085396067 container create aec2d0389f5396e8067f7c81ed47f518b13f9aeca9ab22dc138a7e73723c0f86 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4a88868a-c729-4b53-a5da-713d88b5b238, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 26 05:04:46 localhost systemd[1]: Started libpod-conmon-aec2d0389f5396e8067f7c81ed47f518b13f9aeca9ab22dc138a7e73723c0f86.scope. Nov 26 05:04:46 localhost podman[324705]: 2025-11-26 10:04:46.318469025 +0000 UTC m=+0.039437323 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:04:46 localhost systemd[1]: Started libcrun container. Nov 26 05:04:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b8a8cf1a0ae1b45df0b3b0e6fb1e15158fe808e3344fb8bba88568a622dab59/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:04:46 localhost podman[324705]: 2025-11-26 10:04:46.448574787 +0000 UTC m=+0.169543045 container init aec2d0389f5396e8067f7c81ed47f518b13f9aeca9ab22dc138a7e73723c0f86 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4a88868a-c729-4b53-a5da-713d88b5b238, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118) Nov 26 05:04:46 localhost podman[324705]: 2025-11-26 10:04:46.458657488 +0000 UTC m=+0.179625746 container start aec2d0389f5396e8067f7c81ed47f518b13f9aeca9ab22dc138a7e73723c0f86 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4a88868a-c729-4b53-a5da-713d88b5b238, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118) Nov 26 05:04:46 localhost dnsmasq[324723]: started, version 2.85 cachesize 150 Nov 26 05:04:46 localhost dnsmasq[324723]: DNS service limited to local subnets Nov 26 05:04:46 localhost dnsmasq[324723]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:04:46 localhost dnsmasq[324723]: warning: no upstream servers configured Nov 26 05:04:46 localhost dnsmasq-dhcp[324723]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 26 05:04:46 localhost dnsmasq[324723]: read /var/lib/neutron/dhcp/4a88868a-c729-4b53-a5da-713d88b5b238/addn_hosts - 0 addresses Nov 26 05:04:46 localhost dnsmasq-dhcp[324723]: read /var/lib/neutron/dhcp/4a88868a-c729-4b53-a5da-713d88b5b238/host Nov 26 05:04:46 localhost dnsmasq-dhcp[324723]: read /var/lib/neutron/dhcp/4a88868a-c729-4b53-a5da-713d88b5b238/opts Nov 26 05:04:46 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:46.619 262471 INFO neutron.agent.dhcp.agent [None req-07d18878-0678-42d5-a2be-787e5628cc1d - - - - - -] DHCP configuration for ports {'1c5bc906-c5a1-4f97-ae3c-9761de091647'} is completed#033[00m Nov 26 05:04:47 localhost dnsmasq[322449]: exiting on receipt of SIGTERM Nov 26 05:04:47 localhost podman[324741]: 2025-11-26 10:04:47.04932486 +0000 UTC m=+0.064608213 container kill b4a12ced12807c774e29e8a9831e7480761887ce5e69e1e96cfcd81bfd67f811 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d13a4499-ed1d-419a-b7ec-18731e67f9ad, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 26 05:04:47 localhost systemd[1]: libpod-b4a12ced12807c774e29e8a9831e7480761887ce5e69e1e96cfcd81bfd67f811.scope: Deactivated successfully. Nov 26 05:04:47 localhost nova_compute[281415]: 2025-11-26 10:04:47.089 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:04:47 localhost nova_compute[281415]: 2025-11-26 10:04:47.107 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Triggering sync for uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Nov 26 05:04:47 localhost nova_compute[281415]: 2025-11-26 10:04:47.108 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "9d78bef9-6977-4fb5-b50b-ae75124e73af" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:04:47 localhost nova_compute[281415]: 2025-11-26 10:04:47.109 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "9d78bef9-6977-4fb5-b50b-ae75124e73af" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:04:47 localhost nova_compute[281415]: 2025-11-26 10:04:47.132 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "9d78bef9-6977-4fb5-b50b-ae75124e73af" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.023s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:04:47 localhost podman[324754]: 2025-11-26 10:04:47.134758377 +0000 UTC m=+0.065339576 container died b4a12ced12807c774e29e8a9831e7480761887ce5e69e1e96cfcd81bfd67f811 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d13a4499-ed1d-419a-b7ec-18731e67f9ad, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:04:47 localhost podman[324754]: 2025-11-26 10:04:47.171280879 +0000 UTC m=+0.101862038 container cleanup b4a12ced12807c774e29e8a9831e7480761887ce5e69e1e96cfcd81bfd67f811 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d13a4499-ed1d-419a-b7ec-18731e67f9ad, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 26 05:04:47 localhost systemd[1]: libpod-conmon-b4a12ced12807c774e29e8a9831e7480761887ce5e69e1e96cfcd81bfd67f811.scope: Deactivated successfully. Nov 26 05:04:47 localhost podman[324756]: 2025-11-26 10:04:47.216245722 +0000 UTC m=+0.141748843 container remove b4a12ced12807c774e29e8a9831e7480761887ce5e69e1e96cfcd81bfd67f811 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d13a4499-ed1d-419a-b7ec-18731e67f9ad, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:04:47 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:47.219 159486 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 764f9365-cbeb-4d89-86e2-bd4f79d6e4ee with type ""#033[00m Nov 26 05:04:47 localhost ovn_controller[153664]: 2025-11-26T10:04:47Z|00448|binding|INFO|Removing iface tape966386f-83 ovn-installed in OVS Nov 26 05:04:47 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:47.221 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.255.242/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-d13a4499-ed1d-419a-b7ec-18731e67f9ad', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d13a4499-ed1d-419a-b7ec-18731e67f9ad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '140ee02dff30450e88d5baa79f6f7df2', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3694eb6a-62d4-45ac-b83e-207240edfdd0, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e966386f-8317-49af-b52a-0c8093a7a76a) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:04:47 localhost ovn_controller[153664]: 2025-11-26T10:04:47Z|00449|binding|INFO|Removing lport e966386f-8317-49af-b52a-0c8093a7a76a ovn-installed in OVS Nov 26 05:04:47 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:47.223 159486 INFO neutron.agent.ovn.metadata.agent [-] Port e966386f-8317-49af-b52a-0c8093a7a76a in datapath d13a4499-ed1d-419a-b7ec-18731e67f9ad unbound from our chassis#033[00m Nov 26 05:04:47 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:47.225 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d13a4499-ed1d-419a-b7ec-18731e67f9ad, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:04:47 localhost ovn_metadata_agent[159481]: 2025-11-26 10:04:47.226 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[af19cd6f-10f8-4e0c-aa5c-dbdce2c3c860]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:04:47 localhost nova_compute[281415]: 2025-11-26 10:04:47.267 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:47 localhost kernel: device tape966386f-83 left promiscuous mode Nov 26 05:04:47 localhost nova_compute[281415]: 2025-11-26 10:04:47.282 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:47 localhost nova_compute[281415]: 2025-11-26 10:04:47.330 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:47 localhost systemd[1]: var-lib-containers-storage-overlay-f1a08f726bcf402cd459f5ea4f8c45f486c7fafaceadfe24a40e70399b8a68d6-merged.mount: Deactivated successfully. Nov 26 05:04:47 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b4a12ced12807c774e29e8a9831e7480761887ce5e69e1e96cfcd81bfd67f811-userdata-shm.mount: Deactivated successfully. Nov 26 05:04:47 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:47.439 262471 INFO neutron.agent.dhcp.agent [None req-15c79dbe-d16c-4b6e-8cbd-1ced2285b20a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:04:47 localhost systemd[1]: run-netns-qdhcp\x2dd13a4499\x2ded1d\x2d419a\x2db7ec\x2d18731e67f9ad.mount: Deactivated successfully. Nov 26 05:04:47 localhost nova_compute[281415]: 2025-11-26 10:04:47.751 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:47 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:47.754 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:04:48 localhost ovn_controller[153664]: 2025-11-26T10:04:48Z|00450|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:04:48 localhost nova_compute[281415]: 2025-11-26 10:04:48.059 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:48 localhost nova_compute[281415]: 2025-11-26 10:04:48.630 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:49 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:49.053 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:04:48Z, description=, device_id=179b0b8f-2844-410a-aff6-87a091c05589, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=e8775dc9-c0c5-4924-acfa-5fc704d5a968, ip_allocation=immediate, mac_address=fa:16:3e:69:b1:f3, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:04:42Z, description=, dns_domain=, id=4a88868a-c729-4b53-a5da-713d88b5b238, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-1493293849-network, port_security_enabled=True, project_id=7ff1c2725eb448ea88b1272d72be47e9, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=924, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2618, status=ACTIVE, subnets=['8b88eab3-2b9b-441c-a388-eb0216f2673e'], tags=[], tenant_id=7ff1c2725eb448ea88b1272d72be47e9, updated_at=2025-11-26T10:04:43Z, vlan_transparent=None, network_id=4a88868a-c729-4b53-a5da-713d88b5b238, port_security_enabled=False, project_id=7ff1c2725eb448ea88b1272d72be47e9, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2647, status=DOWN, tags=[], tenant_id=7ff1c2725eb448ea88b1272d72be47e9, updated_at=2025-11-26T10:04:48Z on network 4a88868a-c729-4b53-a5da-713d88b5b238#033[00m Nov 26 05:04:49 localhost dnsmasq[324723]: read /var/lib/neutron/dhcp/4a88868a-c729-4b53-a5da-713d88b5b238/addn_hosts - 1 addresses Nov 26 05:04:49 localhost dnsmasq-dhcp[324723]: read /var/lib/neutron/dhcp/4a88868a-c729-4b53-a5da-713d88b5b238/host Nov 26 05:04:49 localhost podman[324801]: 2025-11-26 10:04:49.28826225 +0000 UTC m=+0.063619612 container kill aec2d0389f5396e8067f7c81ed47f518b13f9aeca9ab22dc138a7e73723c0f86 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4a88868a-c729-4b53-a5da-713d88b5b238, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true) Nov 26 05:04:49 localhost dnsmasq-dhcp[324723]: read /var/lib/neutron/dhcp/4a88868a-c729-4b53-a5da-713d88b5b238/opts Nov 26 05:04:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 05:04:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 05:04:49 localhost systemd[1]: tmp-crun.D3m52h.mount: Deactivated successfully. Nov 26 05:04:49 localhost systemd[1]: tmp-crun.gET08W.mount: Deactivated successfully. Nov 26 05:04:49 localhost podman[324816]: 2025-11-26 10:04:49.397173464 +0000 UTC m=+0.084673364 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:04:49 localhost podman[324816]: 2025-11-26 10:04:49.42928683 +0000 UTC m=+0.116786790 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent) Nov 26 05:04:49 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 05:04:49 localhost podman[324818]: 2025-11-26 10:04:49.5093594 +0000 UTC m=+0.191049380 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:04:49 localhost podman[324818]: 2025-11-26 10:04:49.543995374 +0000 UTC m=+0.225685394 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:04:49 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 05:04:49 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:49.581 262471 INFO neutron.agent.dhcp.agent [None req-d7311fb5-694b-4511-978c-dd23ccf87c22 - - - - - -] DHCP configuration for ports {'e8775dc9-c0c5-4924-acfa-5fc704d5a968'} is completed#033[00m Nov 26 05:04:49 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e180 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:04:50 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:50.439 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:04:48Z, description=, device_id=179b0b8f-2844-410a-aff6-87a091c05589, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=e8775dc9-c0c5-4924-acfa-5fc704d5a968, ip_allocation=immediate, mac_address=fa:16:3e:69:b1:f3, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:04:42Z, description=, dns_domain=, id=4a88868a-c729-4b53-a5da-713d88b5b238, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-1493293849-network, port_security_enabled=True, project_id=7ff1c2725eb448ea88b1272d72be47e9, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=924, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2618, status=ACTIVE, subnets=['8b88eab3-2b9b-441c-a388-eb0216f2673e'], tags=[], tenant_id=7ff1c2725eb448ea88b1272d72be47e9, updated_at=2025-11-26T10:04:43Z, vlan_transparent=None, network_id=4a88868a-c729-4b53-a5da-713d88b5b238, port_security_enabled=False, project_id=7ff1c2725eb448ea88b1272d72be47e9, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2647, status=DOWN, tags=[], tenant_id=7ff1c2725eb448ea88b1272d72be47e9, updated_at=2025-11-26T10:04:48Z on network 4a88868a-c729-4b53-a5da-713d88b5b238#033[00m Nov 26 05:04:50 localhost dnsmasq[324723]: read /var/lib/neutron/dhcp/4a88868a-c729-4b53-a5da-713d88b5b238/addn_hosts - 1 addresses Nov 26 05:04:50 localhost dnsmasq-dhcp[324723]: read /var/lib/neutron/dhcp/4a88868a-c729-4b53-a5da-713d88b5b238/host Nov 26 05:04:50 localhost dnsmasq-dhcp[324723]: read /var/lib/neutron/dhcp/4a88868a-c729-4b53-a5da-713d88b5b238/opts Nov 26 05:04:50 localhost podman[324879]: 2025-11-26 10:04:50.854055844 +0000 UTC m=+0.067480942 container kill aec2d0389f5396e8067f7c81ed47f518b13f9aeca9ab22dc138a7e73723c0f86 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4a88868a-c729-4b53-a5da-713d88b5b238, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0) Nov 26 05:04:51 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:04:51.202 262471 INFO neutron.agent.dhcp.agent [None req-30202c8d-2973-49da-b7ff-f6ca78cbc6ff - - - - - -] DHCP configuration for ports {'e8775dc9-c0c5-4924-acfa-5fc704d5a968'} is completed#033[00m Nov 26 05:04:52 localhost nova_compute[281415]: 2025-11-26 10:04:52.333 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:53 localhost nova_compute[281415]: 2025-11-26 10:04:53.671 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:54 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e180 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:04:56 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:04:56 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:04:57 localhost nova_compute[281415]: 2025-11-26 10:04:57.356 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:57 localhost podman[240049]: time="2025-11-26T10:04:57Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 05:04:57 localhost podman[240049]: @ - - [26/Nov/2025:10:04:57 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1" Nov 26 05:04:57 localhost podman[240049]: @ - - [26/Nov/2025:10:04:57 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19259 "" "Go-http-client/1.1" Nov 26 05:04:57 localhost neutron_sriov_agent[255515]: 2025-11-26 10:04:57.938 2 INFO neutron.agent.securitygroups_rpc [None req-4c32c42d-c37c-41f0-8917-4eb9abf67fee 72e1d9471f1d4505b4235fd67f9eaf3f 6f76866a60674c05ab3541d24e53df46 - - default default] Security group rule updated ['f8695f54-3395-4453-ac5c-1bddd2eca909']#033[00m Nov 26 05:04:58 localhost nova_compute[281415]: 2025-11-26 10:04:58.703 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:04:59 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e180 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:05:00 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.eve49", "format": "json"} : dispatch Nov 26 05:05:00 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/bcaccd6d-419b-4afc-b665-b3c287b4ae69/dd6a5a64-5609-4019-86d7-f6f03b8352b6", "osd", "allow rw pool=manila_data namespace=fsvolumens_bcaccd6d-419b-4afc-b665-b3c287b4ae69", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:05:00 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/bcaccd6d-419b-4afc-b665-b3c287b4ae69/dd6a5a64-5609-4019-86d7-f6f03b8352b6", "osd", "allow rw pool=manila_data namespace=fsvolumens_bcaccd6d-419b-4afc-b665-b3c287b4ae69", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:05:00 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/bcaccd6d-419b-4afc-b665-b3c287b4ae69/dd6a5a64-5609-4019-86d7-f6f03b8352b6", "osd", "allow rw pool=manila_data namespace=fsvolumens_bcaccd6d-419b-4afc-b665-b3c287b4ae69", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:05:02 localhost nova_compute[281415]: 2025-11-26 10:05:02.399 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:02 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:05:02.464 262471 INFO neutron.agent.linux.ip_lib [None req-77a47943-ca89-4e5f-b834-ea2dfbf357a1 - - - - - -] Device tapf40350c4-bb cannot be used as it has no MAC address#033[00m Nov 26 05:05:02 localhost nova_compute[281415]: 2025-11-26 10:05:02.491 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:02 localhost kernel: device tapf40350c4-bb entered promiscuous mode Nov 26 05:05:02 localhost nova_compute[281415]: 2025-11-26 10:05:02.501 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:02 localhost ovn_controller[153664]: 2025-11-26T10:05:02Z|00451|binding|INFO|Claiming lport f40350c4-bb5c-41a1-a1be-f6833d36ae2a for this chassis. Nov 26 05:05:02 localhost ovn_controller[153664]: 2025-11-26T10:05:02Z|00452|binding|INFO|f40350c4-bb5c-41a1-a1be-f6833d36ae2a: Claiming unknown Nov 26 05:05:02 localhost NetworkManager[5970]: [1764151502.5022] manager: (tapf40350c4-bb): new Generic device (/org/freedesktop/NetworkManager/Devices/71) Nov 26 05:05:02 localhost systemd-udevd[324978]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:05:02 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:02.512 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.101.0.2/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-d459403d-2a49-4c53-959d-df6aa94956be', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d459403d-2a49-4c53-959d-df6aa94956be', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a2fca84bedb04d0b8a425f8e67ff3ca6', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ef40a8f1-0516-43a7-bccd-a6e14cd4b9f3, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f40350c4-bb5c-41a1-a1be-f6833d36ae2a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:05:02 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:02.515 159486 INFO neutron.agent.ovn.metadata.agent [-] Port f40350c4-bb5c-41a1-a1be-f6833d36ae2a in datapath d459403d-2a49-4c53-959d-df6aa94956be bound to our chassis#033[00m Nov 26 05:05:02 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:02.516 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network d459403d-2a49-4c53-959d-df6aa94956be or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:05:02 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:02.517 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[385a7a7f-2cb0-4925-80b3-4525f59833c3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:05:02 localhost ovn_controller[153664]: 2025-11-26T10:05:02Z|00453|binding|INFO|Setting lport f40350c4-bb5c-41a1-a1be-f6833d36ae2a ovn-installed in OVS Nov 26 05:05:02 localhost nova_compute[281415]: 2025-11-26 10:05:02.519 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:02 localhost ovn_controller[153664]: 2025-11-26T10:05:02Z|00454|binding|INFO|Setting lport f40350c4-bb5c-41a1-a1be-f6833d36ae2a up in Southbound Nov 26 05:05:02 localhost nova_compute[281415]: 2025-11-26 10:05:02.521 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:02 localhost nova_compute[281415]: 2025-11-26 10:05:02.547 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:02 localhost nova_compute[281415]: 2025-11-26 10:05:02.600 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:02 localhost nova_compute[281415]: 2025-11-26 10:05:02.640 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:02 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 05:05:02 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:05:02 localhost nova_compute[281415]: 2025-11-26 10:05:02.864 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:05:02 localhost nova_compute[281415]: 2025-11-26 10:05:02.864 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:05:03 localhost podman[325051]: Nov 26 05:05:03 localhost podman[325051]: 2025-11-26 10:05:03.657963897 +0000 UTC m=+0.091064563 container create 1059ee613d80c94482b237669d1aa8b13d98089f95be79c0c05f9ccbd3c1205e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d459403d-2a49-4c53-959d-df6aa94956be, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:05:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:03.673 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:05:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:03.674 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:05:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:03.675 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:05:03 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.eve48", "format": "json"} : dispatch Nov 26 05:05:03 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/bcaccd6d-419b-4afc-b665-b3c287b4ae69/dd6a5a64-5609-4019-86d7-f6f03b8352b6", "osd", "allow rw pool=manila_data namespace=fsvolumens_bcaccd6d-419b-4afc-b665-b3c287b4ae69", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:05:03 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/bcaccd6d-419b-4afc-b665-b3c287b4ae69/dd6a5a64-5609-4019-86d7-f6f03b8352b6", "osd", "allow rw pool=manila_data namespace=fsvolumens_bcaccd6d-419b-4afc-b665-b3c287b4ae69", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:05:03 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/bcaccd6d-419b-4afc-b665-b3c287b4ae69/dd6a5a64-5609-4019-86d7-f6f03b8352b6", "osd", "allow rw pool=manila_data namespace=fsvolumens_bcaccd6d-419b-4afc-b665-b3c287b4ae69", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:05:03 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:05:03 localhost podman[325051]: 2025-11-26 10:05:03.608260597 +0000 UTC m=+0.041361303 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:05:03 localhost nova_compute[281415]: 2025-11-26 10:05:03.749 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:03 localhost systemd[1]: Started libpod-conmon-1059ee613d80c94482b237669d1aa8b13d98089f95be79c0c05f9ccbd3c1205e.scope. Nov 26 05:05:03 localhost systemd[1]: Started libcrun container. Nov 26 05:05:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c1e88e67d54ad54c2adb312f747bc52324ed2c3787699bf96bcbd65ef66ef356/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:05:03 localhost podman[325051]: 2025-11-26 10:05:03.788602544 +0000 UTC m=+0.221703220 container init 1059ee613d80c94482b237669d1aa8b13d98089f95be79c0c05f9ccbd3c1205e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d459403d-2a49-4c53-959d-df6aa94956be, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2) Nov 26 05:05:03 localhost podman[325051]: 2025-11-26 10:05:03.80168593 +0000 UTC m=+0.234786596 container start 1059ee613d80c94482b237669d1aa8b13d98089f95be79c0c05f9ccbd3c1205e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d459403d-2a49-4c53-959d-df6aa94956be, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Nov 26 05:05:03 localhost dnsmasq[325071]: started, version 2.85 cachesize 150 Nov 26 05:05:03 localhost dnsmasq[325071]: DNS service limited to local subnets Nov 26 05:05:03 localhost dnsmasq[325071]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:05:03 localhost dnsmasq[325071]: warning: no upstream servers configured Nov 26 05:05:03 localhost dnsmasq-dhcp[325071]: DHCP, static leases only on 10.101.0.0, lease time 1d Nov 26 05:05:03 localhost dnsmasq[325071]: read /var/lib/neutron/dhcp/d459403d-2a49-4c53-959d-df6aa94956be/addn_hosts - 0 addresses Nov 26 05:05:03 localhost dnsmasq-dhcp[325071]: read /var/lib/neutron/dhcp/d459403d-2a49-4c53-959d-df6aa94956be/host Nov 26 05:05:03 localhost dnsmasq-dhcp[325071]: read /var/lib/neutron/dhcp/d459403d-2a49-4c53-959d-df6aa94956be/opts Nov 26 05:05:03 localhost nova_compute[281415]: 2025-11-26 10:05:03.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:05:04 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:05:04.313 262471 INFO neutron.agent.dhcp.agent [None req-a625257c-634d-424c-9bc6-ced6b5ccb6e2 - - - - - -] DHCP configuration for ports {'9bc17f53-ad9f-4826-ba85-c269bee4d4b7'} is completed#033[00m Nov 26 05:05:04 localhost systemd[1]: tmp-crun.2wJrdE.mount: Deactivated successfully. Nov 26 05:05:04 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e180 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:05:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 05:05:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 05:05:05 localhost nova_compute[281415]: 2025-11-26 10:05:05.849 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:05:05 localhost nova_compute[281415]: 2025-11-26 10:05:05.849 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:05:05 localhost nova_compute[281415]: 2025-11-26 10:05:05.850 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 05:05:05 localhost nova_compute[281415]: 2025-11-26 10:05:05.850 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:05:05 localhost podman[325072]: 2025-11-26 10:05:05.850978884 +0000 UTC m=+0.105849491 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 26 05:05:05 localhost podman[325072]: 2025-11-26 10:05:05.860236661 +0000 UTC m=+0.115107298 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 26 05:05:05 localhost nova_compute[281415]: 2025-11-26 10:05:05.873 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:05:05 localhost nova_compute[281415]: 2025-11-26 10:05:05.873 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:05:05 localhost nova_compute[281415]: 2025-11-26 10:05:05.874 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:05:05 localhost nova_compute[281415]: 2025-11-26 10:05:05.874 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 05:05:05 localhost nova_compute[281415]: 2025-11-26 10:05:05.874 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 05:05:05 localhost systemd[1]: tmp-crun.wfSKJD.mount: Deactivated successfully. Nov 26 05:05:05 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 05:05:05 localhost podman[325073]: 2025-11-26 10:05:05.910968682 +0000 UTC m=+0.161737532 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, config_id=edpm) Nov 26 05:05:05 localhost podman[325073]: 2025-11-26 10:05:05.990734944 +0000 UTC m=+0.241503844 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, managed_by=edpm_ansible) Nov 26 05:05:06 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 05:05:06 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 05:05:06 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3334858578' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 05:05:06 localhost nova_compute[281415]: 2025-11-26 10:05:06.416 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.542s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 05:05:06 localhost nova_compute[281415]: 2025-11-26 10:05:06.500 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 05:05:06 localhost nova_compute[281415]: 2025-11-26 10:05:06.501 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 05:05:06 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.eve48", "format": "json"} : dispatch Nov 26 05:05:06 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.eve48"} : dispatch Nov 26 05:05:06 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.eve48"} : dispatch Nov 26 05:05:06 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.eve48"}]': finished Nov 26 05:05:06 localhost nova_compute[281415]: 2025-11-26 10:05:06.741 281419 WARNING nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 05:05:06 localhost nova_compute[281415]: 2025-11-26 10:05:06.743 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=11223MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 05:05:06 localhost nova_compute[281415]: 2025-11-26 10:05:06.743 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:05:06 localhost nova_compute[281415]: 2025-11-26 10:05:06.744 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:05:06 localhost nova_compute[281415]: 2025-11-26 10:05:06.835 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 05:05:06 localhost nova_compute[281415]: 2025-11-26 10:05:06.835 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 05:05:06 localhost nova_compute[281415]: 2025-11-26 10:05:06.836 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 05:05:06 localhost nova_compute[281415]: 2025-11-26 10:05:06.896 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 05:05:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:05:06.941 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:05:06Z, description=, device_id=dab9b6ff-f0d7-463c-95a5-9c9001b5c80e, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=b7946c42-cde3-455c-a3a4-ad551a4c9fa6, ip_allocation=immediate, mac_address=fa:16:3e:55:bc:f0, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:04:57Z, description=, dns_domain=, id=d459403d-2a49-4c53-959d-df6aa94956be, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1271064156, port_security_enabled=True, project_id=a2fca84bedb04d0b8a425f8e67ff3ca6, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=4895, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2679, status=ACTIVE, subnets=['5c255ec8-ba97-4992-b146-59545b865201'], tags=[], tenant_id=a2fca84bedb04d0b8a425f8e67ff3ca6, updated_at=2025-11-26T10:05:01Z, vlan_transparent=None, network_id=d459403d-2a49-4c53-959d-df6aa94956be, port_security_enabled=False, project_id=a2fca84bedb04d0b8a425f8e67ff3ca6, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2743, status=DOWN, tags=[], tenant_id=a2fca84bedb04d0b8a425f8e67ff3ca6, updated_at=2025-11-26T10:05:06Z on network d459403d-2a49-4c53-959d-df6aa94956be#033[00m Nov 26 05:05:07 localhost dnsmasq[325071]: read /var/lib/neutron/dhcp/d459403d-2a49-4c53-959d-df6aa94956be/addn_hosts - 1 addresses Nov 26 05:05:07 localhost dnsmasq-dhcp[325071]: read /var/lib/neutron/dhcp/d459403d-2a49-4c53-959d-df6aa94956be/host Nov 26 05:05:07 localhost dnsmasq-dhcp[325071]: read /var/lib/neutron/dhcp/d459403d-2a49-4c53-959d-df6aa94956be/opts Nov 26 05:05:07 localhost podman[325169]: 2025-11-26 10:05:07.185918995 +0000 UTC m=+0.065707596 container kill 1059ee613d80c94482b237669d1aa8b13d98089f95be79c0c05f9ccbd3c1205e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d459403d-2a49-4c53-959d-df6aa94956be, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0) Nov 26 05:05:07 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 05:05:07 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2354628312' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 05:05:07 localhost nova_compute[281415]: 2025-11-26 10:05:07.370 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 05:05:07 localhost nova_compute[281415]: 2025-11-26 10:05:07.378 281419 DEBUG nova.compute.provider_tree [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 05:05:07 localhost nova_compute[281415]: 2025-11-26 10:05:07.402 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 05:05:07 localhost nova_compute[281415]: 2025-11-26 10:05:07.405 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 05:05:07 localhost nova_compute[281415]: 2025-11-26 10:05:07.405 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.662s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:05:07 localhost nova_compute[281415]: 2025-11-26 10:05:07.450 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:07 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:05:07.490 262471 INFO neutron.agent.dhcp.agent [None req-e268b205-901d-4d41-909d-85049b6e7e13 - - - - - -] DHCP configuration for ports {'b7946c42-cde3-455c-a3a4-ad551a4c9fa6'} is completed#033[00m Nov 26 05:05:07 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:05:07 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:05:07 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:05:07.949 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:05:06Z, description=, device_id=dab9b6ff-f0d7-463c-95a5-9c9001b5c80e, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=b7946c42-cde3-455c-a3a4-ad551a4c9fa6, ip_allocation=immediate, mac_address=fa:16:3e:55:bc:f0, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:04:57Z, description=, dns_domain=, id=d459403d-2a49-4c53-959d-df6aa94956be, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1271064156, port_security_enabled=True, project_id=a2fca84bedb04d0b8a425f8e67ff3ca6, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=4895, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2679, status=ACTIVE, subnets=['5c255ec8-ba97-4992-b146-59545b865201'], tags=[], tenant_id=a2fca84bedb04d0b8a425f8e67ff3ca6, updated_at=2025-11-26T10:05:01Z, vlan_transparent=None, network_id=d459403d-2a49-4c53-959d-df6aa94956be, port_security_enabled=False, project_id=a2fca84bedb04d0b8a425f8e67ff3ca6, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2743, status=DOWN, tags=[], tenant_id=a2fca84bedb04d0b8a425f8e67ff3ca6, updated_at=2025-11-26T10:05:06Z on network d459403d-2a49-4c53-959d-df6aa94956be#033[00m Nov 26 05:05:08 localhost systemd[1]: tmp-crun.RZ9t7W.mount: Deactivated successfully. Nov 26 05:05:08 localhost dnsmasq[325071]: read /var/lib/neutron/dhcp/d459403d-2a49-4c53-959d-df6aa94956be/addn_hosts - 1 addresses Nov 26 05:05:08 localhost dnsmasq-dhcp[325071]: read /var/lib/neutron/dhcp/d459403d-2a49-4c53-959d-df6aa94956be/host Nov 26 05:05:08 localhost dnsmasq-dhcp[325071]: read /var/lib/neutron/dhcp/d459403d-2a49-4c53-959d-df6aa94956be/opts Nov 26 05:05:08 localhost podman[325207]: 2025-11-26 10:05:08.193063191 +0000 UTC m=+0.075827561 container kill 1059ee613d80c94482b237669d1aa8b13d98089f95be79c0c05f9ccbd3c1205e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d459403d-2a49-4c53-959d-df6aa94956be, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Nov 26 05:05:08 localhost nova_compute[281415]: 2025-11-26 10:05:08.405 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:05:08 localhost nova_compute[281415]: 2025-11-26 10:05:08.407 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:05:08 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:05:08.460 262471 INFO neutron.agent.dhcp.agent [None req-24e14b89-41d0-4dd3-a2bb-d2260d1e9328 - - - - - -] DHCP configuration for ports {'b7946c42-cde3-455c-a3a4-ad551a4c9fa6'} is completed#033[00m Nov 26 05:05:08 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e181 e181: 6 total, 6 up, 6 in Nov 26 05:05:08 localhost nova_compute[281415]: 2025-11-26 10:05:08.805 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:08 localhost sshd[325241]: main: sshd: ssh-rsa algorithm is disabled Nov 26 05:05:09 localhost dnsmasq[325071]: read /var/lib/neutron/dhcp/d459403d-2a49-4c53-959d-df6aa94956be/addn_hosts - 0 addresses Nov 26 05:05:09 localhost dnsmasq-dhcp[325071]: read /var/lib/neutron/dhcp/d459403d-2a49-4c53-959d-df6aa94956be/host Nov 26 05:05:09 localhost podman[325246]: 2025-11-26 10:05:09.012490629 +0000 UTC m=+0.064941993 container kill 1059ee613d80c94482b237669d1aa8b13d98089f95be79c0c05f9ccbd3c1205e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d459403d-2a49-4c53-959d-df6aa94956be, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true) Nov 26 05:05:09 localhost dnsmasq-dhcp[325071]: read /var/lib/neutron/dhcp/d459403d-2a49-4c53-959d-df6aa94956be/opts Nov 26 05:05:09 localhost kernel: device tapf40350c4-bb left promiscuous mode Nov 26 05:05:09 localhost ovn_controller[153664]: 2025-11-26T10:05:09Z|00455|binding|INFO|Releasing lport f40350c4-bb5c-41a1-a1be-f6833d36ae2a from this chassis (sb_readonly=0) Nov 26 05:05:09 localhost ovn_controller[153664]: 2025-11-26T10:05:09Z|00456|binding|INFO|Setting lport f40350c4-bb5c-41a1-a1be-f6833d36ae2a down in Southbound Nov 26 05:05:09 localhost nova_compute[281415]: 2025-11-26 10:05:09.243 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:09 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:09.262 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.101.0.2/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-d459403d-2a49-4c53-959d-df6aa94956be', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d459403d-2a49-4c53-959d-df6aa94956be', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a2fca84bedb04d0b8a425f8e67ff3ca6', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ef40a8f1-0516-43a7-bccd-a6e14cd4b9f3, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f40350c4-bb5c-41a1-a1be-f6833d36ae2a) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:05:09 localhost nova_compute[281415]: 2025-11-26 10:05:09.265 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:09 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:09.264 159486 INFO neutron.agent.ovn.metadata.agent [-] Port f40350c4-bb5c-41a1-a1be-f6833d36ae2a in datapath d459403d-2a49-4c53-959d-df6aa94956be unbound from our chassis#033[00m Nov 26 05:05:09 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:09.266 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d459403d-2a49-4c53-959d-df6aa94956be, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:05:09 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:09.268 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[a253495b-acbb-429f-a204-d29b48eb3250]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:05:09 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:05:09 localhost nova_compute[281415]: 2025-11-26 10:05:09.849 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:05:09 localhost nova_compute[281415]: 2025-11-26 10:05:09.850 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 05:05:09 localhost nova_compute[281415]: 2025-11-26 10:05:09.850 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 05:05:10 localhost nova_compute[281415]: 2025-11-26 10:05:10.208 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 05:05:10 localhost nova_compute[281415]: 2025-11-26 10:05:10.208 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 05:05:10 localhost nova_compute[281415]: 2025-11-26 10:05:10.208 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 05:05:10 localhost nova_compute[281415]: 2025-11-26 10:05:10.209 281419 DEBUG nova.objects.instance [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 05:05:10 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:10.725 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:5e:b4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '86:cf:7c:68:02:df'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:05:10 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:10.727 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 26 05:05:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 05:05:10 localhost nova_compute[281415]: 2025-11-26 10:05:10.732 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 05:05:10 localhost podman[325270]: 2025-11-26 10:05:10.846339539 +0000 UTC m=+0.102359893 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, container_name=ovn_controller, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 26 05:05:10 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.eve47", "format": "json"} : dispatch Nov 26 05:05:10 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/bcaccd6d-419b-4afc-b665-b3c287b4ae69/dd6a5a64-5609-4019-86d7-f6f03b8352b6", "osd", "allow rw pool=manila_data namespace=fsvolumens_bcaccd6d-419b-4afc-b665-b3c287b4ae69", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:05:10 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/bcaccd6d-419b-4afc-b665-b3c287b4ae69/dd6a5a64-5609-4019-86d7-f6f03b8352b6", "osd", "allow rw pool=manila_data namespace=fsvolumens_bcaccd6d-419b-4afc-b665-b3c287b4ae69", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:05:10 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/bcaccd6d-419b-4afc-b665-b3c287b4ae69/dd6a5a64-5609-4019-86d7-f6f03b8352b6", "osd", "allow rw pool=manila_data namespace=fsvolumens_bcaccd6d-419b-4afc-b665-b3c287b4ae69", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:05:10 localhost systemd[1]: tmp-crun.A7AD05.mount: Deactivated successfully. Nov 26 05:05:10 localhost podman[325271]: 2025-11-26 10:05:10.904228012 +0000 UTC m=+0.155261641 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, name=ubi9-minimal, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1755695350, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc.) Nov 26 05:05:10 localhost podman[325270]: 2025-11-26 10:05:10.937822583 +0000 UTC m=+0.193842967 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Nov 26 05:05:10 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 05:05:10 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e182 e182: 6 total, 6 up, 6 in Nov 26 05:05:11 localhost dnsmasq[325071]: exiting on receipt of SIGTERM Nov 26 05:05:11 localhost systemd[1]: libpod-1059ee613d80c94482b237669d1aa8b13d98089f95be79c0c05f9ccbd3c1205e.scope: Deactivated successfully. Nov 26 05:05:11 localhost podman[325327]: 2025-11-26 10:05:11.004927042 +0000 UTC m=+0.084535000 container kill 1059ee613d80c94482b237669d1aa8b13d98089f95be79c0c05f9ccbd3c1205e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d459403d-2a49-4c53-959d-df6aa94956be, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 26 05:05:11 localhost podman[325271]: 2025-11-26 10:05:11.044421967 +0000 UTC m=+0.295455596 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, architecture=x86_64, container_name=openstack_network_exporter, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, version=9.6, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 26 05:05:11 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 05:05:11 localhost podman[325343]: 2025-11-26 10:05:11.086211 +0000 UTC m=+0.066325325 container died 1059ee613d80c94482b237669d1aa8b13d98089f95be79c0c05f9ccbd3c1205e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d459403d-2a49-4c53-959d-df6aa94956be, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 26 05:05:11 localhost podman[325343]: 2025-11-26 10:05:11.119810151 +0000 UTC m=+0.099924416 container cleanup 1059ee613d80c94482b237669d1aa8b13d98089f95be79c0c05f9ccbd3c1205e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d459403d-2a49-4c53-959d-df6aa94956be, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3) Nov 26 05:05:11 localhost systemd[1]: libpod-conmon-1059ee613d80c94482b237669d1aa8b13d98089f95be79c0c05f9ccbd3c1205e.scope: Deactivated successfully. Nov 26 05:05:11 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 26 05:05:11 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2091031593' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 26 05:05:11 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 26 05:05:11 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2091031593' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 26 05:05:11 localhost podman[325350]: 2025-11-26 10:05:11.169685207 +0000 UTC m=+0.134215900 container remove 1059ee613d80c94482b237669d1aa8b13d98089f95be79c0c05f9ccbd3c1205e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d459403d-2a49-4c53-959d-df6aa94956be, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 26 05:05:11 localhost nova_compute[281415]: 2025-11-26 10:05:11.225 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 05:05:11 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:05:11 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:05:11 localhost nova_compute[281415]: 2025-11-26 10:05:11.250 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 05:05:11 localhost nova_compute[281415]: 2025-11-26 10:05:11.251 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 05:05:11 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:05:11 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:05:11 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:05:11.463 262471 INFO neutron.agent.dhcp.agent [None req-68267bec-8bae-44ce-ac54-6881610f86a8 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:05:11 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:05:11.509 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:05:11 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:05:11.708 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:05:11 localhost systemd[1]: var-lib-containers-storage-overlay-c1e88e67d54ad54c2adb312f747bc52324ed2c3787699bf96bcbd65ef66ef356-merged.mount: Deactivated successfully. Nov 26 05:05:11 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1059ee613d80c94482b237669d1aa8b13d98089f95be79c0c05f9ccbd3c1205e-userdata-shm.mount: Deactivated successfully. Nov 26 05:05:11 localhost systemd[1]: run-netns-qdhcp\x2dd459403d\x2d2a49\x2d4c53\x2d959d\x2ddf6aa94956be.mount: Deactivated successfully. Nov 26 05:05:11 localhost ovn_controller[153664]: 2025-11-26T10:05:11Z|00457|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:05:11 localhost nova_compute[281415]: 2025-11-26 10:05:11.944 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:12 localhost nova_compute[281415]: 2025-11-26 10:05:12.493 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:13 localhost nova_compute[281415]: 2025-11-26 10:05:13.831 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:14 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.eve47", "format": "json"} : dispatch Nov 26 05:05:14 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.eve47"} : dispatch Nov 26 05:05:14 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.eve47"} : dispatch Nov 26 05:05:14 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.eve47"}]': finished Nov 26 05:05:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 05:05:14 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e182 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:05:14 localhost podman[325372]: 2025-11-26 10:05:14.829399629 +0000 UTC m=+0.088721900 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 26 05:05:14 localhost podman[325372]: 2025-11-26 10:05:14.842398011 +0000 UTC m=+0.101720252 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 26 05:05:14 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 05:05:15 localhost openstack_network_exporter[242153]: ERROR 10:05:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:05:15 localhost openstack_network_exporter[242153]: ERROR 10:05:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:05:15 localhost openstack_network_exporter[242153]: ERROR 10:05:15 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 05:05:15 localhost openstack_network_exporter[242153]: ERROR 10:05:15 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 05:05:15 localhost openstack_network_exporter[242153]: Nov 26 05:05:15 localhost openstack_network_exporter[242153]: ERROR 10:05:15 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 05:05:15 localhost openstack_network_exporter[242153]: Nov 26 05:05:16 localhost neutron_sriov_agent[255515]: 2025-11-26 10:05:16.098 2 INFO neutron.agent.securitygroups_rpc [None req-0d0e706d-6661-4b91-aeaa-e55906981431 64586a6ee7b64963ac0eee4039334416 a2fca84bedb04d0b8a425f8e67ff3ca6 - - default default] Security group member updated ['bb2478c9-ccff-435d-b94a-e81432fb5625']#033[00m Nov 26 05:05:17 localhost nova_compute[281415]: 2025-11-26 10:05:17.527 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:17 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.eve49", "format": "json"} : dispatch Nov 26 05:05:17 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.eve49"} : dispatch Nov 26 05:05:17 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.eve49"} : dispatch Nov 26 05:05:17 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.eve49"}]': finished Nov 26 05:05:18 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:05:18 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:05:18 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e183 e183: 6 total, 6 up, 6 in Nov 26 05:05:18 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:18.730 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fad182b-d1fd-4eb1-a4d3-436a76a6f49e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 05:05:18 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1. Nov 26 05:05:18 localhost neutron_sriov_agent[255515]: 2025-11-26 10:05:18.889 2 INFO neutron.agent.securitygroups_rpc [None req-707a380d-78b2-4f4f-8ffa-4c3f74e88c67 64586a6ee7b64963ac0eee4039334416 a2fca84bedb04d0b8a425f8e67ff3ca6 - - default default] Security group member updated ['bb2478c9-ccff-435d-b94a-e81432fb5625']#033[00m Nov 26 05:05:18 localhost nova_compute[281415]: 2025-11-26 10:05:18.897 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 05:05:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 05:05:19 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e183 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:05:19 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e184 e184: 6 total, 6 up, 6 in Nov 26 05:05:19 localhost podman[325396]: 2025-11-26 10:05:19.849976405 +0000 UTC m=+0.107119289 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3) Nov 26 05:05:19 localhost podman[325396]: 2025-11-26 10:05:19.864350621 +0000 UTC m=+0.121493455 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2) Nov 26 05:05:19 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 05:05:19 localhost systemd[1]: tmp-crun.KvAShv.mount: Deactivated successfully. Nov 26 05:05:19 localhost podman[325395]: 2025-11-26 10:05:19.987665041 +0000 UTC m=+0.249079268 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2) Nov 26 05:05:19 localhost podman[325395]: 2025-11-26 10:05:19.99211983 +0000 UTC m=+0.253534067 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS) Nov 26 05:05:20 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 05:05:21 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e185 e185: 6 total, 6 up, 6 in Nov 26 05:05:22 localhost nova_compute[281415]: 2025-11-26 10:05:22.568 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:22 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e186 e186: 6 total, 6 up, 6 in Nov 26 05:05:23 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:05:23.197 262471 INFO neutron.agent.linux.ip_lib [None req-e37f9fc8-2cfa-48bc-9ebf-94fde17c4071 - - - - - -] Device tap6b8178a9-5f cannot be used as it has no MAC address#033[00m Nov 26 05:05:23 localhost nova_compute[281415]: 2025-11-26 10:05:23.222 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:23 localhost kernel: device tap6b8178a9-5f entered promiscuous mode Nov 26 05:05:23 localhost NetworkManager[5970]: [1764151523.2324] manager: (tap6b8178a9-5f): new Generic device (/org/freedesktop/NetworkManager/Devices/72) Nov 26 05:05:23 localhost nova_compute[281415]: 2025-11-26 10:05:23.233 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:23 localhost ovn_controller[153664]: 2025-11-26T10:05:23Z|00458|binding|INFO|Claiming lport 6b8178a9-5fb2-4311-bcc0-1da9bcad9ec1 for this chassis. Nov 26 05:05:23 localhost ovn_controller[153664]: 2025-11-26T10:05:23Z|00459|binding|INFO|6b8178a9-5fb2-4311-bcc0-1da9bcad9ec1: Claiming unknown Nov 26 05:05:23 localhost systemd-udevd[325443]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:05:23 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:23.245 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-9a3caa8f-e7f9-48ef-8867-1aa3a75179be', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9a3caa8f-e7f9-48ef-8867-1aa3a75179be', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a2fca84bedb04d0b8a425f8e67ff3ca6', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=97745035-c2a6-4f11-ae26-41cbe22f1b48, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=6b8178a9-5fb2-4311-bcc0-1da9bcad9ec1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:05:23 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:23.247 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 6b8178a9-5fb2-4311-bcc0-1da9bcad9ec1 in datapath 9a3caa8f-e7f9-48ef-8867-1aa3a75179be bound to our chassis#033[00m Nov 26 05:05:23 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:23.249 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Port 9d4bee22-a619-4ed7-a09e-b1f1abedce1e IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 26 05:05:23 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:23.250 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9a3caa8f-e7f9-48ef-8867-1aa3a75179be, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:05:23 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:23.251 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[66dd4e8b-7558-4e62-96cb-6c85725c09d5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:05:23 localhost journal[229445]: ethtool ioctl error on tap6b8178a9-5f: No such device Nov 26 05:05:23 localhost ovn_controller[153664]: 2025-11-26T10:05:23Z|00460|binding|INFO|Setting lport 6b8178a9-5fb2-4311-bcc0-1da9bcad9ec1 ovn-installed in OVS Nov 26 05:05:23 localhost ovn_controller[153664]: 2025-11-26T10:05:23Z|00461|binding|INFO|Setting lport 6b8178a9-5fb2-4311-bcc0-1da9bcad9ec1 up in Southbound Nov 26 05:05:23 localhost nova_compute[281415]: 2025-11-26 10:05:23.273 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:23 localhost journal[229445]: ethtool ioctl error on tap6b8178a9-5f: No such device Nov 26 05:05:23 localhost journal[229445]: ethtool ioctl error on tap6b8178a9-5f: No such device Nov 26 05:05:23 localhost journal[229445]: ethtool ioctl error on tap6b8178a9-5f: No such device Nov 26 05:05:23 localhost journal[229445]: ethtool ioctl error on tap6b8178a9-5f: No such device Nov 26 05:05:23 localhost journal[229445]: ethtool ioctl error on tap6b8178a9-5f: No such device Nov 26 05:05:23 localhost journal[229445]: ethtool ioctl error on tap6b8178a9-5f: No such device Nov 26 05:05:23 localhost journal[229445]: ethtool ioctl error on tap6b8178a9-5f: No such device Nov 26 05:05:23 localhost nova_compute[281415]: 2025-11-26 10:05:23.320 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:23 localhost nova_compute[281415]: 2025-11-26 10:05:23.364 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:23 localhost nova_compute[281415]: 2025-11-26 10:05:23.941 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:24 localhost podman[325514]: Nov 26 05:05:24 localhost podman[325514]: 2025-11-26 10:05:24.33120158 +0000 UTC m=+0.095734877 container create 41ceaaa10ea2ccf62aa1fa7e02122c079856f24eac78664331cdd40176327121 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a3caa8f-e7f9-48ef-8867-1aa3a75179be, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118) Nov 26 05:05:24 localhost systemd[1]: Started libpod-conmon-41ceaaa10ea2ccf62aa1fa7e02122c079856f24eac78664331cdd40176327121.scope. Nov 26 05:05:24 localhost podman[325514]: 2025-11-26 10:05:24.284758131 +0000 UTC m=+0.049291448 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:05:24 localhost systemd[1]: Started libcrun container. Nov 26 05:05:24 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 26 05:05:24 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4088433715' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 26 05:05:24 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 26 05:05:24 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4088433715' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 26 05:05:24 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/32f39a484ddfd1bce8768e36ceb76ad71a53ef7e135f686ed08046734c944348/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:05:24 localhost podman[325514]: 2025-11-26 10:05:24.412878381 +0000 UTC m=+0.177411668 container init 41ceaaa10ea2ccf62aa1fa7e02122c079856f24eac78664331cdd40176327121 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a3caa8f-e7f9-48ef-8867-1aa3a75179be, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 26 05:05:24 localhost podman[325514]: 2025-11-26 10:05:24.422384835 +0000 UTC m=+0.186918132 container start 41ceaaa10ea2ccf62aa1fa7e02122c079856f24eac78664331cdd40176327121 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a3caa8f-e7f9-48ef-8867-1aa3a75179be, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 26 05:05:24 localhost dnsmasq[325532]: started, version 2.85 cachesize 150 Nov 26 05:05:24 localhost dnsmasq[325532]: DNS service limited to local subnets Nov 26 05:05:24 localhost dnsmasq[325532]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:05:24 localhost dnsmasq[325532]: warning: no upstream servers configured Nov 26 05:05:24 localhost dnsmasq-dhcp[325532]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 26 05:05:24 localhost dnsmasq[325532]: read /var/lib/neutron/dhcp/9a3caa8f-e7f9-48ef-8867-1aa3a75179be/addn_hosts - 0 addresses Nov 26 05:05:24 localhost dnsmasq-dhcp[325532]: read /var/lib/neutron/dhcp/9a3caa8f-e7f9-48ef-8867-1aa3a75179be/host Nov 26 05:05:24 localhost dnsmasq-dhcp[325532]: read /var/lib/neutron/dhcp/9a3caa8f-e7f9-48ef-8867-1aa3a75179be/opts Nov 26 05:05:24 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:05:24.486 262471 INFO neutron.agent.dhcp.agent [None req-16f56aa3-1401-4931-a340-3c5b5f29dee4 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:05:22Z, description=, device_id=6bfa5bdf-9be1-445e-92d5-cd359ad350da, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=8954bb78-a2f0-4366-900c-f672ca37ab36, ip_allocation=immediate, mac_address=fa:16:3e:0e:be:77, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:05:20Z, description=, dns_domain=, id=9a3caa8f-e7f9-48ef-8867-1aa3a75179be, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1666522898, port_security_enabled=True, project_id=a2fca84bedb04d0b8a425f8e67ff3ca6, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=6657, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2791, status=ACTIVE, subnets=['8ad8c0ce-70d9-43f2-bbd7-e48d9a481312'], tags=[], tenant_id=a2fca84bedb04d0b8a425f8e67ff3ca6, updated_at=2025-11-26T10:05:21Z, vlan_transparent=None, network_id=9a3caa8f-e7f9-48ef-8867-1aa3a75179be, port_security_enabled=False, project_id=a2fca84bedb04d0b8a425f8e67ff3ca6, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2815, status=DOWN, tags=[], tenant_id=a2fca84bedb04d0b8a425f8e67ff3ca6, updated_at=2025-11-26T10:05:22Z on network 9a3caa8f-e7f9-48ef-8867-1aa3a75179be#033[00m Nov 26 05:05:24 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:05:24.606 262471 INFO neutron.agent.dhcp.agent [None req-f7145b4f-1d1a-4947-ac43-76aeaf18da15 - - - - - -] DHCP configuration for ports {'6cdf9b00-bf43-4503-b572-5985c85a2636'} is completed#033[00m Nov 26 05:05:24 localhost dnsmasq[325532]: read /var/lib/neutron/dhcp/9a3caa8f-e7f9-48ef-8867-1aa3a75179be/addn_hosts - 1 addresses Nov 26 05:05:24 localhost podman[325551]: 2025-11-26 10:05:24.710801922 +0000 UTC m=+0.069465074 container kill 41ceaaa10ea2ccf62aa1fa7e02122c079856f24eac78664331cdd40176327121 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a3caa8f-e7f9-48ef-8867-1aa3a75179be, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true) Nov 26 05:05:24 localhost dnsmasq-dhcp[325532]: read /var/lib/neutron/dhcp/9a3caa8f-e7f9-48ef-8867-1aa3a75179be/host Nov 26 05:05:24 localhost dnsmasq-dhcp[325532]: read /var/lib/neutron/dhcp/9a3caa8f-e7f9-48ef-8867-1aa3a75179be/opts Nov 26 05:05:24 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:05:24 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:05:24.876 262471 INFO neutron.agent.dhcp.agent [None req-0a9a4775-2c81-47bd-af48-2f271c1883fa - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:05:22Z, description=, device_id=6bfa5bdf-9be1-445e-92d5-cd359ad350da, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=8954bb78-a2f0-4366-900c-f672ca37ab36, ip_allocation=immediate, mac_address=fa:16:3e:0e:be:77, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:05:20Z, description=, dns_domain=, id=9a3caa8f-e7f9-48ef-8867-1aa3a75179be, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1666522898, port_security_enabled=True, project_id=a2fca84bedb04d0b8a425f8e67ff3ca6, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=6657, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2791, status=ACTIVE, subnets=['8ad8c0ce-70d9-43f2-bbd7-e48d9a481312'], tags=[], tenant_id=a2fca84bedb04d0b8a425f8e67ff3ca6, updated_at=2025-11-26T10:05:21Z, vlan_transparent=None, network_id=9a3caa8f-e7f9-48ef-8867-1aa3a75179be, port_security_enabled=False, project_id=a2fca84bedb04d0b8a425f8e67ff3ca6, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2815, status=DOWN, tags=[], tenant_id=a2fca84bedb04d0b8a425f8e67ff3ca6, updated_at=2025-11-26T10:05:22Z on network 9a3caa8f-e7f9-48ef-8867-1aa3a75179be#033[00m Nov 26 05:05:25 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:05:25.076 262471 INFO neutron.agent.dhcp.agent [None req-96a16be0-0436-461a-8ceb-98a46793c2fd - - - - - -] DHCP configuration for ports {'8954bb78-a2f0-4366-900c-f672ca37ab36'} is completed#033[00m Nov 26 05:05:25 localhost podman[325588]: 2025-11-26 10:05:25.12031116 +0000 UTC m=+0.061197987 container kill 41ceaaa10ea2ccf62aa1fa7e02122c079856f24eac78664331cdd40176327121 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a3caa8f-e7f9-48ef-8867-1aa3a75179be, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:05:25 localhost dnsmasq[325532]: read /var/lib/neutron/dhcp/9a3caa8f-e7f9-48ef-8867-1aa3a75179be/addn_hosts - 1 addresses Nov 26 05:05:25 localhost dnsmasq-dhcp[325532]: read /var/lib/neutron/dhcp/9a3caa8f-e7f9-48ef-8867-1aa3a75179be/host Nov 26 05:05:25 localhost dnsmasq-dhcp[325532]: read /var/lib/neutron/dhcp/9a3caa8f-e7f9-48ef-8867-1aa3a75179be/opts Nov 26 05:05:25 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 26 05:05:25 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3114622527' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 26 05:05:25 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 26 05:05:25 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3114622527' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 26 05:05:25 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:05:25.418 262471 INFO neutron.agent.dhcp.agent [None req-39492a67-46ca-442a-bab4-2425bbd9885b - - - - - -] DHCP configuration for ports {'8954bb78-a2f0-4366-900c-f672ca37ab36'} is completed#033[00m Nov 26 05:05:25 localhost dnsmasq[325532]: read /var/lib/neutron/dhcp/9a3caa8f-e7f9-48ef-8867-1aa3a75179be/addn_hosts - 0 addresses Nov 26 05:05:25 localhost dnsmasq-dhcp[325532]: read /var/lib/neutron/dhcp/9a3caa8f-e7f9-48ef-8867-1aa3a75179be/host Nov 26 05:05:25 localhost podman[325626]: 2025-11-26 10:05:25.697874405 +0000 UTC m=+0.059139874 container kill 41ceaaa10ea2ccf62aa1fa7e02122c079856f24eac78664331cdd40176327121 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a3caa8f-e7f9-48ef-8867-1aa3a75179be, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 26 05:05:25 localhost dnsmasq-dhcp[325532]: read /var/lib/neutron/dhcp/9a3caa8f-e7f9-48ef-8867-1aa3a75179be/opts Nov 26 05:05:25 localhost ovn_controller[153664]: 2025-11-26T10:05:25Z|00462|binding|INFO|Releasing lport 6b8178a9-5fb2-4311-bcc0-1da9bcad9ec1 from this chassis (sb_readonly=0) Nov 26 05:05:25 localhost kernel: device tap6b8178a9-5f left promiscuous mode Nov 26 05:05:25 localhost ovn_controller[153664]: 2025-11-26T10:05:25Z|00463|binding|INFO|Setting lport 6b8178a9-5fb2-4311-bcc0-1da9bcad9ec1 down in Southbound Nov 26 05:05:25 localhost nova_compute[281415]: 2025-11-26 10:05:25.908 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:25 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:25.928 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-9a3caa8f-e7f9-48ef-8867-1aa3a75179be', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9a3caa8f-e7f9-48ef-8867-1aa3a75179be', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a2fca84bedb04d0b8a425f8e67ff3ca6', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=97745035-c2a6-4f11-ae26-41cbe22f1b48, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=6b8178a9-5fb2-4311-bcc0-1da9bcad9ec1) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:05:25 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:25.930 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 6b8178a9-5fb2-4311-bcc0-1da9bcad9ec1 in datapath 9a3caa8f-e7f9-48ef-8867-1aa3a75179be unbound from our chassis#033[00m Nov 26 05:05:25 localhost nova_compute[281415]: 2025-11-26 10:05:25.932 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:25 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:25.933 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9a3caa8f-e7f9-48ef-8867-1aa3a75179be, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:05:25 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:25.934 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[1c9897d1-112e-4f37-8f4c-4e0b78c4d478]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:05:26 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:05:26 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:05:26 localhost dnsmasq[325532]: exiting on receipt of SIGTERM Nov 26 05:05:26 localhost podman[325665]: 2025-11-26 10:05:26.882184258 +0000 UTC m=+0.058187435 container kill 41ceaaa10ea2ccf62aa1fa7e02122c079856f24eac78664331cdd40176327121 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a3caa8f-e7f9-48ef-8867-1aa3a75179be, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:05:26 localhost systemd[1]: libpod-41ceaaa10ea2ccf62aa1fa7e02122c079856f24eac78664331cdd40176327121.scope: Deactivated successfully. Nov 26 05:05:26 localhost podman[325679]: 2025-11-26 10:05:26.962672493 +0000 UTC m=+0.065852422 container died 41ceaaa10ea2ccf62aa1fa7e02122c079856f24eac78664331cdd40176327121 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a3caa8f-e7f9-48ef-8867-1aa3a75179be, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:05:27 localhost podman[325679]: 2025-11-26 10:05:27.053166146 +0000 UTC m=+0.156346025 container cleanup 41ceaaa10ea2ccf62aa1fa7e02122c079856f24eac78664331cdd40176327121 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a3caa8f-e7f9-48ef-8867-1aa3a75179be, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 26 05:05:27 localhost systemd[1]: libpod-conmon-41ceaaa10ea2ccf62aa1fa7e02122c079856f24eac78664331cdd40176327121.scope: Deactivated successfully. Nov 26 05:05:27 localhost podman[325682]: 2025-11-26 10:05:27.077045287 +0000 UTC m=+0.165857511 container remove 41ceaaa10ea2ccf62aa1fa7e02122c079856f24eac78664331cdd40176327121 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a3caa8f-e7f9-48ef-8867-1aa3a75179be, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:05:27 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:05:27.125 262471 INFO neutron.agent.dhcp.agent [None req-42274948-1bda-494f-829f-9e493428b392 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:05:27 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:05:27.173 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:05:27 localhost ovn_controller[153664]: 2025-11-26T10:05:27Z|00464|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:05:27 localhost nova_compute[281415]: 2025-11-26 10:05:27.384 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:27 localhost podman[240049]: time="2025-11-26T10:05:27Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 05:05:27 localhost podman[240049]: @ - - [26/Nov/2025:10:05:27 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1" Nov 26 05:05:27 localhost podman[240049]: @ - - [26/Nov/2025:10:05:27 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19247 "" "Go-http-client/1.1" Nov 26 05:05:27 localhost nova_compute[281415]: 2025-11-26 10:05:27.619 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:27 localhost systemd[1]: var-lib-containers-storage-overlay-32f39a484ddfd1bce8768e36ceb76ad71a53ef7e135f686ed08046734c944348-merged.mount: Deactivated successfully. Nov 26 05:05:27 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-41ceaaa10ea2ccf62aa1fa7e02122c079856f24eac78664331cdd40176327121-userdata-shm.mount: Deactivated successfully. Nov 26 05:05:27 localhost systemd[1]: run-netns-qdhcp\x2d9a3caa8f\x2de7f9\x2d48ef\x2d8867\x2d1aa3a75179be.mount: Deactivated successfully. Nov 26 05:05:28 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e187 e187: 6 total, 6 up, 6 in Nov 26 05:05:28 localhost nova_compute[281415]: 2025-11-26 10:05:28.982 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:29 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:05:30 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:05:30 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:05:31 localhost sshd[325710]: main: sshd: ssh-rsa algorithm is disabled Nov 26 05:05:32 localhost nova_compute[281415]: 2025-11-26 10:05:32.665 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:32 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:05:32 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:05:34 localhost nova_compute[281415]: 2025-11-26 10:05:34.013 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:34 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:05:36 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 26 05:05:36 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2672993884' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 26 05:05:36 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 26 05:05:36 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2672993884' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 26 05:05:36 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 26 05:05:36 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/952887224' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 26 05:05:36 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 26 05:05:36 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/952887224' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 26 05:05:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 05:05:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 05:05:36 localhost podman[325712]: 2025-11-26 10:05:36.845252581 +0000 UTC m=+0.097715799 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 05:05:36 localhost podman[325713]: 2025-11-26 10:05:36.914222707 +0000 UTC m=+0.163773105 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_managed=true) Nov 26 05:05:36 localhost podman[325713]: 2025-11-26 10:05:36.930955465 +0000 UTC m=+0.180505873 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 26 05:05:36 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 05:05:36 localhost podman[325712]: 2025-11-26 10:05:36.988461098 +0000 UTC m=+0.240924336 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 26 05:05:37 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 05:05:37 localhost nova_compute[281415]: 2025-11-26 10:05:37.727 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:38 localhost ceph-mon[297296]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0. Nov 26 05:05:38 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:05:38.585118) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 26 05:05:38 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43 Nov 26 05:05:38 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151538585227, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 2152, "num_deletes": 267, "total_data_size": 3618747, "memory_usage": 3671664, "flush_reason": "Manual Compaction"} Nov 26 05:05:38 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started Nov 26 05:05:38 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151538600441, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 2364261, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25593, "largest_seqno": 27740, "table_properties": {"data_size": 2355780, "index_size": 5050, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2373, "raw_key_size": 20251, "raw_average_key_size": 21, "raw_value_size": 2337739, "raw_average_value_size": 2484, "num_data_blocks": 218, "num_entries": 941, "num_filter_entries": 941, "num_deletions": 267, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764151432, "oldest_key_time": 1764151432, "file_creation_time": 1764151538, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79fcc812-ff89-4330-9c0d-148e92884770", "db_session_id": "NHY1MHQN9GMTALZ562AS", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}} Nov 26 05:05:38 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 15393 microseconds, and 7458 cpu microseconds. Nov 26 05:05:38 localhost ceph-mon[297296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 05:05:38 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:05:38.600515) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 2364261 bytes OK Nov 26 05:05:38 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:05:38.600551) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started Nov 26 05:05:38 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:05:38.603751) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done Nov 26 05:05:38 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:05:38.603776) EVENT_LOG_v1 {"time_micros": 1764151538603769, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 26 05:05:38 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:05:38.603800) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 26 05:05:38 localhost ceph-mon[297296]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 3608528, prev total WAL file size 3609277, number of live WAL files 2. Nov 26 05:05:38 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 05:05:38 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:05:38.604864) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034303138' seq:72057594037927935, type:22 .. '6C6F676D0034323639' seq:0, type:0; will stop at (end) Nov 26 05:05:38 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 26 05:05:38 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(2308KB)], [42(15MB)] Nov 26 05:05:38 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151538605013, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 18719582, "oldest_snapshot_seqno": -1} Nov 26 05:05:38 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 12809 keys, 18127919 bytes, temperature: kUnknown Nov 26 05:05:38 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151538695693, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 18127919, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18051568, "index_size": 43226, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 32069, "raw_key_size": 342266, "raw_average_key_size": 26, "raw_value_size": 17830331, "raw_average_value_size": 1392, "num_data_blocks": 1644, "num_entries": 12809, "num_filter_entries": 12809, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764150724, "oldest_key_time": 0, "file_creation_time": 1764151538, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79fcc812-ff89-4330-9c0d-148e92884770", "db_session_id": "NHY1MHQN9GMTALZ562AS", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}} Nov 26 05:05:38 localhost ceph-mon[297296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 05:05:38 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:05:38.696272) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 18127919 bytes Nov 26 05:05:38 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:05:38.698126) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 205.7 rd, 199.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.3, 15.6 +0.0 blob) out(17.3 +0.0 blob), read-write-amplify(15.6) write-amplify(7.7) OK, records in: 13359, records dropped: 550 output_compression: NoCompression Nov 26 05:05:38 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:05:38.698160) EVENT_LOG_v1 {"time_micros": 1764151538698143, "job": 24, "event": "compaction_finished", "compaction_time_micros": 90989, "compaction_time_cpu_micros": 58959, "output_level": 6, "num_output_files": 1, "total_output_size": 18127919, "num_input_records": 13359, "num_output_records": 12809, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 26 05:05:38 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 05:05:38 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151538699079, "job": 24, "event": "table_file_deletion", "file_number": 44} Nov 26 05:05:38 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 05:05:38 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151538701827, "job": 24, "event": "table_file_deletion", "file_number": 42} Nov 26 05:05:38 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:05:38.604775) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:05:38 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:05:38.701978) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:05:38 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:05:38.701984) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:05:38 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:05:38.701987) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:05:38 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:05:38.701991) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:05:38 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:05:38.701994) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:05:39 localhost nova_compute[281415]: 2025-11-26 10:05:39.049 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:39 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:05:40 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e188 e188: 6 total, 6 up, 6 in Nov 26 05:05:41 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:05:41 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:05:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 05:05:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 05:05:41 localhost podman[325753]: 2025-11-26 10:05:41.838414077 +0000 UTC m=+0.093935172 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 26 05:05:41 localhost podman[325754]: 2025-11-26 10:05:41.902183753 +0000 UTC m=+0.151891658 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, name=ubi9-minimal, architecture=x86_64, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, release=1755695350, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Nov 26 05:05:41 localhost podman[325753]: 2025-11-26 10:05:41.909598152 +0000 UTC m=+0.165119217 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:05:41 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 05:05:41 localhost podman[325754]: 2025-11-26 10:05:41.96570283 +0000 UTC m=+0.215410765 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., version=9.6, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, name=ubi9-minimal, release=1755695350) Nov 26 05:05:41 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 05:05:42 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:05:42 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:05:42 localhost nova_compute[281415]: 2025-11-26 10:05:42.762 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:42 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e189 e189: 6 total, 6 up, 6 in Nov 26 05:05:44 localhost nova_compute[281415]: 2025-11-26 10:05:44.078 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:44 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:05:44.470 262471 INFO neutron.agent.linux.ip_lib [None req-de19be54-6c3f-4fb8-96ae-447d6a56ea9f - - - - - -] Device tapb24a5e0e-a6 cannot be used as it has no MAC address#033[00m Nov 26 05:05:44 localhost nova_compute[281415]: 2025-11-26 10:05:44.504 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:44 localhost kernel: device tapb24a5e0e-a6 entered promiscuous mode Nov 26 05:05:44 localhost ovn_controller[153664]: 2025-11-26T10:05:44Z|00465|binding|INFO|Claiming lport b24a5e0e-a63c-47cf-884e-94a601a6077e for this chassis. Nov 26 05:05:44 localhost NetworkManager[5970]: [1764151544.5194] manager: (tapb24a5e0e-a6): new Generic device (/org/freedesktop/NetworkManager/Devices/73) Nov 26 05:05:44 localhost ovn_controller[153664]: 2025-11-26T10:05:44Z|00466|binding|INFO|b24a5e0e-a63c-47cf-884e-94a601a6077e: Claiming unknown Nov 26 05:05:44 localhost nova_compute[281415]: 2025-11-26 10:05:44.520 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:44 localhost systemd-udevd[325808]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:05:44 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:44.536 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.101.0.3/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-0a44989c-90f0-4237-9878-ccfc1abd8dca', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a44989c-90f0-4237-9878-ccfc1abd8dca', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a2fca84bedb04d0b8a425f8e67ff3ca6', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c9b9f79a-3d96-4c31-bdee-1bad04bcc568, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=b24a5e0e-a63c-47cf-884e-94a601a6077e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:05:44 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:44.538 159486 INFO neutron.agent.ovn.metadata.agent [-] Port b24a5e0e-a63c-47cf-884e-94a601a6077e in datapath 0a44989c-90f0-4237-9878-ccfc1abd8dca bound to our chassis#033[00m Nov 26 05:05:44 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:44.541 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Port 5da1578b-d5c9-4453-a2b2-62e8768ca215 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 26 05:05:44 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:44.541 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0a44989c-90f0-4237-9878-ccfc1abd8dca, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:05:44 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:44.543 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[09d5188c-9f95-4c47-a263-f9c9f1d63968]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:05:44 localhost journal[229445]: ethtool ioctl error on tapb24a5e0e-a6: No such device Nov 26 05:05:44 localhost journal[229445]: ethtool ioctl error on tapb24a5e0e-a6: No such device Nov 26 05:05:44 localhost journal[229445]: ethtool ioctl error on tapb24a5e0e-a6: No such device Nov 26 05:05:44 localhost journal[229445]: ethtool ioctl error on tapb24a5e0e-a6: No such device Nov 26 05:05:44 localhost ovn_controller[153664]: 2025-11-26T10:05:44Z|00467|binding|INFO|Setting lport b24a5e0e-a63c-47cf-884e-94a601a6077e ovn-installed in OVS Nov 26 05:05:44 localhost ovn_controller[153664]: 2025-11-26T10:05:44Z|00468|binding|INFO|Setting lport b24a5e0e-a63c-47cf-884e-94a601a6077e up in Southbound Nov 26 05:05:44 localhost journal[229445]: ethtool ioctl error on tapb24a5e0e-a6: No such device Nov 26 05:05:44 localhost nova_compute[281415]: 2025-11-26 10:05:44.570 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:44 localhost journal[229445]: ethtool ioctl error on tapb24a5e0e-a6: No such device Nov 26 05:05:44 localhost journal[229445]: ethtool ioctl error on tapb24a5e0e-a6: No such device Nov 26 05:05:44 localhost journal[229445]: ethtool ioctl error on tapb24a5e0e-a6: No such device Nov 26 05:05:44 localhost nova_compute[281415]: 2025-11-26 10:05:44.617 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:44 localhost nova_compute[281415]: 2025-11-26 10:05:44.656 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:44 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e189 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:05:45 localhost nova_compute[281415]: 2025-11-26 10:05:45.592 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:45 localhost podman[325893]: Nov 26 05:05:45 localhost dnsmasq[324723]: read /var/lib/neutron/dhcp/4a88868a-c729-4b53-a5da-713d88b5b238/addn_hosts - 0 addresses Nov 26 05:05:45 localhost dnsmasq-dhcp[324723]: read /var/lib/neutron/dhcp/4a88868a-c729-4b53-a5da-713d88b5b238/host Nov 26 05:05:45 localhost dnsmasq-dhcp[324723]: read /var/lib/neutron/dhcp/4a88868a-c729-4b53-a5da-713d88b5b238/opts Nov 26 05:05:45 localhost podman[325905]: 2025-11-26 10:05:45.659752036 +0000 UTC m=+0.078741030 container kill aec2d0389f5396e8067f7c81ed47f518b13f9aeca9ab22dc138a7e73723c0f86 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4a88868a-c729-4b53-a5da-713d88b5b238, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 26 05:05:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 05:05:45 localhost podman[325893]: 2025-11-26 10:05:45.69859693 +0000 UTC m=+0.163451156 container create 5a2c41cd507d1a03391fe3fcb71d97c28cf4c2f5aa4ff2edff12a9f7f544cda5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a44989c-90f0-4237-9878-ccfc1abd8dca, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 26 05:05:45 localhost podman[325893]: 2025-11-26 10:05:45.603839984 +0000 UTC m=+0.068694260 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:05:45 localhost systemd[1]: Started libpod-conmon-5a2c41cd507d1a03391fe3fcb71d97c28cf4c2f5aa4ff2edff12a9f7f544cda5.scope. Nov 26 05:05:45 localhost openstack_network_exporter[242153]: ERROR 10:05:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:05:45 localhost openstack_network_exporter[242153]: ERROR 10:05:45 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 05:05:45 localhost openstack_network_exporter[242153]: ERROR 10:05:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:05:45 localhost openstack_network_exporter[242153]: ERROR 10:05:45 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 05:05:45 localhost openstack_network_exporter[242153]: Nov 26 05:05:45 localhost openstack_network_exporter[242153]: ERROR 10:05:45 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 05:05:45 localhost openstack_network_exporter[242153]: Nov 26 05:05:45 localhost systemd[1]: Started libcrun container. Nov 26 05:05:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/785ba14175673e2945c94fa663dcc64f5675db0d711bc0e0bb93e1573f581e58/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:05:45 localhost podman[325922]: 2025-11-26 10:05:45.802304473 +0000 UTC m=+0.114305243 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 05:05:45 localhost podman[325893]: 2025-11-26 10:05:45.810743015 +0000 UTC m=+0.275597261 container init 5a2c41cd507d1a03391fe3fcb71d97c28cf4c2f5aa4ff2edff12a9f7f544cda5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a44989c-90f0-4237-9878-ccfc1abd8dca, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true) Nov 26 05:05:45 localhost podman[325893]: 2025-11-26 10:05:45.821603551 +0000 UTC m=+0.286457807 container start 5a2c41cd507d1a03391fe3fcb71d97c28cf4c2f5aa4ff2edff12a9f7f544cda5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a44989c-90f0-4237-9878-ccfc1abd8dca, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:05:45 localhost dnsmasq[325953]: started, version 2.85 cachesize 150 Nov 26 05:05:45 localhost dnsmasq[325953]: DNS service limited to local subnets Nov 26 05:05:45 localhost dnsmasq[325953]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:05:45 localhost dnsmasq[325953]: warning: no upstream servers configured Nov 26 05:05:45 localhost dnsmasq-dhcp[325953]: DHCP, static leases only on 10.101.0.0, lease time 1d Nov 26 05:05:45 localhost dnsmasq[325953]: read /var/lib/neutron/dhcp/0a44989c-90f0-4237-9878-ccfc1abd8dca/addn_hosts - 0 addresses Nov 26 05:05:45 localhost dnsmasq-dhcp[325953]: read /var/lib/neutron/dhcp/0a44989c-90f0-4237-9878-ccfc1abd8dca/host Nov 26 05:05:45 localhost dnsmasq-dhcp[325953]: read /var/lib/neutron/dhcp/0a44989c-90f0-4237-9878-ccfc1abd8dca/opts Nov 26 05:05:45 localhost podman[325922]: 2025-11-26 10:05:45.838963299 +0000 UTC m=+0.150964089 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 05:05:45 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 05:05:45 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:05:45.900 262471 INFO neutron.agent.dhcp.agent [None req-cf18d232-f4e4-45b2-8752-09df34312c50 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:05:43Z, description=, device_id=d9493b58-53da-4a8c-923b-79b05cde4931, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=86a71ee7-e5b2-4f4c-9a5f-3b61a080c08a, ip_allocation=immediate, mac_address=fa:16:3e:f2:42:fb, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:05:41Z, description=, dns_domain=, id=0a44989c-90f0-4237-9878-ccfc1abd8dca, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1089884987, port_security_enabled=True, project_id=a2fca84bedb04d0b8a425f8e67ff3ca6, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=61832, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2947, status=ACTIVE, subnets=['ec9ca735-d1bf-42c6-a072-9673eae5b972'], tags=[], tenant_id=a2fca84bedb04d0b8a425f8e67ff3ca6, updated_at=2025-11-26T10:05:42Z, vlan_transparent=None, network_id=0a44989c-90f0-4237-9878-ccfc1abd8dca, port_security_enabled=False, project_id=a2fca84bedb04d0b8a425f8e67ff3ca6, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2958, status=DOWN, tags=[], tenant_id=a2fca84bedb04d0b8a425f8e67ff3ca6, updated_at=2025-11-26T10:05:44Z on network 0a44989c-90f0-4237-9878-ccfc1abd8dca#033[00m Nov 26 05:05:45 localhost sshd[325958]: main: sshd: ssh-rsa algorithm is disabled Nov 26 05:05:45 localhost ovn_controller[153664]: 2025-11-26T10:05:45Z|00469|binding|INFO|Releasing lport ee9a6356-2bbd-47d2-b872-41e118ba6f17 from this chassis (sb_readonly=0) Nov 26 05:05:45 localhost ovn_controller[153664]: 2025-11-26T10:05:45Z|00470|binding|INFO|Setting lport ee9a6356-2bbd-47d2-b872-41e118ba6f17 down in Southbound Nov 26 05:05:45 localhost kernel: device tapee9a6356-2b left promiscuous mode Nov 26 05:05:45 localhost nova_compute[281415]: 2025-11-26 10:05:45.942 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:45 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:45.950 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-4a88868a-c729-4b53-a5da-713d88b5b238', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a88868a-c729-4b53-a5da-713d88b5b238', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ff1c2725eb448ea88b1272d72be47e9', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=53ae7ca7-f004-4de2-b27e-b430deae157d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ee9a6356-2bbd-47d2-b872-41e118ba6f17) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:05:45 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:45.952 159486 INFO neutron.agent.ovn.metadata.agent [-] Port ee9a6356-2bbd-47d2-b872-41e118ba6f17 in datapath 4a88868a-c729-4b53-a5da-713d88b5b238 unbound from our chassis#033[00m Nov 26 05:05:45 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:45.954 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4a88868a-c729-4b53-a5da-713d88b5b238, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:05:45 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:45.955 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[d77461e2-565c-4922-8124-2f6ebd19edd7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:05:45 localhost nova_compute[281415]: 2025-11-26 10:05:45.967 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:46 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:05:46.004 262471 INFO neutron.agent.dhcp.agent [None req-78045a05-5f09-443c-8f45-39e470c3e676 - - - - - -] DHCP configuration for ports {'2b23854f-b541-4c0b-b5fe-1672426fa3b0'} is completed#033[00m Nov 26 05:05:46 localhost dnsmasq[325953]: read /var/lib/neutron/dhcp/0a44989c-90f0-4237-9878-ccfc1abd8dca/addn_hosts - 1 addresses Nov 26 05:05:46 localhost dnsmasq-dhcp[325953]: read /var/lib/neutron/dhcp/0a44989c-90f0-4237-9878-ccfc1abd8dca/host Nov 26 05:05:46 localhost podman[325979]: 2025-11-26 10:05:46.176050073 +0000 UTC m=+0.065219822 container kill 5a2c41cd507d1a03391fe3fcb71d97c28cf4c2f5aa4ff2edff12a9f7f544cda5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a44989c-90f0-4237-9878-ccfc1abd8dca, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 26 05:05:46 localhost dnsmasq-dhcp[325953]: read /var/lib/neutron/dhcp/0a44989c-90f0-4237-9878-ccfc1abd8dca/opts Nov 26 05:05:46 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:05:46.344 262471 INFO neutron.agent.dhcp.agent [None req-b2125a47-915f-404b-957a-366cd44f9657 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:05:43Z, description=, device_id=d9493b58-53da-4a8c-923b-79b05cde4931, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=86a71ee7-e5b2-4f4c-9a5f-3b61a080c08a, ip_allocation=immediate, mac_address=fa:16:3e:f2:42:fb, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:05:41Z, description=, dns_domain=, id=0a44989c-90f0-4237-9878-ccfc1abd8dca, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1089884987, port_security_enabled=True, project_id=a2fca84bedb04d0b8a425f8e67ff3ca6, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=61832, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2947, status=ACTIVE, subnets=['ec9ca735-d1bf-42c6-a072-9673eae5b972'], tags=[], tenant_id=a2fca84bedb04d0b8a425f8e67ff3ca6, updated_at=2025-11-26T10:05:42Z, vlan_transparent=None, network_id=0a44989c-90f0-4237-9878-ccfc1abd8dca, port_security_enabled=False, project_id=a2fca84bedb04d0b8a425f8e67ff3ca6, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2958, status=DOWN, tags=[], tenant_id=a2fca84bedb04d0b8a425f8e67ff3ca6, updated_at=2025-11-26T10:05:44Z on network 0a44989c-90f0-4237-9878-ccfc1abd8dca#033[00m Nov 26 05:05:46 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:05:46.533 262471 INFO neutron.agent.dhcp.agent [None req-e9e62db1-c3de-4e4a-9b6b-5c183b8a375b - - - - - -] DHCP configuration for ports {'86a71ee7-e5b2-4f4c-9a5f-3b61a080c08a'} is completed#033[00m Nov 26 05:05:46 localhost dnsmasq[325953]: read /var/lib/neutron/dhcp/0a44989c-90f0-4237-9878-ccfc1abd8dca/addn_hosts - 1 addresses Nov 26 05:05:46 localhost dnsmasq-dhcp[325953]: read /var/lib/neutron/dhcp/0a44989c-90f0-4237-9878-ccfc1abd8dca/host Nov 26 05:05:46 localhost podman[326018]: 2025-11-26 10:05:46.613418414 +0000 UTC m=+0.064076276 container kill 5a2c41cd507d1a03391fe3fcb71d97c28cf4c2f5aa4ff2edff12a9f7f544cda5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a44989c-90f0-4237-9878-ccfc1abd8dca, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 26 05:05:46 localhost dnsmasq-dhcp[325953]: read /var/lib/neutron/dhcp/0a44989c-90f0-4237-9878-ccfc1abd8dca/opts Nov 26 05:05:46 localhost systemd[1]: tmp-crun.No47wH.mount: Deactivated successfully. Nov 26 05:05:46 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:05:46.887 262471 INFO neutron.agent.dhcp.agent [None req-2dabb6bf-0dc5-4e40-8256-351f5c0ad931 - - - - - -] DHCP configuration for ports {'86a71ee7-e5b2-4f4c-9a5f-3b61a080c08a'} is completed#033[00m Nov 26 05:05:47 localhost nova_compute[281415]: 2025-11-26 10:05:47.801 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:47 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e190 e190: 6 total, 6 up, 6 in Nov 26 05:05:47 localhost ovn_controller[153664]: 2025-11-26T10:05:47Z|00471|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:05:48 localhost nova_compute[281415]: 2025-11-26 10:05:48.026 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:48 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e191 e191: 6 total, 6 up, 6 in Nov 26 05:05:48 localhost dnsmasq[324723]: exiting on receipt of SIGTERM Nov 26 05:05:48 localhost podman[326055]: 2025-11-26 10:05:48.622505113 +0000 UTC m=+0.080654060 container kill aec2d0389f5396e8067f7c81ed47f518b13f9aeca9ab22dc138a7e73723c0f86 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4a88868a-c729-4b53-a5da-713d88b5b238, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 26 05:05:48 localhost systemd[1]: libpod-aec2d0389f5396e8067f7c81ed47f518b13f9aeca9ab22dc138a7e73723c0f86.scope: Deactivated successfully. Nov 26 05:05:48 localhost podman[326069]: 2025-11-26 10:05:48.69763312 +0000 UTC m=+0.062104535 container died aec2d0389f5396e8067f7c81ed47f518b13f9aeca9ab22dc138a7e73723c0f86 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4a88868a-c729-4b53-a5da-713d88b5b238, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 26 05:05:48 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aec2d0389f5396e8067f7c81ed47f518b13f9aeca9ab22dc138a7e73723c0f86-userdata-shm.mount: Deactivated successfully. Nov 26 05:05:48 localhost podman[326069]: 2025-11-26 10:05:48.735834034 +0000 UTC m=+0.100305409 container cleanup aec2d0389f5396e8067f7c81ed47f518b13f9aeca9ab22dc138a7e73723c0f86 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4a88868a-c729-4b53-a5da-713d88b5b238, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 26 05:05:48 localhost systemd[1]: libpod-conmon-aec2d0389f5396e8067f7c81ed47f518b13f9aeca9ab22dc138a7e73723c0f86.scope: Deactivated successfully. Nov 26 05:05:48 localhost podman[326076]: 2025-11-26 10:05:48.791038555 +0000 UTC m=+0.138522974 container remove aec2d0389f5396e8067f7c81ed47f518b13f9aeca9ab22dc138a7e73723c0f86 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4a88868a-c729-4b53-a5da-713d88b5b238, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 26 05:05:48 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:05:48.820 262471 INFO neutron.agent.dhcp.agent [None req-729c2f6b-db8a-4232-8b16-89d2573e7bc6 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:05:48 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:05:48.973 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:05:49 localhost nova_compute[281415]: 2025-11-26 10:05:49.081 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:49 localhost systemd[1]: var-lib-containers-storage-overlay-0b8a8cf1a0ae1b45df0b3b0e6fb1e15158fe808e3344fb8bba88568a622dab59-merged.mount: Deactivated successfully. Nov 26 05:05:49 localhost systemd[1]: run-netns-qdhcp\x2d4a88868a\x2dc729\x2d4b53\x2da5da\x2d713d88b5b238.mount: Deactivated successfully. Nov 26 05:05:49 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e191 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:05:50 localhost nova_compute[281415]: 2025-11-26 10:05:50.180 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 05:05:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 05:05:50 localhost podman[326101]: 2025-11-26 10:05:50.832323881 +0000 UTC m=+0.091031571 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true) Nov 26 05:05:50 localhost podman[326102]: 2025-11-26 10:05:50.893098534 +0000 UTC m=+0.147609565 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd) Nov 26 05:05:50 localhost podman[326101]: 2025-11-26 10:05:50.913691802 +0000 UTC m=+0.172399502 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:05:50 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 05:05:50 localhost podman[326102]: 2025-11-26 10:05:50.932421202 +0000 UTC m=+0.186932203 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 26 05:05:50 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 05:05:52 localhost nova_compute[281415]: 2025-11-26 10:05:52.831 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:53 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:05:53.196 262471 INFO neutron.agent.linux.ip_lib [None req-baa3f49d-c0c2-44f9-9cad-5baa7b2ce7d2 - - - - - -] Device tapab7de840-06 cannot be used as it has no MAC address#033[00m Nov 26 05:05:53 localhost nova_compute[281415]: 2025-11-26 10:05:53.231 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:53 localhost kernel: device tapab7de840-06 entered promiscuous mode Nov 26 05:05:53 localhost NetworkManager[5970]: [1764151553.2406] manager: (tapab7de840-06): new Generic device (/org/freedesktop/NetworkManager/Devices/74) Nov 26 05:05:53 localhost ovn_controller[153664]: 2025-11-26T10:05:53Z|00472|binding|INFO|Claiming lport ab7de840-06b3-40b5-842d-18d043379e58 for this chassis. Nov 26 05:05:53 localhost ovn_controller[153664]: 2025-11-26T10:05:53Z|00473|binding|INFO|ab7de840-06b3-40b5-842d-18d043379e58: Claiming unknown Nov 26 05:05:53 localhost nova_compute[281415]: 2025-11-26 10:05:53.242 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:53 localhost systemd-udevd[326148]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:05:53 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:53.254 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.103.0.3/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-5d4c3033-661c-42a2-85cc-29f8d2e3b13e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5d4c3033-661c-42a2-85cc-29f8d2e3b13e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a2fca84bedb04d0b8a425f8e67ff3ca6', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5132876d-9dfd-429e-89e4-1f59c26179fc, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ab7de840-06b3-40b5-842d-18d043379e58) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:05:53 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:53.257 159486 INFO neutron.agent.ovn.metadata.agent [-] Port ab7de840-06b3-40b5-842d-18d043379e58 in datapath 5d4c3033-661c-42a2-85cc-29f8d2e3b13e bound to our chassis#033[00m Nov 26 05:05:53 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:53.260 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Port 25f7c157-a4e3-4456-91d8-8de95c1fc7a1 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 26 05:05:53 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:53.260 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5d4c3033-661c-42a2-85cc-29f8d2e3b13e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:05:53 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:53.261 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[0e196b4e-af64-4b50-abc9-01a879d9acaa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:05:53 localhost journal[229445]: ethtool ioctl error on tapab7de840-06: No such device Nov 26 05:05:53 localhost nova_compute[281415]: 2025-11-26 10:05:53.276 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:53 localhost ovn_controller[153664]: 2025-11-26T10:05:53Z|00474|binding|INFO|Setting lport ab7de840-06b3-40b5-842d-18d043379e58 ovn-installed in OVS Nov 26 05:05:53 localhost journal[229445]: ethtool ioctl error on tapab7de840-06: No such device Nov 26 05:05:53 localhost ovn_controller[153664]: 2025-11-26T10:05:53Z|00475|binding|INFO|Setting lport ab7de840-06b3-40b5-842d-18d043379e58 up in Southbound Nov 26 05:05:53 localhost nova_compute[281415]: 2025-11-26 10:05:53.282 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:53 localhost nova_compute[281415]: 2025-11-26 10:05:53.284 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:53 localhost journal[229445]: ethtool ioctl error on tapab7de840-06: No such device Nov 26 05:05:53 localhost journal[229445]: ethtool ioctl error on tapab7de840-06: No such device Nov 26 05:05:53 localhost journal[229445]: ethtool ioctl error on tapab7de840-06: No such device Nov 26 05:05:53 localhost journal[229445]: ethtool ioctl error on tapab7de840-06: No such device Nov 26 05:05:53 localhost journal[229445]: ethtool ioctl error on tapab7de840-06: No such device Nov 26 05:05:53 localhost journal[229445]: ethtool ioctl error on tapab7de840-06: No such device Nov 26 05:05:53 localhost nova_compute[281415]: 2025-11-26 10:05:53.334 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:54 localhost nova_compute[281415]: 2025-11-26 10:05:54.048 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:54 localhost ovn_controller[153664]: 2025-11-26T10:05:54Z|00476|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:05:54 localhost nova_compute[281415]: 2025-11-26 10:05:54.111 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:54 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e191 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:05:54 localhost podman[326219]: Nov 26 05:05:55 localhost podman[326219]: 2025-11-26 10:05:55.006257095 +0000 UTC m=+0.095350105 container create 619d774fa3e7f05f3176293920c0a80c36e495e0ddc89bb0f1284b39a3de51f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d4c3033-661c-42a2-85cc-29f8d2e3b13e, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:05:55 localhost podman[326219]: 2025-11-26 10:05:54.958852436 +0000 UTC m=+0.047945496 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:05:55 localhost systemd[1]: Started libpod-conmon-619d774fa3e7f05f3176293920c0a80c36e495e0ddc89bb0f1284b39a3de51f6.scope. Nov 26 05:05:55 localhost systemd[1]: tmp-crun.5YjMCt.mount: Deactivated successfully. Nov 26 05:05:55 localhost systemd[1]: Started libcrun container. Nov 26 05:05:55 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e192 e192: 6 total, 6 up, 6 in Nov 26 05:05:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/859d46aaf08a3cbedd85dd73e64eb6c33bc87ec9190f4a3aa3edad1e65fd76ca/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:05:55 localhost podman[326219]: 2025-11-26 10:05:55.119228406 +0000 UTC m=+0.208321416 container init 619d774fa3e7f05f3176293920c0a80c36e495e0ddc89bb0f1284b39a3de51f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d4c3033-661c-42a2-85cc-29f8d2e3b13e, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2) Nov 26 05:05:55 localhost podman[326219]: 2025-11-26 10:05:55.131034431 +0000 UTC m=+0.220127441 container start 619d774fa3e7f05f3176293920c0a80c36e495e0ddc89bb0f1284b39a3de51f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d4c3033-661c-42a2-85cc-29f8d2e3b13e, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Nov 26 05:05:55 localhost dnsmasq[326237]: started, version 2.85 cachesize 150 Nov 26 05:05:55 localhost dnsmasq[326237]: DNS service limited to local subnets Nov 26 05:05:55 localhost dnsmasq[326237]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:05:55 localhost dnsmasq[326237]: warning: no upstream servers configured Nov 26 05:05:55 localhost dnsmasq-dhcp[326237]: DHCP, static leases only on 10.103.0.0, lease time 1d Nov 26 05:05:55 localhost dnsmasq[326237]: read /var/lib/neutron/dhcp/5d4c3033-661c-42a2-85cc-29f8d2e3b13e/addn_hosts - 0 addresses Nov 26 05:05:55 localhost dnsmasq-dhcp[326237]: read /var/lib/neutron/dhcp/5d4c3033-661c-42a2-85cc-29f8d2e3b13e/host Nov 26 05:05:55 localhost dnsmasq-dhcp[326237]: read /var/lib/neutron/dhcp/5d4c3033-661c-42a2-85cc-29f8d2e3b13e/opts Nov 26 05:05:55 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:05:55.201 262471 INFO neutron.agent.dhcp.agent [None req-2ebcbeca-b46c-4b22-a0d2-edda1a30d5d4 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:05:52Z, description=, device_id=d9493b58-53da-4a8c-923b-79b05cde4931, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=ab6a01ff-609b-43bc-9667-25b041c5ea9c, ip_allocation=immediate, mac_address=fa:16:3e:55:88:19, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:05:50Z, description=, dns_domain=, id=5d4c3033-661c-42a2-85cc-29f8d2e3b13e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1878225852, port_security_enabled=True, project_id=a2fca84bedb04d0b8a425f8e67ff3ca6, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=8341, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2991, status=ACTIVE, subnets=['4bcdaa7d-73b8-435b-b184-84ef132028c0'], tags=[], tenant_id=a2fca84bedb04d0b8a425f8e67ff3ca6, updated_at=2025-11-26T10:05:51Z, vlan_transparent=None, network_id=5d4c3033-661c-42a2-85cc-29f8d2e3b13e, port_security_enabled=False, project_id=a2fca84bedb04d0b8a425f8e67ff3ca6, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3013, status=DOWN, tags=[], tenant_id=a2fca84bedb04d0b8a425f8e67ff3ca6, updated_at=2025-11-26T10:05:52Z on network 5d4c3033-661c-42a2-85cc-29f8d2e3b13e#033[00m Nov 26 05:05:55 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:05:55.308 262471 INFO neutron.agent.dhcp.agent [None req-7f8c2c93-56a5-48b2-bfbf-e3e0ccd6a5d8 - - - - - -] DHCP configuration for ports {'fcef3b77-37e0-4d41-8002-1accec928c09'} is completed#033[00m Nov 26 05:05:55 localhost dnsmasq[326237]: read /var/lib/neutron/dhcp/5d4c3033-661c-42a2-85cc-29f8d2e3b13e/addn_hosts - 1 addresses Nov 26 05:05:55 localhost podman[326255]: 2025-11-26 10:05:55.459417846 +0000 UTC m=+0.068631997 container kill 619d774fa3e7f05f3176293920c0a80c36e495e0ddc89bb0f1284b39a3de51f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d4c3033-661c-42a2-85cc-29f8d2e3b13e, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 26 05:05:55 localhost dnsmasq-dhcp[326237]: read /var/lib/neutron/dhcp/5d4c3033-661c-42a2-85cc-29f8d2e3b13e/host Nov 26 05:05:55 localhost dnsmasq-dhcp[326237]: read /var/lib/neutron/dhcp/5d4c3033-661c-42a2-85cc-29f8d2e3b13e/opts Nov 26 05:05:55 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:05:55.634 262471 INFO neutron.agent.dhcp.agent [None req-1933eb37-259a-47d4-816a-d3c449ba4a8e - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:05:52Z, description=, device_id=d9493b58-53da-4a8c-923b-79b05cde4931, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=ab6a01ff-609b-43bc-9667-25b041c5ea9c, ip_allocation=immediate, mac_address=fa:16:3e:55:88:19, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:05:50Z, description=, dns_domain=, id=5d4c3033-661c-42a2-85cc-29f8d2e3b13e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1878225852, port_security_enabled=True, project_id=a2fca84bedb04d0b8a425f8e67ff3ca6, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=8341, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2991, status=ACTIVE, subnets=['4bcdaa7d-73b8-435b-b184-84ef132028c0'], tags=[], tenant_id=a2fca84bedb04d0b8a425f8e67ff3ca6, updated_at=2025-11-26T10:05:51Z, vlan_transparent=None, network_id=5d4c3033-661c-42a2-85cc-29f8d2e3b13e, port_security_enabled=False, project_id=a2fca84bedb04d0b8a425f8e67ff3ca6, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3013, status=DOWN, tags=[], tenant_id=a2fca84bedb04d0b8a425f8e67ff3ca6, updated_at=2025-11-26T10:05:52Z on network 5d4c3033-661c-42a2-85cc-29f8d2e3b13e#033[00m Nov 26 05:05:55 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:05:55.757 262471 INFO neutron.agent.dhcp.agent [None req-0b963f0a-cf16-4e6b-b805-59b114d8eaa8 - - - - - -] DHCP configuration for ports {'ab6a01ff-609b-43bc-9667-25b041c5ea9c'} is completed#033[00m Nov 26 05:05:55 localhost dnsmasq[326237]: read /var/lib/neutron/dhcp/5d4c3033-661c-42a2-85cc-29f8d2e3b13e/addn_hosts - 1 addresses Nov 26 05:05:55 localhost dnsmasq-dhcp[326237]: read /var/lib/neutron/dhcp/5d4c3033-661c-42a2-85cc-29f8d2e3b13e/host Nov 26 05:05:55 localhost dnsmasq-dhcp[326237]: read /var/lib/neutron/dhcp/5d4c3033-661c-42a2-85cc-29f8d2e3b13e/opts Nov 26 05:05:55 localhost podman[326294]: 2025-11-26 10:05:55.874067093 +0000 UTC m=+0.052553199 container kill 619d774fa3e7f05f3176293920c0a80c36e495e0ddc89bb0f1284b39a3de51f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d4c3033-661c-42a2-85cc-29f8d2e3b13e, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 26 05:05:56 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:05:56.106 262471 INFO neutron.agent.dhcp.agent [None req-d2b69a02-37e2-4a3c-a676-793360eaf830 - - - - - -] DHCP configuration for ports {'ab6a01ff-609b-43bc-9667-25b041c5ea9c'} is completed#033[00m Nov 26 05:05:56 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e193 e193: 6 total, 6 up, 6 in Nov 26 05:05:56 localhost nova_compute[281415]: 2025-11-26 10:05:56.591 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:57 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e194 e194: 6 total, 6 up, 6 in Nov 26 05:05:57 localhost podman[240049]: time="2025-11-26T10:05:57Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 05:05:57 localhost podman[240049]: @ - - [26/Nov/2025:10:05:57 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157512 "" "Go-http-client/1.1" Nov 26 05:05:57 localhost podman[240049]: @ - - [26/Nov/2025:10:05:57 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19736 "" "Go-http-client/1.1" Nov 26 05:05:57 localhost nova_compute[281415]: 2025-11-26 10:05:57.866 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:58 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:05:58 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:05:58 localhost systemd[1]: tmp-crun.o2Jpjj.mount: Deactivated successfully. Nov 26 05:05:58 localhost dnsmasq[326237]: read /var/lib/neutron/dhcp/5d4c3033-661c-42a2-85cc-29f8d2e3b13e/addn_hosts - 0 addresses Nov 26 05:05:58 localhost dnsmasq-dhcp[326237]: read /var/lib/neutron/dhcp/5d4c3033-661c-42a2-85cc-29f8d2e3b13e/host Nov 26 05:05:58 localhost podman[326333]: 2025-11-26 10:05:58.543980507 +0000 UTC m=+0.076711318 container kill 619d774fa3e7f05f3176293920c0a80c36e495e0ddc89bb0f1284b39a3de51f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d4c3033-661c-42a2-85cc-29f8d2e3b13e, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:05:58 localhost dnsmasq-dhcp[326237]: read /var/lib/neutron/dhcp/5d4c3033-661c-42a2-85cc-29f8d2e3b13e/opts Nov 26 05:05:58 localhost ovn_controller[153664]: 2025-11-26T10:05:58Z|00477|binding|INFO|Releasing lport ab7de840-06b3-40b5-842d-18d043379e58 from this chassis (sb_readonly=0) Nov 26 05:05:58 localhost kernel: device tapab7de840-06 left promiscuous mode Nov 26 05:05:58 localhost ovn_controller[153664]: 2025-11-26T10:05:58Z|00478|binding|INFO|Setting lport ab7de840-06b3-40b5-842d-18d043379e58 down in Southbound Nov 26 05:05:58 localhost nova_compute[281415]: 2025-11-26 10:05:58.759 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:58 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:58.767 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.103.0.3/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-5d4c3033-661c-42a2-85cc-29f8d2e3b13e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5d4c3033-661c-42a2-85cc-29f8d2e3b13e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a2fca84bedb04d0b8a425f8e67ff3ca6', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5132876d-9dfd-429e-89e4-1f59c26179fc, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ab7de840-06b3-40b5-842d-18d043379e58) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:05:58 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:58.768 159486 INFO neutron.agent.ovn.metadata.agent [-] Port ab7de840-06b3-40b5-842d-18d043379e58 in datapath 5d4c3033-661c-42a2-85cc-29f8d2e3b13e unbound from our chassis#033[00m Nov 26 05:05:58 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:58.769 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5d4c3033-661c-42a2-85cc-29f8d2e3b13e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:05:58 localhost ovn_metadata_agent[159481]: 2025-11-26 10:05:58.770 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[6da93223-5584-438b-892b-b26fd5b5a0f3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:05:58 localhost nova_compute[281415]: 2025-11-26 10:05:58.779 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:59 localhost nova_compute[281415]: 2025-11-26 10:05:59.114 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:59 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e195 e195: 6 total, 6 up, 6 in Nov 26 05:05:59 localhost podman[326375]: 2025-11-26 10:05:59.353601432 +0000 UTC m=+0.065470190 container kill 619d774fa3e7f05f3176293920c0a80c36e495e0ddc89bb0f1284b39a3de51f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d4c3033-661c-42a2-85cc-29f8d2e3b13e, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3) Nov 26 05:05:59 localhost dnsmasq[326237]: exiting on receipt of SIGTERM Nov 26 05:05:59 localhost systemd[1]: libpod-619d774fa3e7f05f3176293920c0a80c36e495e0ddc89bb0f1284b39a3de51f6.scope: Deactivated successfully. Nov 26 05:05:59 localhost podman[326390]: 2025-11-26 10:05:59.432164276 +0000 UTC m=+0.058712450 container died 619d774fa3e7f05f3176293920c0a80c36e495e0ddc89bb0f1284b39a3de51f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d4c3033-661c-42a2-85cc-29f8d2e3b13e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 26 05:05:59 localhost systemd[1]: tmp-crun.1P9q19.mount: Deactivated successfully. Nov 26 05:05:59 localhost podman[326390]: 2025-11-26 10:05:59.472791785 +0000 UTC m=+0.099339919 container cleanup 619d774fa3e7f05f3176293920c0a80c36e495e0ddc89bb0f1284b39a3de51f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d4c3033-661c-42a2-85cc-29f8d2e3b13e, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 26 05:05:59 localhost systemd[1]: libpod-conmon-619d774fa3e7f05f3176293920c0a80c36e495e0ddc89bb0f1284b39a3de51f6.scope: Deactivated successfully. Nov 26 05:05:59 localhost podman[326393]: 2025-11-26 10:05:59.520182303 +0000 UTC m=+0.135483049 container remove 619d774fa3e7f05f3176293920c0a80c36e495e0ddc89bb0f1284b39a3de51f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d4c3033-661c-42a2-85cc-29f8d2e3b13e, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 26 05:05:59 localhost systemd[1]: var-lib-containers-storage-overlay-859d46aaf08a3cbedd85dd73e64eb6c33bc87ec9190f4a3aa3edad1e65fd76ca-merged.mount: Deactivated successfully. Nov 26 05:05:59 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-619d774fa3e7f05f3176293920c0a80c36e495e0ddc89bb0f1284b39a3de51f6-userdata-shm.mount: Deactivated successfully. Nov 26 05:05:59 localhost systemd[1]: run-netns-qdhcp\x2d5d4c3033\x2d661c\x2d42a2\x2d85cc\x2d29f8d2e3b13e.mount: Deactivated successfully. Nov 26 05:05:59 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:05:59.575 262471 INFO neutron.agent.dhcp.agent [None req-8f013681-0006-4193-85b0-cb79d56b731f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:05:59 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:05:59.664 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:05:59 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e195 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:05:59 localhost ovn_controller[153664]: 2025-11-26T10:05:59Z|00479|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:05:59 localhost nova_compute[281415]: 2025-11-26 10:05:59.836 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:05:59 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 26 05:05:59 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2100965218' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 26 05:05:59 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 26 05:05:59 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2100965218' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 26 05:06:00 localhost nova_compute[281415]: 2025-11-26 10:06:00.340 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:02 localhost ovn_controller[153664]: 2025-11-26T10:06:02Z|00480|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:06:02 localhost nova_compute[281415]: 2025-11-26 10:06:02.192 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:02 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e196 e196: 6 total, 6 up, 6 in Nov 26 05:06:02 localhost systemd[1]: tmp-crun.dua37X.mount: Deactivated successfully. Nov 26 05:06:02 localhost dnsmasq[325953]: read /var/lib/neutron/dhcp/0a44989c-90f0-4237-9878-ccfc1abd8dca/addn_hosts - 0 addresses Nov 26 05:06:02 localhost podman[326436]: 2025-11-26 10:06:02.812625545 +0000 UTC m=+0.077805382 container kill 5a2c41cd507d1a03391fe3fcb71d97c28cf4c2f5aa4ff2edff12a9f7f544cda5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a44989c-90f0-4237-9878-ccfc1abd8dca, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Nov 26 05:06:02 localhost dnsmasq-dhcp[325953]: read /var/lib/neutron/dhcp/0a44989c-90f0-4237-9878-ccfc1abd8dca/host Nov 26 05:06:02 localhost dnsmasq-dhcp[325953]: read /var/lib/neutron/dhcp/0a44989c-90f0-4237-9878-ccfc1abd8dca/opts Nov 26 05:06:02 localhost nova_compute[281415]: 2025-11-26 10:06:02.867 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:03 localhost ovn_controller[153664]: 2025-11-26T10:06:03Z|00481|binding|INFO|Releasing lport b24a5e0e-a63c-47cf-884e-94a601a6077e from this chassis (sb_readonly=0) Nov 26 05:06:03 localhost kernel: device tapb24a5e0e-a6 left promiscuous mode Nov 26 05:06:03 localhost ovn_controller[153664]: 2025-11-26T10:06:03Z|00482|binding|INFO|Setting lport b24a5e0e-a63c-47cf-884e-94a601a6077e down in Southbound Nov 26 05:06:03 localhost nova_compute[281415]: 2025-11-26 10:06:03.067 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:03 localhost nova_compute[281415]: 2025-11-26 10:06:03.087 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:06:03.085 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.101.0.3/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-0a44989c-90f0-4237-9878-ccfc1abd8dca', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a44989c-90f0-4237-9878-ccfc1abd8dca', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a2fca84bedb04d0b8a425f8e67ff3ca6', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c9b9f79a-3d96-4c31-bdee-1bad04bcc568, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=b24a5e0e-a63c-47cf-884e-94a601a6077e) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:06:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:06:03.088 159486 INFO neutron.agent.ovn.metadata.agent [-] Port b24a5e0e-a63c-47cf-884e-94a601a6077e in datapath 0a44989c-90f0-4237-9878-ccfc1abd8dca unbound from our chassis#033[00m Nov 26 05:06:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:06:03.090 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0a44989c-90f0-4237-9878-ccfc1abd8dca, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:06:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:06:03.092 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[962cd801-7cd0-4626-ae1d-245a32649eca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:06:03 localhost dnsmasq[325953]: exiting on receipt of SIGTERM Nov 26 05:06:03 localhost podman[326535]: 2025-11-26 10:06:03.3908577 +0000 UTC m=+0.054158868 container kill 5a2c41cd507d1a03391fe3fcb71d97c28cf4c2f5aa4ff2edff12a9f7f544cda5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a44989c-90f0-4237-9878-ccfc1abd8dca, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118) Nov 26 05:06:03 localhost systemd[1]: libpod-5a2c41cd507d1a03391fe3fcb71d97c28cf4c2f5aa4ff2edff12a9f7f544cda5.scope: Deactivated successfully. Nov 26 05:06:03 localhost podman[326551]: 2025-11-26 10:06:03.46795209 +0000 UTC m=+0.060969871 container died 5a2c41cd507d1a03391fe3fcb71d97c28cf4c2f5aa4ff2edff12a9f7f544cda5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a44989c-90f0-4237-9878-ccfc1abd8dca, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true) Nov 26 05:06:03 localhost podman[326551]: 2025-11-26 10:06:03.503307324 +0000 UTC m=+0.096324985 container cleanup 5a2c41cd507d1a03391fe3fcb71d97c28cf4c2f5aa4ff2edff12a9f7f544cda5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a44989c-90f0-4237-9878-ccfc1abd8dca, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Nov 26 05:06:03 localhost systemd[1]: libpod-conmon-5a2c41cd507d1a03391fe3fcb71d97c28cf4c2f5aa4ff2edff12a9f7f544cda5.scope: Deactivated successfully. Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.586 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'name': 'test', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005536118.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'hostId': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.587 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 26 05:06:03 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e197 e197: 6 total, 6 up, 6 in Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.592 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes volume: 7557 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a15bfcaa-4153-4710-820c-978bbd4b2710', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7557, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:06:03.587879', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '8448ce02-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12038.830158649, 'message_signature': 'dfe0b656d026e5de4c71f30d9a53c278f3811555b39be8071f02ccdb1f0865da'}]}, 'timestamp': '2025-11-26 10:06:03.594032', '_unique_id': 'beb2d7216e5c4c47829569782fa29700'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.596 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.598 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 26 05:06:03 localhost podman[326553]: 2025-11-26 10:06:03.629769834 +0000 UTC m=+0.216062046 container remove 5a2c41cd507d1a03391fe3fcb71d97c28cf4c2f5aa4ff2edff12a9f7f544cda5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a44989c-90f0-4237-9878-ccfc1abd8dca, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.647 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.648 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a431668-e50c-4ad8-b631-3a1a658431dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:06:03.598918', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '845110a8-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12038.841194611, 'message_signature': '9a696c9da625cd75b1a1a338c83798629a5a4fb560abb2ecf817045264bfc04e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:06:03.598918', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '84513376-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12038.841194611, 'message_signature': 'd540a3f58c2eab4595a927d552263aad58bc3638018f840d85260e957c0e43d2'}]}, 'timestamp': '2025-11-26 10:06:03.648983', '_unique_id': '4bbdb42a4380486497433ca426cfa4ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.651 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.652 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.672 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/memory.usage volume: 51.79296875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:06:03 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:06:03.671 262471 INFO neutron.agent.dhcp.agent [None req-b5987446-61f9-4f1a-ba64-ae2ce9d87ba7 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:06:03 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:06:03.673 262471 INFO neutron.agent.dhcp.agent [None req-b5987446-61f9-4f1a-ba64-ae2ce9d87ba7 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '72e91eda-3468-4224-8812-d91bbb1d81f3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.79296875, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T10:06:03.653047', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '8454e278-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12038.914717068, 'message_signature': 'b1c58204fab75dc80aef315ac0e35ca8817cd0736f56da6c28f9a4fa1b69dcf0'}]}, 'timestamp': '2025-11-26 10:06:03.673104', '_unique_id': 'd260ec2652844e989c22ea26d9175e73'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.674 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.675 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.675 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:06:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:06:03.674 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:06:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:06:03.674 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:06:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:06:03.675 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd940cb00-2048-46b9-bdbc-54a6776ec12b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:06:03.675550', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '845555be-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12038.830158649, 'message_signature': 'e9cc7994dcdc0281766b3df8928cfd64ae764d20e6318bd0257480e760cb75a5'}]}, 'timestamp': '2025-11-26 10:06:03.676088', '_unique_id': '02a6be9aee6c40449f848f427c268acd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.677 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.678 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.678 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.678 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '63f24ec2-4d41-48af-a237-f1a6dc215392', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:06:03.678446', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8455c670-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12038.841194611, 'message_signature': '54ed53671b3f9a85a60208fd569614fcb156457a60e41db47c223fcf8630b909'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:06:03.678446', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8455d85e-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12038.841194611, 'message_signature': 'c0c683e9042079a7d32b53c9b3497e3faf29c56ef70087a085dd07e09041babe'}]}, 'timestamp': '2025-11-26 10:06:03.679436', '_unique_id': '07c408917f9947329952add011835306'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.680 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.681 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.681 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 1723586642 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.682 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 89399569 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '897e245d-a3c4-4f1f-8a42-007aac58d32f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1723586642, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:06:03.681741', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '84564906-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12038.841194611, 'message_signature': 'b61163cea50b59150bd154ab472e3e6f6ac415e5955cb271d29abafd75f282ed'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89399569, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:06:03.681741', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8456633c-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12038.841194611, 'message_signature': '73be95e0fb6f17b9880d1fb9a2312c25f871ad930f7decda86272874a5e72943'}]}, 'timestamp': '2025-11-26 10:06:03.682907', '_unique_id': '1cb9be0bf3994a679a1f49028c5897fd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.683 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.685 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.685 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e94a7a2-6e7b-4bff-b866-c4033fbb4193', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:06:03.685230', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '8456cf98-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12038.830158649, 'message_signature': '9f83311525873bf88c6fcfe496a2d388824c63863c2ace85a2a2f5a1523bd5c8'}]}, 'timestamp': '2025-11-26 10:06:03.685703', '_unique_id': 'd223ec41b6374758a0880162a10f03d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.686 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.687 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.688 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c767b424-0ab0-43f1-853a-98851437d3e0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:06:03.688124', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '845741b2-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12038.830158649, 'message_signature': '8bf38c86d4923a7900641aadb51087d194b835469d6cca8230a77e6b41434034'}]}, 'timestamp': '2025-11-26 10:06:03.688637', '_unique_id': 'fd1449000a97447f97dc89de7bfa81e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.689 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.690 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.691 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.702 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.703 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ed6c338-5016-44e6-9b62-1231b443b39b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:06:03.691104', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '84597f5e-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12038.933332356, 'message_signature': '470750eecdd1c198dc056d01a1aa6d3d49f0053166ccdc3792cfbd48b8db9f5d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:06:03.691104', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '845989f4-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12038.933332356, 'message_signature': 'f9b1ac56dbe5c5899b0e9e177905090e7f1c6898ead804ed46dceffa9c5fa280'}]}, 'timestamp': '2025-11-26 10:06:03.703473', '_unique_id': '214ab45b64af493482e9485125dc54b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.704 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '544a1de3-1757-45f2-953c-85bf98b47656', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:06:03.704846', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '8459cb26-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12038.830158649, 'message_signature': 'f92f7c13a8135a084de78629c88f5222123cef9c41ff97ce2cc2886f9510ec9f'}]}, 'timestamp': '2025-11-26 10:06:03.705217', '_unique_id': 'b4a4ec2feab646bca4015e0583bec1f0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.705 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.706 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.706 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.706 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '728de330-0468-4fec-9ffa-8ad59b14385d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:06:03.706590', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '845a0e1a-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12038.933332356, 'message_signature': '10501068763ab0ef03a3741d79bdf4843268cabdc063a92c810332fb26421fd8'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:06:03.706590', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '845a1928-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12038.933332356, 'message_signature': '8bad831f9d91337b42e3daf2f36bc35a6bd69aecfd164d0954038e98a9d6e506'}]}, 'timestamp': '2025-11-26 10:06:03.707142', '_unique_id': '2e310f2dbb3a4281a080414dd7afbebf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.707 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.708 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.708 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7681bada-6896-4ac7-873d-688e2e990de7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:06:03.708495', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '845a58d4-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12038.830158649, 'message_signature': '188b244ed84cc97346e923f3e74093a33033f48d25afeff9b33f45df711957b6'}]}, 'timestamp': '2025-11-26 10:06:03.708788', '_unique_id': '397f1928fcb44f7b8b8cfce0f04c7cc8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.709 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.710 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.710 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'db8d5fbe-daf6-4718-9486-4630c52793b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:06:03.710160', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '845a99c0-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12038.830158649, 'message_signature': '9444cc892852af18f9b35e5265d0755aa9d0ac7b0b0173fc4e5cc106d5e04154'}]}, 'timestamp': '2025-11-26 10:06:03.710451', '_unique_id': '8a49f3dada1c4acdb405c595f1801c06'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.711 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 1143371229 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 23326743 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4c3422d-1937-4f3f-b020-69e3e00bd9d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1143371229, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:06:03.711847', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '845adc5a-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12038.841194611, 'message_signature': '602709332894e8df5b58863f3e46879ab5b99f1b59a9b2876357f8850d784023'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 23326743, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:06:03.711847', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '845ae682-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12038.841194611, 'message_signature': 'e142a61d4b6d162c09ddb3116ae69eb553116535d57cdf2bc992b9469eb45825'}]}, 'timestamp': '2025-11-26 10:06:03.712400', '_unique_id': 'd504d4b24dbf4d6b9d0fd051c9ce155f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.712 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.713 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.713 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '684f286d-9f53-4da8-8789-c352c0b665b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:06:03.713833', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '845b29a8-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12038.841194611, 'message_signature': 'b7b7f6c07fd12c01fff413833bceb9a01284c0cec4de2392923cd64d6e01d680'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:06:03.713833', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '845b34b6-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12038.841194611, 'message_signature': 'e98a6d9a6452db7ffb113610310f7f1c652189d584cd7cf952954d0892ea05c1'}]}, 'timestamp': '2025-11-26 10:06:03.714398', '_unique_id': 'e18f6222a4534b9aa8b54697857c0c33'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.714 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.715 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.715 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '38e9ceb8-5130-446e-a31c-d28e143e1e58', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:06:03.715729', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '845b7336-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12038.933332356, 'message_signature': 'f639eacb613482b71404d6a89662eb3b628dab01092d1a06692904cfbc339a03'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:06:03.715729', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '845b7e76-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12038.933332356, 'message_signature': '7a01646f327421a8eac8403123ed8300f03474f041e4fc7aa521aa1cb8f7c677'}]}, 'timestamp': '2025-11-26 10:06:03.716287', '_unique_id': '9af549d1b610472a832cbf3ccef319d1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.716 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.717 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.717 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee80583d-644f-41b5-99dc-6130de63d877', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:06:03.717763', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '845bc2b4-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12038.830158649, 'message_signature': '795141ecaa4023f63b5bc80247caac6cc00e092138c28af4386539dd730a386c'}]}, 'timestamp': '2025-11-26 10:06:03.718088', '_unique_id': 'b7f2d2805cf34590bd021ee9e220b827'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.718 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.719 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.719 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/cpu volume: 17900000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dcc49022-f881-49b6-b370-88ff28cffb0e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17900000000, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T10:06:03.719389', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '845c0206-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12038.914717068, 'message_signature': '0b9fbb34c591fc141e46378e525e190c4fe3c582c938dfe9a4cc68f2354f5f35'}]}, 'timestamp': '2025-11-26 10:06:03.719665', '_unique_id': '5c2ec516d89846b4ac465c2cd5e6d704'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.720 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.721 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.721 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.721 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets volume: 68 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2923c6f8-9da8-4ce5-a7ca-6435038d0ba8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 68, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:06:03.721265', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '845c4b62-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12038.830158649, 'message_signature': '5cf6ad462b6a33c95f53b8b7f14a532ef3257d77818c0321b825a351c0a0ad60'}]}, 'timestamp': '2025-11-26 10:06:03.721551', '_unique_id': '3c8b07c84f0b49cba2d752fe69bdfb9d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.722 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5eea0383-13e5-42eb-a0ab-2cca18945022', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:06:03.722872', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '845c8b18-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12038.830158649, 'message_signature': '4be620e4ad57738838298ce9b7f60f24ddc9d3e2c949eeef43f9831e37ea67bf'}]}, 'timestamp': '2025-11-26 10:06:03.723181', '_unique_id': '28173e14f70640ec9ebe9da76c6cf4c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.723 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.724 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.724 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.724 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d15d2b0-64fd-4bc8-87f8-a934fc37ebee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:06:03.724639', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '845ccf24-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12038.841194611, 'message_signature': '908487dca2ebefa08885c2103b1bb2f505d4d25950885271d5474ca7b68a79d9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:06:03.724639', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '845cdadc-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12038.841194611, 'message_signature': 'c7c5cdef9b56adf9989694d7faf3f411ca9c76f91e8202ed59da507139f82f10'}]}, 'timestamp': '2025-11-26 10:06:03.725207', '_unique_id': 'd63c4d773fe94a52a182394ce67b79bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:06:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:06:03.725 12 ERROR oslo_messaging.notify.messaging Nov 26 05:06:03 localhost ovn_controller[153664]: 2025-11-26T10:06:03Z|00483|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:06:03 localhost nova_compute[281415]: 2025-11-26 10:06:03.800 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:03 localhost systemd[1]: var-lib-containers-storage-overlay-785ba14175673e2945c94fa663dcc64f5675db0d711bc0e0bb93e1573f581e58-merged.mount: Deactivated successfully. Nov 26 05:06:03 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5a2c41cd507d1a03391fe3fcb71d97c28cf4c2f5aa4ff2edff12a9f7f544cda5-userdata-shm.mount: Deactivated successfully. Nov 26 05:06:03 localhost systemd[1]: run-netns-qdhcp\x2d0a44989c\x2d90f0\x2d4237\x2d9878\x2dccfc1abd8dca.mount: Deactivated successfully. Nov 26 05:06:03 localhost podman[326620]: 2025-11-26 10:06:03.938413686 +0000 UTC m=+0.101504565 container exec a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, version=7, build-date=2025-09-24T08:57:55, distribution-scope=public, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, GIT_CLEAN=True, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, release=553, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., io.openshift.expose-services=) Nov 26 05:06:04 localhost podman[326620]: 2025-11-26 10:06:04.094778461 +0000 UTC m=+0.257869310 container exec_died a5033d31d58238ccc3854adb1d5653cd506ae6bf5cb65e463a13411a23a04d3c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-0d5e5e6d-3c4b-5efe-8c65-346ae6715606-crash-np0005536118, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, distribution-scope=public, maintainer=Guillaume Abrioux , GIT_BRANCH=main, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, version=7, GIT_CLEAN=True, io.openshift.expose-services=, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git) Nov 26 05:06:04 localhost nova_compute[281415]: 2025-11-26 10:06:04.118 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:04 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e198 e198: 6 total, 6 up, 6 in Nov 26 05:06:04 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:06:04 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:06:04 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:06:04 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:06:04 localhost sshd[326739]: main: sshd: ssh-rsa algorithm is disabled Nov 26 05:06:04 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e198 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:06:04 localhost nova_compute[281415]: 2025-11-26 10:06:04.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:06:04 localhost nova_compute[281415]: 2025-11-26 10:06:04.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:06:05 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:06:05 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:06:05 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 26 05:06:05 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 8400.1 total, 600.0 interval#012Cumulative writes: 14K writes, 58K keys, 14K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.01 MB/s#012Cumulative WAL: 14K writes, 4589 syncs, 3.19 writes per sync, written: 0.04 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 8734 writes, 32K keys, 8734 commit groups, 1.0 writes per commit group, ingest: 24.90 MB, 0.04 MB/s#012Interval WAL: 8734 writes, 3722 syncs, 2.35 writes per sync, written: 0.02 GB, 0.04 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 26 05:06:05 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:06:05 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:06:05 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 26 05:06:05 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 26 05:06:05 localhost ceph-mon[297296]: Adjusting osd_memory_target on np0005536117.localdomain to 836.6M Nov 26 05:06:05 localhost ceph-mon[297296]: Unable to set osd_memory_target on np0005536117.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 26 05:06:05 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 26 05:06:05 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 26 05:06:05 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 26 05:06:05 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 26 05:06:05 localhost ceph-mon[297296]: Adjusting osd_memory_target on np0005536119.localdomain to 836.6M Nov 26 05:06:05 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 26 05:06:05 localhost ceph-mon[297296]: Unable to set osd_memory_target on np0005536119.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 26 05:06:05 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 26 05:06:05 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 26 05:06:05 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 26 05:06:05 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 26 05:06:05 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 05:06:05 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 26 05:06:05 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:06:05 localhost nova_compute[281415]: 2025-11-26 10:06:05.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:06:05 localhost nova_compute[281415]: 2025-11-26 10:06:05.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:06:05 localhost nova_compute[281415]: 2025-11-26 10:06:05.875 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:06:05 localhost nova_compute[281415]: 2025-11-26 10:06:05.876 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:06:05 localhost nova_compute[281415]: 2025-11-26 10:06:05.876 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:06:05 localhost nova_compute[281415]: 2025-11-26 10:06:05.876 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 05:06:05 localhost nova_compute[281415]: 2025-11-26 10:06:05.877 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 05:06:06 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 05:06:06 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/461668077' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 05:06:06 localhost nova_compute[281415]: 2025-11-26 10:06:06.412 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.535s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 05:06:06 localhost nova_compute[281415]: 2025-11-26 10:06:06.509 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 05:06:06 localhost nova_compute[281415]: 2025-11-26 10:06:06.510 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 05:06:06 localhost nova_compute[281415]: 2025-11-26 10:06:06.748 281419 WARNING nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 05:06:06 localhost nova_compute[281415]: 2025-11-26 10:06:06.749 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=11214MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 05:06:06 localhost nova_compute[281415]: 2025-11-26 10:06:06.750 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:06:06 localhost nova_compute[281415]: 2025-11-26 10:06:06.750 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:06:06 localhost nova_compute[281415]: 2025-11-26 10:06:06.811 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 05:06:06 localhost nova_compute[281415]: 2025-11-26 10:06:06.811 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 05:06:06 localhost nova_compute[281415]: 2025-11-26 10:06:06.812 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 05:06:06 localhost nova_compute[281415]: 2025-11-26 10:06:06.852 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 05:06:06 localhost ceph-mon[297296]: Adjusting osd_memory_target on np0005536118.localdomain to 836.6M Nov 26 05:06:06 localhost ceph-mon[297296]: Unable to set osd_memory_target on np0005536118.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 26 05:06:06 localhost ceph-mon[297296]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0. Nov 26 05:06:06 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:06:06.894794) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 26 05:06:06 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46 Nov 26 05:06:06 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151566894876, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 869, "num_deletes": 254, "total_data_size": 1042850, "memory_usage": 1058848, "flush_reason": "Manual Compaction"} Nov 26 05:06:06 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started Nov 26 05:06:06 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e199 e199: 6 total, 6 up, 6 in Nov 26 05:06:06 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151566901971, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 683268, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 27745, "largest_seqno": 28609, "table_properties": {"data_size": 679011, "index_size": 1920, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 10987, "raw_average_key_size": 21, "raw_value_size": 670015, "raw_average_value_size": 1334, "num_data_blocks": 77, "num_entries": 502, "num_filter_entries": 502, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764151538, "oldest_key_time": 1764151538, "file_creation_time": 1764151566, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79fcc812-ff89-4330-9c0d-148e92884770", "db_session_id": "NHY1MHQN9GMTALZ562AS", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}} Nov 26 05:06:06 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 7214 microseconds, and 3407 cpu microseconds. Nov 26 05:06:06 localhost ceph-mon[297296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 05:06:06 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:06:06.902019) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 683268 bytes OK Nov 26 05:06:06 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:06:06.902047) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started Nov 26 05:06:06 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:06:06.903953) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done Nov 26 05:06:06 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:06:06.903975) EVENT_LOG_v1 {"time_micros": 1764151566903966, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 26 05:06:06 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:06:06.903997) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 26 05:06:06 localhost ceph-mon[297296]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 1038133, prev total WAL file size 1038174, number of live WAL files 2. Nov 26 05:06:06 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 05:06:06 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:06:06.904668) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132303438' seq:72057594037927935, type:22 .. '7061786F73003132333030' seq:0, type:0; will stop at (end) Nov 26 05:06:06 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 26 05:06:06 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(667KB)], [45(17MB)] Nov 26 05:06:06 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151566904725, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 18811187, "oldest_snapshot_seqno": -1} Nov 26 05:06:07 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 12779 keys, 17588722 bytes, temperature: kUnknown Nov 26 05:06:07 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151567000163, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 17588722, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17513290, "index_size": 42371, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 32005, "raw_key_size": 342552, "raw_average_key_size": 26, "raw_value_size": 17293101, "raw_average_value_size": 1353, "num_data_blocks": 1602, "num_entries": 12779, "num_filter_entries": 12779, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764150724, "oldest_key_time": 0, "file_creation_time": 1764151566, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79fcc812-ff89-4330-9c0d-148e92884770", "db_session_id": "NHY1MHQN9GMTALZ562AS", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}} Nov 26 05:06:07 localhost ceph-mon[297296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 05:06:07 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:06:07.000490) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 17588722 bytes Nov 26 05:06:07 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:06:07.001915) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 196.9 rd, 184.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 17.3 +0.0 blob) out(16.8 +0.0 blob), read-write-amplify(53.3) write-amplify(25.7) OK, records in: 13311, records dropped: 532 output_compression: NoCompression Nov 26 05:06:07 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:06:07.001958) EVENT_LOG_v1 {"time_micros": 1764151567001924, "job": 26, "event": "compaction_finished", "compaction_time_micros": 95527, "compaction_time_cpu_micros": 52142, "output_level": 6, "num_output_files": 1, "total_output_size": 17588722, "num_input_records": 13311, "num_output_records": 12779, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 26 05:06:07 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 05:06:07 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151567002205, "job": 26, "event": "table_file_deletion", "file_number": 47} Nov 26 05:06:07 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 05:06:07 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151567004171, "job": 26, "event": "table_file_deletion", "file_number": 45} Nov 26 05:06:07 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:06:06.904588) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:06:07 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:06:07.004284) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:06:07 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:06:07.004293) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:06:07 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:06:07.004295) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:06:07 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:06:07.004298) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:06:07 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:06:07.004300) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:06:07 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 05:06:07 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1485475966' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 05:06:07 localhost nova_compute[281415]: 2025-11-26 10:06:07.354 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.502s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 05:06:07 localhost nova_compute[281415]: 2025-11-26 10:06:07.363 281419 DEBUG nova.compute.provider_tree [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 05:06:07 localhost nova_compute[281415]: 2025-11-26 10:06:07.386 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 05:06:07 localhost nova_compute[281415]: 2025-11-26 10:06:07.388 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 05:06:07 localhost nova_compute[281415]: 2025-11-26 10:06:07.389 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.639s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:06:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 05:06:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 05:06:07 localhost podman[326871]: 2025-11-26 10:06:07.841110706 +0000 UTC m=+0.090406682 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 05:06:07 localhost podman[326871]: 2025-11-26 10:06:07.853715966 +0000 UTC m=+0.103011982 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 26 05:06:07 localhost nova_compute[281415]: 2025-11-26 10:06:07.871 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:07 localhost podman[326872]: 2025-11-26 10:06:07.893500669 +0000 UTC m=+0.141073552 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:06:07 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e200 e200: 6 total, 6 up, 6 in Nov 26 05:06:07 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 05:06:07 localhost podman[326872]: 2025-11-26 10:06:07.930868807 +0000 UTC m=+0.178441700 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, config_id=edpm) Nov 26 05:06:07 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 05:06:08 localhost nova_compute[281415]: 2025-11-26 10:06:08.385 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:06:08 localhost nova_compute[281415]: 2025-11-26 10:06:08.407 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:06:08 localhost nova_compute[281415]: 2025-11-26 10:06:08.408 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:06:08 localhost nova_compute[281415]: 2025-11-26 10:06:08.408 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:06:08 localhost nova_compute[281415]: 2025-11-26 10:06:08.409 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 05:06:08 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e201 e201: 6 total, 6 up, 6 in Nov 26 05:06:08 localhost nova_compute[281415]: 2025-11-26 10:06:08.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:06:08 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:06:09 localhost nova_compute[281415]: 2025-11-26 10:06:09.155 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:09 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:06:09 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:06:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 26 05:06:09 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 8400.1 total, 600.0 interval#012Cumulative writes: 13K writes, 52K keys, 13K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.00 MB/s#012Cumulative WAL: 13K writes, 4567 syncs, 3.06 writes per sync, written: 0.04 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 8995 writes, 30K keys, 8995 commit groups, 1.0 writes per commit group, ingest: 18.88 MB, 0.03 MB/s#012Interval WAL: 8995 writes, 3889 syncs, 2.31 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 26 05:06:09 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e201 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:06:09 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e202 e202: 6 total, 6 up, 6 in Nov 26 05:06:10 localhost nova_compute[281415]: 2025-11-26 10:06:10.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:06:10 localhost nova_compute[281415]: 2025-11-26 10:06:10.848 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 05:06:10 localhost nova_compute[281415]: 2025-11-26 10:06:10.849 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 05:06:10 localhost nova_compute[281415]: 2025-11-26 10:06:10.975 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 05:06:10 localhost nova_compute[281415]: 2025-11-26 10:06:10.975 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 05:06:10 localhost nova_compute[281415]: 2025-11-26 10:06:10.976 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 05:06:10 localhost nova_compute[281415]: 2025-11-26 10:06:10.976 281419 DEBUG nova.objects.instance [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 05:06:10 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e203 e203: 6 total, 6 up, 6 in Nov 26 05:06:11 localhost nova_compute[281415]: 2025-11-26 10:06:11.135 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:11 localhost ovn_metadata_agent[159481]: 2025-11-26 10:06:11.135 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:5e:b4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '86:cf:7c:68:02:df'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:06:11 localhost ovn_metadata_agent[159481]: 2025-11-26 10:06:11.137 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 26 05:06:11 localhost nova_compute[281415]: 2025-11-26 10:06:11.389 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 05:06:11 localhost nova_compute[281415]: 2025-11-26 10:06:11.407 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 05:06:11 localhost nova_compute[281415]: 2025-11-26 10:06:11.408 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 05:06:11 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:06:11 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:06:12 localhost neutron_sriov_agent[255515]: 2025-11-26 10:06:12.067 2 INFO neutron.agent.securitygroups_rpc [None req-5853f316-82fc-435a-9c12-61ca72009667 e3850b3461aa415d9384b57d59527d33 b95ca0c66f9343078bd952fd58e11f91 - - default default] Security group rule updated ['14ca9467-db0c-4b5e-a29b-5f8bd80e4937']#033[00m Nov 26 05:06:12 localhost neutron_sriov_agent[255515]: 2025-11-26 10:06:12.164 2 INFO neutron.agent.securitygroups_rpc [None req-fa8983e9-a24a-4a60-99fa-588380881572 e3850b3461aa415d9384b57d59527d33 b95ca0c66f9343078bd952fd58e11f91 - - default default] Security group rule updated ['14ca9467-db0c-4b5e-a29b-5f8bd80e4937']#033[00m Nov 26 05:06:12 localhost neutron_sriov_agent[255515]: 2025-11-26 10:06:12.622 2 INFO neutron.agent.securitygroups_rpc [None req-b2b3480d-8e3c-4855-9ea9-e44c001cd92d e3850b3461aa415d9384b57d59527d33 b95ca0c66f9343078bd952fd58e11f91 - - default default] Security group rule updated ['c4405d3a-5c5f-4dd2-94ba-0b177d9becdf']#033[00m Nov 26 05:06:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 05:06:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 05:06:12 localhost neutron_sriov_agent[255515]: 2025-11-26 10:06:12.827 2 INFO neutron.agent.securitygroups_rpc [None req-1bed7176-3974-43b1-b8d6-0fef30fd9fdc e3850b3461aa415d9384b57d59527d33 b95ca0c66f9343078bd952fd58e11f91 - - default default] Security group rule updated ['c4405d3a-5c5f-4dd2-94ba-0b177d9becdf']#033[00m Nov 26 05:06:12 localhost systemd[1]: tmp-crun.hnJ3bn.mount: Deactivated successfully. Nov 26 05:06:12 localhost podman[326915]: 2025-11-26 10:06:12.855270483 +0000 UTC m=+0.110003359 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, name=ubi9-minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=edpm, version=9.6) Nov 26 05:06:12 localhost podman[326915]: 2025-11-26 10:06:12.874173349 +0000 UTC m=+0.128906185 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.buildah.version=1.33.7, name=ubi9-minimal, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, config_id=edpm, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Nov 26 05:06:12 localhost nova_compute[281415]: 2025-11-26 10:06:12.874 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:12 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 05:06:12 localhost podman[326914]: 2025-11-26 10:06:12.94653085 +0000 UTC m=+0.201208825 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true) Nov 26 05:06:13 localhost podman[326914]: 2025-11-26 10:06:13.017480589 +0000 UTC m=+0.272158564 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:06:13 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1408210637", "format": "json"} : dispatch Nov 26 05:06:13 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1408210637", "caps": ["mds", "allow rw path=/volumes/_nogroup/9fa9d05c-ff76-44e1-af53-b1c099c1a475/ccc7563c-ea0d-492f-be71-fb20bbbca86e", "osd", "allow rw pool=manila_data namespace=fsvolumens_9fa9d05c-ff76-44e1-af53-b1c099c1a475", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:06:13 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1408210637", "caps": ["mds", "allow rw path=/volumes/_nogroup/9fa9d05c-ff76-44e1-af53-b1c099c1a475/ccc7563c-ea0d-492f-be71-fb20bbbca86e", "osd", "allow rw pool=manila_data namespace=fsvolumens_9fa9d05c-ff76-44e1-af53-b1c099c1a475", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:06:13 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1408210637", "caps": ["mds", "allow rw path=/volumes/_nogroup/9fa9d05c-ff76-44e1-af53-b1c099c1a475/ccc7563c-ea0d-492f-be71-fb20bbbca86e", "osd", "allow rw pool=manila_data namespace=fsvolumens_9fa9d05c-ff76-44e1-af53-b1c099c1a475", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:06:13 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 05:06:13 localhost neutron_sriov_agent[255515]: 2025-11-26 10:06:13.083 2 INFO neutron.agent.securitygroups_rpc [None req-41ec0012-8cba-4b46-b456-43cbc5872e21 e3850b3461aa415d9384b57d59527d33 b95ca0c66f9343078bd952fd58e11f91 - - default default] Security group rule updated ['c4405d3a-5c5f-4dd2-94ba-0b177d9becdf']#033[00m Nov 26 05:06:13 localhost neutron_sriov_agent[255515]: 2025-11-26 10:06:13.252 2 INFO neutron.agent.securitygroups_rpc [None req-b68ca1cd-9fa3-4b8c-81aa-19e7da2ac29e e3850b3461aa415d9384b57d59527d33 b95ca0c66f9343078bd952fd58e11f91 - - default default] Security group rule updated ['c4405d3a-5c5f-4dd2-94ba-0b177d9becdf']#033[00m Nov 26 05:06:13 localhost neutron_sriov_agent[255515]: 2025-11-26 10:06:13.429 2 INFO neutron.agent.securitygroups_rpc [None req-af314f47-4287-4af4-8555-944a5b66cee1 e3850b3461aa415d9384b57d59527d33 b95ca0c66f9343078bd952fd58e11f91 - - default default] Security group rule updated ['c4405d3a-5c5f-4dd2-94ba-0b177d9becdf']#033[00m Nov 26 05:06:13 localhost neutron_sriov_agent[255515]: 2025-11-26 10:06:13.589 2 INFO neutron.agent.securitygroups_rpc [None req-3815965d-da3f-4deb-9a11-85d83284b90f e3850b3461aa415d9384b57d59527d33 b95ca0c66f9343078bd952fd58e11f91 - - default default] Security group rule updated ['c4405d3a-5c5f-4dd2-94ba-0b177d9becdf']#033[00m Nov 26 05:06:13 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e204 e204: 6 total, 6 up, 6 in Nov 26 05:06:13 localhost neutron_sriov_agent[255515]: 2025-11-26 10:06:13.772 2 INFO neutron.agent.securitygroups_rpc [None req-aef8f752-f417-49f4-8186-cf79fe153771 e3850b3461aa415d9384b57d59527d33 b95ca0c66f9343078bd952fd58e11f91 - - default default] Security group rule updated ['c4405d3a-5c5f-4dd2-94ba-0b177d9becdf']#033[00m Nov 26 05:06:14 localhost neutron_sriov_agent[255515]: 2025-11-26 10:06:14.014 2 INFO neutron.agent.securitygroups_rpc [None req-bf67774d-0962-4f5f-9e60-e4cdfa6a3a90 e3850b3461aa415d9384b57d59527d33 b95ca0c66f9343078bd952fd58e11f91 - - default default] Security group rule updated ['c4405d3a-5c5f-4dd2-94ba-0b177d9becdf']#033[00m Nov 26 05:06:14 localhost nova_compute[281415]: 2025-11-26 10:06:14.187 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:14 localhost neutron_sriov_agent[255515]: 2025-11-26 10:06:14.194 2 INFO neutron.agent.securitygroups_rpc [None req-3715301a-9c61-49e5-8a1d-6234190a5bda e3850b3461aa415d9384b57d59527d33 b95ca0c66f9343078bd952fd58e11f91 - - default default] Security group rule updated ['c4405d3a-5c5f-4dd2-94ba-0b177d9becdf']#033[00m Nov 26 05:06:14 localhost neutron_sriov_agent[255515]: 2025-11-26 10:06:14.492 2 INFO neutron.agent.securitygroups_rpc [None req-3ee227db-0e7d-4801-8968-e1c54916f9df e3850b3461aa415d9384b57d59527d33 b95ca0c66f9343078bd952fd58e11f91 - - default default] Security group rule updated ['c4405d3a-5c5f-4dd2-94ba-0b177d9becdf']#033[00m Nov 26 05:06:14 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e204 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:06:14 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:06:14.822 262471 INFO neutron.agent.linux.ip_lib [None req-d5505e28-c7c2-418a-8ee3-83f752351a23 - - - - - -] Device tapb2e3b1e1-0a cannot be used as it has no MAC address#033[00m Nov 26 05:06:14 localhost nova_compute[281415]: 2025-11-26 10:06:14.851 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:14 localhost kernel: device tapb2e3b1e1-0a entered promiscuous mode Nov 26 05:06:14 localhost nova_compute[281415]: 2025-11-26 10:06:14.860 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:14 localhost NetworkManager[5970]: [1764151574.8609] manager: (tapb2e3b1e1-0a): new Generic device (/org/freedesktop/NetworkManager/Devices/75) Nov 26 05:06:14 localhost ovn_controller[153664]: 2025-11-26T10:06:14Z|00484|binding|INFO|Claiming lport b2e3b1e1-0ac3-47a8-9e74-3e7dfe61aa8a for this chassis. Nov 26 05:06:14 localhost ovn_controller[153664]: 2025-11-26T10:06:14Z|00485|binding|INFO|b2e3b1e1-0ac3-47a8-9e74-3e7dfe61aa8a: Claiming unknown Nov 26 05:06:14 localhost systemd-udevd[326971]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:06:14 localhost ovn_metadata_agent[159481]: 2025-11-26 10:06:14.873 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-c6c96c4d-a15e-43c2-92db-590d17ac6c96', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c6c96c4d-a15e-43c2-92db-590d17ac6c96', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '25af506c54f34bb4baedfbfac942ffa0', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82bcb304-c12b-4e0e-8d67-c4a3965eb271, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=b2e3b1e1-0ac3-47a8-9e74-3e7dfe61aa8a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:06:14 localhost ovn_metadata_agent[159481]: 2025-11-26 10:06:14.875 159486 INFO neutron.agent.ovn.metadata.agent [-] Port b2e3b1e1-0ac3-47a8-9e74-3e7dfe61aa8a in datapath c6c96c4d-a15e-43c2-92db-590d17ac6c96 bound to our chassis#033[00m Nov 26 05:06:14 localhost ovn_metadata_agent[159481]: 2025-11-26 10:06:14.876 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c6c96c4d-a15e-43c2-92db-590d17ac6c96 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:06:14 localhost ovn_metadata_agent[159481]: 2025-11-26 10:06:14.877 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[1ae7af4a-b17a-4124-b688-4d7657d31418]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:06:14 localhost journal[229445]: ethtool ioctl error on tapb2e3b1e1-0a: No such device Nov 26 05:06:14 localhost journal[229445]: ethtool ioctl error on tapb2e3b1e1-0a: No such device Nov 26 05:06:14 localhost ovn_controller[153664]: 2025-11-26T10:06:14Z|00486|binding|INFO|Setting lport b2e3b1e1-0ac3-47a8-9e74-3e7dfe61aa8a ovn-installed in OVS Nov 26 05:06:14 localhost ovn_controller[153664]: 2025-11-26T10:06:14Z|00487|binding|INFO|Setting lport b2e3b1e1-0ac3-47a8-9e74-3e7dfe61aa8a up in Southbound Nov 26 05:06:14 localhost nova_compute[281415]: 2025-11-26 10:06:14.904 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:14 localhost journal[229445]: ethtool ioctl error on tapb2e3b1e1-0a: No such device Nov 26 05:06:14 localhost journal[229445]: ethtool ioctl error on tapb2e3b1e1-0a: No such device Nov 26 05:06:14 localhost journal[229445]: ethtool ioctl error on tapb2e3b1e1-0a: No such device Nov 26 05:06:14 localhost journal[229445]: ethtool ioctl error on tapb2e3b1e1-0a: No such device Nov 26 05:06:14 localhost journal[229445]: ethtool ioctl error on tapb2e3b1e1-0a: No such device Nov 26 05:06:14 localhost journal[229445]: ethtool ioctl error on tapb2e3b1e1-0a: No such device Nov 26 05:06:14 localhost nova_compute[281415]: 2025-11-26 10:06:14.952 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:14 localhost nova_compute[281415]: 2025-11-26 10:06:14.981 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:15 localhost ovn_metadata_agent[159481]: 2025-11-26 10:06:15.139 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fad182b-d1fd-4eb1-a4d3-436a76a6f49e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 05:06:15 localhost neutron_sriov_agent[255515]: 2025-11-26 10:06:15.270 2 INFO neutron.agent.securitygroups_rpc [None req-bfc55d66-fa2e-4bcd-aaa9-6a6d9aa0a5ec e3850b3461aa415d9384b57d59527d33 b95ca0c66f9343078bd952fd58e11f91 - - default default] Security group rule updated ['58f5a126-9aad-4387-9c0e-d334916deddc']#033[00m Nov 26 05:06:15 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e205 e205: 6 total, 6 up, 6 in Nov 26 05:06:15 localhost openstack_network_exporter[242153]: ERROR 10:06:15 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 05:06:15 localhost openstack_network_exporter[242153]: ERROR 10:06:15 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 05:06:15 localhost openstack_network_exporter[242153]: Nov 26 05:06:15 localhost openstack_network_exporter[242153]: ERROR 10:06:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:06:15 localhost openstack_network_exporter[242153]: ERROR 10:06:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:06:15 localhost openstack_network_exporter[242153]: ERROR 10:06:15 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 05:06:15 localhost openstack_network_exporter[242153]: Nov 26 05:06:15 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:06:15 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:06:15 localhost neutron_sriov_agent[255515]: 2025-11-26 10:06:15.919 2 INFO neutron.agent.securitygroups_rpc [None req-155dddc6-e29e-4301-b702-1d3e12b811c8 e3850b3461aa415d9384b57d59527d33 b95ca0c66f9343078bd952fd58e11f91 - - default default] Security group rule updated ['485fc1ac-0d0c-4a73-b793-c518c13e7818']#033[00m Nov 26 05:06:15 localhost podman[327043]: Nov 26 05:06:15 localhost podman[327043]: 2025-11-26 10:06:15.99459367 +0000 UTC m=+0.116769198 container create 03931ba36fd3aec4b0088fb8ea7eba9f0c58bae408072454b8bf478de320764b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c6c96c4d-a15e-43c2-92db-590d17ac6c96, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3) Nov 26 05:06:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 05:06:16 localhost systemd[1]: Started libpod-conmon-03931ba36fd3aec4b0088fb8ea7eba9f0c58bae408072454b8bf478de320764b.scope. Nov 26 05:06:16 localhost neutron_sriov_agent[255515]: 2025-11-26 10:06:16.032 2 INFO neutron.agent.securitygroups_rpc [None req-f78c387f-8813-4e23-a91f-59aec513e5b4 e3850b3461aa415d9384b57d59527d33 b95ca0c66f9343078bd952fd58e11f91 - - default default] Security group rule updated ['485fc1ac-0d0c-4a73-b793-c518c13e7818']#033[00m Nov 26 05:06:16 localhost podman[327043]: 2025-11-26 10:06:15.940975009 +0000 UTC m=+0.063150597 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:06:16 localhost systemd[1]: Started libcrun container. Nov 26 05:06:16 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/861db8e9e3e2368f86c3d0d56797fa1378a6c70c22343a6763cd2a59cbaaaddd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:06:16 localhost podman[327043]: 2025-11-26 10:06:16.066280281 +0000 UTC m=+0.188455799 container init 03931ba36fd3aec4b0088fb8ea7eba9f0c58bae408072454b8bf478de320764b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c6c96c4d-a15e-43c2-92db-590d17ac6c96, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, io.buildah.version=1.41.3) Nov 26 05:06:16 localhost podman[327043]: 2025-11-26 10:06:16.076467147 +0000 UTC m=+0.198642665 container start 03931ba36fd3aec4b0088fb8ea7eba9f0c58bae408072454b8bf478de320764b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c6c96c4d-a15e-43c2-92db-590d17ac6c96, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 26 05:06:16 localhost dnsmasq[327071]: started, version 2.85 cachesize 150 Nov 26 05:06:16 localhost dnsmasq[327071]: DNS service limited to local subnets Nov 26 05:06:16 localhost dnsmasq[327071]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:06:16 localhost dnsmasq[327071]: warning: no upstream servers configured Nov 26 05:06:16 localhost dnsmasq-dhcp[327071]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 26 05:06:16 localhost dnsmasq[327071]: read /var/lib/neutron/dhcp/c6c96c4d-a15e-43c2-92db-590d17ac6c96/addn_hosts - 0 addresses Nov 26 05:06:16 localhost dnsmasq-dhcp[327071]: read /var/lib/neutron/dhcp/c6c96c4d-a15e-43c2-92db-590d17ac6c96/host Nov 26 05:06:16 localhost dnsmasq-dhcp[327071]: read /var/lib/neutron/dhcp/c6c96c4d-a15e-43c2-92db-590d17ac6c96/opts Nov 26 05:06:16 localhost podman[327057]: 2025-11-26 10:06:16.151213833 +0000 UTC m=+0.107553533 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 26 05:06:16 localhost podman[327057]: 2025-11-26 10:06:16.161588434 +0000 UTC m=+0.117927464 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 05:06:16 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 05:06:16 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:06:16.264 262471 INFO neutron.agent.dhcp.agent [None req-882ee22b-1b54-4055-9b04-cf01a11a4dc1 - - - - - -] DHCP configuration for ports {'f1de8deb-183e-48a9-a88b-82c0e379a21a'} is completed#033[00m Nov 26 05:06:16 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1408210637", "format": "json"} : dispatch Nov 26 05:06:16 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1408210637"} : dispatch Nov 26 05:06:16 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1408210637"} : dispatch Nov 26 05:06:16 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1408210637"}]': finished Nov 26 05:06:16 localhost nova_compute[281415]: 2025-11-26 10:06:16.689 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:16 localhost neutron_sriov_agent[255515]: 2025-11-26 10:06:16.895 2 INFO neutron.agent.securitygroups_rpc [None req-d8e37378-f908-4781-bfd7-ee09fe58d27d e3850b3461aa415d9384b57d59527d33 b95ca0c66f9343078bd952fd58e11f91 - - default default] Security group rule updated ['efab2688-3b83-45d0-8598-88d9cc4a5fab']#033[00m Nov 26 05:06:17 localhost neutron_sriov_agent[255515]: 2025-11-26 10:06:17.079 2 INFO neutron.agent.securitygroups_rpc [None req-a49a06b4-6f0f-46c8-96ee-60dff93e999f e3850b3461aa415d9384b57d59527d33 b95ca0c66f9343078bd952fd58e11f91 - - default default] Security group rule updated ['efab2688-3b83-45d0-8598-88d9cc4a5fab']#033[00m Nov 26 05:06:17 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:06:17.312 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:06:17Z, description=, device_id=847a021b-6135-4bcc-82c3-a845652255f1, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=d443cdee-4ec5-4fc1-a6dd-e72e893fde92, ip_allocation=immediate, mac_address=fa:16:3e:9e:15:7f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:06:13Z, description=, dns_domain=, id=c6c96c4d-a15e-43c2-92db-590d17ac6c96, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesSnapshotTestJSON-1643873068-network, port_security_enabled=True, project_id=25af506c54f34bb4baedfbfac942ffa0, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=5007, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3137, status=ACTIVE, subnets=['e9bc8532-db9f-44b0-83b5-0dadb6108fcc'], tags=[], tenant_id=25af506c54f34bb4baedfbfac942ffa0, updated_at=2025-11-26T10:06:13Z, vlan_transparent=None, network_id=c6c96c4d-a15e-43c2-92db-590d17ac6c96, port_security_enabled=False, project_id=25af506c54f34bb4baedfbfac942ffa0, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3173, status=DOWN, tags=[], tenant_id=25af506c54f34bb4baedfbfac942ffa0, updated_at=2025-11-26T10:06:17Z on network c6c96c4d-a15e-43c2-92db-590d17ac6c96#033[00m Nov 26 05:06:17 localhost podman[327105]: 2025-11-26 10:06:17.524294765 +0000 UTC m=+0.052709593 container kill 03931ba36fd3aec4b0088fb8ea7eba9f0c58bae408072454b8bf478de320764b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c6c96c4d-a15e-43c2-92db-590d17ac6c96, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:06:17 localhost dnsmasq[327071]: read /var/lib/neutron/dhcp/c6c96c4d-a15e-43c2-92db-590d17ac6c96/addn_hosts - 1 addresses Nov 26 05:06:17 localhost dnsmasq-dhcp[327071]: read /var/lib/neutron/dhcp/c6c96c4d-a15e-43c2-92db-590d17ac6c96/host Nov 26 05:06:17 localhost dnsmasq-dhcp[327071]: read /var/lib/neutron/dhcp/c6c96c4d-a15e-43c2-92db-590d17ac6c96/opts Nov 26 05:06:17 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e206 e206: 6 total, 6 up, 6 in Nov 26 05:06:17 localhost neutron_sriov_agent[255515]: 2025-11-26 10:06:17.807 2 INFO neutron.agent.securitygroups_rpc [None req-c6783789-84f8-403a-9850-7306cb5075fc e3850b3461aa415d9384b57d59527d33 b95ca0c66f9343078bd952fd58e11f91 - - default default] Security group rule updated ['4ff74c14-4a48-4905-9f6d-d6058270dbbf']#033[00m Nov 26 05:06:17 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:06:17.839 262471 INFO neutron.agent.dhcp.agent [None req-07d47bf3-007d-47e6-91b7-cd4756c0ce52 - - - - - -] DHCP configuration for ports {'d443cdee-4ec5-4fc1-a6dd-e72e893fde92'} is completed#033[00m Nov 26 05:06:17 localhost nova_compute[281415]: 2025-11-26 10:06:17.920 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:18 localhost neutron_sriov_agent[255515]: 2025-11-26 10:06:18.495 2 INFO neutron.agent.securitygroups_rpc [None req-bd7fdfcd-1b4e-4135-a74a-a339cdc4e2f1 e3850b3461aa415d9384b57d59527d33 b95ca0c66f9343078bd952fd58e11f91 - - default default] Security group rule updated ['4ff74c14-4a48-4905-9f6d-d6058270dbbf']#033[00m Nov 26 05:06:18 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:06:18.696 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:06:17Z, description=, device_id=847a021b-6135-4bcc-82c3-a845652255f1, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=d443cdee-4ec5-4fc1-a6dd-e72e893fde92, ip_allocation=immediate, mac_address=fa:16:3e:9e:15:7f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:06:13Z, description=, dns_domain=, id=c6c96c4d-a15e-43c2-92db-590d17ac6c96, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesSnapshotTestJSON-1643873068-network, port_security_enabled=True, project_id=25af506c54f34bb4baedfbfac942ffa0, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=5007, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3137, status=ACTIVE, subnets=['e9bc8532-db9f-44b0-83b5-0dadb6108fcc'], tags=[], tenant_id=25af506c54f34bb4baedfbfac942ffa0, updated_at=2025-11-26T10:06:13Z, vlan_transparent=None, network_id=c6c96c4d-a15e-43c2-92db-590d17ac6c96, port_security_enabled=False, project_id=25af506c54f34bb4baedfbfac942ffa0, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3173, status=DOWN, tags=[], tenant_id=25af506c54f34bb4baedfbfac942ffa0, updated_at=2025-11-26T10:06:17Z on network c6c96c4d-a15e-43c2-92db-590d17ac6c96#033[00m Nov 26 05:06:18 localhost neutron_sriov_agent[255515]: 2025-11-26 10:06:18.810 2 INFO neutron.agent.securitygroups_rpc [None req-9b6a89db-ef6d-46bc-8169-370fdc2f0dda e3850b3461aa415d9384b57d59527d33 b95ca0c66f9343078bd952fd58e11f91 - - default default] Security group rule updated ['4ff74c14-4a48-4905-9f6d-d6058270dbbf']#033[00m Nov 26 05:06:18 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:06:18 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:06:18 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:06:18.945 262471 INFO neutron.agent.linux.ip_lib [None req-8a17fb27-1510-4d3a-9c1e-78c31b5d4c39 - - - - - -] Device tap0c42f84a-24 cannot be used as it has no MAC address#033[00m Nov 26 05:06:18 localhost systemd[1]: tmp-crun.rcCb8v.mount: Deactivated successfully. Nov 26 05:06:19 localhost dnsmasq[327071]: read /var/lib/neutron/dhcp/c6c96c4d-a15e-43c2-92db-590d17ac6c96/addn_hosts - 1 addresses Nov 26 05:06:19 localhost dnsmasq-dhcp[327071]: read /var/lib/neutron/dhcp/c6c96c4d-a15e-43c2-92db-590d17ac6c96/host Nov 26 05:06:19 localhost dnsmasq-dhcp[327071]: read /var/lib/neutron/dhcp/c6c96c4d-a15e-43c2-92db-590d17ac6c96/opts Nov 26 05:06:19 localhost nova_compute[281415]: 2025-11-26 10:06:19.002 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:19 localhost podman[327149]: 2025-11-26 10:06:19.003120585 +0000 UTC m=+0.095684796 container kill 03931ba36fd3aec4b0088fb8ea7eba9f0c58bae408072454b8bf478de320764b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c6c96c4d-a15e-43c2-92db-590d17ac6c96, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 26 05:06:19 localhost kernel: device tap0c42f84a-24 entered promiscuous mode Nov 26 05:06:19 localhost NetworkManager[5970]: [1764151579.0116] manager: (tap0c42f84a-24): new Generic device (/org/freedesktop/NetworkManager/Devices/76) Nov 26 05:06:19 localhost ovn_controller[153664]: 2025-11-26T10:06:19Z|00488|binding|INFO|Claiming lport 0c42f84a-2482-4da6-b9da-00ce975f9ad2 for this chassis. Nov 26 05:06:19 localhost ovn_controller[153664]: 2025-11-26T10:06:19Z|00489|binding|INFO|0c42f84a-2482-4da6-b9da-00ce975f9ad2: Claiming unknown Nov 26 05:06:19 localhost nova_compute[281415]: 2025-11-26 10:06:19.012 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:19 localhost systemd-udevd[327167]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:06:19 localhost ovn_metadata_agent[159481]: 2025-11-26 10:06:19.023 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-4890ed8f-d73a-4106-bf53-9d2cfe52e760', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4890ed8f-d73a-4106-bf53-9d2cfe52e760', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b23390764b654d8db9370e455401b357', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=69014c16-e0ff-4371-83cb-2c858ed8bf9b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=0c42f84a-2482-4da6-b9da-00ce975f9ad2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:06:19 localhost ovn_metadata_agent[159481]: 2025-11-26 10:06:19.025 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 0c42f84a-2482-4da6-b9da-00ce975f9ad2 in datapath 4890ed8f-d73a-4106-bf53-9d2cfe52e760 bound to our chassis#033[00m Nov 26 05:06:19 localhost ovn_metadata_agent[159481]: 2025-11-26 10:06:19.029 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Port a4c73ff5-8761-41eb-963d-91c45cf43917 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 26 05:06:19 localhost ovn_metadata_agent[159481]: 2025-11-26 10:06:19.030 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4890ed8f-d73a-4106-bf53-9d2cfe52e760, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:06:19 localhost ovn_metadata_agent[159481]: 2025-11-26 10:06:19.031 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[1d15ef90-13e3-4dca-8701-d7282ead393f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:06:19 localhost nova_compute[281415]: 2025-11-26 10:06:19.037 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:19 localhost ovn_controller[153664]: 2025-11-26T10:06:19Z|00490|binding|INFO|Setting lport 0c42f84a-2482-4da6-b9da-00ce975f9ad2 ovn-installed in OVS Nov 26 05:06:19 localhost ovn_controller[153664]: 2025-11-26T10:06:19Z|00491|binding|INFO|Setting lport 0c42f84a-2482-4da6-b9da-00ce975f9ad2 up in Southbound Nov 26 05:06:19 localhost nova_compute[281415]: 2025-11-26 10:06:19.057 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:19 localhost nova_compute[281415]: 2025-11-26 10:06:19.108 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:19 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:06:19 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:06:19 localhost nova_compute[281415]: 2025-11-26 10:06:19.146 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:19 localhost nova_compute[281415]: 2025-11-26 10:06:19.189 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:19 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:06:19.291 262471 INFO neutron.agent.dhcp.agent [None req-4499bac9-dbb4-4a93-bfcf-17504bf49716 - - - - - -] DHCP configuration for ports {'d443cdee-4ec5-4fc1-a6dd-e72e893fde92'} is completed#033[00m Nov 26 05:06:19 localhost neutron_sriov_agent[255515]: 2025-11-26 10:06:19.348 2 INFO neutron.agent.securitygroups_rpc [None req-f937971b-c5b8-4bd0-9578-c629772c871a e3850b3461aa415d9384b57d59527d33 b95ca0c66f9343078bd952fd58e11f91 - - default default] Security group rule updated ['4ff74c14-4a48-4905-9f6d-d6058270dbbf']#033[00m Nov 26 05:06:19 localhost neutron_sriov_agent[255515]: 2025-11-26 10:06:19.604 2 INFO neutron.agent.securitygroups_rpc [None req-c0e38151-ef3e-4e07-9ccb-a5b0dfbf9fe0 e3850b3461aa415d9384b57d59527d33 b95ca0c66f9343078bd952fd58e11f91 - - default default] Security group rule updated ['4ff74c14-4a48-4905-9f6d-d6058270dbbf']#033[00m Nov 26 05:06:19 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e207 e207: 6 total, 6 up, 6 in Nov 26 05:06:19 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e207 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:06:19 localhost nova_compute[281415]: 2025-11-26 10:06:19.831 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:20 localhost neutron_sriov_agent[255515]: 2025-11-26 10:06:20.087 2 INFO neutron.agent.securitygroups_rpc [None req-af72e6a0-bce2-48ff-b07e-d7bb92e4ce57 e3850b3461aa415d9384b57d59527d33 b95ca0c66f9343078bd952fd58e11f91 - - default default] Security group rule updated ['4ff74c14-4a48-4905-9f6d-d6058270dbbf']#033[00m Nov 26 05:06:20 localhost podman[327230]: Nov 26 05:06:20 localhost podman[327230]: 2025-11-26 10:06:20.106698628 +0000 UTC m=+0.095440158 container create 21dd3021ae68c66b600d8c319046f03f24da5bb4047e28df00c3e6b6044400b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4890ed8f-d73a-4106-bf53-9d2cfe52e760, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Nov 26 05:06:20 localhost systemd[1]: Started libpod-conmon-21dd3021ae68c66b600d8c319046f03f24da5bb4047e28df00c3e6b6044400b8.scope. Nov 26 05:06:20 localhost podman[327230]: 2025-11-26 10:06:20.055973666 +0000 UTC m=+0.044715216 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:06:20 localhost systemd[1]: Started libcrun container. Nov 26 05:06:20 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8389bcfeccf59545a7a2d2123ae64fd08137805c40bcf28384e28a00cd22716e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:06:20 localhost podman[327230]: 2025-11-26 10:06:20.181284329 +0000 UTC m=+0.170025839 container init 21dd3021ae68c66b600d8c319046f03f24da5bb4047e28df00c3e6b6044400b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4890ed8f-d73a-4106-bf53-9d2cfe52e760, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:06:20 localhost podman[327230]: 2025-11-26 10:06:20.193901239 +0000 UTC m=+0.182642809 container start 21dd3021ae68c66b600d8c319046f03f24da5bb4047e28df00c3e6b6044400b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4890ed8f-d73a-4106-bf53-9d2cfe52e760, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS) Nov 26 05:06:20 localhost dnsmasq[327248]: started, version 2.85 cachesize 150 Nov 26 05:06:20 localhost dnsmasq[327248]: DNS service limited to local subnets Nov 26 05:06:20 localhost dnsmasq[327248]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:06:20 localhost dnsmasq[327248]: warning: no upstream servers configured Nov 26 05:06:20 localhost dnsmasq-dhcp[327248]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 26 05:06:20 localhost dnsmasq[327248]: read /var/lib/neutron/dhcp/4890ed8f-d73a-4106-bf53-9d2cfe52e760/addn_hosts - 0 addresses Nov 26 05:06:20 localhost dnsmasq-dhcp[327248]: read /var/lib/neutron/dhcp/4890ed8f-d73a-4106-bf53-9d2cfe52e760/host Nov 26 05:06:20 localhost dnsmasq-dhcp[327248]: read /var/lib/neutron/dhcp/4890ed8f-d73a-4106-bf53-9d2cfe52e760/opts Nov 26 05:06:20 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:06:20.361 262471 INFO neutron.agent.dhcp.agent [None req-38e2fcad-1f9f-49fc-9c6b-86dc57dd12fe - - - - - -] DHCP configuration for ports {'3b1f0745-6090-42aa-96fc-dd20f0dc28b9'} is completed#033[00m Nov 26 05:06:20 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:06:20.632 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:06:20Z, description=, device_id=e209acd7-752a-4151-8c90-dfc76485e6e8, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=8a05f9a9-0226-4e1c-aa02-b83a895cefab, ip_allocation=immediate, mac_address=fa:16:3e:c7:7e:fd, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:06:16Z, description=, dns_domain=, id=4890ed8f-d73a-4106-bf53-9d2cfe52e760, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PrometheusGabbiTest-2017321751-network, port_security_enabled=True, project_id=b23390764b654d8db9370e455401b357, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=666, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3165, status=ACTIVE, subnets=['837b2dd6-e92b-450e-9c56-5e53e4e4e2f5'], tags=[], tenant_id=b23390764b654d8db9370e455401b357, updated_at=2025-11-26T10:06:17Z, vlan_transparent=None, network_id=4890ed8f-d73a-4106-bf53-9d2cfe52e760, port_security_enabled=False, project_id=b23390764b654d8db9370e455401b357, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3184, status=DOWN, tags=[], tenant_id=b23390764b654d8db9370e455401b357, updated_at=2025-11-26T10:06:20Z on network 4890ed8f-d73a-4106-bf53-9d2cfe52e760#033[00m Nov 26 05:06:20 localhost dnsmasq[327248]: read /var/lib/neutron/dhcp/4890ed8f-d73a-4106-bf53-9d2cfe52e760/addn_hosts - 1 addresses Nov 26 05:06:20 localhost dnsmasq-dhcp[327248]: read /var/lib/neutron/dhcp/4890ed8f-d73a-4106-bf53-9d2cfe52e760/host Nov 26 05:06:20 localhost dnsmasq-dhcp[327248]: read /var/lib/neutron/dhcp/4890ed8f-d73a-4106-bf53-9d2cfe52e760/opts Nov 26 05:06:20 localhost podman[327267]: 2025-11-26 10:06:20.866808389 +0000 UTC m=+0.061656152 container kill 21dd3021ae68c66b600d8c319046f03f24da5bb4047e28df00c3e6b6044400b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4890ed8f-d73a-4106-bf53-9d2cfe52e760, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 26 05:06:20 localhost neutron_sriov_agent[255515]: 2025-11-26 10:06:20.873 2 INFO neutron.agent.securitygroups_rpc [None req-d601b765-b2d6-403a-b26d-e187842e5021 e3850b3461aa415d9384b57d59527d33 b95ca0c66f9343078bd952fd58e11f91 - - default default] Security group rule updated ['914251a0-eb3f-47b6-859a-28460bc6e230']#033[00m Nov 26 05:06:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 05:06:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 05:06:21 localhost podman[327285]: 2025-11-26 10:06:21.082018097 +0000 UTC m=+0.087640567 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:06:21 localhost podman[327285]: 2025-11-26 10:06:21.092317046 +0000 UTC m=+0.097939546 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS) Nov 26 05:06:21 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 05:06:21 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:06:21.113 262471 INFO neutron.agent.dhcp.agent [None req-75345588-de60-46bc-a799-897b9807ef2f - - - - - -] DHCP configuration for ports {'8a05f9a9-0226-4e1c-aa02-b83a895cefab'} is completed#033[00m Nov 26 05:06:21 localhost podman[327286]: 2025-11-26 10:06:21.194191133 +0000 UTC m=+0.195951472 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251118, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 26 05:06:21 localhost podman[327286]: 2025-11-26 10:06:21.23444519 +0000 UTC m=+0.236205529 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.build-date=20251118) Nov 26 05:06:21 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 05:06:22 localhost neutron_sriov_agent[255515]: 2025-11-26 10:06:22.168 2 INFO neutron.agent.securitygroups_rpc [None req-9b1ceb77-fd52-43f4-a3d9-ef9efcd46ca7 8790f5109ea84c179bc8b14923ff8237 48f18ef67d7544d6977d40f7cdc5b4ff - - default default] Security group rule updated ['2942cd33-cfc9-40b7-ad09-309a9a1e4d80']#033[00m Nov 26 05:06:22 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:06:22.242 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:06:20Z, description=, device_id=e209acd7-752a-4151-8c90-dfc76485e6e8, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=8a05f9a9-0226-4e1c-aa02-b83a895cefab, ip_allocation=immediate, mac_address=fa:16:3e:c7:7e:fd, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:06:16Z, description=, dns_domain=, id=4890ed8f-d73a-4106-bf53-9d2cfe52e760, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PrometheusGabbiTest-2017321751-network, port_security_enabled=True, project_id=b23390764b654d8db9370e455401b357, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=666, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3165, status=ACTIVE, subnets=['837b2dd6-e92b-450e-9c56-5e53e4e4e2f5'], tags=[], tenant_id=b23390764b654d8db9370e455401b357, updated_at=2025-11-26T10:06:17Z, vlan_transparent=None, network_id=4890ed8f-d73a-4106-bf53-9d2cfe52e760, port_security_enabled=False, project_id=b23390764b654d8db9370e455401b357, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3184, status=DOWN, tags=[], tenant_id=b23390764b654d8db9370e455401b357, updated_at=2025-11-26T10:06:20Z on network 4890ed8f-d73a-4106-bf53-9d2cfe52e760#033[00m Nov 26 05:06:22 localhost neutron_sriov_agent[255515]: 2025-11-26 10:06:22.327 2 INFO neutron.agent.securitygroups_rpc [None req-50e097dc-6dd2-488a-a96c-3df745ca9e25 8790f5109ea84c179bc8b14923ff8237 48f18ef67d7544d6977d40f7cdc5b4ff - - default default] Security group rule updated ['2942cd33-cfc9-40b7-ad09-309a9a1e4d80']#033[00m Nov 26 05:06:22 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 26 05:06:22 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/314302387' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 26 05:06:22 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 26 05:06:22 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/314302387' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 26 05:06:22 localhost dnsmasq[327248]: read /var/lib/neutron/dhcp/4890ed8f-d73a-4106-bf53-9d2cfe52e760/addn_hosts - 1 addresses Nov 26 05:06:22 localhost dnsmasq-dhcp[327248]: read /var/lib/neutron/dhcp/4890ed8f-d73a-4106-bf53-9d2cfe52e760/host Nov 26 05:06:22 localhost dnsmasq-dhcp[327248]: read /var/lib/neutron/dhcp/4890ed8f-d73a-4106-bf53-9d2cfe52e760/opts Nov 26 05:06:22 localhost podman[327339]: 2025-11-26 10:06:22.472965594 +0000 UTC m=+0.066049168 container kill 21dd3021ae68c66b600d8c319046f03f24da5bb4047e28df00c3e6b6044400b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4890ed8f-d73a-4106-bf53-9d2cfe52e760, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 26 05:06:22 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1408210637", "format": "json"} : dispatch Nov 26 05:06:22 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1408210637", "caps": ["mds", "allow rw path=/volumes/_nogroup/439bd99d-89c0-413c-a64e-e1fa83632ced/2b9732c2-1051-4817-bc7c-9132e10350e7", "osd", "allow rw pool=manila_data namespace=fsvolumens_439bd99d-89c0-413c-a64e-e1fa83632ced", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:06:22 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1408210637", "caps": ["mds", "allow rw path=/volumes/_nogroup/439bd99d-89c0-413c-a64e-e1fa83632ced/2b9732c2-1051-4817-bc7c-9132e10350e7", "osd", "allow rw pool=manila_data namespace=fsvolumens_439bd99d-89c0-413c-a64e-e1fa83632ced", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:06:22 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1408210637", "caps": ["mds", "allow rw path=/volumes/_nogroup/439bd99d-89c0-413c-a64e-e1fa83632ced/2b9732c2-1051-4817-bc7c-9132e10350e7", "osd", "allow rw pool=manila_data namespace=fsvolumens_439bd99d-89c0-413c-a64e-e1fa83632ced", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:06:22 localhost sshd[327360]: main: sshd: ssh-rsa algorithm is disabled Nov 26 05:06:22 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e208 e208: 6 total, 6 up, 6 in Nov 26 05:06:22 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:06:22.770 262471 INFO neutron.agent.dhcp.agent [None req-b630fad3-5be7-4430-bacc-7838d8cba730 - - - - - -] DHCP configuration for ports {'8a05f9a9-0226-4e1c-aa02-b83a895cefab'} is completed#033[00m Nov 26 05:06:22 localhost nova_compute[281415]: 2025-11-26 10:06:22.961 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:23 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e209 e209: 6 total, 6 up, 6 in Nov 26 05:06:24 localhost nova_compute[281415]: 2025-11-26 10:06:24.192 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:24 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e209 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:06:26 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:06:26 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:06:26 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1408210637", "format": "json"} : dispatch Nov 26 05:06:26 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1408210637"} : dispatch Nov 26 05:06:26 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1408210637"} : dispatch Nov 26 05:06:26 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1408210637"}]': finished Nov 26 05:06:27 localhost neutron_sriov_agent[255515]: 2025-11-26 10:06:27.013 2 INFO neutron.agent.securitygroups_rpc [None req-0da4d25b-d876-4334-9d78-523b3df2afc7 25bcc914f94e42b18ff5ffb3afe523c4 9742628fcf554550b7021556f4114164 - - default default] Security group member updated ['6315612a-0ef4-497e-b6b8-4837ad1394c8']#033[00m Nov 26 05:06:27 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:06:27 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:06:27 localhost podman[240049]: time="2025-11-26T10:06:27Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 05:06:27 localhost podman[240049]: @ - - [26/Nov/2025:10:06:27 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157512 "" "Go-http-client/1.1" Nov 26 05:06:27 localhost podman[240049]: @ - - [26/Nov/2025:10:06:27 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19736 "" "Go-http-client/1.1" Nov 26 05:06:27 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:06:27.658 262471 INFO neutron.agent.linux.ip_lib [None req-d7738397-fbe4-4a52-b37d-4ca0077e198e - - - - - -] Device tap3f21ff0b-5d cannot be used as it has no MAC address#033[00m Nov 26 05:06:27 localhost nova_compute[281415]: 2025-11-26 10:06:27.692 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:27 localhost kernel: device tap3f21ff0b-5d entered promiscuous mode Nov 26 05:06:27 localhost NetworkManager[5970]: [1764151587.6994] manager: (tap3f21ff0b-5d): new Generic device (/org/freedesktop/NetworkManager/Devices/77) Nov 26 05:06:27 localhost ovn_controller[153664]: 2025-11-26T10:06:27Z|00492|binding|INFO|Claiming lport 3f21ff0b-5dcc-4ed6-ae9a-3b99333ff9a7 for this chassis. Nov 26 05:06:27 localhost ovn_controller[153664]: 2025-11-26T10:06:27Z|00493|binding|INFO|3f21ff0b-5dcc-4ed6-ae9a-3b99333ff9a7: Claiming unknown Nov 26 05:06:27 localhost nova_compute[281415]: 2025-11-26 10:06:27.700 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:27 localhost systemd-udevd[327372]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:06:27 localhost ovn_metadata_agent[159481]: 2025-11-26 10:06:27.729 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-1cebb749-a443-461a-9a9a-67b64ad2e20e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1cebb749-a443-461a-9a9a-67b64ad2e20e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f4d16a61e2bd42aeadcc181e68b41c65', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3829556a-d366-497b-be33-b029ca4b7855, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=3f21ff0b-5dcc-4ed6-ae9a-3b99333ff9a7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:06:27 localhost ovn_metadata_agent[159481]: 2025-11-26 10:06:27.731 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 3f21ff0b-5dcc-4ed6-ae9a-3b99333ff9a7 in datapath 1cebb749-a443-461a-9a9a-67b64ad2e20e bound to our chassis#033[00m Nov 26 05:06:27 localhost ovn_controller[153664]: 2025-11-26T10:06:27Z|00494|binding|INFO|Setting lport 3f21ff0b-5dcc-4ed6-ae9a-3b99333ff9a7 ovn-installed in OVS Nov 26 05:06:27 localhost ovn_controller[153664]: 2025-11-26T10:06:27Z|00495|binding|INFO|Setting lport 3f21ff0b-5dcc-4ed6-ae9a-3b99333ff9a7 up in Southbound Nov 26 05:06:27 localhost ovn_metadata_agent[159481]: 2025-11-26 10:06:27.734 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Port f9f29ec8-f61b-407b-bee5-84a5d5c80a65 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 26 05:06:27 localhost ovn_metadata_agent[159481]: 2025-11-26 10:06:27.734 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1cebb749-a443-461a-9a9a-67b64ad2e20e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:06:27 localhost ovn_metadata_agent[159481]: 2025-11-26 10:06:27.735 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[51711fac-a7d5-43a8-a21b-f0e2d6d441b4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:06:27 localhost nova_compute[281415]: 2025-11-26 10:06:27.735 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:27 localhost journal[229445]: ethtool ioctl error on tap3f21ff0b-5d: No such device Nov 26 05:06:27 localhost journal[229445]: ethtool ioctl error on tap3f21ff0b-5d: No such device Nov 26 05:06:27 localhost journal[229445]: ethtool ioctl error on tap3f21ff0b-5d: No such device Nov 26 05:06:27 localhost journal[229445]: ethtool ioctl error on tap3f21ff0b-5d: No such device Nov 26 05:06:27 localhost journal[229445]: ethtool ioctl error on tap3f21ff0b-5d: No such device Nov 26 05:06:27 localhost journal[229445]: ethtool ioctl error on tap3f21ff0b-5d: No such device Nov 26 05:06:27 localhost journal[229445]: ethtool ioctl error on tap3f21ff0b-5d: No such device Nov 26 05:06:27 localhost journal[229445]: ethtool ioctl error on tap3f21ff0b-5d: No such device Nov 26 05:06:27 localhost nova_compute[281415]: 2025-11-26 10:06:27.788 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:27 localhost nova_compute[281415]: 2025-11-26 10:06:27.838 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:27 localhost nova_compute[281415]: 2025-11-26 10:06:27.963 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:28 localhost neutron_sriov_agent[255515]: 2025-11-26 10:06:28.121 2 INFO neutron.agent.securitygroups_rpc [req-9f59edeb-582e-4a43-91a6-d6720948ef6d req-0d0aafff-fe85-4667-9186-5f8f7e626f4c 8790f5109ea84c179bc8b14923ff8237 48f18ef67d7544d6977d40f7cdc5b4ff - - default default] Security group member updated ['2942cd33-cfc9-40b7-ad09-309a9a1e4d80']#033[00m Nov 26 05:06:28 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e210 e210: 6 total, 6 up, 6 in Nov 26 05:06:28 localhost podman[327443]: Nov 26 05:06:28 localhost podman[327443]: 2025-11-26 10:06:28.956311382 +0000 UTC m=+0.097887374 container create 3d4eeb5ff2d9a7915f06d46b4c0e2c0bc4a1866a83c5fdb4cc122719cace2ead (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1cebb749-a443-461a-9a9a-67b64ad2e20e, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:06:29 localhost systemd[1]: Started libpod-conmon-3d4eeb5ff2d9a7915f06d46b4c0e2c0bc4a1866a83c5fdb4cc122719cace2ead.scope. Nov 26 05:06:29 localhost podman[327443]: 2025-11-26 10:06:28.917587542 +0000 UTC m=+0.059163554 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:06:29 localhost systemd[1]: Started libcrun container. Nov 26 05:06:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/673b824f3341d54794f2d02d596c44e4b965b1f01500601871c0088b8f34ee25/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:06:29 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:06:29 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:06:29 localhost podman[327443]: 2025-11-26 10:06:29.047742085 +0000 UTC m=+0.189318097 container init 3d4eeb5ff2d9a7915f06d46b4c0e2c0bc4a1866a83c5fdb4cc122719cace2ead (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1cebb749-a443-461a-9a9a-67b64ad2e20e, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:06:29 localhost podman[327443]: 2025-11-26 10:06:29.063202254 +0000 UTC m=+0.204778266 container start 3d4eeb5ff2d9a7915f06d46b4c0e2c0bc4a1866a83c5fdb4cc122719cace2ead (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1cebb749-a443-461a-9a9a-67b64ad2e20e, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 26 05:06:29 localhost dnsmasq[327462]: started, version 2.85 cachesize 150 Nov 26 05:06:29 localhost dnsmasq[327462]: DNS service limited to local subnets Nov 26 05:06:29 localhost dnsmasq[327462]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:06:29 localhost dnsmasq[327462]: warning: no upstream servers configured Nov 26 05:06:29 localhost dnsmasq-dhcp[327462]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 26 05:06:29 localhost dnsmasq[327462]: read /var/lib/neutron/dhcp/1cebb749-a443-461a-9a9a-67b64ad2e20e/addn_hosts - 0 addresses Nov 26 05:06:29 localhost dnsmasq-dhcp[327462]: read /var/lib/neutron/dhcp/1cebb749-a443-461a-9a9a-67b64ad2e20e/host Nov 26 05:06:29 localhost dnsmasq-dhcp[327462]: read /var/lib/neutron/dhcp/1cebb749-a443-461a-9a9a-67b64ad2e20e/opts Nov 26 05:06:29 localhost nova_compute[281415]: 2025-11-26 10:06:29.167 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:29 localhost nova_compute[281415]: 2025-11-26 10:06:29.197 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:06:29.278 262471 INFO neutron.agent.dhcp.agent [None req-01f7d036-0591-49d5-8fee-8a1b668d2ab3 - - - - - -] DHCP configuration for ports {'8bb559c1-d8ea-41ef-9552-184e634e8314'} is completed#033[00m Nov 26 05:06:29 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:06:29 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:06:29 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e210 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:06:29 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:06:29.906 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:06:29Z, description=, device_id=32ca9841-038f-450d-b8d1-75a017848f34, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=0add6e37-840c-4e6e-9999-43a1a6afd120, ip_allocation=immediate, mac_address=fa:16:3e:9c:8e:24, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:06:24Z, description=, dns_domain=, id=1cebb749-a443-461a-9a9a-67b64ad2e20e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PrometheusGabbiTest-130403742-network, port_security_enabled=True, project_id=f4d16a61e2bd42aeadcc181e68b41c65, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=34780, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3213, status=ACTIVE, subnets=['ba122c38-fda0-4ea8-8284-9aebdcfa507d'], tags=[], tenant_id=f4d16a61e2bd42aeadcc181e68b41c65, updated_at=2025-11-26T10:06:25Z, vlan_transparent=None, network_id=1cebb749-a443-461a-9a9a-67b64ad2e20e, port_security_enabled=False, project_id=f4d16a61e2bd42aeadcc181e68b41c65, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3229, status=DOWN, tags=[], tenant_id=f4d16a61e2bd42aeadcc181e68b41c65, updated_at=2025-11-26T10:06:29Z on network 1cebb749-a443-461a-9a9a-67b64ad2e20e#033[00m Nov 26 05:06:30 localhost dnsmasq[327462]: read /var/lib/neutron/dhcp/1cebb749-a443-461a-9a9a-67b64ad2e20e/addn_hosts - 1 addresses Nov 26 05:06:30 localhost dnsmasq-dhcp[327462]: read /var/lib/neutron/dhcp/1cebb749-a443-461a-9a9a-67b64ad2e20e/host Nov 26 05:06:30 localhost dnsmasq-dhcp[327462]: read /var/lib/neutron/dhcp/1cebb749-a443-461a-9a9a-67b64ad2e20e/opts Nov 26 05:06:30 localhost systemd[1]: tmp-crun.ZYNScT.mount: Deactivated successfully. Nov 26 05:06:30 localhost podman[327480]: 2025-11-26 10:06:30.192105732 +0000 UTC m=+0.081417434 container kill 3d4eeb5ff2d9a7915f06d46b4c0e2c0bc4a1866a83c5fdb4cc122719cace2ead (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1cebb749-a443-461a-9a9a-67b64ad2e20e, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 26 05:06:30 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:06:30.452 262471 INFO neutron.agent.dhcp.agent [None req-e6a48956-815a-4751-9ae1-db4db95b01fb - - - - - -] DHCP configuration for ports {'0add6e37-840c-4e6e-9999-43a1a6afd120'} is completed#033[00m Nov 26 05:06:30 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:06:30.727 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:06:29Z, description=, device_id=32ca9841-038f-450d-b8d1-75a017848f34, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=0add6e37-840c-4e6e-9999-43a1a6afd120, ip_allocation=immediate, mac_address=fa:16:3e:9c:8e:24, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:06:24Z, description=, dns_domain=, id=1cebb749-a443-461a-9a9a-67b64ad2e20e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PrometheusGabbiTest-130403742-network, port_security_enabled=True, project_id=f4d16a61e2bd42aeadcc181e68b41c65, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=34780, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3213, status=ACTIVE, subnets=['ba122c38-fda0-4ea8-8284-9aebdcfa507d'], tags=[], tenant_id=f4d16a61e2bd42aeadcc181e68b41c65, updated_at=2025-11-26T10:06:25Z, vlan_transparent=None, network_id=1cebb749-a443-461a-9a9a-67b64ad2e20e, port_security_enabled=False, project_id=f4d16a61e2bd42aeadcc181e68b41c65, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3229, status=DOWN, tags=[], tenant_id=f4d16a61e2bd42aeadcc181e68b41c65, updated_at=2025-11-26T10:06:29Z on network 1cebb749-a443-461a-9a9a-67b64ad2e20e#033[00m Nov 26 05:06:30 localhost dnsmasq[327462]: read /var/lib/neutron/dhcp/1cebb749-a443-461a-9a9a-67b64ad2e20e/addn_hosts - 1 addresses Nov 26 05:06:30 localhost dnsmasq-dhcp[327462]: read /var/lib/neutron/dhcp/1cebb749-a443-461a-9a9a-67b64ad2e20e/host Nov 26 05:06:30 localhost dnsmasq-dhcp[327462]: read /var/lib/neutron/dhcp/1cebb749-a443-461a-9a9a-67b64ad2e20e/opts Nov 26 05:06:30 localhost podman[327518]: 2025-11-26 10:06:30.980553551 +0000 UTC m=+0.071192427 container kill 3d4eeb5ff2d9a7915f06d46b4c0e2c0bc4a1866a83c5fdb4cc122719cace2ead (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1cebb749-a443-461a-9a9a-67b64ad2e20e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 26 05:06:31 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:06:31.344 262471 INFO neutron.agent.dhcp.agent [None req-34abb7e0-c816-40a0-aefa-e51d871e4bc7 - - - - - -] DHCP configuration for ports {'0add6e37-840c-4e6e-9999-43a1a6afd120'} is completed#033[00m Nov 26 05:06:32 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1408210637", "format": "json"} : dispatch Nov 26 05:06:32 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1408210637", "caps": ["mds", "allow rw path=/volumes/_nogroup/e5bab9d8-b14d-41f8-b946-b057952c39f2/ac621c6f-df14-456d-8001-fa686b543f83", "osd", "allow rw pool=manila_data namespace=fsvolumens_e5bab9d8-b14d-41f8-b946-b057952c39f2", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:06:32 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1408210637", "caps": ["mds", "allow rw path=/volumes/_nogroup/e5bab9d8-b14d-41f8-b946-b057952c39f2/ac621c6f-df14-456d-8001-fa686b543f83", "osd", "allow rw pool=manila_data namespace=fsvolumens_e5bab9d8-b14d-41f8-b946-b057952c39f2", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:06:32 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1408210637", "caps": ["mds", "allow rw path=/volumes/_nogroup/e5bab9d8-b14d-41f8-b946-b057952c39f2/ac621c6f-df14-456d-8001-fa686b543f83", "osd", "allow rw pool=manila_data namespace=fsvolumens_e5bab9d8-b14d-41f8-b946-b057952c39f2", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:06:32 localhost nova_compute[281415]: 2025-11-26 10:06:32.965 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:33 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:06:33 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:06:33 localhost neutron_sriov_agent[255515]: 2025-11-26 10:06:33.108 2 INFO neutron.agent.securitygroups_rpc [None req-fb77e30a-2a6a-4686-83f2-5f2ce804ac68 25bcc914f94e42b18ff5ffb3afe523c4 9742628fcf554550b7021556f4114164 - - default default] Security group member updated ['6315612a-0ef4-497e-b6b8-4837ad1394c8']#033[00m Nov 26 05:06:34 localhost nova_compute[281415]: 2025-11-26 10:06:34.238 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:34 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e210 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:06:36 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:06:36 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:06:36 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1408210637", "format": "json"} : dispatch Nov 26 05:06:36 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1408210637"} : dispatch Nov 26 05:06:36 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1408210637"} : dispatch Nov 26 05:06:36 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1408210637"}]': finished Nov 26 05:06:37 localhost nova_compute[281415]: 2025-11-26 10:06:37.967 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:38 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e211 e211: 6 total, 6 up, 6 in Nov 26 05:06:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 05:06:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 05:06:38 localhost podman[327539]: 2025-11-26 10:06:38.842093879 +0000 UTC m=+0.092299890 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 05:06:38 localhost podman[327539]: 2025-11-26 10:06:38.881451059 +0000 UTC m=+0.131657050 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 26 05:06:38 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 05:06:38 localhost podman[327540]: 2025-11-26 10:06:38.91763615 +0000 UTC m=+0.166426817 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251118, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Nov 26 05:06:38 localhost podman[327540]: 2025-11-26 10:06:38.934377439 +0000 UTC m=+0.183168146 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 26 05:06:38 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 05:06:39 localhost nova_compute[281415]: 2025-11-26 10:06:39.269 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:39 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1408210637", "format": "json"} : dispatch Nov 26 05:06:39 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1408210637", "caps": ["mds", "allow rw path=/volumes/_nogroup/ea5081ba-54f0-45b2-8168-76b7a178ad5c/bd11ef17-e721-4049-b42f-92ac03445e08", "osd", "allow rw pool=manila_data namespace=fsvolumens_ea5081ba-54f0-45b2-8168-76b7a178ad5c", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:06:39 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1408210637", "caps": ["mds", "allow rw path=/volumes/_nogroup/ea5081ba-54f0-45b2-8168-76b7a178ad5c/bd11ef17-e721-4049-b42f-92ac03445e08", "osd", "allow rw pool=manila_data namespace=fsvolumens_ea5081ba-54f0-45b2-8168-76b7a178ad5c", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:06:39 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1408210637", "caps": ["mds", "allow rw path=/volumes/_nogroup/ea5081ba-54f0-45b2-8168-76b7a178ad5c/bd11ef17-e721-4049-b42f-92ac03445e08", "osd", "allow rw pool=manila_data namespace=fsvolumens_ea5081ba-54f0-45b2-8168-76b7a178ad5c", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:06:39 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:06:39 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:06:39 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e212 e212: 6 total, 6 up, 6 in Nov 26 05:06:39 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e212 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:06:41 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e213 e213: 6 total, 6 up, 6 in Nov 26 05:06:42 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e214 e214: 6 total, 6 up, 6 in Nov 26 05:06:42 localhost dnsmasq[327248]: read /var/lib/neutron/dhcp/4890ed8f-d73a-4106-bf53-9d2cfe52e760/addn_hosts - 0 addresses Nov 26 05:06:42 localhost dnsmasq-dhcp[327248]: read /var/lib/neutron/dhcp/4890ed8f-d73a-4106-bf53-9d2cfe52e760/host Nov 26 05:06:42 localhost dnsmasq-dhcp[327248]: read /var/lib/neutron/dhcp/4890ed8f-d73a-4106-bf53-9d2cfe52e760/opts Nov 26 05:06:42 localhost podman[327599]: 2025-11-26 10:06:42.576544337 +0000 UTC m=+0.064756968 container kill 21dd3021ae68c66b600d8c319046f03f24da5bb4047e28df00c3e6b6044400b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4890ed8f-d73a-4106-bf53-9d2cfe52e760, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 26 05:06:42 localhost ovn_controller[153664]: 2025-11-26T10:06:42Z|00496|binding|INFO|Releasing lport 0c42f84a-2482-4da6-b9da-00ce975f9ad2 from this chassis (sb_readonly=0) Nov 26 05:06:42 localhost kernel: device tap0c42f84a-24 left promiscuous mode Nov 26 05:06:42 localhost nova_compute[281415]: 2025-11-26 10:06:42.793 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:42 localhost ovn_controller[153664]: 2025-11-26T10:06:42Z|00497|binding|INFO|Setting lport 0c42f84a-2482-4da6-b9da-00ce975f9ad2 down in Southbound Nov 26 05:06:42 localhost ovn_metadata_agent[159481]: 2025-11-26 10:06:42.810 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-4890ed8f-d73a-4106-bf53-9d2cfe52e760', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4890ed8f-d73a-4106-bf53-9d2cfe52e760', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b23390764b654d8db9370e455401b357', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=69014c16-e0ff-4371-83cb-2c858ed8bf9b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=0c42f84a-2482-4da6-b9da-00ce975f9ad2) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:06:42 localhost ovn_metadata_agent[159481]: 2025-11-26 10:06:42.811 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 0c42f84a-2482-4da6-b9da-00ce975f9ad2 in datapath 4890ed8f-d73a-4106-bf53-9d2cfe52e760 unbound from our chassis#033[00m Nov 26 05:06:42 localhost ovn_metadata_agent[159481]: 2025-11-26 10:06:42.816 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4890ed8f-d73a-4106-bf53-9d2cfe52e760, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:06:42 localhost ovn_metadata_agent[159481]: 2025-11-26 10:06:42.819 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[4b9e991a-4600-4030-98a5-0f57c82cd807]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:06:42 localhost nova_compute[281415]: 2025-11-26 10:06:42.820 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:42 localhost nova_compute[281415]: 2025-11-26 10:06:42.969 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 05:06:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 05:06:43 localhost podman[327622]: 2025-11-26 10:06:43.821101978 +0000 UTC m=+0.076729009 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:06:43 localhost podman[327622]: 2025-11-26 10:06:43.904239364 +0000 UTC m=+0.159866445 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 26 05:06:43 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 05:06:43 localhost podman[327623]: 2025-11-26 10:06:43.923650265 +0000 UTC m=+0.176542011 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter) Nov 26 05:06:43 localhost podman[327623]: 2025-11-26 10:06:43.962077746 +0000 UTC m=+0.214969542 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, name=ubi9-minimal, version=9.6, maintainer=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.expose-services=) Nov 26 05:06:43 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 05:06:44 localhost nova_compute[281415]: 2025-11-26 10:06:44.270 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:44 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e215 e215: 6 total, 6 up, 6 in Nov 26 05:06:44 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e215 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:06:44 localhost ovn_controller[153664]: 2025-11-26T10:06:44Z|00498|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:06:44 localhost nova_compute[281415]: 2025-11-26 10:06:44.915 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:45 localhost dnsmasq[327248]: exiting on receipt of SIGTERM Nov 26 05:06:45 localhost podman[327686]: 2025-11-26 10:06:45.485042132 +0000 UTC m=+0.065404147 container kill 21dd3021ae68c66b600d8c319046f03f24da5bb4047e28df00c3e6b6044400b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4890ed8f-d73a-4106-bf53-9d2cfe52e760, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Nov 26 05:06:45 localhost systemd[1]: libpod-21dd3021ae68c66b600d8c319046f03f24da5bb4047e28df00c3e6b6044400b8.scope: Deactivated successfully. Nov 26 05:06:45 localhost podman[327700]: 2025-11-26 10:06:45.579982924 +0000 UTC m=+0.073774917 container died 21dd3021ae68c66b600d8c319046f03f24da5bb4047e28df00c3e6b6044400b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4890ed8f-d73a-4106-bf53-9d2cfe52e760, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:06:45 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-21dd3021ae68c66b600d8c319046f03f24da5bb4047e28df00c3e6b6044400b8-userdata-shm.mount: Deactivated successfully. Nov 26 05:06:45 localhost podman[327700]: 2025-11-26 10:06:45.617752885 +0000 UTC m=+0.111544838 container cleanup 21dd3021ae68c66b600d8c319046f03f24da5bb4047e28df00c3e6b6044400b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4890ed8f-d73a-4106-bf53-9d2cfe52e760, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0) Nov 26 05:06:45 localhost systemd[1]: libpod-conmon-21dd3021ae68c66b600d8c319046f03f24da5bb4047e28df00c3e6b6044400b8.scope: Deactivated successfully. Nov 26 05:06:45 localhost podman[327701]: 2025-11-26 10:06:45.653181733 +0000 UTC m=+0.142679472 container remove 21dd3021ae68c66b600d8c319046f03f24da5bb4047e28df00c3e6b6044400b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4890ed8f-d73a-4106-bf53-9d2cfe52e760, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:06:45 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:06:45.679 262471 INFO neutron.agent.dhcp.agent [None req-b1f4601e-63e4-4d1f-83fa-8de657d51f18 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:06:45 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:06:45.681 262471 INFO neutron.agent.dhcp.agent [None req-b1f4601e-63e4-4d1f-83fa-8de657d51f18 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:06:45 localhost openstack_network_exporter[242153]: ERROR 10:06:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:06:45 localhost openstack_network_exporter[242153]: ERROR 10:06:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:06:45 localhost openstack_network_exporter[242153]: ERROR 10:06:45 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 05:06:45 localhost openstack_network_exporter[242153]: ERROR 10:06:45 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 05:06:45 localhost openstack_network_exporter[242153]: Nov 26 05:06:45 localhost openstack_network_exporter[242153]: ERROR 10:06:45 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 05:06:45 localhost openstack_network_exporter[242153]: Nov 26 05:06:46 localhost dnsmasq[327462]: read /var/lib/neutron/dhcp/1cebb749-a443-461a-9a9a-67b64ad2e20e/addn_hosts - 0 addresses Nov 26 05:06:46 localhost dnsmasq-dhcp[327462]: read /var/lib/neutron/dhcp/1cebb749-a443-461a-9a9a-67b64ad2e20e/host Nov 26 05:06:46 localhost podman[327744]: 2025-11-26 10:06:46.431830687 +0000 UTC m=+0.059590377 container kill 3d4eeb5ff2d9a7915f06d46b4c0e2c0bc4a1866a83c5fdb4cc122719cace2ead (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1cebb749-a443-461a-9a9a-67b64ad2e20e, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:06:46 localhost dnsmasq-dhcp[327462]: read /var/lib/neutron/dhcp/1cebb749-a443-461a-9a9a-67b64ad2e20e/opts Nov 26 05:06:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 05:06:46 localhost systemd[1]: var-lib-containers-storage-overlay-8389bcfeccf59545a7a2d2123ae64fd08137805c40bcf28384e28a00cd22716e-merged.mount: Deactivated successfully. Nov 26 05:06:46 localhost systemd[1]: run-netns-qdhcp\x2d4890ed8f\x2dd73a\x2d4106\x2dbf53\x2d9d2cfe52e760.mount: Deactivated successfully. Nov 26 05:06:46 localhost podman[327760]: 2025-11-26 10:06:46.59328869 +0000 UTC m=+0.095930753 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 05:06:46 localhost podman[327760]: 2025-11-26 10:06:46.606754568 +0000 UTC m=+0.109396621 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 05:06:46 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 05:06:46 localhost ovn_controller[153664]: 2025-11-26T10:06:46Z|00499|binding|INFO|Releasing lport 3f21ff0b-5dcc-4ed6-ae9a-3b99333ff9a7 from this chassis (sb_readonly=0) Nov 26 05:06:46 localhost ovn_controller[153664]: 2025-11-26T10:06:46Z|00500|binding|INFO|Setting lport 3f21ff0b-5dcc-4ed6-ae9a-3b99333ff9a7 down in Southbound Nov 26 05:06:46 localhost nova_compute[281415]: 2025-11-26 10:06:46.626 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:46 localhost kernel: device tap3f21ff0b-5d left promiscuous mode Nov 26 05:06:46 localhost ovn_metadata_agent[159481]: 2025-11-26 10:06:46.647 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-1cebb749-a443-461a-9a9a-67b64ad2e20e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1cebb749-a443-461a-9a9a-67b64ad2e20e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f4d16a61e2bd42aeadcc181e68b41c65', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3829556a-d366-497b-be33-b029ca4b7855, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=3f21ff0b-5dcc-4ed6-ae9a-3b99333ff9a7) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:06:46 localhost nova_compute[281415]: 2025-11-26 10:06:46.647 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:46 localhost ovn_metadata_agent[159481]: 2025-11-26 10:06:46.650 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 3f21ff0b-5dcc-4ed6-ae9a-3b99333ff9a7 in datapath 1cebb749-a443-461a-9a9a-67b64ad2e20e unbound from our chassis#033[00m Nov 26 05:06:46 localhost ovn_metadata_agent[159481]: 2025-11-26 10:06:46.652 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1cebb749-a443-461a-9a9a-67b64ad2e20e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:06:46 localhost ovn_metadata_agent[159481]: 2025-11-26 10:06:46.653 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[f7038798-4040-4f5a-bc02-b95d169e0ba4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:06:47 localhost ovn_controller[153664]: 2025-11-26T10:06:47Z|00501|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:06:47 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:06:47 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:06:47 localhost nova_compute[281415]: 2025-11-26 10:06:47.686 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:47 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e216 e216: 6 total, 6 up, 6 in Nov 26 05:06:47 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1408210637", "format": "json"} : dispatch Nov 26 05:06:47 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1408210637"} : dispatch Nov 26 05:06:47 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1408210637"} : dispatch Nov 26 05:06:47 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1408210637"}]': finished Nov 26 05:06:47 localhost nova_compute[281415]: 2025-11-26 10:06:47.971 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:48 localhost dnsmasq[327462]: exiting on receipt of SIGTERM Nov 26 05:06:48 localhost podman[327808]: 2025-11-26 10:06:48.168102403 +0000 UTC m=+0.062827897 container kill 3d4eeb5ff2d9a7915f06d46b4c0e2c0bc4a1866a83c5fdb4cc122719cace2ead (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1cebb749-a443-461a-9a9a-67b64ad2e20e, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Nov 26 05:06:48 localhost systemd[1]: libpod-3d4eeb5ff2d9a7915f06d46b4c0e2c0bc4a1866a83c5fdb4cc122719cace2ead.scope: Deactivated successfully. Nov 26 05:06:48 localhost podman[327821]: 2025-11-26 10:06:48.245627415 +0000 UTC m=+0.059394701 container died 3d4eeb5ff2d9a7915f06d46b4c0e2c0bc4a1866a83c5fdb4cc122719cace2ead (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1cebb749-a443-461a-9a9a-67b64ad2e20e, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 26 05:06:48 localhost podman[327821]: 2025-11-26 10:06:48.286625946 +0000 UTC m=+0.100393192 container cleanup 3d4eeb5ff2d9a7915f06d46b4c0e2c0bc4a1866a83c5fdb4cc122719cace2ead (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1cebb749-a443-461a-9a9a-67b64ad2e20e, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 26 05:06:48 localhost systemd[1]: libpod-conmon-3d4eeb5ff2d9a7915f06d46b4c0e2c0bc4a1866a83c5fdb4cc122719cace2ead.scope: Deactivated successfully. Nov 26 05:06:48 localhost podman[327823]: 2025-11-26 10:06:48.329883006 +0000 UTC m=+0.136006086 container remove 3d4eeb5ff2d9a7915f06d46b4c0e2c0bc4a1866a83c5fdb4cc122719cace2ead (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1cebb749-a443-461a-9a9a-67b64ad2e20e, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:06:48 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:06:48.354 262471 INFO neutron.agent.dhcp.agent [None req-44252aec-181c-4dc4-a81d-f4d4e69ea8cd - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:06:48 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:06:48.355 262471 INFO neutron.agent.dhcp.agent [None req-44252aec-181c-4dc4-a81d-f4d4e69ea8cd - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:06:48 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e217 e217: 6 total, 6 up, 6 in Nov 26 05:06:49 localhost systemd[1]: var-lib-containers-storage-overlay-673b824f3341d54794f2d02d596c44e4b965b1f01500601871c0088b8f34ee25-merged.mount: Deactivated successfully. Nov 26 05:06:49 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3d4eeb5ff2d9a7915f06d46b4c0e2c0bc4a1866a83c5fdb4cc122719cace2ead-userdata-shm.mount: Deactivated successfully. Nov 26 05:06:49 localhost systemd[1]: run-netns-qdhcp\x2d1cebb749\x2da443\x2d461a\x2d9a9a\x2d67b64ad2e20e.mount: Deactivated successfully. Nov 26 05:06:49 localhost nova_compute[281415]: 2025-11-26 10:06:49.275 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:49 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1408210637", "format": "json"} : dispatch Nov 26 05:06:49 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1408210637", "caps": ["mds", "allow rw path=/volumes/_nogroup/ea5081ba-54f0-45b2-8168-76b7a178ad5c/bd11ef17-e721-4049-b42f-92ac03445e08", "osd", "allow rw pool=manila_data namespace=fsvolumens_ea5081ba-54f0-45b2-8168-76b7a178ad5c", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:06:49 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1408210637", "caps": ["mds", "allow rw path=/volumes/_nogroup/ea5081ba-54f0-45b2-8168-76b7a178ad5c/bd11ef17-e721-4049-b42f-92ac03445e08", "osd", "allow rw pool=manila_data namespace=fsvolumens_ea5081ba-54f0-45b2-8168-76b7a178ad5c", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:06:49 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1408210637", "caps": ["mds", "allow rw path=/volumes/_nogroup/ea5081ba-54f0-45b2-8168-76b7a178ad5c/bd11ef17-e721-4049-b42f-92ac03445e08", "osd", "allow rw pool=manila_data namespace=fsvolumens_ea5081ba-54f0-45b2-8168-76b7a178ad5c", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:06:49 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e217 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:06:50 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e218 e218: 6 total, 6 up, 6 in Nov 26 05:06:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 05:06:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 05:06:51 localhost systemd[1]: tmp-crun.hpg8Kh.mount: Deactivated successfully. Nov 26 05:06:51 localhost podman[327852]: 2025-11-26 10:06:51.878914138 +0000 UTC m=+0.133258080 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0) Nov 26 05:06:51 localhost podman[327851]: 2025-11-26 10:06:51.844484131 +0000 UTC m=+0.103472747 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:06:51 localhost podman[327852]: 2025-11-26 10:06:51.920400903 +0000 UTC m=+0.174744845 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, config_id=multipathd) Nov 26 05:06:51 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 05:06:51 localhost podman[327851]: 2025-11-26 10:06:51.977485972 +0000 UTC m=+0.236474638 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 26 05:06:51 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 05:06:52 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 26 05:06:52 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1583363213' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 26 05:06:52 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 26 05:06:52 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1583363213' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 26 05:06:52 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1408210637", "format": "json"} : dispatch Nov 26 05:06:52 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1408210637"} : dispatch Nov 26 05:06:52 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1408210637"} : dispatch Nov 26 05:06:52 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1408210637"}]': finished Nov 26 05:06:52 localhost nova_compute[281415]: 2025-11-26 10:06:52.974 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:54 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e219 e219: 6 total, 6 up, 6 in Nov 26 05:06:54 localhost nova_compute[281415]: 2025-11-26 10:06:54.312 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:54 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:06:55 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e220 e220: 6 total, 6 up, 6 in Nov 26 05:06:56 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1408210637", "format": "json"} : dispatch Nov 26 05:06:56 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1408210637", "caps": ["mds", "allow rw path=/volumes/_nogroup/ea5081ba-54f0-45b2-8168-76b7a178ad5c/bd11ef17-e721-4049-b42f-92ac03445e08", "osd", "allow rw pool=manila_data namespace=fsvolumens_ea5081ba-54f0-45b2-8168-76b7a178ad5c", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:06:56 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1408210637", "caps": ["mds", "allow rw path=/volumes/_nogroup/ea5081ba-54f0-45b2-8168-76b7a178ad5c/bd11ef17-e721-4049-b42f-92ac03445e08", "osd", "allow rw pool=manila_data namespace=fsvolumens_ea5081ba-54f0-45b2-8168-76b7a178ad5c", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:06:56 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1408210637", "caps": ["mds", "allow rw path=/volumes/_nogroup/ea5081ba-54f0-45b2-8168-76b7a178ad5c/bd11ef17-e721-4049-b42f-92ac03445e08", "osd", "allow rw pool=manila_data namespace=fsvolumens_ea5081ba-54f0-45b2-8168-76b7a178ad5c", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:06:57 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e221 e221: 6 total, 6 up, 6 in Nov 26 05:06:57 localhost podman[240049]: time="2025-11-26T10:06:57Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 05:06:57 localhost podman[240049]: @ - - [26/Nov/2025:10:06:57 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1" Nov 26 05:06:57 localhost podman[240049]: @ - - [26/Nov/2025:10:06:57 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19267 "" "Go-http-client/1.1" Nov 26 05:06:57 localhost nova_compute[281415]: 2025-11-26 10:06:57.976 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:58 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e222 e222: 6 total, 6 up, 6 in Nov 26 05:06:58 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 26 05:06:58 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3409976696' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 26 05:06:58 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 26 05:06:58 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3409976696' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 26 05:06:59 localhost nova_compute[281415]: 2025-11-26 10:06:59.345 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:06:59 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e223 e223: 6 total, 6 up, 6 in Nov 26 05:06:59 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e223 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:07:00 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1408210637", "format": "json"} : dispatch Nov 26 05:07:00 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1408210637"} : dispatch Nov 26 05:07:00 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1408210637"} : dispatch Nov 26 05:07:00 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1408210637"}]': finished Nov 26 05:07:00 localhost sshd[327890]: main: sshd: ssh-rsa algorithm is disabled Nov 26 05:07:01 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 26 05:07:01 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1964862986' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 26 05:07:01 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 26 05:07:01 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1964862986' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 26 05:07:02 localhost nova_compute[281415]: 2025-11-26 10:07:02.978 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:03 localhost dnsmasq[327071]: read /var/lib/neutron/dhcp/c6c96c4d-a15e-43c2-92db-590d17ac6c96/addn_hosts - 0 addresses Nov 26 05:07:03 localhost podman[327909]: 2025-11-26 10:07:03.138377788 +0000 UTC m=+0.073002243 container kill 03931ba36fd3aec4b0088fb8ea7eba9f0c58bae408072454b8bf478de320764b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c6c96c4d-a15e-43c2-92db-590d17ac6c96, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:07:03 localhost systemd[1]: tmp-crun.vXDguo.mount: Deactivated successfully. Nov 26 05:07:03 localhost dnsmasq-dhcp[327071]: read /var/lib/neutron/dhcp/c6c96c4d-a15e-43c2-92db-590d17ac6c96/host Nov 26 05:07:03 localhost dnsmasq-dhcp[327071]: read /var/lib/neutron/dhcp/c6c96c4d-a15e-43c2-92db-590d17ac6c96/opts Nov 26 05:07:03 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1408210637", "format": "json"} : dispatch Nov 26 05:07:03 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1408210637", "caps": ["mds", "allow rw path=/volumes/_nogroup/ea5081ba-54f0-45b2-8168-76b7a178ad5c/bd11ef17-e721-4049-b42f-92ac03445e08", "osd", "allow rw pool=manila_data namespace=fsvolumens_ea5081ba-54f0-45b2-8168-76b7a178ad5c", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:07:03 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1408210637", "caps": ["mds", "allow rw path=/volumes/_nogroup/ea5081ba-54f0-45b2-8168-76b7a178ad5c/bd11ef17-e721-4049-b42f-92ac03445e08", "osd", "allow rw pool=manila_data namespace=fsvolumens_ea5081ba-54f0-45b2-8168-76b7a178ad5c", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:07:03 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1408210637", "caps": ["mds", "allow rw path=/volumes/_nogroup/ea5081ba-54f0-45b2-8168-76b7a178ad5c/bd11ef17-e721-4049-b42f-92ac03445e08", "osd", "allow rw pool=manila_data namespace=fsvolumens_ea5081ba-54f0-45b2-8168-76b7a178ad5c", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:07:03 localhost nova_compute[281415]: 2025-11-26 10:07:03.352 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:03 localhost kernel: device tapb2e3b1e1-0a left promiscuous mode Nov 26 05:07:03 localhost ovn_controller[153664]: 2025-11-26T10:07:03Z|00502|binding|INFO|Releasing lport b2e3b1e1-0ac3-47a8-9e74-3e7dfe61aa8a from this chassis (sb_readonly=0) Nov 26 05:07:03 localhost ovn_controller[153664]: 2025-11-26T10:07:03Z|00503|binding|INFO|Setting lport b2e3b1e1-0ac3-47a8-9e74-3e7dfe61aa8a down in Southbound Nov 26 05:07:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:07:03.366 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-c6c96c4d-a15e-43c2-92db-590d17ac6c96', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c6c96c4d-a15e-43c2-92db-590d17ac6c96', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '25af506c54f34bb4baedfbfac942ffa0', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82bcb304-c12b-4e0e-8d67-c4a3965eb271, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=b2e3b1e1-0ac3-47a8-9e74-3e7dfe61aa8a) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:07:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:07:03.368 159486 INFO neutron.agent.ovn.metadata.agent [-] Port b2e3b1e1-0ac3-47a8-9e74-3e7dfe61aa8a in datapath c6c96c4d-a15e-43c2-92db-590d17ac6c96 unbound from our chassis#033[00m Nov 26 05:07:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:07:03.372 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c6c96c4d-a15e-43c2-92db-590d17ac6c96, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:07:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:07:03.374 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[8ab2fbcd-9e73-4324-9d9a-dfd5dae61ebe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:07:03 localhost nova_compute[281415]: 2025-11-26 10:07:03.378 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:03 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e224 e224: 6 total, 6 up, 6 in Nov 26 05:07:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:07:03.675 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:07:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:07:03.676 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:07:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:07:03.677 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:07:04 localhost nova_compute[281415]: 2025-11-26 10:07:04.348 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:04 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e224 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:07:05 localhost nova_compute[281415]: 2025-11-26 10:07:05.404 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:07:05 localhost nova_compute[281415]: 2025-11-26 10:07:05.851 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:07:05 localhost nova_compute[281415]: 2025-11-26 10:07:05.871 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:07:05 localhost nova_compute[281415]: 2025-11-26 10:07:05.871 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:07:05 localhost nova_compute[281415]: 2025-11-26 10:07:05.872 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:07:05 localhost nova_compute[281415]: 2025-11-26 10:07:05.872 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 05:07:05 localhost nova_compute[281415]: 2025-11-26 10:07:05.872 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 05:07:06 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 05:07:06 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/611852591' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 05:07:06 localhost nova_compute[281415]: 2025-11-26 10:07:06.325 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 05:07:06 localhost nova_compute[281415]: 2025-11-26 10:07:06.399 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 05:07:06 localhost nova_compute[281415]: 2025-11-26 10:07:06.400 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 05:07:06 localhost ovn_controller[153664]: 2025-11-26T10:07:06Z|00504|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:07:06 localhost nova_compute[281415]: 2025-11-26 10:07:06.609 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:06 localhost nova_compute[281415]: 2025-11-26 10:07:06.651 281419 WARNING nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 05:07:06 localhost nova_compute[281415]: 2025-11-26 10:07:06.654 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=11148MB free_disk=41.70030212402344GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 05:07:06 localhost nova_compute[281415]: 2025-11-26 10:07:06.655 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:07:06 localhost nova_compute[281415]: 2025-11-26 10:07:06.656 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:07:06 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1408210637", "format": "json"} : dispatch Nov 26 05:07:06 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1408210637"} : dispatch Nov 26 05:07:06 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1408210637"} : dispatch Nov 26 05:07:06 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1408210637"}]': finished Nov 26 05:07:06 localhost nova_compute[281415]: 2025-11-26 10:07:06.750 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 05:07:06 localhost nova_compute[281415]: 2025-11-26 10:07:06.751 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 05:07:06 localhost nova_compute[281415]: 2025-11-26 10:07:06.751 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 05:07:06 localhost nova_compute[281415]: 2025-11-26 10:07:06.800 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 05:07:07 localhost podman[328055]: 2025-11-26 10:07:07.101037526 +0000 UTC m=+0.066422809 container kill 03931ba36fd3aec4b0088fb8ea7eba9f0c58bae408072454b8bf478de320764b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c6c96c4d-a15e-43c2-92db-590d17ac6c96, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2) Nov 26 05:07:07 localhost dnsmasq[327071]: exiting on receipt of SIGTERM Nov 26 05:07:07 localhost systemd[1]: libpod-03931ba36fd3aec4b0088fb8ea7eba9f0c58bae408072454b8bf478de320764b.scope: Deactivated successfully. Nov 26 05:07:07 localhost podman[328069]: 2025-11-26 10:07:07.176647939 +0000 UTC m=+0.051746655 container died 03931ba36fd3aec4b0088fb8ea7eba9f0c58bae408072454b8bf478de320764b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c6c96c4d-a15e-43c2-92db-590d17ac6c96, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118) Nov 26 05:07:07 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-03931ba36fd3aec4b0088fb8ea7eba9f0c58bae408072454b8bf478de320764b-userdata-shm.mount: Deactivated successfully. Nov 26 05:07:07 localhost systemd[1]: var-lib-containers-storage-overlay-861db8e9e3e2368f86c3d0d56797fa1378a6c70c22343a6763cd2a59cbaaaddd-merged.mount: Deactivated successfully. Nov 26 05:07:07 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 05:07:07 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2947916514' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 05:07:07 localhost podman[328069]: 2025-11-26 10:07:07.23899742 +0000 UTC m=+0.114096096 container remove 03931ba36fd3aec4b0088fb8ea7eba9f0c58bae408072454b8bf478de320764b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c6c96c4d-a15e-43c2-92db-590d17ac6c96, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 26 05:07:07 localhost nova_compute[281415]: 2025-11-26 10:07:07.246 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 05:07:07 localhost systemd[1]: libpod-conmon-03931ba36fd3aec4b0088fb8ea7eba9f0c58bae408072454b8bf478de320764b.scope: Deactivated successfully. Nov 26 05:07:07 localhost nova_compute[281415]: 2025-11-26 10:07:07.256 281419 DEBUG nova.compute.provider_tree [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 05:07:07 localhost nova_compute[281415]: 2025-11-26 10:07:07.274 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 05:07:07 localhost nova_compute[281415]: 2025-11-26 10:07:07.276 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 05:07:07 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:07:07.275 262471 INFO neutron.agent.dhcp.agent [None req-59c11484-3693-46e0-b48b-833d72aa9cb2 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:07:07 localhost nova_compute[281415]: 2025-11-26 10:07:07.276 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.620s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:07:07 localhost systemd[1]: run-netns-qdhcp\x2dc6c96c4d\x2da15e\x2d43c2\x2d92db\x2d590d17ac6c96.mount: Deactivated successfully. Nov 26 05:07:07 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:07:07.378 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:07:07 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 05:07:07 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:07:07 localhost nova_compute[281415]: 2025-11-26 10:07:07.981 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:08 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:07:08 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:07:08 localhost nova_compute[281415]: 2025-11-26 10:07:08.273 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:07:08 localhost nova_compute[281415]: 2025-11-26 10:07:08.273 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:07:08 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e225 e225: 6 total, 6 up, 6 in Nov 26 05:07:08 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:07:08 localhost nova_compute[281415]: 2025-11-26 10:07:08.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:07:08 localhost nova_compute[281415]: 2025-11-26 10:07:08.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:07:08 localhost nova_compute[281415]: 2025-11-26 10:07:08.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:07:08 localhost nova_compute[281415]: 2025-11-26 10:07:08.849 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 05:07:09 localhost nova_compute[281415]: 2025-11-26 10:07:09.334 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:09 localhost nova_compute[281415]: 2025-11-26 10:07:09.350 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 05:07:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 05:07:09 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e225 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:07:09 localhost podman[328112]: 2025-11-26 10:07:09.843156417 +0000 UTC m=+0.092374813 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 05:07:09 localhost podman[328112]: 2025-11-26 10:07:09.85842735 +0000 UTC m=+0.107645746 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 05:07:09 localhost podman[328113]: 2025-11-26 10:07:09.902729853 +0000 UTC m=+0.149153572 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true) Nov 26 05:07:09 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 05:07:09 localhost podman[328113]: 2025-11-26 10:07:09.940292657 +0000 UTC m=+0.186716386 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 26 05:07:09 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 05:07:10 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:07:10 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:07:10 localhost nova_compute[281415]: 2025-11-26 10:07:10.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:07:11 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 26 05:07:11 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:07:11 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:07:11 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:07:11 localhost nova_compute[281415]: 2025-11-26 10:07:11.905 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:12 localhost nova_compute[281415]: 2025-11-26 10:07:12.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:07:12 localhost nova_compute[281415]: 2025-11-26 10:07:12.849 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 05:07:12 localhost nova_compute[281415]: 2025-11-26 10:07:12.849 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 05:07:12 localhost nova_compute[281415]: 2025-11-26 10:07:12.953 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 05:07:12 localhost nova_compute[281415]: 2025-11-26 10:07:12.953 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 05:07:12 localhost nova_compute[281415]: 2025-11-26 10:07:12.954 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 05:07:12 localhost nova_compute[281415]: 2025-11-26 10:07:12.954 281419 DEBUG nova.objects.instance [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 05:07:12 localhost nova_compute[281415]: 2025-11-26 10:07:12.986 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:13 localhost nova_compute[281415]: 2025-11-26 10:07:13.647 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 05:07:13 localhost nova_compute[281415]: 2025-11-26 10:07:13.666 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 05:07:13 localhost nova_compute[281415]: 2025-11-26 10:07:13.666 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 05:07:14 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:07:14 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:07:14 localhost ovn_metadata_agent[159481]: 2025-11-26 10:07:14.309 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:5e:b4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '86:cf:7c:68:02:df'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:07:14 localhost ovn_metadata_agent[159481]: 2025-11-26 10:07:14.311 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 26 05:07:14 localhost nova_compute[281415]: 2025-11-26 10:07:14.349 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:14 localhost nova_compute[281415]: 2025-11-26 10:07:14.356 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 05:07:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 05:07:14 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e225 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:07:14 localhost podman[328156]: 2025-11-26 10:07:14.866402016 +0000 UTC m=+0.114665364 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-type=git, maintainer=Red Hat, Inc., distribution-scope=public, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, managed_by=edpm_ansible, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, version=9.6) Nov 26 05:07:14 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:07:14.900 262471 INFO neutron.agent.linux.ip_lib [None req-3860045f-849a-4401-9a8e-a61bb6a48bac - - - - - -] Device tap469088a1-ef cannot be used as it has no MAC address#033[00m Nov 26 05:07:14 localhost podman[328155]: 2025-11-26 10:07:14.904608189 +0000 UTC m=+0.153874439 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 26 05:07:14 localhost nova_compute[281415]: 2025-11-26 10:07:14.935 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:14 localhost kernel: device tap469088a1-ef entered promiscuous mode Nov 26 05:07:14 localhost ovn_controller[153664]: 2025-11-26T10:07:14Z|00505|binding|INFO|Claiming lport 469088a1-efdc-45cb-bec6-021e88f502a7 for this chassis. Nov 26 05:07:14 localhost NetworkManager[5970]: [1764151634.9503] manager: (tap469088a1-ef): new Generic device (/org/freedesktop/NetworkManager/Devices/78) Nov 26 05:07:14 localhost ovn_controller[153664]: 2025-11-26T10:07:14Z|00506|binding|INFO|469088a1-efdc-45cb-bec6-021e88f502a7: Claiming unknown Nov 26 05:07:14 localhost nova_compute[281415]: 2025-11-26 10:07:14.950 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:14 localhost podman[328156]: 2025-11-26 10:07:14.953391501 +0000 UTC m=+0.201654839 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-type=git, maintainer=Red Hat, Inc., io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers) Nov 26 05:07:14 localhost systemd-udevd[328208]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:07:14 localhost ovn_metadata_agent[159481]: 2025-11-26 10:07:14.959 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-5398cb98-6af3-480d-a740-fb6c9b807f6f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5398cb98-6af3-480d-a740-fb6c9b807f6f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd822319495c24889802d5a61a295ef62', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e6a290dc-3890-4a34-b373-ac1ee4e3eee1, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=469088a1-efdc-45cb-bec6-021e88f502a7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:07:14 localhost ovn_metadata_agent[159481]: 2025-11-26 10:07:14.961 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 469088a1-efdc-45cb-bec6-021e88f502a7 in datapath 5398cb98-6af3-480d-a740-fb6c9b807f6f bound to our chassis#033[00m Nov 26 05:07:14 localhost ovn_metadata_agent[159481]: 2025-11-26 10:07:14.963 159486 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5398cb98-6af3-480d-a740-fb6c9b807f6f or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 26 05:07:14 localhost ovn_metadata_agent[159481]: 2025-11-26 10:07:14.964 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[9cc8af99-ec61-4d8d-a43d-e4efe83f7db0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:07:14 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 05:07:14 localhost journal[229445]: ethtool ioctl error on tap469088a1-ef: No such device Nov 26 05:07:14 localhost ovn_controller[153664]: 2025-11-26T10:07:14Z|00507|binding|INFO|Setting lport 469088a1-efdc-45cb-bec6-021e88f502a7 ovn-installed in OVS Nov 26 05:07:14 localhost ovn_controller[153664]: 2025-11-26T10:07:14Z|00508|binding|INFO|Setting lport 469088a1-efdc-45cb-bec6-021e88f502a7 up in Southbound Nov 26 05:07:14 localhost journal[229445]: ethtool ioctl error on tap469088a1-ef: No such device Nov 26 05:07:14 localhost nova_compute[281415]: 2025-11-26 10:07:14.996 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:15 localhost journal[229445]: ethtool ioctl error on tap469088a1-ef: No such device Nov 26 05:07:15 localhost journal[229445]: ethtool ioctl error on tap469088a1-ef: No such device Nov 26 05:07:15 localhost journal[229445]: ethtool ioctl error on tap469088a1-ef: No such device Nov 26 05:07:15 localhost journal[229445]: ethtool ioctl error on tap469088a1-ef: No such device Nov 26 05:07:15 localhost podman[328155]: 2025-11-26 10:07:15.020556612 +0000 UTC m=+0.269822832 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 26 05:07:15 localhost journal[229445]: ethtool ioctl error on tap469088a1-ef: No such device Nov 26 05:07:15 localhost journal[229445]: ethtool ioctl error on tap469088a1-ef: No such device Nov 26 05:07:15 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 05:07:15 localhost nova_compute[281415]: 2025-11-26 10:07:15.049 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:15 localhost nova_compute[281415]: 2025-11-26 10:07:15.088 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:15 localhost openstack_network_exporter[242153]: ERROR 10:07:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:07:15 localhost openstack_network_exporter[242153]: ERROR 10:07:15 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 05:07:15 localhost openstack_network_exporter[242153]: ERROR 10:07:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:07:15 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 26 05:07:15 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 26 05:07:15 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 26 05:07:15 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Nov 26 05:07:15 localhost openstack_network_exporter[242153]: ERROR 10:07:15 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 05:07:15 localhost openstack_network_exporter[242153]: Nov 26 05:07:15 localhost openstack_network_exporter[242153]: ERROR 10:07:15 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 05:07:15 localhost openstack_network_exporter[242153]: Nov 26 05:07:16 localhost podman[328279]: Nov 26 05:07:16 localhost podman[328279]: 2025-11-26 10:07:16.116384654 +0000 UTC m=+0.104508708 container create 85ab8078b55141599dadb4f8392f8c068e24907cf9331f086116b9ab55fc9874 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5398cb98-6af3-480d-a740-fb6c9b807f6f, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2) Nov 26 05:07:16 localhost podman[328279]: 2025-11-26 10:07:16.064781715 +0000 UTC m=+0.052905809 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:07:16 localhost systemd[1]: Started libpod-conmon-85ab8078b55141599dadb4f8392f8c068e24907cf9331f086116b9ab55fc9874.scope. Nov 26 05:07:16 localhost systemd[1]: Started libcrun container. Nov 26 05:07:16 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a26937d5bea0bf1cdf5d3b25b5256ffde6e729cc3a8e6edad349f91fc9e791f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:07:16 localhost podman[328279]: 2025-11-26 10:07:16.200870912 +0000 UTC m=+0.188994936 container init 85ab8078b55141599dadb4f8392f8c068e24907cf9331f086116b9ab55fc9874 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5398cb98-6af3-480d-a740-fb6c9b807f6f, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true) Nov 26 05:07:16 localhost podman[328279]: 2025-11-26 10:07:16.212603325 +0000 UTC m=+0.200727339 container start 85ab8078b55141599dadb4f8392f8c068e24907cf9331f086116b9ab55fc9874 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5398cb98-6af3-480d-a740-fb6c9b807f6f, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:07:16 localhost dnsmasq[328297]: started, version 2.85 cachesize 150 Nov 26 05:07:16 localhost dnsmasq[328297]: DNS service limited to local subnets Nov 26 05:07:16 localhost dnsmasq[328297]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:07:16 localhost dnsmasq[328297]: warning: no upstream servers configured Nov 26 05:07:16 localhost dnsmasq-dhcp[328297]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 26 05:07:16 localhost dnsmasq[328297]: read /var/lib/neutron/dhcp/5398cb98-6af3-480d-a740-fb6c9b807f6f/addn_hosts - 0 addresses Nov 26 05:07:16 localhost dnsmasq-dhcp[328297]: read /var/lib/neutron/dhcp/5398cb98-6af3-480d-a740-fb6c9b807f6f/host Nov 26 05:07:16 localhost dnsmasq-dhcp[328297]: read /var/lib/neutron/dhcp/5398cb98-6af3-480d-a740-fb6c9b807f6f/opts Nov 26 05:07:16 localhost ovn_metadata_agent[159481]: 2025-11-26 10:07:16.313 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fad182b-d1fd-4eb1-a4d3-436a76a6f49e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 05:07:16 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:07:16.422 262471 INFO neutron.agent.dhcp.agent [None req-23d02075-b4a3-4643-b564-992141d2184a - - - - - -] DHCP configuration for ports {'8add270c-30f7-4c1b-8b76-4069aa248ab6'} is completed#033[00m Nov 26 05:07:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 05:07:16 localhost podman[328298]: 2025-11-26 10:07:16.831091489 +0000 UTC m=+0.087436920 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 05:07:16 localhost podman[328298]: 2025-11-26 10:07:16.842023517 +0000 UTC m=+0.098369028 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 05:07:16 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e226 e226: 6 total, 6 up, 6 in Nov 26 05:07:16 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 05:07:17 localhost nova_compute[281415]: 2025-11-26 10:07:17.194 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:17 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:07:17.589 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:07:17Z, description=, device_id=fdaa0151-9575-4879-9920-eba93abda006, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=29ef0207-3c01-4d05-b999-f8e8568d593b, ip_allocation=immediate, mac_address=fa:16:3e:82:9e:a4, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:07:13Z, description=, dns_domain=, id=5398cb98-6af3-480d-a740-fb6c9b807f6f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIAdminTest-328286573-network, port_security_enabled=True, project_id=d822319495c24889802d5a61a295ef62, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=27451, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3354, status=ACTIVE, subnets=['4ba19de8-d14d-4129-8847-bcb9e6010bc1'], tags=[], tenant_id=d822319495c24889802d5a61a295ef62, updated_at=2025-11-26T10:07:14Z, vlan_transparent=None, network_id=5398cb98-6af3-480d-a740-fb6c9b807f6f, port_security_enabled=False, project_id=d822319495c24889802d5a61a295ef62, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3376, status=DOWN, tags=[], tenant_id=d822319495c24889802d5a61a295ef62, updated_at=2025-11-26T10:07:17Z on network 5398cb98-6af3-480d-a740-fb6c9b807f6f#033[00m Nov 26 05:07:17 localhost dnsmasq[328297]: read /var/lib/neutron/dhcp/5398cb98-6af3-480d-a740-fb6c9b807f6f/addn_hosts - 1 addresses Nov 26 05:07:17 localhost dnsmasq-dhcp[328297]: read /var/lib/neutron/dhcp/5398cb98-6af3-480d-a740-fb6c9b807f6f/host Nov 26 05:07:17 localhost podman[328338]: 2025-11-26 10:07:17.835777728 +0000 UTC m=+0.065480971 container kill 85ab8078b55141599dadb4f8392f8c068e24907cf9331f086116b9ab55fc9874 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5398cb98-6af3-480d-a740-fb6c9b807f6f, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 26 05:07:17 localhost dnsmasq-dhcp[328297]: read /var/lib/neutron/dhcp/5398cb98-6af3-480d-a740-fb6c9b807f6f/opts Nov 26 05:07:17 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e227 e227: 6 total, 6 up, 6 in Nov 26 05:07:17 localhost nova_compute[281415]: 2025-11-26 10:07:17.987 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:18 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:07:18.076 262471 INFO neutron.agent.dhcp.agent [None req-fcffaf85-3ab3-412f-8c5f-e686d3fac8b1 - - - - - -] DHCP configuration for ports {'29ef0207-3c01-4d05-b999-f8e8568d593b'} is completed#033[00m Nov 26 05:07:18 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:07:18.260 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:07:17Z, description=, device_id=fdaa0151-9575-4879-9920-eba93abda006, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=29ef0207-3c01-4d05-b999-f8e8568d593b, ip_allocation=immediate, mac_address=fa:16:3e:82:9e:a4, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:07:13Z, description=, dns_domain=, id=5398cb98-6af3-480d-a740-fb6c9b807f6f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIAdminTest-328286573-network, port_security_enabled=True, project_id=d822319495c24889802d5a61a295ef62, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=27451, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3354, status=ACTIVE, subnets=['4ba19de8-d14d-4129-8847-bcb9e6010bc1'], tags=[], tenant_id=d822319495c24889802d5a61a295ef62, updated_at=2025-11-26T10:07:14Z, vlan_transparent=None, network_id=5398cb98-6af3-480d-a740-fb6c9b807f6f, port_security_enabled=False, project_id=d822319495c24889802d5a61a295ef62, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3376, status=DOWN, tags=[], tenant_id=d822319495c24889802d5a61a295ef62, updated_at=2025-11-26T10:07:17Z on network 5398cb98-6af3-480d-a740-fb6c9b807f6f#033[00m Nov 26 05:07:18 localhost dnsmasq[328297]: read /var/lib/neutron/dhcp/5398cb98-6af3-480d-a740-fb6c9b807f6f/addn_hosts - 1 addresses Nov 26 05:07:18 localhost dnsmasq-dhcp[328297]: read /var/lib/neutron/dhcp/5398cb98-6af3-480d-a740-fb6c9b807f6f/host Nov 26 05:07:18 localhost dnsmasq-dhcp[328297]: read /var/lib/neutron/dhcp/5398cb98-6af3-480d-a740-fb6c9b807f6f/opts Nov 26 05:07:18 localhost podman[328376]: 2025-11-26 10:07:18.510476292 +0000 UTC m=+0.094436307 container kill 85ab8078b55141599dadb4f8392f8c068e24907cf9331f086116b9ab55fc9874 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5398cb98-6af3-480d-a740-fb6c9b807f6f, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 26 05:07:18 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:07:18.726 262471 INFO neutron.agent.dhcp.agent [None req-5cc1bcfe-41db-4e06-8fff-364fe914edae - - - - - -] DHCP configuration for ports {'29ef0207-3c01-4d05-b999-f8e8568d593b'} is completed#033[00m Nov 26 05:07:18 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 26 05:07:18 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow r pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:07:18 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow r pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:07:18 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow r pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:07:19 localhost nova_compute[281415]: 2025-11-26 10:07:19.357 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:19 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:07:19 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3068679664' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:07:19 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e227 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:07:19 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e228 e228: 6 total, 6 up, 6 in Nov 26 05:07:20 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e229 e229: 6 total, 6 up, 6 in Nov 26 05:07:21 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 26 05:07:21 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 26 05:07:21 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 26 05:07:21 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Nov 26 05:07:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 05:07:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 05:07:22 localhost podman[328398]: 2025-11-26 10:07:22.859281205 +0000 UTC m=+0.109555186 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Nov 26 05:07:22 localhost podman[328398]: 2025-11-26 10:07:22.895400554 +0000 UTC m=+0.145674555 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 26 05:07:22 localhost podman[328399]: 2025-11-26 10:07:22.905764564 +0000 UTC m=+0.151533786 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd) Nov 26 05:07:22 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 05:07:22 localhost podman[328399]: 2025-11-26 10:07:22.94241585 +0000 UTC m=+0.188185032 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 26 05:07:22 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 05:07:22 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e230 e230: 6 total, 6 up, 6 in Nov 26 05:07:22 localhost nova_compute[281415]: 2025-11-26 10:07:22.989 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:23 localhost ceph-mon[297296]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0. Nov 26 05:07:23 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:07:23.666962) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 26 05:07:23 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49 Nov 26 05:07:23 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151643667039, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 2047, "num_deletes": 266, "total_data_size": 3130029, "memory_usage": 3183880, "flush_reason": "Manual Compaction"} Nov 26 05:07:23 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started Nov 26 05:07:23 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151643677811, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 1771447, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 28615, "largest_seqno": 30656, "table_properties": {"data_size": 1763371, "index_size": 4585, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 21864, "raw_average_key_size": 23, "raw_value_size": 1745612, "raw_average_value_size": 1839, "num_data_blocks": 197, "num_entries": 949, "num_filter_entries": 949, "num_deletions": 266, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764151566, "oldest_key_time": 1764151566, "file_creation_time": 1764151643, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79fcc812-ff89-4330-9c0d-148e92884770", "db_session_id": "NHY1MHQN9GMTALZ562AS", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}} Nov 26 05:07:23 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 10887 microseconds, and 5892 cpu microseconds. Nov 26 05:07:23 localhost ceph-mon[297296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 05:07:23 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:07:23.677862) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 1771447 bytes OK Nov 26 05:07:23 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:07:23.677890) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started Nov 26 05:07:23 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:07:23.681228) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done Nov 26 05:07:23 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:07:23.681253) EVENT_LOG_v1 {"time_micros": 1764151643681247, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 26 05:07:23 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:07:23.681281) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 26 05:07:23 localhost ceph-mon[297296]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 3119874, prev total WAL file size 3120198, number of live WAL files 2. Nov 26 05:07:23 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 05:07:23 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:07:23.682312) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740034303038' seq:72057594037927935, type:22 .. '6D6772737461740034323630' seq:0, type:0; will stop at (end) Nov 26 05:07:23 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 26 05:07:23 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(1729KB)], [48(16MB)] Nov 26 05:07:23 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151643682364, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 19360169, "oldest_snapshot_seqno": -1} Nov 26 05:07:23 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 13208 keys, 17503540 bytes, temperature: kUnknown Nov 26 05:07:23 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151643769828, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 17503540, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17427539, "index_size": 41916, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 33029, "raw_key_size": 353225, "raw_average_key_size": 26, "raw_value_size": 17202255, "raw_average_value_size": 1302, "num_data_blocks": 1579, "num_entries": 13208, "num_filter_entries": 13208, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764150724, "oldest_key_time": 0, "file_creation_time": 1764151643, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79fcc812-ff89-4330-9c0d-148e92884770", "db_session_id": "NHY1MHQN9GMTALZ562AS", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}} Nov 26 05:07:23 localhost ceph-mon[297296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 05:07:23 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:07:23.770137) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 17503540 bytes Nov 26 05:07:23 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:07:23.771504) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 221.2 rd, 200.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 16.8 +0.0 blob) out(16.7 +0.0 blob), read-write-amplify(20.8) write-amplify(9.9) OK, records in: 13728, records dropped: 520 output_compression: NoCompression Nov 26 05:07:23 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:07:23.771522) EVENT_LOG_v1 {"time_micros": 1764151643771513, "job": 28, "event": "compaction_finished", "compaction_time_micros": 87534, "compaction_time_cpu_micros": 51075, "output_level": 6, "num_output_files": 1, "total_output_size": 17503540, "num_input_records": 13728, "num_output_records": 13208, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 26 05:07:23 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 05:07:23 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151643771773, "job": 28, "event": "table_file_deletion", "file_number": 50} Nov 26 05:07:23 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 05:07:23 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151643773293, "job": 28, "event": "table_file_deletion", "file_number": 48} Nov 26 05:07:23 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:07:23.682211) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:07:23 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:07:23.773315) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:07:23 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:07:23.773318) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:07:23 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:07:23.773320) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:07:23 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:07:23.773321) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:07:23 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:07:23.773323) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:07:24 localhost nova_compute[281415]: 2025-11-26 10:07:24.389 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:24 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e230 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:07:25 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:07:25 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1738081585' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:07:26 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e231 e231: 6 total, 6 up, 6 in Nov 26 05:07:27 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e232 e232: 6 total, 6 up, 6 in Nov 26 05:07:27 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:07:27 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:07:27 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:07:27 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:07:27 localhost podman[240049]: time="2025-11-26T10:07:27Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 05:07:27 localhost podman[240049]: @ - - [26/Nov/2025:10:07:27 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1" Nov 26 05:07:27 localhost podman[240049]: @ - - [26/Nov/2025:10:07:27 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19270 "" "Go-http-client/1.1" Nov 26 05:07:27 localhost nova_compute[281415]: 2025-11-26 10:07:27.992 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:28 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 26 05:07:28 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:07:28 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:07:28 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:07:28 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e233 e233: 6 total, 6 up, 6 in Nov 26 05:07:29 localhost ovn_controller[153664]: 2025-11-26T10:07:29Z|00509|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:07:29 localhost nova_compute[281415]: 2025-11-26 10:07:29.308 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:29 localhost nova_compute[281415]: 2025-11-26 10:07:29.393 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:29 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e234 e234: 6 total, 6 up, 6 in Nov 26 05:07:29 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e234 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:07:30 localhost dnsmasq[328297]: read /var/lib/neutron/dhcp/5398cb98-6af3-480d-a740-fb6c9b807f6f/addn_hosts - 0 addresses Nov 26 05:07:30 localhost dnsmasq-dhcp[328297]: read /var/lib/neutron/dhcp/5398cb98-6af3-480d-a740-fb6c9b807f6f/host Nov 26 05:07:30 localhost dnsmasq-dhcp[328297]: read /var/lib/neutron/dhcp/5398cb98-6af3-480d-a740-fb6c9b807f6f/opts Nov 26 05:07:30 localhost podman[328452]: 2025-11-26 10:07:30.985675781 +0000 UTC m=+0.066706688 container kill 85ab8078b55141599dadb4f8392f8c068e24907cf9331f086116b9ab55fc9874 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5398cb98-6af3-480d-a740-fb6c9b807f6f, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 26 05:07:31 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1. Nov 26 05:07:31 localhost ovn_controller[153664]: 2025-11-26T10:07:31Z|00510|binding|INFO|Releasing lport 469088a1-efdc-45cb-bec6-021e88f502a7 from this chassis (sb_readonly=0) Nov 26 05:07:31 localhost ovn_controller[153664]: 2025-11-26T10:07:31Z|00511|binding|INFO|Setting lport 469088a1-efdc-45cb-bec6-021e88f502a7 down in Southbound Nov 26 05:07:31 localhost kernel: device tap469088a1-ef left promiscuous mode Nov 26 05:07:31 localhost nova_compute[281415]: 2025-11-26 10:07:31.228 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:31 localhost ovn_metadata_agent[159481]: 2025-11-26 10:07:31.238 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-5398cb98-6af3-480d-a740-fb6c9b807f6f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5398cb98-6af3-480d-a740-fb6c9b807f6f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd822319495c24889802d5a61a295ef62', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e6a290dc-3890-4a34-b373-ac1ee4e3eee1, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=469088a1-efdc-45cb-bec6-021e88f502a7) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:07:31 localhost ovn_metadata_agent[159481]: 2025-11-26 10:07:31.240 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 469088a1-efdc-45cb-bec6-021e88f502a7 in datapath 5398cb98-6af3-480d-a740-fb6c9b807f6f unbound from our chassis#033[00m Nov 26 05:07:31 localhost ovn_metadata_agent[159481]: 2025-11-26 10:07:31.243 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5398cb98-6af3-480d-a740-fb6c9b807f6f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:07:31 localhost ovn_metadata_agent[159481]: 2025-11-26 10:07:31.245 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[e4f45cbf-ba20-49f0-8048-12bd03e4a98a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:07:31 localhost sshd[328475]: main: sshd: ssh-rsa algorithm is disabled Nov 26 05:07:31 localhost nova_compute[281415]: 2025-11-26 10:07:31.252 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:31 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 26 05:07:31 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 26 05:07:31 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 26 05:07:31 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Nov 26 05:07:31 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e235 e235: 6 total, 6 up, 6 in Nov 26 05:07:32 localhost ovn_controller[153664]: 2025-11-26T10:07:32Z|00512|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:07:32 localhost nova_compute[281415]: 2025-11-26 10:07:32.345 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:32 localhost dnsmasq[328297]: exiting on receipt of SIGTERM Nov 26 05:07:32 localhost podman[328494]: 2025-11-26 10:07:32.773005189 +0000 UTC m=+0.061247019 container kill 85ab8078b55141599dadb4f8392f8c068e24907cf9331f086116b9ab55fc9874 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5398cb98-6af3-480d-a740-fb6c9b807f6f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 26 05:07:32 localhost systemd[1]: libpod-85ab8078b55141599dadb4f8392f8c068e24907cf9331f086116b9ab55fc9874.scope: Deactivated successfully. Nov 26 05:07:32 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 26 05:07:32 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/272967428' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 26 05:07:32 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 26 05:07:32 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/272967428' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 26 05:07:32 localhost podman[328506]: 2025-11-26 10:07:32.831552883 +0000 UTC m=+0.047264555 container died 85ab8078b55141599dadb4f8392f8c068e24907cf9331f086116b9ab55fc9874 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5398cb98-6af3-480d-a740-fb6c9b807f6f, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:07:32 localhost podman[328506]: 2025-11-26 10:07:32.871332316 +0000 UTC m=+0.087044008 container cleanup 85ab8078b55141599dadb4f8392f8c068e24907cf9331f086116b9ab55fc9874 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5398cb98-6af3-480d-a740-fb6c9b807f6f, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:07:32 localhost systemd[1]: libpod-conmon-85ab8078b55141599dadb4f8392f8c068e24907cf9331f086116b9ab55fc9874.scope: Deactivated successfully. Nov 26 05:07:32 localhost podman[328508]: 2025-11-26 10:07:32.920076756 +0000 UTC m=+0.126315265 container remove 85ab8078b55141599dadb4f8392f8c068e24907cf9331f086116b9ab55fc9874 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5398cb98-6af3-480d-a740-fb6c9b807f6f, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:07:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:07:32.947 262471 INFO neutron.agent.dhcp.agent [None req-821ca739-da6d-4b05-8638-2f0ae0a92936 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:07:32 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:07:32.948 262471 INFO neutron.agent.dhcp.agent [None req-821ca739-da6d-4b05-8638-2f0ae0a92936 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:07:32 localhost nova_compute[281415]: 2025-11-26 10:07:32.993 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:33 localhost systemd[1]: var-lib-containers-storage-overlay-4a26937d5bea0bf1cdf5d3b25b5256ffde6e729cc3a8e6edad349f91fc9e791f-merged.mount: Deactivated successfully. Nov 26 05:07:33 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-85ab8078b55141599dadb4f8392f8c068e24907cf9331f086116b9ab55fc9874-userdata-shm.mount: Deactivated successfully. Nov 26 05:07:33 localhost systemd[1]: run-netns-qdhcp\x2d5398cb98\x2d6af3\x2d480d\x2da740\x2dfb6c9b807f6f.mount: Deactivated successfully. Nov 26 05:07:34 localhost nova_compute[281415]: 2025-11-26 10:07:34.396 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:34 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e235 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:07:34 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 26 05:07:34 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow r pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:07:34 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow r pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:07:34 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow r pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:07:35 localhost neutron_sriov_agent[255515]: 2025-11-26 10:07:35.902 2 INFO neutron.agent.securitygroups_rpc [req-65573a51-a565-4c3e-83ce-018c526f2fe1 req-c7589ddf-15dd-4d52-8419-c052d4e15747 8790f5109ea84c179bc8b14923ff8237 48f18ef67d7544d6977d40f7cdc5b4ff - - default default] Security group member updated ['2942cd33-cfc9-40b7-ad09-309a9a1e4d80']#033[00m Nov 26 05:07:36 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:07:36 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:07:37 localhost sshd[328535]: main: sshd: ssh-rsa algorithm is disabled Nov 26 05:07:37 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e236 e236: 6 total, 6 up, 6 in Nov 26 05:07:37 localhost nova_compute[281415]: 2025-11-26 10:07:37.997 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:38 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e237 e237: 6 total, 6 up, 6 in Nov 26 05:07:38 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 26 05:07:38 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 26 05:07:38 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 26 05:07:38 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Nov 26 05:07:39 localhost nova_compute[281415]: 2025-11-26 10:07:39.431 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:39 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e238 e238: 6 total, 6 up, 6 in Nov 26 05:07:39 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:07:40 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e239 e239: 6 total, 6 up, 6 in Nov 26 05:07:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 05:07:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 05:07:40 localhost podman[328537]: 2025-11-26 10:07:40.84316381 +0000 UTC m=+0.095332725 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 26 05:07:40 localhost podman[328538]: 2025-11-26 10:07:40.908779894 +0000 UTC m=+0.153824848 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:07:40 localhost podman[328537]: 2025-11-26 10:07:40.932595381 +0000 UTC m=+0.184764296 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 05:07:40 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 05:07:40 localhost podman[328538]: 2025-11-26 10:07:40.950883378 +0000 UTC m=+0.195928382 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:07:40 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 05:07:41 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 26 05:07:41 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:07:41 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:07:41 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:07:43 localhost nova_compute[281415]: 2025-11-26 10:07:42.998 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:43 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:07:43 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:07:43 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e240 e240: 6 total, 6 up, 6 in Nov 26 05:07:43 localhost ovn_controller[153664]: 2025-11-26T10:07:43Z|00513|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:07:43 localhost nova_compute[281415]: 2025-11-26 10:07:43.991 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:44 localhost nova_compute[281415]: 2025-11-26 10:07:44.434 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:44 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:07:44 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 26 05:07:44 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 26 05:07:44 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 26 05:07:44 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Nov 26 05:07:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 05:07:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 05:07:45 localhost openstack_network_exporter[242153]: ERROR 10:07:45 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 05:07:45 localhost openstack_network_exporter[242153]: ERROR 10:07:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:07:45 localhost openstack_network_exporter[242153]: ERROR 10:07:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:07:45 localhost openstack_network_exporter[242153]: ERROR 10:07:45 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 05:07:45 localhost openstack_network_exporter[242153]: Nov 26 05:07:45 localhost openstack_network_exporter[242153]: ERROR 10:07:45 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 05:07:45 localhost openstack_network_exporter[242153]: Nov 26 05:07:45 localhost podman[328581]: 2025-11-26 10:07:45.848962317 +0000 UTC m=+0.107178051 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller) Nov 26 05:07:45 localhost podman[328582]: 2025-11-26 10:07:45.917761049 +0000 UTC m=+0.170326788 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=edpm, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.buildah.version=1.33.7, version=9.6, io.openshift.tags=minimal rhel9, vcs-type=git) Nov 26 05:07:45 localhost podman[328582]: 2025-11-26 10:07:45.934918401 +0000 UTC m=+0.187484160 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_id=edpm, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, name=ubi9-minimal, release=1755695350) Nov 26 05:07:45 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 05:07:45 localhost podman[328581]: 2025-11-26 10:07:45.992894407 +0000 UTC m=+0.251110161 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 26 05:07:46 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 05:07:46 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:07:46 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:07:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 05:07:47 localhost systemd[1]: tmp-crun.gFwc3M.mount: Deactivated successfully. Nov 26 05:07:47 localhost podman[328624]: 2025-11-26 10:07:47.84366357 +0000 UTC m=+0.103621791 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 05:07:47 localhost podman[328624]: 2025-11-26 10:07:47.877294642 +0000 UTC m=+0.137252823 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 05:07:47 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 05:07:48 localhost nova_compute[281415]: 2025-11-26 10:07:48.001 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:48 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e241 e241: 6 total, 6 up, 6 in Nov 26 05:07:48 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 26 05:07:48 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow r pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:07:48 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow r pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:07:48 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow r pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:07:49 localhost nova_compute[281415]: 2025-11-26 10:07:49.438 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:49 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e241 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:07:49 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e242 e242: 6 total, 6 up, 6 in Nov 26 05:07:52 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 26 05:07:52 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 26 05:07:52 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 26 05:07:52 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Nov 26 05:07:52 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e243 e243: 6 total, 6 up, 6 in Nov 26 05:07:53 localhost nova_compute[281415]: 2025-11-26 10:07:53.006 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:54 localhost nova_compute[281415]: 2025-11-26 10:07:54.441 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:54 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:07:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 05:07:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 05:07:54 localhost podman[328647]: 2025-11-26 10:07:54.897816053 +0000 UTC m=+0.104049354 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Nov 26 05:07:54 localhost podman[328647]: 2025-11-26 10:07:54.907319599 +0000 UTC m=+0.113552950 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118) Nov 26 05:07:54 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 05:07:54 localhost podman[328648]: 2025-11-26 10:07:54.959916748 +0000 UTC m=+0.159771451 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Nov 26 05:07:54 localhost podman[328648]: 2025-11-26 10:07:54.976572765 +0000 UTC m=+0.176427468 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.build-date=20251118, tcib_managed=true) Nov 26 05:07:54 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 05:07:56 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e244 e244: 6 total, 6 up, 6 in Nov 26 05:07:57 localhost podman[240049]: time="2025-11-26T10:07:57Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 05:07:57 localhost podman[240049]: @ - - [26/Nov/2025:10:07:57 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153864 "" "Go-http-client/1.1" Nov 26 05:07:57 localhost podman[240049]: @ - - [26/Nov/2025:10:07:57 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18792 "" "Go-http-client/1.1" Nov 26 05:07:57 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e245 e245: 6 total, 6 up, 6 in Nov 26 05:07:58 localhost nova_compute[281415]: 2025-11-26 10:07:58.008 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:58 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e246 e246: 6 total, 6 up, 6 in Nov 26 05:07:59 localhost nova_compute[281415]: 2025-11-26 10:07:59.474 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:07:59 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e246 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:08:00 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 26 05:08:00 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3891171904' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 26 05:08:00 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 26 05:08:00 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3891171904' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 26 05:08:02 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:08:02 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:08:02 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 26 05:08:02 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:08:02 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:08:02 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:08:03 localhost nova_compute[281415]: 2025-11-26 10:08:03.013 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:08:03 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e247 e247: 6 total, 6 up, 6 in Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.587 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'name': 'test', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005536118.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'hostId': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.587 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.594 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '264f54d7-36f6-47db-b853-ded97fc947a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:08:03.588007', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'cbcfad40-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12158.830317701, 'message_signature': '52931f98fd49b49de864a483223913d05c213531a18b11fbb7e17fad93aeeb9a'}]}, 'timestamp': '2025-11-26 10:08:03.596209', '_unique_id': '436464ba37914e1fa207300b9656dc86'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.598 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.599 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.599 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets volume: 68 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f804e0d-0061-43d7-bc5c-6e41e9329c3f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 68, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:08:03.599515', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'cbd0473c-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12158.830317701, 'message_signature': 'a7ce1eb6f66614beacdf7b6514a7f2cf3a3b938d1a70cc9d70bf9f8cdc37702c'}]}, 'timestamp': '2025-11-26 10:08:03.600026', '_unique_id': '9f13f0f9228e412b8d9bd46b5123004a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.601 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.602 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.632 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.633 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b5429053-1415-4ce8-bc43-da4f026bf522', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:08:03.602229', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'cbd54fb6-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12158.844492582, 'message_signature': '5e6ddfc612842e9c47716fb6e4cada9cfa9edad7a5c75ac984ab262d3c976e47'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:08:03.602229', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'cbd562ee-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12158.844492582, 'message_signature': '1815b7a60dcf7a3039047cf87df6d7dc81d0be10c8a7d857010cc676df830cbb'}]}, 'timestamp': '2025-11-26 10:08:03.633433', '_unique_id': '1c65f1bed62d4b2c9ad9c9a6ce750730'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.634 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.635 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.635 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd1958d48-8237-4df5-8a13-1f0ba0e59c94', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:08:03.635673', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'cbd5cb62-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12158.830317701, 'message_signature': '57be8981f4180d863f6c94906f137cdefc5c1a8e771f3e813378c46eccff0b8b'}]}, 'timestamp': '2025-11-26 10:08:03.636202', '_unique_id': '7dd3c1ce4eac40e7ba07f314bf4e098f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.637 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.638 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.638 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 1723586642 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.638 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 89399569 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b1974088-2de5-4e6f-be67-efe49443aec9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1723586642, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:08:03.638450', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'cbd63782-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12158.844492582, 'message_signature': '337a3d69a403a32be82ea8c6e3c69a657f3af75da7747c72bd16e0b3bbee9c74'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89399569, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:08:03.638450', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'cbd6492a-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12158.844492582, 'message_signature': 'ca071144c57ea5a28b8e81ff0303018191e0439bec80d2971876a2c2582bceba'}]}, 'timestamp': '2025-11-26 10:08:03.639325', '_unique_id': '32a60fc95acc4142ade2193afdaa03f9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.640 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.641 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.641 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '657450d6-4544-4366-99e4-a8bc9675ebd2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:08:03.641521', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'cbd6afbe-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12158.830317701, 'message_signature': 'f83b95de619f256f8540f1a0e40ab0ce55b6aed3580060c76ba7643c5964276e'}]}, 'timestamp': '2025-11-26 10:08:03.642021', '_unique_id': '6941e4026204475bbc9d4d988e0d126c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.642 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.644 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.644 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '965fce72-2c41-4d9f-801a-87d1a65e9a92', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:08:03.644353', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'cbd71e86-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12158.830317701, 'message_signature': '1910244b0402ce80843850cfab984088163636868deb703d6133da661936510b'}]}, 'timestamp': '2025-11-26 10:08:03.644814', '_unique_id': 'a0b231af472b4df0a7c4323960abb109'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.645 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.646 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.646 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 1143371229 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.647 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 23326743 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1be0c250-13c8-4cf6-ba22-29e8d2c211ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1143371229, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:08:03.646925', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'cbd78402-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12158.844492582, 'message_signature': '90490be83f92218fce618e5dd839df2de2f8377b4779f0a3da48c2b3c5d4f550'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 23326743, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:08:03.646925', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'cbd79424-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12158.844492582, 'message_signature': '1e5a6bf834f9e61950ba6bb499b954572d560510d549dec84220e921f3821c52'}]}, 'timestamp': '2025-11-26 10:08:03.647795', '_unique_id': '2f07701a46e44b4a83fef76d817b153d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.648 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.649 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.650 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.650 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '293c9c01-20f9-43e5-b8ea-0964e9ea24aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:08:03.650254', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'cbd80544-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12158.830317701, 'message_signature': '957b90a83470890c4c157fd1c25c4209bf93bb90591a0db45445693b26ab6c1d'}]}, 'timestamp': '2025-11-26 10:08:03.650720', '_unique_id': '6536e0f9480243869bba6d11f01b24f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.651 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.652 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.670 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/cpu volume: 18520000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fdbe2a7d-1b13-487e-b07c-5f22f3c43289', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 18520000000, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T10:08:03.652920', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'cbdb231e-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12158.912808254, 'message_signature': '4f2ff05c263b5d663fb7a887a422a67345961e666b01cfa22f74678eca28e038'}]}, 'timestamp': '2025-11-26 10:08:03.671173', '_unique_id': '5b156409787244d68c6cfab5d99b7f28'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.672 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.673 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.673 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes volume: 7557 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cbbfca60-9ad1-4a5d-a32d-69ae08627fa7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7557, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:08:03.673311', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'cbdb898a-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12158.830317701, 'message_signature': 'faa9a1d817a4c329bd65391b3ca086654cfa30b857cd58da5df613b78af687e5'}]}, 'timestamp': '2025-11-26 10:08:03.673770', '_unique_id': '870022a61d914325afeb7439b044cd9d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.674 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.675 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.676 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:08:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:08:03.676 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:08:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:08:03.677 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:08:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:08:03.678 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'baacf144-1264-4fc2-8211-60728434df77', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:08:03.676057', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'cbdbf582-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12158.830317701, 'message_signature': 'a3d081eac4b82a15878106e7dda98089e01a3c75a2bdc43d0d992d47d7f6f6da'}]}, 'timestamp': '2025-11-26 10:08:03.676534', '_unique_id': '83ab6acae7764b978bb54ce58b11923f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.677 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.678 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.678 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.689 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.690 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '687ae422-9edf-46b5-841b-3145278cd16f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:08:03.678848', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'cbde0962-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12158.921095831, 'message_signature': '80540608ebe5f17417832aa496935bd994bc2c69655f9a0edeaa1e12eb01a27d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:08:03.678848', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'cbde20c8-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12158.921095831, 'message_signature': 'd3317322bfc0a20ea328f92259bc1ab8b765bf8e17d395c95b1f874473adb9c9'}]}, 'timestamp': '2025-11-26 10:08:03.690730', '_unique_id': '1b3d1f5ca8234cea9f12278dc9804ea7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.691 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.692 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.693 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/memory.usage volume: 51.79296875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd9020b6f-ffc4-405b-9fe6-79c5e369b8d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.79296875, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T10:08:03.693119', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'cbde8f54-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12158.912808254, 'message_signature': '4b3226b431ca17d5ee6ed166102a01c8dcbd8ccb077135d21c2fe55ba4e63d62'}]}, 'timestamp': '2025-11-26 10:08:03.693564', '_unique_id': 'e036bbedc8f8453bad01b323f80df683'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.694 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.695 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.696 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.696 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:08:03 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e248 e248: 6 total, 6 up, 6 in Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.696 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '169769e7-27a9-4023-b9b9-60e0f5d4b5f1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:08:03.696254', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'cbdf09e8-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12158.844492582, 'message_signature': 'f5589c084f7a0754d003f0d190748627076f6d31eb57fbf8036e3fce71ed2ee1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:08:03.696254', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'cbdf20e0-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12158.844492582, 'message_signature': '9c49929cd540f2bfcd1b21b788258205762315f4570a3b04732caea9896857bf'}]}, 'timestamp': '2025-11-26 10:08:03.697282', '_unique_id': 'd871dedd996a43d5ac86aef16521bc86'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.698 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.699 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.699 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.699 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.700 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cf598853-87dc-4448-b087-583fa97e7db7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:08:03.699863', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'cbdf99bc-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12158.921095831, 'message_signature': '8dbe57682224fe4ce1d811e52a42cc108a4a81dac0a72888e1809ec78969dc73'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:08:03.699863', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'cbdfaa06-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12158.921095831, 'message_signature': 'e576fe6b268d5d096f6580b6115541d86b76092075ee990278f520f05e3173a8'}]}, 'timestamp': '2025-11-26 10:08:03.700786', '_unique_id': '08f642f3d2b94b13bd605ae0d453e273'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.701 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.703 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.703 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '191163d2-3b58-4456-ad12-8b7b884b75e5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:08:03.703642', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'cbe02a8a-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12158.830317701, 'message_signature': '388f234d5ff63a57116e7678ca2932730ce2e93c4ad91d333cb8db91758564a4'}]}, 'timestamp': '2025-11-26 10:08:03.704167', '_unique_id': '53a50d7c2cfd4cb890592c2dac582763'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.705 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.706 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.707 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.707 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8c1a1a12-bfa8-4667-a924-f091469d79af', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:08:03.707014', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'cbe0b004-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12158.844492582, 'message_signature': 'cc687c0cbe849505b80265f2e144b8a627cdbcb6fa49cf999634d5fd4cd55ae3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:08:03.707014', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'cbe0c7e2-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12158.844492582, 'message_signature': '90df300df8f4c90a39042830a3ae12df28543b48ab79eec174de9bc71e60730a'}]}, 'timestamp': '2025-11-26 10:08:03.708364', '_unique_id': '8f62c0067fff4a7db38c61ccd34b745e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.709 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.711 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.711 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.711 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fa59dad0-a594-4b37-ab4b-87635efaa473', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:08:03.711269', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'cbe155f4-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12158.844492582, 'message_signature': '8f9841e9968bec1c6a932f4dc01e394b2bac96d5a2e33e2609edab28b3491297'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:08:03.711269', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'cbe16fb2-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12158.844492582, 'message_signature': 'd0fbca840d9e060179948326d9e3283bff11083bcb4f8b3351d592111e3dfddd'}]}, 'timestamp': '2025-11-26 10:08:03.712413', '_unique_id': 'b3de625bc8c5423ebe2c71ed20f5e15b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.713 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.714 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.714 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '46a68681-3871-4b83-b375-1d4b99da7de6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:08:03.714613', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': 'cbe1df1a-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12158.830317701, 'message_signature': '467320858e9ef4f10bb206ee5063cdf40246b980cffd4a60fbdbbfabce3b4dca'}]}, 'timestamp': '2025-11-26 10:08:03.715409', '_unique_id': '0221e0632344449caf50f20957492c92'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.716 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.717 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.717 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.718 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '319d42d0-1b15-4fd1-9ef1-dcaf3dae33f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:08:03.717532', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'cbe248b0-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12158.921095831, 'message_signature': '45e49fa84a99f1534a1e50c1d38c8d9bca3e937fd211f6e640e452296deeb251'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:08:03.717532', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'cbe25a80-caaf-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12158.921095831, 'message_signature': '913bde125961ad5d8815f344038306600e69a090fe2f68719dc8885933b59760'}]}, 'timestamp': '2025-11-26 10:08:03.718412', '_unique_id': 'ba01877dfa9b4e8db795c588a42f4999'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:08:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:08:03.719 12 ERROR oslo_messaging.notify.messaging Nov 26 05:08:03 localhost ceph-mon[297296]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0. Nov 26 05:08:03 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:08:03.973078) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 26 05:08:03 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52 Nov 26 05:08:03 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151683973189, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 1195, "num_deletes": 262, "total_data_size": 1373498, "memory_usage": 1395488, "flush_reason": "Manual Compaction"} Nov 26 05:08:03 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started Nov 26 05:08:03 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151683981778, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 896598, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 30661, "largest_seqno": 31851, "table_properties": {"data_size": 891326, "index_size": 2615, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 13644, "raw_average_key_size": 21, "raw_value_size": 880118, "raw_average_value_size": 1414, "num_data_blocks": 112, "num_entries": 622, "num_filter_entries": 622, "num_deletions": 262, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764151643, "oldest_key_time": 1764151643, "file_creation_time": 1764151683, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79fcc812-ff89-4330-9c0d-148e92884770", "db_session_id": "NHY1MHQN9GMTALZ562AS", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}} Nov 26 05:08:03 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 8766 microseconds, and 4334 cpu microseconds. Nov 26 05:08:03 localhost ceph-mon[297296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 05:08:03 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:08:03.981859) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 896598 bytes OK Nov 26 05:08:03 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:08:03.981908) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started Nov 26 05:08:03 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:08:03.984000) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done Nov 26 05:08:03 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:08:03.984023) EVENT_LOG_v1 {"time_micros": 1764151683984016, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 26 05:08:03 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:08:03.984051) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 26 05:08:03 localhost ceph-mon[297296]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 1367271, prev total WAL file size 1367271, number of live WAL files 2. Nov 26 05:08:03 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 05:08:03 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:08:03.985128) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132323939' seq:72057594037927935, type:22 .. '7061786F73003132353531' seq:0, type:0; will stop at (end) Nov 26 05:08:03 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 26 05:08:03 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(875KB)], [51(16MB)] Nov 26 05:08:03 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151683985195, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 18400138, "oldest_snapshot_seqno": -1} Nov 26 05:08:04 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 13292 keys, 16757757 bytes, temperature: kUnknown Nov 26 05:08:04 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151684105264, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 16757757, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16681678, "index_size": 41738, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 33285, "raw_key_size": 356068, "raw_average_key_size": 26, "raw_value_size": 16455347, "raw_average_value_size": 1237, "num_data_blocks": 1564, "num_entries": 13292, "num_filter_entries": 13292, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764150724, "oldest_key_time": 0, "file_creation_time": 1764151683, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79fcc812-ff89-4330-9c0d-148e92884770", "db_session_id": "NHY1MHQN9GMTALZ562AS", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}} Nov 26 05:08:04 localhost ceph-mon[297296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 05:08:04 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:08:04.105592) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 16757757 bytes Nov 26 05:08:04 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:08:04.107664) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 153.1 rd, 139.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 16.7 +0.0 blob) out(16.0 +0.0 blob), read-write-amplify(39.2) write-amplify(18.7) OK, records in: 13830, records dropped: 538 output_compression: NoCompression Nov 26 05:08:04 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:08:04.107695) EVENT_LOG_v1 {"time_micros": 1764151684107681, "job": 30, "event": "compaction_finished", "compaction_time_micros": 120184, "compaction_time_cpu_micros": 64314, "output_level": 6, "num_output_files": 1, "total_output_size": 16757757, "num_input_records": 13830, "num_output_records": 13292, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 26 05:08:04 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 05:08:04 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151684107998, "job": 30, "event": "table_file_deletion", "file_number": 53} Nov 26 05:08:04 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 05:08:04 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151684110880, "job": 30, "event": "table_file_deletion", "file_number": 51} Nov 26 05:08:04 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:08:03.984920) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:08:04 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:08:04.110947) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:08:04 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:08:04.110953) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:08:04 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:08:04.110955) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:08:04 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:08:04.110957) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:08:04 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:08:04.110959) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:08:04 localhost nova_compute[281415]: 2025-11-26 10:08:04.512 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:08:04 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:08:05 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 26 05:08:05 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 26 05:08:05 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 26 05:08:05 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Nov 26 05:08:05 localhost nova_compute[281415]: 2025-11-26 10:08:05.662 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:08:05 localhost nova_compute[281415]: 2025-11-26 10:08:05.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:08:05 localhost nova_compute[281415]: 2025-11-26 10:08:05.892 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:08:05 localhost nova_compute[281415]: 2025-11-26 10:08:05.892 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:08:05 localhost nova_compute[281415]: 2025-11-26 10:08:05.892 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:08:05 localhost nova_compute[281415]: 2025-11-26 10:08:05.893 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 05:08:05 localhost nova_compute[281415]: 2025-11-26 10:08:05.894 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 05:08:06 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:08:06 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:08:06 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 05:08:06 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/361017881' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 05:08:06 localhost nova_compute[281415]: 2025-11-26 10:08:06.385 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 05:08:06 localhost nova_compute[281415]: 2025-11-26 10:08:06.450 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 05:08:06 localhost nova_compute[281415]: 2025-11-26 10:08:06.453 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 05:08:06 localhost nova_compute[281415]: 2025-11-26 10:08:06.676 281419 WARNING nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 05:08:06 localhost nova_compute[281415]: 2025-11-26 10:08:06.678 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=11152MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 05:08:06 localhost nova_compute[281415]: 2025-11-26 10:08:06.678 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:08:06 localhost nova_compute[281415]: 2025-11-26 10:08:06.679 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:08:06 localhost nova_compute[281415]: 2025-11-26 10:08:06.775 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 05:08:06 localhost nova_compute[281415]: 2025-11-26 10:08:06.776 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 05:08:06 localhost nova_compute[281415]: 2025-11-26 10:08:06.776 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 05:08:06 localhost nova_compute[281415]: 2025-11-26 10:08:06.824 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 05:08:07 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 05:08:07 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/4193157758' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 05:08:07 localhost nova_compute[281415]: 2025-11-26 10:08:07.310 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 05:08:07 localhost nova_compute[281415]: 2025-11-26 10:08:07.318 281419 DEBUG nova.compute.provider_tree [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 05:08:07 localhost nova_compute[281415]: 2025-11-26 10:08:07.378 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 05:08:07 localhost nova_compute[281415]: 2025-11-26 10:08:07.380 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 05:08:07 localhost nova_compute[281415]: 2025-11-26 10:08:07.381 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.702s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:08:08 localhost nova_compute[281415]: 2025-11-26 10:08:08.058 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:08:08 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 26 05:08:08 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow r pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:08:08 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow r pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:08:08 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow r pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:08:09 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 05:08:09 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:08:09 localhost nova_compute[281415]: 2025-11-26 10:08:09.546 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:08:09 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:08:10 localhost nova_compute[281415]: 2025-11-26 10:08:10.382 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:08:10 localhost nova_compute[281415]: 2025-11-26 10:08:10.407 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:08:10 localhost nova_compute[281415]: 2025-11-26 10:08:10.408 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:08:10 localhost nova_compute[281415]: 2025-11-26 10:08:10.408 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:08:10 localhost nova_compute[281415]: 2025-11-26 10:08:10.408 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:08:10 localhost nova_compute[281415]: 2025-11-26 10:08:10.409 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 05:08:10 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 26 05:08:10 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2177411584' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 26 05:08:10 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 26 05:08:10 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2177411584' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 26 05:08:10 localhost nova_compute[281415]: 2025-11-26 10:08:10.849 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:08:11 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 26 05:08:11 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 26 05:08:11 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 26 05:08:11 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Nov 26 05:08:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 05:08:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 05:08:11 localhost nova_compute[281415]: 2025-11-26 10:08:11.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:08:11 localhost podman[328814]: 2025-11-26 10:08:11.856136882 +0000 UTC m=+0.103859087 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:08:11 localhost podman[328814]: 2025-11-26 10:08:11.89210777 +0000 UTC m=+0.139829985 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm) Nov 26 05:08:11 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 05:08:11 localhost podman[328813]: 2025-11-26 10:08:11.916511867 +0000 UTC m=+0.167130142 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 05:08:11 localhost podman[328813]: 2025-11-26 10:08:11.931469842 +0000 UTC m=+0.182088137 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 05:08:11 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 05:08:12 localhost nova_compute[281415]: 2025-11-26 10:08:12.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:08:12 localhost nova_compute[281415]: 2025-11-26 10:08:12.849 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 05:08:12 localhost nova_compute[281415]: 2025-11-26 10:08:12.849 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 05:08:12 localhost nova_compute[281415]: 2025-11-26 10:08:12.991 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 05:08:12 localhost nova_compute[281415]: 2025-11-26 10:08:12.992 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 05:08:12 localhost nova_compute[281415]: 2025-11-26 10:08:12.992 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 05:08:12 localhost nova_compute[281415]: 2025-11-26 10:08:12.992 281419 DEBUG nova.objects.instance [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 05:08:13 localhost nova_compute[281415]: 2025-11-26 10:08:13.095 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:08:13 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:08:13 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:08:13 localhost nova_compute[281415]: 2025-11-26 10:08:13.395 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 05:08:13 localhost nova_compute[281415]: 2025-11-26 10:08:13.412 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 05:08:13 localhost nova_compute[281415]: 2025-11-26 10:08:13.413 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 05:08:13 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:08:14 localhost nova_compute[281415]: 2025-11-26 10:08:14.572 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:08:14 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 26 05:08:14 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:08:14 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:08:14 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:08:14 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:08:14 localhost ovn_controller[153664]: 2025-11-26T10:08:14Z|00514|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory Nov 26 05:08:15 localhost sshd[328854]: main: sshd: ssh-rsa algorithm is disabled Nov 26 05:08:15 localhost openstack_network_exporter[242153]: ERROR 10:08:15 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 05:08:15 localhost openstack_network_exporter[242153]: ERROR 10:08:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:08:15 localhost openstack_network_exporter[242153]: ERROR 10:08:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:08:15 localhost openstack_network_exporter[242153]: ERROR 10:08:15 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 05:08:15 localhost openstack_network_exporter[242153]: Nov 26 05:08:15 localhost openstack_network_exporter[242153]: ERROR 10:08:15 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 05:08:15 localhost openstack_network_exporter[242153]: Nov 26 05:08:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 05:08:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 05:08:16 localhost systemd[1]: tmp-crun.O7ptFM.mount: Deactivated successfully. Nov 26 05:08:16 localhost podman[328857]: 2025-11-26 10:08:16.843388444 +0000 UTC m=+0.101310078 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, managed_by=edpm_ansible, version=9.6) Nov 26 05:08:16 localhost systemd[1]: tmp-crun.0VhkRW.mount: Deactivated successfully. Nov 26 05:08:16 localhost podman[328856]: 2025-11-26 10:08:16.88320507 +0000 UTC m=+0.139649978 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 26 05:08:16 localhost podman[328857]: 2025-11-26 10:08:16.88414151 +0000 UTC m=+0.142063134 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, release=1755695350, name=ubi9-minimal, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers) Nov 26 05:08:16 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 05:08:16 localhost podman[328856]: 2025-11-26 10:08:16.97039612 +0000 UTC m=+0.226840998 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251118) Nov 26 05:08:16 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 05:08:17 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 26 05:08:17 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 26 05:08:17 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 26 05:08:17 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Nov 26 05:08:17 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e249 e249: 6 total, 6 up, 6 in Nov 26 05:08:18 localhost nova_compute[281415]: 2025-11-26 10:08:18.155 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:08:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 05:08:18 localhost podman[328901]: 2025-11-26 10:08:18.830856226 +0000 UTC m=+0.091362410 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 26 05:08:18 localhost podman[328901]: 2025-11-26 10:08:18.844276932 +0000 UTC m=+0.104783116 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 05:08:18 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 05:08:19 localhost nova_compute[281415]: 2025-11-26 10:08:19.618 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:08:19 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e249 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:08:21 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 26 05:08:21 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow r pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:08:21 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow r pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:08:21 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow r pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:08:23 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e250 e250: 6 total, 6 up, 6 in Nov 26 05:08:23 localhost nova_compute[281415]: 2025-11-26 10:08:23.191 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:08:23 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e251 e251: 6 total, 6 up, 6 in Nov 26 05:08:24 localhost nova_compute[281415]: 2025-11-26 10:08:24.623 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:08:24 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e251 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:08:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 05:08:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 05:08:25 localhost podman[328925]: 2025-11-26 10:08:25.825022643 +0000 UTC m=+0.084360191 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent) Nov 26 05:08:25 localhost podman[328925]: 2025-11-26 10:08:25.833376043 +0000 UTC m=+0.092713641 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Nov 26 05:08:25 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 05:08:25 localhost podman[328926]: 2025-11-26 10:08:25.887719812 +0000 UTC m=+0.142290322 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118) Nov 26 05:08:25 localhost podman[328926]: 2025-11-26 10:08:25.903259304 +0000 UTC m=+0.157829814 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd) Nov 26 05:08:25 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 05:08:27 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:08:27 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:08:27 localhost podman[240049]: time="2025-11-26T10:08:27Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 05:08:27 localhost podman[240049]: @ - - [26/Nov/2025:10:08:27 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153864 "" "Go-http-client/1.1" Nov 26 05:08:27 localhost podman[240049]: @ - - [26/Nov/2025:10:08:27 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18788 "" "Go-http-client/1.1" Nov 26 05:08:27 localhost ovn_metadata_agent[159481]: 2025-11-26 10:08:27.552 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:5e:b4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '86:cf:7c:68:02:df'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:08:27 localhost ovn_metadata_agent[159481]: 2025-11-26 10:08:27.554 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 26 05:08:27 localhost nova_compute[281415]: 2025-11-26 10:08:27.601 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:08:28 localhost nova_compute[281415]: 2025-11-26 10:08:28.193 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:08:28 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 26 05:08:28 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 26 05:08:28 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 26 05:08:28 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Nov 26 05:08:28 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e252 e252: 6 total, 6 up, 6 in Nov 26 05:08:29 localhost nova_compute[281415]: 2025-11-26 10:08:29.626 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:08:29 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e253 e253: 6 total, 6 up, 6 in Nov 26 05:08:29 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:08:30 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:08:30 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:08:31 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 26 05:08:31 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:08:31 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:08:31 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:08:32 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e254 e254: 6 total, 6 up, 6 in Nov 26 05:08:33 localhost nova_compute[281415]: 2025-11-26 10:08:33.214 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:08:34 localhost ovn_metadata_agent[159481]: 2025-11-26 10:08:34.556 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fad182b-d1fd-4eb1-a4d3-436a76a6f49e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 05:08:34 localhost nova_compute[281415]: 2025-11-26 10:08:34.662 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:08:34 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e254 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:08:34 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 26 05:08:34 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 26 05:08:34 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 26 05:08:34 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Nov 26 05:08:37 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:08:37 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:08:38 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 26 05:08:38 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow r pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:08:38 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow r pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:08:38 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow r pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:08:38 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e255 e255: 6 total, 6 up, 6 in Nov 26 05:08:38 localhost nova_compute[281415]: 2025-11-26 10:08:38.218 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:08:38 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e256 e256: 6 total, 6 up, 6 in Nov 26 05:08:39 localhost nova_compute[281415]: 2025-11-26 10:08:39.700 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:08:39 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:08:41 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 26 05:08:41 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 26 05:08:41 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 26 05:08:41 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Nov 26 05:08:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 05:08:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 05:08:42 localhost podman[328960]: 2025-11-26 10:08:42.833518122 +0000 UTC m=+0.092311429 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 26 05:08:42 localhost podman[328960]: 2025-11-26 10:08:42.844311737 +0000 UTC m=+0.103105074 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 26 05:08:42 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 05:08:42 localhost podman[328961]: 2025-11-26 10:08:42.9345428 +0000 UTC m=+0.188713714 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:08:42 localhost podman[328961]: 2025-11-26 10:08:42.950489716 +0000 UTC m=+0.204660660 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 26 05:08:42 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 05:08:43 localhost nova_compute[281415]: 2025-11-26 10:08:43.249 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:08:43 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:08:43 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:08:43 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e257 e257: 6 total, 6 up, 6 in Nov 26 05:08:44 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 26 05:08:44 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:08:44 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:08:44 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:08:44 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:08:44 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:08:44 localhost nova_compute[281415]: 2025-11-26 10:08:44.726 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:08:44 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:08:45 localhost openstack_network_exporter[242153]: ERROR 10:08:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:08:45 localhost openstack_network_exporter[242153]: ERROR 10:08:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:08:45 localhost openstack_network_exporter[242153]: ERROR 10:08:45 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 05:08:45 localhost openstack_network_exporter[242153]: ERROR 10:08:45 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 05:08:45 localhost openstack_network_exporter[242153]: Nov 26 05:08:45 localhost openstack_network_exporter[242153]: ERROR 10:08:45 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 05:08:45 localhost openstack_network_exporter[242153]: Nov 26 05:08:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 05:08:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 05:08:47 localhost systemd[1]: tmp-crun.DYo4Gr.mount: Deactivated successfully. Nov 26 05:08:47 localhost podman[329000]: 2025-11-26 10:08:47.843996545 +0000 UTC m=+0.098584624 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0) Nov 26 05:08:47 localhost podman[329001]: 2025-11-26 10:08:47.917377325 +0000 UTC m=+0.166651438 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, container_name=openstack_network_exporter, name=ubi9-minimal, release=1755695350, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers) Nov 26 05:08:47 localhost podman[329000]: 2025-11-26 10:08:47.935360273 +0000 UTC m=+0.189948352 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 26 05:08:47 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 05:08:47 localhost podman[329001]: 2025-11-26 10:08:47.957477131 +0000 UTC m=+0.206751224 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, version=9.6, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-type=git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64) Nov 26 05:08:47 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 05:08:48 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:08:48 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:08:48 localhost nova_compute[281415]: 2025-11-26 10:08:48.251 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:08:48 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 26 05:08:48 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 26 05:08:48 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 26 05:08:48 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Nov 26 05:08:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 05:08:49 localhost nova_compute[281415]: 2025-11-26 10:08:49.729 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:08:49 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:08:49 localhost systemd[1]: tmp-crun.dx6ecH.mount: Deactivated successfully. Nov 26 05:08:49 localhost podman[329044]: 2025-11-26 10:08:49.825191653 +0000 UTC m=+0.084434865 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 05:08:49 localhost podman[329044]: 2025-11-26 10:08:49.830514888 +0000 UTC m=+0.089758090 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 26 05:08:49 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 05:08:50 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 26 05:08:50 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow r pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:08:50 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow r pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:08:50 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow r pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:08:50 localhost sshd[329068]: main: sshd: ssh-rsa algorithm is disabled Nov 26 05:08:52 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:08:52 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:08:53 localhost nova_compute[281415]: 2025-11-26 10:08:53.297 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:08:54 localhost nova_compute[281415]: 2025-11-26 10:08:54.774 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:08:54 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:08:54 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 26 05:08:54 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 26 05:08:54 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 26 05:08:54 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Nov 26 05:08:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 05:08:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 05:08:56 localhost systemd[1]: tmp-crun.s52OEs.mount: Deactivated successfully. Nov 26 05:08:56 localhost podman[329070]: 2025-11-26 10:08:56.840610142 +0000 UTC m=+0.099253855 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:08:56 localhost podman[329070]: 2025-11-26 10:08:56.850363135 +0000 UTC m=+0.109006828 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:08:56 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 05:08:56 localhost podman[329071]: 2025-11-26 10:08:56.928395559 +0000 UTC m=+0.183703507 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible) Nov 26 05:08:56 localhost podman[329071]: 2025-11-26 10:08:56.94225664 +0000 UTC m=+0.197564618 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible) Nov 26 05:08:56 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 05:08:57 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 26 05:08:57 localhost podman[240049]: time="2025-11-26T10:08:57Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 05:08:57 localhost podman[240049]: @ - - [26/Nov/2025:10:08:57 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153864 "" "Go-http-client/1.1" Nov 26 05:08:57 localhost podman[240049]: @ - - [26/Nov/2025:10:08:57 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18793 "" "Go-http-client/1.1" Nov 26 05:08:58 localhost nova_compute[281415]: 2025-11-26 10:08:58.330 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:08:58 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:08:58 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:08:58 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:08:59 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:08:59 localhost nova_compute[281415]: 2025-11-26 10:08:59.810 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:09:00 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 26 05:09:00 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 26 05:09:00 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 26 05:09:00 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Nov 26 05:09:03 localhost nova_compute[281415]: 2025-11-26 10:09:03.361 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:09:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:09:03.677 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:09:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:09:03.678 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:09:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:09:03.678 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:09:04 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 26 05:09:04 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow r pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:09:04 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow r pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:09:04 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow r pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:09:04 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:09:04 localhost nova_compute[281415]: 2025-11-26 10:09:04.849 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:09:06 localhost nova_compute[281415]: 2025-11-26 10:09:06.407 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:09:06 localhost nova_compute[281415]: 2025-11-26 10:09:06.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:09:06 localhost nova_compute[281415]: 2025-11-26 10:09:06.875 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:09:06 localhost nova_compute[281415]: 2025-11-26 10:09:06.876 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:09:06 localhost nova_compute[281415]: 2025-11-26 10:09:06.876 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:09:06 localhost nova_compute[281415]: 2025-11-26 10:09:06.876 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 05:09:06 localhost nova_compute[281415]: 2025-11-26 10:09:06.877 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 05:09:07 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 05:09:07 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/4083294568' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 05:09:07 localhost nova_compute[281415]: 2025-11-26 10:09:07.352 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 05:09:07 localhost nova_compute[281415]: 2025-11-26 10:09:07.426 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 05:09:07 localhost nova_compute[281415]: 2025-11-26 10:09:07.428 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 05:09:07 localhost nova_compute[281415]: 2025-11-26 10:09:07.639 281419 WARNING nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 05:09:07 localhost nova_compute[281415]: 2025-11-26 10:09:07.641 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=11148MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 05:09:07 localhost nova_compute[281415]: 2025-11-26 10:09:07.642 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:09:07 localhost nova_compute[281415]: 2025-11-26 10:09:07.642 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:09:07 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 26 05:09:07 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 26 05:09:07 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 26 05:09:07 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Nov 26 05:09:07 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e258 e258: 6 total, 6 up, 6 in Nov 26 05:09:07 localhost nova_compute[281415]: 2025-11-26 10:09:07.981 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 05:09:07 localhost nova_compute[281415]: 2025-11-26 10:09:07.982 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 05:09:07 localhost nova_compute[281415]: 2025-11-26 10:09:07.982 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 05:09:08 localhost nova_compute[281415]: 2025-11-26 10:09:08.141 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Refreshing inventories for resource provider 05276789-7461-410b-9529-16f5185a8bff _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Nov 26 05:09:08 localhost nova_compute[281415]: 2025-11-26 10:09:08.199 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Updating ProviderTree inventory for provider 05276789-7461-410b-9529-16f5185a8bff from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Nov 26 05:09:08 localhost nova_compute[281415]: 2025-11-26 10:09:08.200 281419 DEBUG nova.compute.provider_tree [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Updating inventory in ProviderTree for provider 05276789-7461-410b-9529-16f5185a8bff with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Nov 26 05:09:08 localhost nova_compute[281415]: 2025-11-26 10:09:08.242 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Refreshing aggregate associations for resource provider 05276789-7461-410b-9529-16f5185a8bff, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Nov 26 05:09:08 localhost nova_compute[281415]: 2025-11-26 10:09:08.276 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Refreshing trait associations for resource provider 05276789-7461-410b-9529-16f5185a8bff, traits: COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_F16C,HW_CPU_X86_SSE,HW_CPU_X86_SSE41,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_FMA3,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ACCELERATORS,HW_CPU_X86_AESNI,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_CLMUL,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_BMI,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Nov 26 05:09:08 localhost nova_compute[281415]: 2025-11-26 10:09:08.312 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 05:09:08 localhost nova_compute[281415]: 2025-11-26 10:09:08.385 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:09:08 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 05:09:08 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/4005986548' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 05:09:08 localhost nova_compute[281415]: 2025-11-26 10:09:08.768 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 05:09:08 localhost nova_compute[281415]: 2025-11-26 10:09:08.775 281419 DEBUG nova.compute.provider_tree [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 05:09:08 localhost nova_compute[281415]: 2025-11-26 10:09:08.794 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 05:09:08 localhost nova_compute[281415]: 2025-11-26 10:09:08.796 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 05:09:08 localhost nova_compute[281415]: 2025-11-26 10:09:08.797 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:09:08 localhost nova_compute[281415]: 2025-11-26 10:09:08.798 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:09:08 localhost nova_compute[281415]: 2025-11-26 10:09:08.798 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Nov 26 05:09:09 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e258 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:09:09 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 05:09:09 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:09:09 localhost nova_compute[281415]: 2025-11-26 10:09:09.878 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:09:10 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 26 05:09:10 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4199666660' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 26 05:09:10 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 26 05:09:10 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4199666660' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 26 05:09:10 localhost nova_compute[281415]: 2025-11-26 10:09:10.811 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:09:10 localhost nova_compute[281415]: 2025-11-26 10:09:10.812 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:09:10 localhost nova_compute[281415]: 2025-11-26 10:09:10.812 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:09:10 localhost nova_compute[281415]: 2025-11-26 10:09:10.812 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:09:10 localhost nova_compute[281415]: 2025-11-26 10:09:10.813 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 05:09:10 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 26 05:09:10 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:09:10 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:09:10 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:09:10 localhost nova_compute[281415]: 2025-11-26 10:09:10.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:09:10 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:09:10 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:09:11 localhost nova_compute[281415]: 2025-11-26 10:09:11.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:09:11 localhost nova_compute[281415]: 2025-11-26 10:09:11.849 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:09:11 localhost nova_compute[281415]: 2025-11-26 10:09:11.849 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Nov 26 05:09:11 localhost nova_compute[281415]: 2025-11-26 10:09:11.873 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Nov 26 05:09:12 localhost nova_compute[281415]: 2025-11-26 10:09:12.872 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:09:12 localhost nova_compute[281415]: 2025-11-26 10:09:12.872 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 05:09:12 localhost nova_compute[281415]: 2025-11-26 10:09:12.873 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 05:09:13 localhost nova_compute[281415]: 2025-11-26 10:09:13.020 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 05:09:13 localhost nova_compute[281415]: 2025-11-26 10:09:13.020 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 05:09:13 localhost nova_compute[281415]: 2025-11-26 10:09:13.021 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 05:09:13 localhost nova_compute[281415]: 2025-11-26 10:09:13.021 281419 DEBUG nova.objects.instance [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 05:09:13 localhost nova_compute[281415]: 2025-11-26 10:09:13.435 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:09:13 localhost nova_compute[281415]: 2025-11-26 10:09:13.602 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 05:09:13 localhost nova_compute[281415]: 2025-11-26 10:09:13.615 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 05:09:13 localhost nova_compute[281415]: 2025-11-26 10:09:13.616 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 05:09:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 05:09:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 05:09:13 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e259 e259: 6 total, 6 up, 6 in Nov 26 05:09:13 localhost podman[329237]: 2025-11-26 10:09:13.850347989 +0000 UTC m=+0.092839595 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 26 05:09:13 localhost podman[329237]: 2025-11-26 10:09:13.869425302 +0000 UTC m=+0.111916928 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 05:09:13 localhost podman[329238]: 2025-11-26 10:09:13.917823315 +0000 UTC m=+0.160728045 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm) Nov 26 05:09:13 localhost podman[329238]: 2025-11-26 10:09:13.935431592 +0000 UTC m=+0.178336312 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Nov 26 05:09:13 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 05:09:13 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 05:09:14 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:09:14 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 26 05:09:14 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 26 05:09:14 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 26 05:09:14 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Nov 26 05:09:14 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:09:14 localhost nova_compute[281415]: 2025-11-26 10:09:14.916 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:09:15 localhost openstack_network_exporter[242153]: ERROR 10:09:15 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 05:09:15 localhost openstack_network_exporter[242153]: Nov 26 05:09:15 localhost openstack_network_exporter[242153]: ERROR 10:09:15 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 05:09:15 localhost openstack_network_exporter[242153]: ERROR 10:09:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:09:15 localhost openstack_network_exporter[242153]: ERROR 10:09:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:09:15 localhost openstack_network_exporter[242153]: ERROR 10:09:15 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 05:09:15 localhost openstack_network_exporter[242153]: Nov 26 05:09:17 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 26 05:09:17 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow r pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:09:17 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow r pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:09:17 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow r pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:09:17 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:09:17 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:09:18 localhost nova_compute[281415]: 2025-11-26 10:09:18.486 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:09:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 05:09:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 05:09:18 localhost podman[329279]: 2025-11-26 10:09:18.821479058 +0000 UTC m=+0.082090691 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 26 05:09:18 localhost nova_compute[281415]: 2025-11-26 10:09:18.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:09:18 localhost podman[329280]: 2025-11-26 10:09:18.883207375 +0000 UTC m=+0.139019279 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, distribution-scope=public, container_name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, architecture=x86_64, build-date=2025-08-20T13:12:41, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Nov 26 05:09:18 localhost podman[329279]: 2025-11-26 10:09:18.889516332 +0000 UTC m=+0.150127995 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:09:18 localhost podman[329280]: 2025-11-26 10:09:18.896995524 +0000 UTC m=+0.152807438 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_id=edpm, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, version=9.6, vendor=Red Hat, Inc.) Nov 26 05:09:18 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 05:09:18 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 05:09:19 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:09:19 localhost nova_compute[281415]: 2025-11-26 10:09:19.964 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:09:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 05:09:20 localhost podman[329324]: 2025-11-26 10:09:20.825068901 +0000 UTC m=+0.086396154 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 05:09:20 localhost podman[329324]: 2025-11-26 10:09:20.83368284 +0000 UTC m=+0.095010093 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 05:09:20 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 05:09:21 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:09:21 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:09:22 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 26 05:09:22 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 26 05:09:22 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 26 05:09:22 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Nov 26 05:09:23 localhost nova_compute[281415]: 2025-11-26 10:09:23.533 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:09:24 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:09:24 localhost nova_compute[281415]: 2025-11-26 10:09:24.969 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:09:25 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 26 05:09:25 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:09:25 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:09:25 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:09:27 localhost podman[240049]: time="2025-11-26T10:09:27Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 05:09:27 localhost podman[240049]: @ - - [26/Nov/2025:10:09:27 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153864 "" "Go-http-client/1.1" Nov 26 05:09:27 localhost podman[240049]: @ - - [26/Nov/2025:10:09:27 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18786 "" "Go-http-client/1.1" Nov 26 05:09:27 localhost ovn_metadata_agent[159481]: 2025-11-26 10:09:27.669 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:5e:b4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '86:cf:7c:68:02:df'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:09:27 localhost ovn_metadata_agent[159481]: 2025-11-26 10:09:27.670 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 26 05:09:27 localhost nova_compute[281415]: 2025-11-26 10:09:27.706 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:09:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 05:09:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 05:09:27 localhost podman[329347]: 2025-11-26 10:09:27.825373711 +0000 UTC m=+0.082835634 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent) Nov 26 05:09:27 localhost podman[329348]: 2025-11-26 10:09:27.890631919 +0000 UTC m=+0.141404494 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Nov 26 05:09:27 localhost podman[329347]: 2025-11-26 10:09:27.910554168 +0000 UTC m=+0.168016151 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible) Nov 26 05:09:27 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 05:09:27 localhost podman[329348]: 2025-11-26 10:09:27.926163802 +0000 UTC m=+0.176936417 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Nov 26 05:09:27 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 05:09:28 localhost nova_compute[281415]: 2025-11-26 10:09:28.536 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:09:28 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 26 05:09:28 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 26 05:09:28 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 26 05:09:28 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Nov 26 05:09:29 localhost sshd[329383]: main: sshd: ssh-rsa algorithm is disabled Nov 26 05:09:29 localhost sshd[329384]: main: sshd: ssh-rsa algorithm is disabled Nov 26 05:09:29 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:09:29 localhost nova_compute[281415]: 2025-11-26 10:09:29.971 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:09:31 localhost ovn_metadata_agent[159481]: 2025-11-26 10:09:31.672 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fad182b-d1fd-4eb1-a4d3-436a76a6f49e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 05:09:32 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 26 05:09:32 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow r pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:09:32 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow r pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:09:32 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow r pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:09:33 localhost nova_compute[281415]: 2025-11-26 10:09:33.562 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:09:34 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:09:34 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:09:34 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:09:35 localhost nova_compute[281415]: 2025-11-26 10:09:35.005 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:09:35 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 26 05:09:35 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 26 05:09:35 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 26 05:09:35 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Nov 26 05:09:37 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:09:37 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:09:37 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e260 e260: 6 total, 6 up, 6 in Nov 26 05:09:38 localhost nova_compute[281415]: 2025-11-26 10:09:38.587 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:09:38 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 26 05:09:38 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:09:38 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:09:38 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:09:38 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e261 e261: 6 total, 6 up, 6 in Nov 26 05:09:39 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e261 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:09:40 localhost nova_compute[281415]: 2025-11-26 10:09:40.056 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:09:41 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 26 05:09:41 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 26 05:09:41 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 26 05:09:41 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Nov 26 05:09:43 localhost nova_compute[281415]: 2025-11-26 10:09:43.630 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:09:43 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e262 e262: 6 total, 6 up, 6 in Nov 26 05:09:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 05:09:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 05:09:44 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:09:44 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e263 e263: 6 total, 6 up, 6 in Nov 26 05:09:44 localhost systemd[1]: tmp-crun.ytb2Wr.mount: Deactivated successfully. Nov 26 05:09:44 localhost podman[329388]: 2025-11-26 10:09:44.861409191 +0000 UTC m=+0.113254530 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Nov 26 05:09:44 localhost podman[329387]: 2025-11-26 10:09:44.887985246 +0000 UTC m=+0.142508518 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 05:09:44 localhost podman[329387]: 2025-11-26 10:09:44.894473437 +0000 UTC m=+0.148996689 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 05:09:44 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 05:09:44 localhost podman[329388]: 2025-11-26 10:09:44.941359344 +0000 UTC m=+0.193204663 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 26 05:09:44 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 05:09:45 localhost nova_compute[281415]: 2025-11-26 10:09:45.059 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:09:45 localhost openstack_network_exporter[242153]: ERROR 10:09:45 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 05:09:45 localhost openstack_network_exporter[242153]: ERROR 10:09:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:09:45 localhost openstack_network_exporter[242153]: ERROR 10:09:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:09:45 localhost openstack_network_exporter[242153]: ERROR 10:09:45 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 05:09:45 localhost openstack_network_exporter[242153]: Nov 26 05:09:45 localhost openstack_network_exporter[242153]: ERROR 10:09:45 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 05:09:45 localhost openstack_network_exporter[242153]: Nov 26 05:09:45 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 26 05:09:45 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow r pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:09:45 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow r pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:09:45 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow r pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:09:46 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:09:46 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:09:47 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e264 e264: 6 total, 6 up, 6 in Nov 26 05:09:48 localhost nova_compute[281415]: 2025-11-26 10:09:48.669 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:09:48 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e265 e265: 6 total, 6 up, 6 in Nov 26 05:09:48 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 26 05:09:48 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 26 05:09:48 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 26 05:09:48 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Nov 26 05:09:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 05:09:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 05:09:49 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:09:49 localhost systemd[1]: tmp-crun.jg7m3M.mount: Deactivated successfully. Nov 26 05:09:49 localhost podman[329430]: 2025-11-26 10:09:49.832050175 +0000 UTC m=+0.092236656 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 26 05:09:49 localhost podman[329431]: 2025-11-26 10:09:49.910465791 +0000 UTC m=+0.165416300 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, version=9.6, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Nov 26 05:09:49 localhost podman[329431]: 2025-11-26 10:09:49.924465016 +0000 UTC m=+0.179415525 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-type=git, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, version=9.6, build-date=2025-08-20T13:12:41, architecture=x86_64, distribution-scope=public) Nov 26 05:09:49 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 05:09:49 localhost podman[329430]: 2025-11-26 10:09:49.941762543 +0000 UTC m=+0.201948994 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 26 05:09:49 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 05:09:50 localhost nova_compute[281415]: 2025-11-26 10:09:50.062 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:09:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 05:09:51 localhost podman[329475]: 2025-11-26 10:09:51.823585832 +0000 UTC m=+0.085152777 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 05:09:51 localhost podman[329475]: 2025-11-26 10:09:51.831866739 +0000 UTC m=+0.093433674 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 05:09:51 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 05:09:52 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 26 05:09:52 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:09:52 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:09:52 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:09:53 localhost ceph-osd[32631]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2. Nov 26 05:09:53 localhost nova_compute[281415]: 2025-11-26 10:09:53.675 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:09:53 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e266 e266: 6 total, 6 up, 6 in Nov 26 05:09:55 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:09:55 localhost nova_compute[281415]: 2025-11-26 10:09:55.102 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:09:57 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 26 05:09:57 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 26 05:09:57 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 26 05:09:57 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Nov 26 05:09:57 localhost podman[240049]: time="2025-11-26T10:09:57Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 05:09:57 localhost podman[240049]: @ - - [26/Nov/2025:10:09:57 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153864 "" "Go-http-client/1.1" Nov 26 05:09:57 localhost podman[240049]: @ - - [26/Nov/2025:10:09:57 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18791 "" "Go-http-client/1.1" Nov 26 05:09:57 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:09:57 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:09:58 localhost nova_compute[281415]: 2025-11-26 10:09:58.676 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:09:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 05:09:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 05:09:58 localhost systemd[1]: tmp-crun.WRjulW.mount: Deactivated successfully. Nov 26 05:09:58 localhost podman[329499]: 2025-11-26 10:09:58.807837191 +0000 UTC m=+0.070737968 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:09:58 localhost podman[329499]: 2025-11-26 10:09:58.81970201 +0000 UTC m=+0.082602757 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:09:58 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 05:09:58 localhost podman[329500]: 2025-11-26 10:09:58.821038572 +0000 UTC m=+0.075947520 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:09:58 localhost podman[329500]: 2025-11-26 10:09:58.906433154 +0000 UTC m=+0.161342062 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:09:58 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 05:10:00 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:10:00 localhost nova_compute[281415]: 2025-11-26 10:10:00.103 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:10:00 localhost ceph-mon[297296]: overall HEALTH_OK Nov 26 05:10:01 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 26 05:10:01 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow r pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:10:01 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow r pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:10:01 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow r pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.586 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'name': 'test', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005536118.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'hostId': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.587 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.600 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.600 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '571a749d-3527-429e-9f55-f4822534c18e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:10:03.587692', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1356ec64-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12278.829950303, 'message_signature': 'fab85622fcbe022e65ba3455154b90883fff610f1fff34eed046105d549c5130'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:10:03.587692', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1356ff88-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12278.829950303, 'message_signature': 'e513473c44004f5c855f25bb5ef021182180a62f42e6a5755d49ce0d91caae09'}]}, 'timestamp': '2025-11-26 10:10:03.601147', '_unique_id': 'e266cc80db8948d8b0ccb02abb2709b5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.603 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.604 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.604 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.614 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c0d83767-2671-4766-82e6-a646bc88319f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:10:03.605232', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '13591f2a-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12278.847486847, 'message_signature': '668b7ae541742136026337f47811086019e59f35cacf31e7fb487535939a689f'}]}, 'timestamp': '2025-11-26 10:10:03.615167', '_unique_id': 'f5fff1a67f114492a82cb960572afc3a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.617 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.619 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.649 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 1143371229 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.649 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 23326743 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '16518b5c-9b5c-476f-9b43-c80eb9a33e19', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1143371229, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:10:03.619482', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '135e6552-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12278.861781272, 'message_signature': 'f29fc5658743ec6c9db108aadda8adc1c3f92c8a7e7a374fba2b018ee462ebf8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 23326743, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:10:03.619482', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '135e7948-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12278.861781272, 'message_signature': '6079262b8a2fe580ad34a751b2d06389e9678798c83bbdd54b92e96b984494c0'}]}, 'timestamp': '2025-11-26 10:10:03.650129', '_unique_id': '75eb5111c8c349a0a720b92689844926'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.651 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.652 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.652 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0bb0fd1-39d7-4258-8d4f-eba2ac88b931', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:10:03.652844', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '135ef7b0-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12278.847486847, 'message_signature': '3cd4c8b037f3ffef10d5151a589391cda8958bba4aa311b11a63b66367bdf691'}]}, 'timestamp': '2025-11-26 10:10:03.653345', '_unique_id': '2721152731e24043b86a13c08f257bb2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.654 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.655 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.655 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4e328e89-bd95-44ac-8eeb-fe0d92b69d3b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:10:03.655475', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '135f5caa-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12278.847486847, 'message_signature': 'd43b23912da309ab3499b637ce441dc18d104c61ffa9787416850ef05caee242'}]}, 'timestamp': '2025-11-26 10:10:03.655960', '_unique_id': '6fd892abcb4047c2be71a60fc4c07f77'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.656 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.657 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.658 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.658 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '12055d2b-33a7-4986-a9ee-6c138b259b6c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:10:03.658061', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '135fc1b8-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12278.829950303, 'message_signature': '739ddba2d7db991c42a8b545aadbd149e1a4cdc0c756953c5073773dbcff248d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:10:03.658061', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '135fd1ee-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12278.829950303, 'message_signature': 'ba97df7d89a17b6ad42cbf12597fdede48ac49ac00050fa028ddfd1303f5d962'}]}, 'timestamp': '2025-11-26 10:10:03.658900', '_unique_id': '3b87a43432b948d4805d8197aa7514b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.659 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.660 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.661 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 1723586642 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.661 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 89399569 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb6b3ce5-5bfe-4f89-9209-9368360866de', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1723586642, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:10:03.661067', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '136036fc-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12278.861781272, 'message_signature': '563b562385e2d929867b04ef3c7697a026a269fe35b9d3f70a1e0e40de306084'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89399569, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:10:03.661067', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '136047aa-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12278.861781272, 'message_signature': 'cf138b7070653d2b36b1cb026ffa1a7230753e3ba5d173709beb06a8a2a345e5'}]}, 'timestamp': '2025-11-26 10:10:03.661914', '_unique_id': '2f3376b864c54aefa9f3dc8d0c2b36a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.662 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.663 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.664 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3666e877-fde8-483a-8455-de123343a5c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:10:03.664048', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '1360aba0-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12278.847486847, 'message_signature': '7881fde2f93a9df13619e56707348b419789ec460f0ccce34a5d1daea8074a59'}]}, 'timestamp': '2025-11-26 10:10:03.664502', '_unique_id': 'c46f92195c5643d0b2abe71b6a7c4ce9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.665 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.666 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.666 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.667 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5d7e9902-2fd8-4a6a-b8d5-2c36aea5df2c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:10:03.666588', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '13610ea6-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12278.861781272, 'message_signature': '2f5614602d387aaf67cead4d5b1b96bb9e6fe571cc9b796a5adb3190486354cd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:10:03.666588', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '13612026-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12278.861781272, 'message_signature': '8e80d5cc8bcf0e94980bc98011c895cb4530dc6da6d41d96a21e0ede96784901'}]}, 'timestamp': '2025-11-26 10:10:03.667457', '_unique_id': 'c980847c66d34dc8bdd284f19c9ae087'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.668 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.669 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.669 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.669 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.669 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.670 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.670 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fdc6b887-fb13-4bed-b031-937b5be76f5f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:10:03.670055', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1361963c-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12278.861781272, 'message_signature': '927f550a8d635d4fe19d21d8b35af30715ad2324f782b66f2afe3bd0b1fa48cc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:10:03.670055', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1361a618-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12278.861781272, 'message_signature': 'd6bb888a680f5d73a3ceedeb53c9b187f956f12ef99e475710225d4578ca7d38'}]}, 'timestamp': '2025-11-26 10:10:03.670887', '_unique_id': 'f7a0b0a9056d47a3be964022bcc8009f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.671 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.672 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.673 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d1d6cf1-50f4-4f8b-9d7d-b499177f64f9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:10:03.673014', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '136209d2-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12278.847486847, 'message_signature': 'bacbcd5ff4de6f56f8b020e6cdec4deb5d82211036526702d38f08437c45e08c'}]}, 'timestamp': '2025-11-26 10:10:03.673487', '_unique_id': '119a7c26f23e45998bbe44fe923fd1c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.674 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.675 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.675 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes volume: 7557 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '670a3893-8e27-44c2-a65a-27d5078fc488', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7557, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:10:03.675665', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '136273ea-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12278.847486847, 'message_signature': '613b8dc5f248fb2a3bc5802514f46f95acba1a8d338120cf22b56d61813088fb'}]}, 'timestamp': '2025-11-26 10:10:03.676189', '_unique_id': '3a449be2e5cd4ac3a548a4ea1ad195c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.677 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.678 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.678 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:10:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:10:03.678 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:10:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:10:03.679 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a34cb19f-59e7-4dc3-880e-ef6fcc69cb6d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:10:03.678257', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '1362d678-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12278.847486847, 'message_signature': 'adda042e54b8ab9325dd4e45fe43f9159f6cce497a82be80b355e48dd84ed0b9'}]}, 'timestamp': '2025-11-26 10:10:03.678718', '_unique_id': 'febadfb0b1784e6cb2b0f281628d4c5b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.679 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.680 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.680 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:10:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:10:03.680 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:10:03 localhost nova_compute[281415]: 2025-11-26 10:10:03.681 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '54a080a5-74b5-4bd7-9149-e46523f5835c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:10:03.680870', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '13633f6e-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12278.847486847, 'message_signature': '29bd91f3ec1f541d0d83e5e7fe559ecccea98aed4d12e79835a8a6a54fad32ff'}]}, 'timestamp': '2025-11-26 10:10:03.681415', '_unique_id': 'e35b8d2269c24caf9a82e72b5acc44b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.682 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.683 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.701 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/cpu volume: 19110000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '708fbf06-6831-4067-ac30-37c86375a0c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 19110000000, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T10:10:03.683750', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '1366650e-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12278.943676016, 'message_signature': 'a2b514c2d6bb62d7e35e8f8ed144d523d61911c51235ef2179567d0e10fbd47c'}]}, 'timestamp': '2025-11-26 10:10:03.702056', '_unique_id': '604c3191582a463d90bd98da302655e0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.703 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.704 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.704 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6dec2722-c179-4a24-b1a4-62f29af1d87c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:10:03.704173', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '1366cae4-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12278.847486847, 'message_signature': 'ccea80845e07a3424c78e52ed1ec5ace4afa8ca33adcc12acdd1101070ac92b0'}]}, 'timestamp': '2025-11-26 10:10:03.704623', '_unique_id': '8af6693a2c5a4833a6f8e293758eeffb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.705 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.706 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.706 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/memory.usage volume: 51.79296875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dbf1cd6a-b4ff-4c05-850d-972fac5ac2a4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.79296875, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T10:10:03.706833', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '136733bc-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12278.943676016, 'message_signature': 'cdf940371137fe8601e8a06002c52e8098639c7781858c5b043740b06d7dbb01'}]}, 'timestamp': '2025-11-26 10:10:03.707292', '_unique_id': 'c9dce24cdcec44b8a12cdc5cbff7956a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.708 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.709 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.709 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.709 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9175590e-5b4d-4210-8e22-ac268c295693', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:10:03.709341', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '13679582-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12278.861781272, 'message_signature': '0637102b079be123efe18298b6147e0cdb8e29a51ed5098468fc968ab83b5fc3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:10:03.709341', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1367a6a8-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12278.861781272, 'message_signature': '47c216de39d14dd9f1b8e49eeedd5235ac0b91dd17420bde1b5835e7f7b0ab37'}]}, 'timestamp': '2025-11-26 10:10:03.710222', '_unique_id': '50b98e153e5a46ccb012317385fc309a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.711 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.712 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.712 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.713 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad78f654-bad4-49f1-b04a-6652cc4f86b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:10:03.712614', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '13681476-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12278.861781272, 'message_signature': '6d3d53a665e696c032c1a1624baf0571bd030842b33997672e5e6365e34a7ec2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:10:03.712614', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1368259c-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12278.861781272, 'message_signature': 'dd9810168d38c5be95f8ab7749eeed2828853a59959857cb3b2b8e01bd5bb56e'}]}, 'timestamp': '2025-11-26 10:10:03.713493', '_unique_id': '0fc4e77d5d14450abe49282efc1e1b07'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.714 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.715 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.715 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.716 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '551b7e2c-0aa0-469d-b683-fd89c931baec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:10:03.715589', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '136888b6-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12278.829950303, 'message_signature': 'a0163d7e8fbf1a63767a4580f2ec372e734d0f7ff04d2ada1acefbbd3070a15b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:10:03.715589', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '13689db0-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12278.829950303, 'message_signature': '9580db134299a8098ad06f9b5092380e0a5d47c337a363371584d78d44547c19'}]}, 'timestamp': '2025-11-26 10:10:03.716548', '_unique_id': 'eee02eefab114595bdc2328052943d86'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.717 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.718 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.718 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets volume: 68 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '52c796e8-37ec-456a-99ef-dbf1c782bd47', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 68, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:10:03.718651', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '1369076e-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12278.847486847, 'message_signature': 'ed27688181332a85ad3e768e2469d327c557ff2cccd1265ae4e16dd88d8cf7e2'}]}, 'timestamp': '2025-11-26 10:10:03.719332', '_unique_id': '747832c06fe6440b8d70e96a80304947'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:10:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:10:03.720 12 ERROR oslo_messaging.notify.messaging Nov 26 05:10:04 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 26 05:10:04 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 26 05:10:04 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 26 05:10:04 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Nov 26 05:10:05 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:10:05 localhost nova_compute[281415]: 2025-11-26 10:10:05.107 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:10:06 localhost sshd[329535]: main: sshd: ssh-rsa algorithm is disabled Nov 26 05:10:06 localhost nova_compute[281415]: 2025-11-26 10:10:06.903 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:10:07 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Nov 26 05:10:07 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e267 e267: 6 total, 6 up, 6 in Nov 26 05:10:08 localhost nova_compute[281415]: 2025-11-26 10:10:08.684 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:10:08 localhost nova_compute[281415]: 2025-11-26 10:10:08.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:10:08 localhost nova_compute[281415]: 2025-11-26 10:10:08.848 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 05:10:08 localhost nova_compute[281415]: 2025-11-26 10:10:08.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:10:08 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:10:08 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:10:08 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:10:08 localhost ceph-mon[297296]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0. Nov 26 05:10:08 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:10:08.883713) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 26 05:10:08 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55 Nov 26 05:10:08 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151808883822, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 2810, "num_deletes": 261, "total_data_size": 4180761, "memory_usage": 4337728, "flush_reason": "Manual Compaction"} Nov 26 05:10:08 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started Nov 26 05:10:08 localhost nova_compute[281415]: 2025-11-26 10:10:08.888 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:10:08 localhost nova_compute[281415]: 2025-11-26 10:10:08.889 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:10:08 localhost nova_compute[281415]: 2025-11-26 10:10:08.889 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:10:08 localhost nova_compute[281415]: 2025-11-26 10:10:08.889 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 05:10:08 localhost nova_compute[281415]: 2025-11-26 10:10:08.890 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 05:10:08 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151808903255, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 2728933, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 31856, "largest_seqno": 34661, "table_properties": {"data_size": 2717878, "index_size": 6856, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3141, "raw_key_size": 27827, "raw_average_key_size": 22, "raw_value_size": 2694030, "raw_average_value_size": 2163, "num_data_blocks": 294, "num_entries": 1245, "num_filter_entries": 1245, "num_deletions": 261, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764151684, "oldest_key_time": 1764151684, "file_creation_time": 1764151808, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79fcc812-ff89-4330-9c0d-148e92884770", "db_session_id": "NHY1MHQN9GMTALZ562AS", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}} Nov 26 05:10:08 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 19585 microseconds, and 8023 cpu microseconds. Nov 26 05:10:08 localhost ceph-mon[297296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 05:10:08 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:10:08.903312) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 2728933 bytes OK Nov 26 05:10:08 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:10:08.903342) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started Nov 26 05:10:08 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:10:08.905338) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done Nov 26 05:10:08 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:10:08.905371) EVENT_LOG_v1 {"time_micros": 1764151808905365, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 26 05:10:08 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:10:08.905397) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 26 05:10:08 localhost ceph-mon[297296]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 4167298, prev total WAL file size 4167298, number of live WAL files 2. Nov 26 05:10:08 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:10:08 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 05:10:08 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:10:08 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:10:08.906547) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132353530' seq:72057594037927935, type:22 .. '7061786F73003132383032' seq:0, type:0; will stop at (end) Nov 26 05:10:08 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 26 05:10:08 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(2664KB)], [54(15MB)] Nov 26 05:10:08 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151808906604, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 19486690, "oldest_snapshot_seqno": -1} Nov 26 05:10:09 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 13995 keys, 17911645 bytes, temperature: kUnknown Nov 26 05:10:09 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151809000588, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 17911645, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17829677, "index_size": 45899, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 35013, "raw_key_size": 373038, "raw_average_key_size": 26, "raw_value_size": 17589899, "raw_average_value_size": 1256, "num_data_blocks": 1729, "num_entries": 13995, "num_filter_entries": 13995, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764150724, "oldest_key_time": 0, "file_creation_time": 1764151808, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79fcc812-ff89-4330-9c0d-148e92884770", "db_session_id": "NHY1MHQN9GMTALZ562AS", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}} Nov 26 05:10:09 localhost ceph-mon[297296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 05:10:09 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:10:09.000964) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 17911645 bytes Nov 26 05:10:09 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:10:09.003193) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 207.0 rd, 190.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.6, 16.0 +0.0 blob) out(17.1 +0.0 blob), read-write-amplify(13.7) write-amplify(6.6) OK, records in: 14537, records dropped: 542 output_compression: NoCompression Nov 26 05:10:09 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:10:09.003224) EVENT_LOG_v1 {"time_micros": 1764151809003210, "job": 32, "event": "compaction_finished", "compaction_time_micros": 94117, "compaction_time_cpu_micros": 52292, "output_level": 6, "num_output_files": 1, "total_output_size": 17911645, "num_input_records": 14537, "num_output_records": 13995, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 26 05:10:09 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 05:10:09 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151809003827, "job": 32, "event": "table_file_deletion", "file_number": 56} Nov 26 05:10:09 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 05:10:09 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151809006402, "job": 32, "event": "table_file_deletion", "file_number": 54} Nov 26 05:10:09 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:10:08.906439) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:10:09 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:10:09.006543) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:10:09 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:10:09.006554) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:10:09 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:10:09.006558) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:10:09 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:10:09.006561) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:10:09 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:10:09.006565) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:10:09 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 05:10:09 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1385155121' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 05:10:09 localhost nova_compute[281415]: 2025-11-26 10:10:09.336 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 05:10:09 localhost nova_compute[281415]: 2025-11-26 10:10:09.411 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 05:10:09 localhost nova_compute[281415]: 2025-11-26 10:10:09.412 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 05:10:09 localhost nova_compute[281415]: 2025-11-26 10:10:09.631 281419 WARNING nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 05:10:09 localhost nova_compute[281415]: 2025-11-26 10:10:09.633 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=11130MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 05:10:09 localhost nova_compute[281415]: 2025-11-26 10:10:09.634 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:10:09 localhost nova_compute[281415]: 2025-11-26 10:10:09.634 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:10:09 localhost nova_compute[281415]: 2025-11-26 10:10:09.735 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 05:10:09 localhost nova_compute[281415]: 2025-11-26 10:10:09.736 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 05:10:09 localhost nova_compute[281415]: 2025-11-26 10:10:09.736 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 05:10:09 localhost nova_compute[281415]: 2025-11-26 10:10:09.776 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 05:10:10 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:10:10 localhost nova_compute[281415]: 2025-11-26 10:10:10.110 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:10:10 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 05:10:10 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1080173198' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 05:10:10 localhost nova_compute[281415]: 2025-11-26 10:10:10.250 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 05:10:10 localhost nova_compute[281415]: 2025-11-26 10:10:10.258 281419 DEBUG nova.compute.provider_tree [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 05:10:10 localhost nova_compute[281415]: 2025-11-26 10:10:10.279 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 05:10:10 localhost nova_compute[281415]: 2025-11-26 10:10:10.282 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 05:10:10 localhost nova_compute[281415]: 2025-11-26 10:10:10.282 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.648s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:10:10 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:10:10 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:10:11 localhost nova_compute[281415]: 2025-11-26 10:10:11.278 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:10:11 localhost nova_compute[281415]: 2025-11-26 10:10:11.315 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:10:11 localhost nova_compute[281415]: 2025-11-26 10:10:11.316 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:10:11 localhost nova_compute[281415]: 2025-11-26 10:10:11.317 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:10:11 localhost nova_compute[281415]: 2025-11-26 10:10:11.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:10:11 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 05:10:11 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:10:12 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:10:12 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:10:12 localhost nova_compute[281415]: 2025-11-26 10:10:12.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:10:13 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:10:13 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:10:13 localhost nova_compute[281415]: 2025-11-26 10:10:13.708 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:10:13 localhost nova_compute[281415]: 2025-11-26 10:10:13.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:10:13 localhost nova_compute[281415]: 2025-11-26 10:10:13.849 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 05:10:13 localhost nova_compute[281415]: 2025-11-26 10:10:13.849 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 05:10:14 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e268 e268: 6 total, 6 up, 6 in Nov 26 05:10:14 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:10:14 localhost nova_compute[281415]: 2025-11-26 10:10:14.939 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 05:10:14 localhost nova_compute[281415]: 2025-11-26 10:10:14.940 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 05:10:14 localhost nova_compute[281415]: 2025-11-26 10:10:14.940 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 05:10:14 localhost nova_compute[281415]: 2025-11-26 10:10:14.941 281419 DEBUG nova.objects.instance [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 05:10:15 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:10:15 localhost nova_compute[281415]: 2025-11-26 10:10:15.145 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:10:15 localhost nova_compute[281415]: 2025-11-26 10:10:15.548 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 05:10:15 localhost nova_compute[281415]: 2025-11-26 10:10:15.566 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 05:10:15 localhost nova_compute[281415]: 2025-11-26 10:10:15.567 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 05:10:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 05:10:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 05:10:15 localhost openstack_network_exporter[242153]: ERROR 10:10:15 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 05:10:15 localhost openstack_network_exporter[242153]: ERROR 10:10:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:10:15 localhost openstack_network_exporter[242153]: ERROR 10:10:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:10:15 localhost openstack_network_exporter[242153]: ERROR 10:10:15 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 05:10:15 localhost openstack_network_exporter[242153]: Nov 26 05:10:15 localhost openstack_network_exporter[242153]: ERROR 10:10:15 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 05:10:15 localhost openstack_network_exporter[242153]: Nov 26 05:10:15 localhost podman[329667]: 2025-11-26 10:10:15.915613267 +0000 UTC m=+0.169415604 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 26 05:10:15 localhost podman[329668]: 2025-11-26 10:10:15.877228875 +0000 UTC m=+0.130870527 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible) Nov 26 05:10:15 localhost podman[329668]: 2025-11-26 10:10:15.957121937 +0000 UTC m=+0.210763559 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute) Nov 26 05:10:15 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 05:10:15 localhost podman[329667]: 2025-11-26 10:10:15.981102322 +0000 UTC m=+0.234904649 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 26 05:10:16 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 05:10:16 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Nov 26 05:10:16 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a,allow rw path=/volumes/_nogroup/975dc2c2-9afa-478b-bb89-47ce2cdbbc97/3fda5e98-80ce-48cc-b39a-9ceab2a53e06", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656,allow rw pool=manila_data namespace=fsvolumens_975dc2c2-9afa-478b-bb89-47ce2cdbbc97"]} : dispatch Nov 26 05:10:16 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a,allow rw path=/volumes/_nogroup/975dc2c2-9afa-478b-bb89-47ce2cdbbc97/3fda5e98-80ce-48cc-b39a-9ceab2a53e06", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656,allow rw pool=manila_data namespace=fsvolumens_975dc2c2-9afa-478b-bb89-47ce2cdbbc97"]} : dispatch Nov 26 05:10:16 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Nov 26 05:10:16 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a,allow rw path=/volumes/_nogroup/975dc2c2-9afa-478b-bb89-47ce2cdbbc97/3fda5e98-80ce-48cc-b39a-9ceab2a53e06", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656,allow rw pool=manila_data namespace=fsvolumens_975dc2c2-9afa-478b-bb89-47ce2cdbbc97"]}]': finished Nov 26 05:10:17 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e269 e269: 6 total, 6 up, 6 in Nov 26 05:10:18 localhost nova_compute[281415]: 2025-11-26 10:10:18.741 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:10:19 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Nov 26 05:10:19 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656"]} : dispatch Nov 26 05:10:19 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656"]} : dispatch Nov 26 05:10:19 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/51d262c4-3a43-4bfc-acac-7f787495e656/3d500796-247d-4981-86d0-b78b9fa2895a", "osd", "allow rw pool=manila_data namespace=fsvolumens_51d262c4-3a43-4bfc-acac-7f787495e656"]}]': finished Nov 26 05:10:20 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e269 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:10:20 localhost nova_compute[281415]: 2025-11-26 10:10:20.148 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:10:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 05:10:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 05:10:20 localhost podman[329711]: 2025-11-26 10:10:20.830305485 +0000 UTC m=+0.083567387 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:10:20 localhost systemd[1]: tmp-crun.JXZVeq.mount: Deactivated successfully. Nov 26 05:10:20 localhost podman[329712]: 2025-11-26 10:10:20.889117562 +0000 UTC m=+0.139313789 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, config_id=edpm, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc.) Nov 26 05:10:20 localhost podman[329711]: 2025-11-26 10:10:20.895351336 +0000 UTC m=+0.148613188 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 26 05:10:20 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 05:10:20 localhost podman[329712]: 2025-11-26 10:10:20.929456836 +0000 UTC m=+0.179653063 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9) Nov 26 05:10:20 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 05:10:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 05:10:22 localhost systemd[1]: tmp-crun.WYdwb4.mount: Deactivated successfully. Nov 26 05:10:22 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Nov 26 05:10:22 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.bob"} : dispatch Nov 26 05:10:22 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.bob"} : dispatch Nov 26 05:10:22 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.bob"}]': finished Nov 26 05:10:22 localhost podman[329756]: 2025-11-26 10:10:22.836104357 +0000 UTC m=+0.092807034 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 26 05:10:22 localhost podman[329756]: 2025-11-26 10:10:22.868331268 +0000 UTC m=+0.125033945 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 05:10:22 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 05:10:23 localhost nova_compute[281415]: 2025-11-26 10:10:23.782 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:10:23 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e270 e270: 6 total, 6 up, 6 in Nov 26 05:10:25 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e270 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:10:25 localhost nova_compute[281415]: 2025-11-26 10:10:25.190 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:10:27 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:10:27 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:10:27 localhost podman[240049]: time="2025-11-26T10:10:27Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 05:10:27 localhost podman[240049]: @ - - [26/Nov/2025:10:10:27 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153864 "" "Go-http-client/1.1" Nov 26 05:10:27 localhost podman[240049]: @ - - [26/Nov/2025:10:10:27 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18791 "" "Go-http-client/1.1" Nov 26 05:10:28 localhost nova_compute[281415]: 2025-11-26 10:10:28.826 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:10:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 05:10:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 05:10:29 localhost systemd[1]: tmp-crun.JRKpUl.mount: Deactivated successfully. Nov 26 05:10:29 localhost podman[329780]: 2025-11-26 10:10:29.83876531 +0000 UTC m=+0.094039823 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true) Nov 26 05:10:29 localhost podman[329780]: 2025-11-26 10:10:29.872709415 +0000 UTC m=+0.127983988 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Nov 26 05:10:29 localhost systemd[1]: tmp-crun.yrp1j5.mount: Deactivated successfully. Nov 26 05:10:29 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 05:10:29 localhost podman[329781]: 2025-11-26 10:10:29.892064736 +0000 UTC m=+0.142970802 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=multipathd) Nov 26 05:10:29 localhost podman[329781]: 2025-11-26 10:10:29.907280108 +0000 UTC m=+0.158186184 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 26 05:10:29 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 05:10:30 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e270 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:10:30 localhost nova_compute[281415]: 2025-11-26 10:10:30.192 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:10:31 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:10:31 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:10:33 localhost ceph-mon[297296]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0. Nov 26 05:10:33 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:10:33.816828) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 26 05:10:33 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58 Nov 26 05:10:33 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151833816897, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 682, "num_deletes": 257, "total_data_size": 656329, "memory_usage": 669584, "flush_reason": "Manual Compaction"} Nov 26 05:10:33 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started Nov 26 05:10:33 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151833823402, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 430152, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 34667, "largest_seqno": 35343, "table_properties": {"data_size": 426661, "index_size": 1282, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 8975, "raw_average_key_size": 20, "raw_value_size": 419162, "raw_average_value_size": 935, "num_data_blocks": 56, "num_entries": 448, "num_filter_entries": 448, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764151809, "oldest_key_time": 1764151809, "file_creation_time": 1764151833, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79fcc812-ff89-4330-9c0d-148e92884770", "db_session_id": "NHY1MHQN9GMTALZ562AS", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}} Nov 26 05:10:33 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 6608 microseconds, and 2323 cpu microseconds. Nov 26 05:10:33 localhost ceph-mon[297296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 05:10:33 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:10:33.823445) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 430152 bytes OK Nov 26 05:10:33 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:10:33.823469) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started Nov 26 05:10:33 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:10:33.825511) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done Nov 26 05:10:33 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:10:33.825533) EVENT_LOG_v1 {"time_micros": 1764151833825526, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 26 05:10:33 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:10:33.825557) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 26 05:10:33 localhost ceph-mon[297296]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 652366, prev total WAL file size 652690, number of live WAL files 2. Nov 26 05:10:33 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 05:10:33 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:10:33.826456) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034323638' seq:72057594037927935, type:22 .. '6C6F676D0034353230' seq:0, type:0; will stop at (end) Nov 26 05:10:33 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 26 05:10:33 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(420KB)], [57(17MB)] Nov 26 05:10:33 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151833826552, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 18341797, "oldest_snapshot_seqno": -1} Nov 26 05:10:33 localhost nova_compute[281415]: 2025-11-26 10:10:33.829 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:10:33 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 13907 keys, 18071784 bytes, temperature: kUnknown Nov 26 05:10:33 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151833898629, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 18071784, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17989833, "index_size": 46074, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 34821, "raw_key_size": 372581, "raw_average_key_size": 26, "raw_value_size": 17751040, "raw_average_value_size": 1276, "num_data_blocks": 1731, "num_entries": 13907, "num_filter_entries": 13907, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764150724, "oldest_key_time": 0, "file_creation_time": 1764151833, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79fcc812-ff89-4330-9c0d-148e92884770", "db_session_id": "NHY1MHQN9GMTALZ562AS", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}} Nov 26 05:10:33 localhost ceph-mon[297296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 05:10:33 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:10:33.899037) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 18071784 bytes Nov 26 05:10:33 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:10:33.901204) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 254.2 rd, 250.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 17.1 +0.0 blob) out(17.2 +0.0 blob), read-write-amplify(84.7) write-amplify(42.0) OK, records in: 14443, records dropped: 536 output_compression: NoCompression Nov 26 05:10:33 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:10:33.901234) EVENT_LOG_v1 {"time_micros": 1764151833901220, "job": 34, "event": "compaction_finished", "compaction_time_micros": 72146, "compaction_time_cpu_micros": 37155, "output_level": 6, "num_output_files": 1, "total_output_size": 18071784, "num_input_records": 14443, "num_output_records": 13907, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 26 05:10:33 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 05:10:33 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151833901451, "job": 34, "event": "table_file_deletion", "file_number": 59} Nov 26 05:10:33 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 05:10:33 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151833903919, "job": 34, "event": "table_file_deletion", "file_number": 57} Nov 26 05:10:33 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:10:33.826297) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:10:33 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:10:33.903971) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:10:33 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:10:33.903975) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:10:33 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:10:33.903978) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:10:33 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:10:33.903982) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:10:33 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:10:33.903985) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:10:34 localhost ovn_metadata_agent[159481]: 2025-11-26 10:10:34.074 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:5e:b4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '86:cf:7c:68:02:df'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:10:34 localhost ovn_metadata_agent[159481]: 2025-11-26 10:10:34.075 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 26 05:10:34 localhost nova_compute[281415]: 2025-11-26 10:10:34.112 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:10:35 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e270 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:10:35 localhost nova_compute[281415]: 2025-11-26 10:10:35.232 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:10:36 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:10:36 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:10:37 localhost ovn_metadata_agent[159481]: 2025-11-26 10:10:37.078 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fad182b-d1fd-4eb1-a4d3-436a76a6f49e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 05:10:37 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e271 e271: 6 total, 6 up, 6 in Nov 26 05:10:38 localhost nova_compute[281415]: 2025-11-26 10:10:38.833 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:10:40 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:10:40 localhost nova_compute[281415]: 2025-11-26 10:10:40.236 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:10:40 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:10:40 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:10:42 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:10:42 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:10:43 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e272 e272: 6 total, 6 up, 6 in Nov 26 05:10:43 localhost nova_compute[281415]: 2025-11-26 10:10:43.867 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:10:44 localhost sshd[329820]: main: sshd: ssh-rsa algorithm is disabled Nov 26 05:10:44 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:10:44 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:10:44 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e273 e273: 6 total, 6 up, 6 in Nov 26 05:10:45 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:10:45 localhost nova_compute[281415]: 2025-11-26 10:10:45.271 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:10:45 localhost openstack_network_exporter[242153]: ERROR 10:10:45 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 05:10:45 localhost openstack_network_exporter[242153]: ERROR 10:10:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:10:45 localhost openstack_network_exporter[242153]: ERROR 10:10:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:10:45 localhost openstack_network_exporter[242153]: ERROR 10:10:45 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 05:10:45 localhost openstack_network_exporter[242153]: Nov 26 05:10:45 localhost openstack_network_exporter[242153]: ERROR 10:10:45 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 05:10:45 localhost openstack_network_exporter[242153]: Nov 26 05:10:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 05:10:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 05:10:46 localhost podman[329822]: 2025-11-26 10:10:46.829411519 +0000 UTC m=+0.087616762 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 26 05:10:46 localhost podman[329822]: 2025-11-26 10:10:46.841402862 +0000 UTC m=+0.099608065 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 05:10:46 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 05:10:46 localhost podman[329823]: 2025-11-26 10:10:46.933658508 +0000 UTC m=+0.188277640 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118) Nov 26 05:10:46 localhost podman[329823]: 2025-11-26 10:10:46.947458947 +0000 UTC m=+0.202078129 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 26 05:10:46 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 05:10:47 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch Nov 26 05:10:47 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/c4fcdbcf-5eaa-402c-9db8-42ab32b91d6d/b870cfa3-5ed0-493f-a464-eee3f8d4ad30", "osd", "allow rw pool=manila_data namespace=fsvolumens_c4fcdbcf-5eaa-402c-9db8-42ab32b91d6d", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:10:47 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/c4fcdbcf-5eaa-402c-9db8-42ab32b91d6d/b870cfa3-5ed0-493f-a464-eee3f8d4ad30", "osd", "allow rw pool=manila_data namespace=fsvolumens_c4fcdbcf-5eaa-402c-9db8-42ab32b91d6d", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:10:47 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/c4fcdbcf-5eaa-402c-9db8-42ab32b91d6d/b870cfa3-5ed0-493f-a464-eee3f8d4ad30", "osd", "allow rw pool=manila_data namespace=fsvolumens_c4fcdbcf-5eaa-402c-9db8-42ab32b91d6d", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:10:48 localhost ceph-mgr[287388]: client.0 ms_handle_reset on v2:172.18.0.108:6810/3354046426 Nov 26 05:10:48 localhost nova_compute[281415]: 2025-11-26 10:10:48.869 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:10:49 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:10:49 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:10:50 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:10:50 localhost nova_compute[281415]: 2025-11-26 10:10:50.274 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:10:51 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:10:51 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:10:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 05:10:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 05:10:51 localhost systemd[1]: tmp-crun.QswEf3.mount: Deactivated successfully. Nov 26 05:10:51 localhost podman[329863]: 2025-11-26 10:10:51.819275881 +0000 UTC m=+0.077532889 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller) Nov 26 05:10:51 localhost podman[329863]: 2025-11-26 10:10:51.90483099 +0000 UTC m=+0.163088088 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3) Nov 26 05:10:51 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 05:10:51 localhost podman[329864]: 2025-11-26 10:10:51.909123903 +0000 UTC m=+0.164431799 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.buildah.version=1.33.7, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, distribution-scope=public) Nov 26 05:10:51 localhost podman[329864]: 2025-11-26 10:10:51.988707306 +0000 UTC m=+0.244015272 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, release=1755695350, architecture=x86_64, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=edpm, distribution-scope=public, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, version=9.6) Nov 26 05:10:52 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 05:10:52 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-334336383", "format": "json"} : dispatch Nov 26 05:10:52 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-334336383", "caps": ["mds", "allow rw path=/volumes/_nogroup/8a1484c2-5dd2-4500-94a7-be59ed8c0eb6/8cb91c51-ba36-4957-8e8b-e461fd2a70f5", "osd", "allow rw pool=manila_data namespace=fsvolumens_8a1484c2-5dd2-4500-94a7-be59ed8c0eb6", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:10:52 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-334336383", "caps": ["mds", "allow rw path=/volumes/_nogroup/8a1484c2-5dd2-4500-94a7-be59ed8c0eb6/8cb91c51-ba36-4957-8e8b-e461fd2a70f5", "osd", "allow rw pool=manila_data namespace=fsvolumens_8a1484c2-5dd2-4500-94a7-be59ed8c0eb6", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:10:52 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-334336383", "caps": ["mds", "allow rw path=/volumes/_nogroup/8a1484c2-5dd2-4500-94a7-be59ed8c0eb6/8cb91c51-ba36-4957-8e8b-e461fd2a70f5", "osd", "allow rw pool=manila_data namespace=fsvolumens_8a1484c2-5dd2-4500-94a7-be59ed8c0eb6", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:10:53 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e274 e274: 6 total, 6 up, 6 in Nov 26 05:10:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 05:10:53 localhost systemd[1]: tmp-crun.aKAaFR.mount: Deactivated successfully. Nov 26 05:10:53 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e275 e275: 6 total, 6 up, 6 in Nov 26 05:10:53 localhost podman[329904]: 2025-11-26 10:10:53.841282148 +0000 UTC m=+0.099087949 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 05:10:53 localhost podman[329904]: 2025-11-26 10:10:53.855488539 +0000 UTC m=+0.113294350 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 05:10:53 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 05:10:53 localhost nova_compute[281415]: 2025-11-26 10:10:53.872 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:10:53 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-334336383", "format": "json"} : dispatch Nov 26 05:10:53 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-334336383"} : dispatch Nov 26 05:10:53 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-334336383"} : dispatch Nov 26 05:10:53 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-334336383"}]': finished Nov 26 05:10:55 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:10:55 localhost nova_compute[281415]: 2025-11-26 10:10:55.566 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:10:55 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch Nov 26 05:10:57 localhost podman[240049]: time="2025-11-26T10:10:57Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 05:10:57 localhost podman[240049]: @ - - [26/Nov/2025:10:10:57 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153864 "" "Go-http-client/1.1" Nov 26 05:10:57 localhost podman[240049]: @ - - [26/Nov/2025:10:10:57 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18791 "" "Go-http-client/1.1" Nov 26 05:10:58 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1392729429", "format": "json"} : dispatch Nov 26 05:10:58 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1392729429", "caps": ["mds", "allow rw path=/volumes/_nogroup/26c7be4e-509a-40d8-81be-8160cc112546/6f4d674e-af3b-465e-9b34-4211dfb5a818", "osd", "allow rw pool=manila_data namespace=fsvolumens_26c7be4e-509a-40d8-81be-8160cc112546", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:10:58 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1392729429", "caps": ["mds", "allow rw path=/volumes/_nogroup/26c7be4e-509a-40d8-81be-8160cc112546/6f4d674e-af3b-465e-9b34-4211dfb5a818", "osd", "allow rw pool=manila_data namespace=fsvolumens_26c7be4e-509a-40d8-81be-8160cc112546", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:10:58 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1392729429", "caps": ["mds", "allow rw path=/volumes/_nogroup/26c7be4e-509a-40d8-81be-8160cc112546/6f4d674e-af3b-465e-9b34-4211dfb5a818", "osd", "allow rw pool=manila_data namespace=fsvolumens_26c7be4e-509a-40d8-81be-8160cc112546", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:10:58 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e276 e276: 6 total, 6 up, 6 in Nov 26 05:10:58 localhost nova_compute[281415]: 2025-11-26 10:10:58.874 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:10:59 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:10:59.255 262471 INFO neutron.agent.linux.ip_lib [None req-5c21b422-9581-46ea-a40b-cfdee02e6c91 - - - - - -] Device tap66ba12d6-97 cannot be used as it has no MAC address#033[00m Nov 26 05:10:59 localhost nova_compute[281415]: 2025-11-26 10:10:59.318 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:10:59 localhost kernel: device tap66ba12d6-97 entered promiscuous mode Nov 26 05:10:59 localhost NetworkManager[5970]: [1764151859.3300] manager: (tap66ba12d6-97): new Generic device (/org/freedesktop/NetworkManager/Devices/79) Nov 26 05:10:59 localhost nova_compute[281415]: 2025-11-26 10:10:59.331 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:10:59 localhost ovn_controller[153664]: 2025-11-26T10:10:59Z|00515|binding|INFO|Claiming lport 66ba12d6-97e8-4734-ac36-52496d50a518 for this chassis. Nov 26 05:10:59 localhost ovn_controller[153664]: 2025-11-26T10:10:59Z|00516|binding|INFO|66ba12d6-97e8-4734-ac36-52496d50a518: Claiming unknown Nov 26 05:10:59 localhost nova_compute[281415]: 2025-11-26 10:10:59.345 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:10:59 localhost ovn_metadata_agent[159481]: 2025-11-26 10:10:59.343 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-a7f4d4e3-3cce-4bbf-b4f0-67cb210bc5c7', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a7f4d4e3-3cce-4bbf-b4f0-67cb210bc5c7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '97357935186e4539b78bb721e122577a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=62fbbb20-962c-4fb3-aa5f-b185a37d1444, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=66ba12d6-97e8-4734-ac36-52496d50a518) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:10:59 localhost ovn_metadata_agent[159481]: 2025-11-26 10:10:59.345 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 66ba12d6-97e8-4734-ac36-52496d50a518 in datapath a7f4d4e3-3cce-4bbf-b4f0-67cb210bc5c7 bound to our chassis#033[00m Nov 26 05:10:59 localhost ovn_metadata_agent[159481]: 2025-11-26 10:10:59.347 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Port a3997c20-54b1-4c5c-9d29-712361bb5331 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 26 05:10:59 localhost ovn_metadata_agent[159481]: 2025-11-26 10:10:59.347 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a7f4d4e3-3cce-4bbf-b4f0-67cb210bc5c7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:10:59 localhost ovn_metadata_agent[159481]: 2025-11-26 10:10:59.348 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[b3e8e28a-4293-4c8b-a279-685590d6d56d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:10:59 localhost ovn_controller[153664]: 2025-11-26T10:10:59Z|00517|binding|INFO|Setting lport 66ba12d6-97e8-4734-ac36-52496d50a518 ovn-installed in OVS Nov 26 05:10:59 localhost ovn_controller[153664]: 2025-11-26T10:10:59Z|00518|binding|INFO|Setting lport 66ba12d6-97e8-4734-ac36-52496d50a518 up in Southbound Nov 26 05:10:59 localhost nova_compute[281415]: 2025-11-26 10:10:59.372 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:10:59 localhost nova_compute[281415]: 2025-11-26 10:10:59.373 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:10:59 localhost nova_compute[281415]: 2025-11-26 10:10:59.425 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:10:59 localhost nova_compute[281415]: 2025-11-26 10:10:59.455 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:10:59 localhost nova_compute[281415]: 2025-11-26 10:10:59.590 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:10:59 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e277 e277: 6 total, 6 up, 6 in Nov 26 05:11:00 localhost podman[329990]: Nov 26 05:11:00 localhost podman[329990]: 2025-11-26 10:11:00.47597981 +0000 UTC m=+0.119208234 container create 04ff34a52bfe265a6fe76fd6095b7a17fe93e00b2606f63082d911a85ca98982 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a7f4d4e3-3cce-4bbf-b4f0-67cb210bc5c7, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118) Nov 26 05:11:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 05:11:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 05:11:00 localhost podman[329990]: 2025-11-26 10:11:00.410267609 +0000 UTC m=+0.053496043 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:11:00 localhost systemd[1]: Started libpod-conmon-04ff34a52bfe265a6fe76fd6095b7a17fe93e00b2606f63082d911a85ca98982.scope. Nov 26 05:11:00 localhost systemd[1]: tmp-crun.QlnF45.mount: Deactivated successfully. Nov 26 05:11:00 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e277 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:11:00 localhost systemd[1]: Started libcrun container. Nov 26 05:11:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/acaa3199f2d9b4bfd0f6f8f5b1b91ce43b125639acae88b1f36d6142abf0e1d2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:11:00 localhost nova_compute[281415]: 2025-11-26 10:11:00.612 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:11:00 localhost podman[330003]: 2025-11-26 10:11:00.642590207 +0000 UTC m=+0.134250632 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:11:00 localhost podman[330003]: 2025-11-26 10:11:00.654220838 +0000 UTC m=+0.145881263 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:11:00 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 05:11:00 localhost podman[329990]: 2025-11-26 10:11:00.67329307 +0000 UTC m=+0.316521484 container init 04ff34a52bfe265a6fe76fd6095b7a17fe93e00b2606f63082d911a85ca98982 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a7f4d4e3-3cce-4bbf-b4f0-67cb210bc5c7, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0) Nov 26 05:11:00 localhost podman[330004]: 2025-11-26 10:11:00.627333262 +0000 UTC m=+0.117214472 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3) Nov 26 05:11:00 localhost dnsmasq[330042]: started, version 2.85 cachesize 150 Nov 26 05:11:00 localhost dnsmasq[330042]: DNS service limited to local subnets Nov 26 05:11:00 localhost dnsmasq[330042]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:11:00 localhost dnsmasq[330042]: warning: no upstream servers configured Nov 26 05:11:00 localhost dnsmasq-dhcp[330042]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 26 05:11:00 localhost dnsmasq[330042]: read /var/lib/neutron/dhcp/a7f4d4e3-3cce-4bbf-b4f0-67cb210bc5c7/addn_hosts - 0 addresses Nov 26 05:11:00 localhost dnsmasq-dhcp[330042]: read /var/lib/neutron/dhcp/a7f4d4e3-3cce-4bbf-b4f0-67cb210bc5c7/host Nov 26 05:11:00 localhost dnsmasq-dhcp[330042]: read /var/lib/neutron/dhcp/a7f4d4e3-3cce-4bbf-b4f0-67cb210bc5c7/opts Nov 26 05:11:00 localhost podman[330004]: 2025-11-26 10:11:00.70839168 +0000 UTC m=+0.198272870 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true) Nov 26 05:11:00 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 05:11:00 localhost podman[329990]: 2025-11-26 10:11:00.733787699 +0000 UTC m=+0.377016103 container start 04ff34a52bfe265a6fe76fd6095b7a17fe93e00b2606f63082d911a85ca98982 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a7f4d4e3-3cce-4bbf-b4f0-67cb210bc5c7, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 26 05:11:00 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:11:00.797 262471 INFO neutron.agent.dhcp.agent [None req-7004f8a0-2d43-40ce-b2e7-c7434d5d581a - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:10:59Z, description=, device_id=3ecb89c1-d55d-4a57-953b-05ba1a9134bd, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=103b7a7b-ce26-48af-ab9c-9d6d820938e4, ip_allocation=immediate, mac_address=fa:16:3e:f6:ca:d1, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:10:56Z, description=, dns_domain=, id=a7f4d4e3-3cce-4bbf-b4f0-67cb210bc5c7, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPITest-360968376-network, port_security_enabled=True, project_id=97357935186e4539b78bb721e122577a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=52889, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3772, status=ACTIVE, subnets=['bd5d6184-9aaf-4cd0-b4c8-21f1c899fc9d'], tags=[], tenant_id=97357935186e4539b78bb721e122577a, updated_at=2025-11-26T10:10:57Z, vlan_transparent=None, network_id=a7f4d4e3-3cce-4bbf-b4f0-67cb210bc5c7, port_security_enabled=False, project_id=97357935186e4539b78bb721e122577a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3795, status=DOWN, tags=[], tenant_id=97357935186e4539b78bb721e122577a, updated_at=2025-11-26T10:10:59Z on network a7f4d4e3-3cce-4bbf-b4f0-67cb210bc5c7#033[00m Nov 26 05:11:00 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:11:00.947 262471 INFO neutron.agent.dhcp.agent [None req-83b25090-036a-4128-bea9-8f17c97e580c - - - - - -] DHCP configuration for ports {'3a066c3a-fa27-4b34-8072-054b38a1cdf0'} is completed#033[00m Nov 26 05:11:01 localhost dnsmasq[330042]: read /var/lib/neutron/dhcp/a7f4d4e3-3cce-4bbf-b4f0-67cb210bc5c7/addn_hosts - 1 addresses Nov 26 05:11:01 localhost podman[330060]: 2025-11-26 10:11:01.015868142 +0000 UTC m=+0.054675769 container kill 04ff34a52bfe265a6fe76fd6095b7a17fe93e00b2606f63082d911a85ca98982 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a7f4d4e3-3cce-4bbf-b4f0-67cb210bc5c7, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Nov 26 05:11:01 localhost dnsmasq-dhcp[330042]: read /var/lib/neutron/dhcp/a7f4d4e3-3cce-4bbf-b4f0-67cb210bc5c7/host Nov 26 05:11:01 localhost dnsmasq-dhcp[330042]: read /var/lib/neutron/dhcp/a7f4d4e3-3cce-4bbf-b4f0-67cb210bc5c7/opts Nov 26 05:11:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:11:01.150 262471 INFO neutron.agent.dhcp.agent [None req-49fbf61a-8e21-4fff-9893-8d4907dcd115 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:10:59Z, description=, device_id=3ecb89c1-d55d-4a57-953b-05ba1a9134bd, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=103b7a7b-ce26-48af-ab9c-9d6d820938e4, ip_allocation=immediate, mac_address=fa:16:3e:f6:ca:d1, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:10:56Z, description=, dns_domain=, id=a7f4d4e3-3cce-4bbf-b4f0-67cb210bc5c7, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPITest-360968376-network, port_security_enabled=True, project_id=97357935186e4539b78bb721e122577a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=52889, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3772, status=ACTIVE, subnets=['bd5d6184-9aaf-4cd0-b4c8-21f1c899fc9d'], tags=[], tenant_id=97357935186e4539b78bb721e122577a, updated_at=2025-11-26T10:10:57Z, vlan_transparent=None, network_id=a7f4d4e3-3cce-4bbf-b4f0-67cb210bc5c7, port_security_enabled=False, project_id=97357935186e4539b78bb721e122577a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3795, status=DOWN, tags=[], tenant_id=97357935186e4539b78bb721e122577a, updated_at=2025-11-26T10:10:59Z on network a7f4d4e3-3cce-4bbf-b4f0-67cb210bc5c7#033[00m Nov 26 05:11:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:11:01.210 262471 INFO neutron.agent.dhcp.agent [None req-7430d299-974d-4fea-a17f-692d15318c08 - - - - - -] DHCP configuration for ports {'103b7a7b-ce26-48af-ab9c-9d6d820938e4'} is completed#033[00m Nov 26 05:11:01 localhost dnsmasq[330042]: read /var/lib/neutron/dhcp/a7f4d4e3-3cce-4bbf-b4f0-67cb210bc5c7/addn_hosts - 1 addresses Nov 26 05:11:01 localhost dnsmasq-dhcp[330042]: read /var/lib/neutron/dhcp/a7f4d4e3-3cce-4bbf-b4f0-67cb210bc5c7/host Nov 26 05:11:01 localhost dnsmasq-dhcp[330042]: read /var/lib/neutron/dhcp/a7f4d4e3-3cce-4bbf-b4f0-67cb210bc5c7/opts Nov 26 05:11:01 localhost podman[330098]: 2025-11-26 10:11:01.385127094 +0000 UTC m=+0.065129084 container kill 04ff34a52bfe265a6fe76fd6095b7a17fe93e00b2606f63082d911a85ca98982 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a7f4d4e3-3cce-4bbf-b4f0-67cb210bc5c7, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:11:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:11:01.927 262471 INFO neutron.agent.dhcp.agent [None req-26c6dc96-0dc4-42c7-8f55-740b76902dea - - - - - -] DHCP configuration for ports {'103b7a7b-ce26-48af-ab9c-9d6d820938e4'} is completed#033[00m Nov 26 05:11:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:11:03.680 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:11:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:11:03.681 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:11:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:11:03.681 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:11:03 localhost nova_compute[281415]: 2025-11-26 10:11:03.908 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:11:04 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1392729429", "format": "json"} : dispatch Nov 26 05:11:04 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1392729429"} : dispatch Nov 26 05:11:04 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1392729429"} : dispatch Nov 26 05:11:04 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1392729429"}]': finished Nov 26 05:11:05 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e277 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:11:05 localhost nova_compute[281415]: 2025-11-26 10:11:05.616 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:11:05 localhost podman[330136]: 2025-11-26 10:11:05.955761604 +0000 UTC m=+0.062104420 container kill 04ff34a52bfe265a6fe76fd6095b7a17fe93e00b2606f63082d911a85ca98982 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a7f4d4e3-3cce-4bbf-b4f0-67cb210bc5c7, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 26 05:11:05 localhost dnsmasq[330042]: read /var/lib/neutron/dhcp/a7f4d4e3-3cce-4bbf-b4f0-67cb210bc5c7/addn_hosts - 0 addresses Nov 26 05:11:05 localhost dnsmasq-dhcp[330042]: read /var/lib/neutron/dhcp/a7f4d4e3-3cce-4bbf-b4f0-67cb210bc5c7/host Nov 26 05:11:05 localhost dnsmasq-dhcp[330042]: read /var/lib/neutron/dhcp/a7f4d4e3-3cce-4bbf-b4f0-67cb210bc5c7/opts Nov 26 05:11:06 localhost ovn_controller[153664]: 2025-11-26T10:11:06Z|00519|binding|INFO|Releasing lport 66ba12d6-97e8-4734-ac36-52496d50a518 from this chassis (sb_readonly=0) Nov 26 05:11:06 localhost nova_compute[281415]: 2025-11-26 10:11:06.143 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:11:06 localhost kernel: device tap66ba12d6-97 left promiscuous mode Nov 26 05:11:06 localhost ovn_controller[153664]: 2025-11-26T10:11:06Z|00520|binding|INFO|Setting lport 66ba12d6-97e8-4734-ac36-52496d50a518 down in Southbound Nov 26 05:11:06 localhost ovn_metadata_agent[159481]: 2025-11-26 10:11:06.165 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-a7f4d4e3-3cce-4bbf-b4f0-67cb210bc5c7', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a7f4d4e3-3cce-4bbf-b4f0-67cb210bc5c7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '97357935186e4539b78bb721e122577a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=62fbbb20-962c-4fb3-aa5f-b185a37d1444, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=66ba12d6-97e8-4734-ac36-52496d50a518) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:11:06 localhost ovn_metadata_agent[159481]: 2025-11-26 10:11:06.166 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 66ba12d6-97e8-4734-ac36-52496d50a518 in datapath a7f4d4e3-3cce-4bbf-b4f0-67cb210bc5c7 unbound from our chassis#033[00m Nov 26 05:11:06 localhost nova_compute[281415]: 2025-11-26 10:11:06.167 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:11:06 localhost ovn_metadata_agent[159481]: 2025-11-26 10:11:06.169 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a7f4d4e3-3cce-4bbf-b4f0-67cb210bc5c7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:11:06 localhost ovn_metadata_agent[159481]: 2025-11-26 10:11:06.169 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[18e4554e-42dd-4c8a-92a9-8e77545ad467]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:11:07 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e278 e278: 6 total, 6 up, 6 in Nov 26 05:11:07 localhost ovn_controller[153664]: 2025-11-26T10:11:07Z|00521|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:11:07 localhost nova_compute[281415]: 2025-11-26 10:11:07.950 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:11:08 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch Nov 26 05:11:08 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.Joe"} : dispatch Nov 26 05:11:08 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.Joe"} : dispatch Nov 26 05:11:08 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.Joe"}]': finished Nov 26 05:11:08 localhost podman[330175]: 2025-11-26 10:11:08.538982874 +0000 UTC m=+0.042114970 container kill 04ff34a52bfe265a6fe76fd6095b7a17fe93e00b2606f63082d911a85ca98982 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a7f4d4e3-3cce-4bbf-b4f0-67cb210bc5c7, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:11:08 localhost dnsmasq[330042]: exiting on receipt of SIGTERM Nov 26 05:11:08 localhost systemd[1]: tmp-crun.4MdNPS.mount: Deactivated successfully. Nov 26 05:11:08 localhost systemd[1]: libpod-04ff34a52bfe265a6fe76fd6095b7a17fe93e00b2606f63082d911a85ca98982.scope: Deactivated successfully. Nov 26 05:11:08 localhost podman[330190]: 2025-11-26 10:11:08.597092639 +0000 UTC m=+0.042218063 container died 04ff34a52bfe265a6fe76fd6095b7a17fe93e00b2606f63082d911a85ca98982 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a7f4d4e3-3cce-4bbf-b4f0-67cb210bc5c7, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118) Nov 26 05:11:08 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-04ff34a52bfe265a6fe76fd6095b7a17fe93e00b2606f63082d911a85ca98982-userdata-shm.mount: Deactivated successfully. Nov 26 05:11:08 localhost podman[330190]: 2025-11-26 10:11:08.684540195 +0000 UTC m=+0.129665579 container cleanup 04ff34a52bfe265a6fe76fd6095b7a17fe93e00b2606f63082d911a85ca98982 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a7f4d4e3-3cce-4bbf-b4f0-67cb210bc5c7, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Nov 26 05:11:08 localhost systemd[1]: libpod-conmon-04ff34a52bfe265a6fe76fd6095b7a17fe93e00b2606f63082d911a85ca98982.scope: Deactivated successfully. Nov 26 05:11:08 localhost podman[330191]: 2025-11-26 10:11:08.707080326 +0000 UTC m=+0.145016636 container remove 04ff34a52bfe265a6fe76fd6095b7a17fe93e00b2606f63082d911a85ca98982 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a7f4d4e3-3cce-4bbf-b4f0-67cb210bc5c7, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 26 05:11:08 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:11:08.734 262471 INFO neutron.agent.dhcp.agent [None req-7fb9af10-41f8-4d91-a567-45408c1083e9 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:11:08 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:11:08.744 262471 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:11:08 localhost nova_compute[281415]: 2025-11-26 10:11:08.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:11:08 localhost nova_compute[281415]: 2025-11-26 10:11:08.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:11:08 localhost nova_compute[281415]: 2025-11-26 10:11:08.849 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 05:11:08 localhost nova_compute[281415]: 2025-11-26 10:11:08.850 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:11:08 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e279 e279: 6 total, 6 up, 6 in Nov 26 05:11:08 localhost nova_compute[281415]: 2025-11-26 10:11:08.891 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:11:08 localhost nova_compute[281415]: 2025-11-26 10:11:08.892 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:11:08 localhost nova_compute[281415]: 2025-11-26 10:11:08.892 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:11:08 localhost nova_compute[281415]: 2025-11-26 10:11:08.892 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 05:11:08 localhost nova_compute[281415]: 2025-11-26 10:11:08.893 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 05:11:08 localhost nova_compute[281415]: 2025-11-26 10:11:08.943 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:11:09 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 05:11:09 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2397356698' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 05:11:09 localhost nova_compute[281415]: 2025-11-26 10:11:09.354 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 05:11:09 localhost nova_compute[281415]: 2025-11-26 10:11:09.449 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 05:11:09 localhost nova_compute[281415]: 2025-11-26 10:11:09.450 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 05:11:09 localhost systemd[1]: var-lib-containers-storage-overlay-acaa3199f2d9b4bfd0f6f8f5b1b91ce43b125639acae88b1f36d6142abf0e1d2-merged.mount: Deactivated successfully. Nov 26 05:11:09 localhost systemd[1]: run-netns-qdhcp\x2da7f4d4e3\x2d3cce\x2d4bbf\x2db4f0\x2d67cb210bc5c7.mount: Deactivated successfully. Nov 26 05:11:09 localhost nova_compute[281415]: 2025-11-26 10:11:09.673 281419 WARNING nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 05:11:09 localhost nova_compute[281415]: 2025-11-26 10:11:09.675 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=11121MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 05:11:09 localhost nova_compute[281415]: 2025-11-26 10:11:09.676 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:11:09 localhost nova_compute[281415]: 2025-11-26 10:11:09.676 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:11:09 localhost nova_compute[281415]: 2025-11-26 10:11:09.777 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 05:11:09 localhost nova_compute[281415]: 2025-11-26 10:11:09.777 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 05:11:09 localhost nova_compute[281415]: 2025-11-26 10:11:09.778 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 05:11:09 localhost nova_compute[281415]: 2025-11-26 10:11:09.821 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 05:11:10 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 05:11:10 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1951683723' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 05:11:10 localhost nova_compute[281415]: 2025-11-26 10:11:10.260 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 05:11:10 localhost nova_compute[281415]: 2025-11-26 10:11:10.269 281419 DEBUG nova.compute.provider_tree [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 05:11:10 localhost nova_compute[281415]: 2025-11-26 10:11:10.296 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 05:11:10 localhost nova_compute[281415]: 2025-11-26 10:11:10.299 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 05:11:10 localhost nova_compute[281415]: 2025-11-26 10:11:10.300 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.624s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:11:10 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:11:10 localhost nova_compute[281415]: 2025-11-26 10:11:10.619 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:11:10 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 26 05:11:10 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2127002984' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 26 05:11:10 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 26 05:11:10 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2127002984' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 26 05:11:11 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.admin", "format": "json"} : dispatch Nov 26 05:11:12 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:11:12 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:11:12 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:11:12 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:11:12 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:11:12 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:11:12 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:11:12 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:11:13 localhost nova_compute[281415]: 2025-11-26 10:11:13.300 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:11:13 localhost nova_compute[281415]: 2025-11-26 10:11:13.302 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:11:13 localhost nova_compute[281415]: 2025-11-26 10:11:13.302 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:11:13 localhost nova_compute[281415]: 2025-11-26 10:11:13.302 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:11:13 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 05:11:13 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:11:14 localhost nova_compute[281415]: 2025-11-26 10:11:14.027 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:11:14 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e280 e280: 6 total, 6 up, 6 in Nov 26 05:11:14 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:11:14 localhost nova_compute[281415]: 2025-11-26 10:11:14.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:11:15 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:11:15 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:11:15 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch Nov 26 05:11:15 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/87c6788b-7612-454a-90f4-6ec8d7aef9d4/b2f97048-063c-4598-b4e7-f804cc577acf", "osd", "allow rw pool=manila_data namespace=fsvolumens_87c6788b-7612-454a-90f4-6ec8d7aef9d4", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:11:15 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/87c6788b-7612-454a-90f4-6ec8d7aef9d4/b2f97048-063c-4598-b4e7-f804cc577acf", "osd", "allow rw pool=manila_data namespace=fsvolumens_87c6788b-7612-454a-90f4-6ec8d7aef9d4", "mon", "allow r"], "format": "json"} : dispatch Nov 26 05:11:15 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/87c6788b-7612-454a-90f4-6ec8d7aef9d4/b2f97048-063c-4598-b4e7-f804cc577acf", "osd", "allow rw pool=manila_data namespace=fsvolumens_87c6788b-7612-454a-90f4-6ec8d7aef9d4", "mon", "allow r"], "format": "json"}]': finished Nov 26 05:11:15 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e280 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:11:15 localhost nova_compute[281415]: 2025-11-26 10:11:15.622 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:11:15 localhost openstack_network_exporter[242153]: ERROR 10:11:15 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 05:11:15 localhost openstack_network_exporter[242153]: ERROR 10:11:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:11:15 localhost openstack_network_exporter[242153]: ERROR 10:11:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:11:15 localhost openstack_network_exporter[242153]: ERROR 10:11:15 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 05:11:15 localhost openstack_network_exporter[242153]: Nov 26 05:11:15 localhost openstack_network_exporter[242153]: ERROR 10:11:15 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 05:11:15 localhost openstack_network_exporter[242153]: Nov 26 05:11:15 localhost nova_compute[281415]: 2025-11-26 10:11:15.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:11:15 localhost nova_compute[281415]: 2025-11-26 10:11:15.848 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 05:11:15 localhost nova_compute[281415]: 2025-11-26 10:11:15.849 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 05:11:16 localhost nova_compute[281415]: 2025-11-26 10:11:16.330 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 05:11:16 localhost nova_compute[281415]: 2025-11-26 10:11:16.331 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 05:11:16 localhost nova_compute[281415]: 2025-11-26 10:11:16.332 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 05:11:16 localhost nova_compute[281415]: 2025-11-26 10:11:16.332 281419 DEBUG nova.objects.instance [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 05:11:16 localhost nova_compute[281415]: 2025-11-26 10:11:16.830 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 05:11:16 localhost nova_compute[281415]: 2025-11-26 10:11:16.861 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 05:11:16 localhost nova_compute[281415]: 2025-11-26 10:11:16.862 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 05:11:17 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e281 e281: 6 total, 6 up, 6 in Nov 26 05:11:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 05:11:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 05:11:17 localhost podman[330405]: 2025-11-26 10:11:17.851153695 +0000 UTC m=+0.102699181 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 05:11:17 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:11:17 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:11:17 localhost podman[330406]: 2025-11-26 10:11:17.90216987 +0000 UTC m=+0.153975094 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251118) Nov 26 05:11:17 localhost podman[330405]: 2025-11-26 10:11:17.913877174 +0000 UTC m=+0.165422610 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 05:11:17 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 05:11:17 localhost podman[330406]: 2025-11-26 10:11:17.943738742 +0000 UTC m=+0.195543946 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 26 05:11:17 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 05:11:19 localhost nova_compute[281415]: 2025-11-26 10:11:19.030 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:11:20 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:11:20 localhost nova_compute[281415]: 2025-11-26 10:11:20.625 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:11:21 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch Nov 26 05:11:22 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e282 e282: 6 total, 6 up, 6 in Nov 26 05:11:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 05:11:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 05:11:22 localhost systemd[1]: tmp-crun.chBQl2.mount: Deactivated successfully. Nov 26 05:11:22 localhost podman[330447]: 2025-11-26 10:11:22.885840382 +0000 UTC m=+0.143531280 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller) Nov 26 05:11:22 localhost podman[330448]: 2025-11-26 10:11:22.847029026 +0000 UTC m=+0.099685688 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, maintainer=Red Hat, Inc., vcs-type=git) Nov 26 05:11:22 localhost podman[330448]: 2025-11-26 10:11:22.93084689 +0000 UTC m=+0.183503542 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=9.6, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, io.openshift.expose-services=, config_id=edpm) Nov 26 05:11:22 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 05:11:22 localhost podman[330447]: 2025-11-26 10:11:22.955389562 +0000 UTC m=+0.213080540 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:11:22 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 05:11:23 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e283 e283: 6 total, 6 up, 6 in Nov 26 05:11:24 localhost nova_compute[281415]: 2025-11-26 10:11:24.033 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:11:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 05:11:24 localhost podman[330492]: 2025-11-26 10:11:24.239578975 +0000 UTC m=+0.079046776 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 05:11:24 localhost podman[330492]: 2025-11-26 10:11:24.252379593 +0000 UTC m=+0.091847354 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 05:11:24 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 05:11:25 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:11:25 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:11:25 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e283 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:11:25 localhost nova_compute[281415]: 2025-11-26 10:11:25.628 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:11:27 localhost podman[240049]: time="2025-11-26T10:11:27Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 05:11:27 localhost podman[240049]: @ - - [26/Nov/2025:10:11:27 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153864 "" "Go-http-client/1.1" Nov 26 05:11:27 localhost podman[240049]: @ - - [26/Nov/2025:10:11:27 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18788 "" "Go-http-client/1.1" Nov 26 05:11:28 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch Nov 26 05:11:28 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.david"} : dispatch Nov 26 05:11:28 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth rm", "entity": "client.david"} : dispatch Nov 26 05:11:28 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' cmd='[{"prefix": "auth rm", "entity": "client.david"}]': finished Nov 26 05:11:28 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e284 e284: 6 total, 6 up, 6 in Nov 26 05:11:29 localhost nova_compute[281415]: 2025-11-26 10:11:29.037 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:11:30 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:11:30 localhost nova_compute[281415]: 2025-11-26 10:11:30.632 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:11:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 05:11:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 05:11:30 localhost podman[330515]: 2025-11-26 10:11:30.826161802 +0000 UTC m=+0.087577501 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd) Nov 26 05:11:30 localhost podman[330515]: 2025-11-26 10:11:30.842550292 +0000 UTC m=+0.103966031 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Nov 26 05:11:30 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 05:11:30 localhost systemd[1]: tmp-crun.0bCbxd.mount: Deactivated successfully. Nov 26 05:11:30 localhost podman[330514]: 2025-11-26 10:11:30.93166508 +0000 UTC m=+0.192972526 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:11:30 localhost podman[330514]: 2025-11-26 10:11:30.939444882 +0000 UTC m=+0.200752368 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0) Nov 26 05:11:30 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 05:11:31 localhost sshd[330551]: main: sshd: ssh-rsa algorithm is disabled Nov 26 05:11:34 localhost nova_compute[281415]: 2025-11-26 10:11:34.067 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:11:35 localhost nova_compute[281415]: 2025-11-26 10:11:35.032 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:11:35 localhost ovn_metadata_agent[159481]: 2025-11-26 10:11:35.033 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:5e:b4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '86:cf:7c:68:02:df'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:11:35 localhost ovn_metadata_agent[159481]: 2025-11-26 10:11:35.034 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 26 05:11:35 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:11:35 localhost nova_compute[281415]: 2025-11-26 10:11:35.635 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:11:38 localhost ovn_controller[153664]: 2025-11-26T10:11:38Z|00522|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory Nov 26 05:11:39 localhost nova_compute[281415]: 2025-11-26 10:11:39.107 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:11:40 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:11:40 localhost nova_compute[281415]: 2025-11-26 10:11:40.639 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:11:41 localhost ovn_metadata_agent[159481]: 2025-11-26 10:11:41.036 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fad182b-d1fd-4eb1-a4d3-436a76a6f49e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 05:11:44 localhost nova_compute[281415]: 2025-11-26 10:11:44.116 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:11:45 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:11:45 localhost nova_compute[281415]: 2025-11-26 10:11:45.642 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:11:45 localhost openstack_network_exporter[242153]: ERROR 10:11:45 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 05:11:45 localhost openstack_network_exporter[242153]: ERROR 10:11:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:11:45 localhost openstack_network_exporter[242153]: ERROR 10:11:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:11:45 localhost openstack_network_exporter[242153]: ERROR 10:11:45 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 05:11:45 localhost openstack_network_exporter[242153]: Nov 26 05:11:45 localhost openstack_network_exporter[242153]: ERROR 10:11:45 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 05:11:45 localhost openstack_network_exporter[242153]: Nov 26 05:11:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 05:11:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 05:11:48 localhost podman[330553]: 2025-11-26 10:11:48.835891628 +0000 UTC m=+0.096787147 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 26 05:11:48 localhost systemd[1]: tmp-crun.30OXh7.mount: Deactivated successfully. Nov 26 05:11:48 localhost podman[330554]: 2025-11-26 10:11:48.888064719 +0000 UTC m=+0.148789143 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, config_id=edpm, tcib_managed=true) Nov 26 05:11:48 localhost podman[330554]: 2025-11-26 10:11:48.898728371 +0000 UTC m=+0.159452815 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:11:48 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 05:11:48 localhost podman[330553]: 2025-11-26 10:11:48.951419877 +0000 UTC m=+0.212315426 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 26 05:11:48 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 05:11:49 localhost nova_compute[281415]: 2025-11-26 10:11:49.118 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:11:50 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:11:50 localhost nova_compute[281415]: 2025-11-26 10:11:50.646 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:11:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 05:11:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 05:11:53 localhost podman[330595]: 2025-11-26 10:11:53.825157412 +0000 UTC m=+0.085012002 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251118) Nov 26 05:11:53 localhost systemd[1]: tmp-crun.iW7Qt4.mount: Deactivated successfully. Nov 26 05:11:53 localhost podman[330596]: 2025-11-26 10:11:53.904112574 +0000 UTC m=+0.159405413 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, vcs-type=git, config_id=edpm, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., distribution-scope=public, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., managed_by=edpm_ansible, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9) Nov 26 05:11:53 localhost podman[330596]: 2025-11-26 10:11:53.918875234 +0000 UTC m=+0.174168103 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, managed_by=edpm_ansible, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., release=1755695350, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc.) Nov 26 05:11:53 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 05:11:53 localhost podman[330595]: 2025-11-26 10:11:53.936061337 +0000 UTC m=+0.195915977 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Nov 26 05:11:53 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 05:11:54 localhost nova_compute[281415]: 2025-11-26 10:11:54.120 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:11:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 05:11:55 localhost systemd[1]: tmp-crun.fUy9O0.mount: Deactivated successfully. Nov 26 05:11:55 localhost podman[330642]: 2025-11-26 10:11:55.05027428 +0000 UTC m=+0.132250749 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 26 05:11:55 localhost podman[330642]: 2025-11-26 10:11:55.059151807 +0000 UTC m=+0.141128236 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 05:11:55 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 05:11:55 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:11:55 localhost nova_compute[281415]: 2025-11-26 10:11:55.649 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:11:57 localhost podman[240049]: time="2025-11-26T10:11:57Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 05:11:57 localhost podman[240049]: @ - - [26/Nov/2025:10:11:57 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153864 "" "Go-http-client/1.1" Nov 26 05:11:57 localhost podman[240049]: @ - - [26/Nov/2025:10:11:57 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18798 "" "Go-http-client/1.1" Nov 26 05:11:59 localhost nova_compute[281415]: 2025-11-26 10:11:59.169 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:00 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:12:00 localhost nova_compute[281415]: 2025-11-26 10:12:00.653 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 05:12:01 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:12:01.007 262471 INFO neutron.agent.linux.ip_lib [None req-2310c75a-8609-47d8-9ead-fbba07cdd865 - - - - - -] Device tap2f5ea239-93 cannot be used as it has no MAC address#033[00m Nov 26 05:12:01 localhost systemd[1]: tmp-crun.an0J6C.mount: Deactivated successfully. Nov 26 05:12:01 localhost podman[330667]: 2025-11-26 10:12:01.026702954 +0000 UTC m=+0.092263337 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3) Nov 26 05:12:01 localhost nova_compute[281415]: 2025-11-26 10:12:01.040 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 05:12:01 localhost kernel: device tap2f5ea239-93 entered promiscuous mode Nov 26 05:12:01 localhost ovn_controller[153664]: 2025-11-26T10:12:01Z|00523|binding|INFO|Claiming lport 2f5ea239-93f2-4dfc-8a28-947ab5f6db01 for this chassis. Nov 26 05:12:01 localhost ovn_controller[153664]: 2025-11-26T10:12:01Z|00524|binding|INFO|2f5ea239-93f2-4dfc-8a28-947ab5f6db01: Claiming unknown Nov 26 05:12:01 localhost NetworkManager[5970]: [1764151921.0541] manager: (tap2f5ea239-93): new Generic device (/org/freedesktop/NetworkManager/Devices/80) Nov 26 05:12:01 localhost nova_compute[281415]: 2025-11-26 10:12:01.051 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:01 localhost podman[330667]: 2025-11-26 10:12:01.05520909 +0000 UTC m=+0.120769473 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:12:01 localhost systemd-udevd[330700]: Network interface NamePolicy= disabled on kernel command line. Nov 26 05:12:01 localhost ovn_metadata_agent[159481]: 2025-11-26 10:12:01.064 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-d075d3f8-292c-4df9-9421-a5e645f00f81', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d075d3f8-292c-4df9-9421-a5e645f00f81', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '53fd247374fe4fd29fbc5ba5a3ef7754', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8b947871-56a9-4337-8534-3a21bfcedf5a, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=2f5ea239-93f2-4dfc-8a28-947ab5f6db01) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:12:01 localhost ovn_metadata_agent[159481]: 2025-11-26 10:12:01.066 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 2f5ea239-93f2-4dfc-8a28-947ab5f6db01 in datapath d075d3f8-292c-4df9-9421-a5e645f00f81 bound to our chassis#033[00m Nov 26 05:12:01 localhost ovn_metadata_agent[159481]: 2025-11-26 10:12:01.070 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Port e265333c-e21b-45d7-a10d-183b2aa72bf5 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 26 05:12:01 localhost ovn_metadata_agent[159481]: 2025-11-26 10:12:01.071 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d075d3f8-292c-4df9-9421-a5e645f00f81, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:12:01 localhost ovn_metadata_agent[159481]: 2025-11-26 10:12:01.072 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[ed57a8fb-995c-41f5-bcc4-639161ddbf22]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:12:01 localhost ovn_controller[153664]: 2025-11-26T10:12:01Z|00525|binding|INFO|Setting lport 2f5ea239-93f2-4dfc-8a28-947ab5f6db01 ovn-installed in OVS Nov 26 05:12:01 localhost ovn_controller[153664]: 2025-11-26T10:12:01Z|00526|binding|INFO|Setting lport 2f5ea239-93f2-4dfc-8a28-947ab5f6db01 up in Southbound Nov 26 05:12:01 localhost nova_compute[281415]: 2025-11-26 10:12:01.078 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:01 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 05:12:01 localhost nova_compute[281415]: 2025-11-26 10:12:01.087 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:01 localhost nova_compute[281415]: 2025-11-26 10:12:01.089 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:01 localhost journal[229445]: ethtool ioctl error on tap2f5ea239-93: No such device Nov 26 05:12:01 localhost journal[229445]: ethtool ioctl error on tap2f5ea239-93: No such device Nov 26 05:12:01 localhost journal[229445]: ethtool ioctl error on tap2f5ea239-93: No such device Nov 26 05:12:01 localhost journal[229445]: ethtool ioctl error on tap2f5ea239-93: No such device Nov 26 05:12:01 localhost journal[229445]: ethtool ioctl error on tap2f5ea239-93: No such device Nov 26 05:12:01 localhost journal[229445]: ethtool ioctl error on tap2f5ea239-93: No such device Nov 26 05:12:01 localhost journal[229445]: ethtool ioctl error on tap2f5ea239-93: No such device Nov 26 05:12:01 localhost journal[229445]: ethtool ioctl error on tap2f5ea239-93: No such device Nov 26 05:12:01 localhost nova_compute[281415]: 2025-11-26 10:12:01.132 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:01 localhost podman[330693]: 2025-11-26 10:12:01.158986763 +0000 UTC m=+0.102536916 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118) Nov 26 05:12:01 localhost nova_compute[281415]: 2025-11-26 10:12:01.167 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:01 localhost podman[330693]: 2025-11-26 10:12:01.188274063 +0000 UTC m=+0.131824176 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent) Nov 26 05:12:01 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 05:12:01 localhost nova_compute[281415]: 2025-11-26 10:12:01.703 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:01 localhost systemd[1]: tmp-crun.WrClQd.mount: Deactivated successfully. Nov 26 05:12:02 localhost podman[330783]: Nov 26 05:12:02 localhost podman[330783]: 2025-11-26 10:12:02.144784668 +0000 UTC m=+0.094899779 container create b2eeadde59d09f5277188199bcce97e644b9898ace7613255b6b8cab22b5fc0c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d075d3f8-292c-4df9-9421-a5e645f00f81, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Nov 26 05:12:02 localhost systemd[1]: Started libpod-conmon-b2eeadde59d09f5277188199bcce97e644b9898ace7613255b6b8cab22b5fc0c.scope. Nov 26 05:12:02 localhost podman[330783]: 2025-11-26 10:12:02.099544563 +0000 UTC m=+0.049659724 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 26 05:12:02 localhost systemd[1]: Started libcrun container. Nov 26 05:12:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ee6b694c4d0e89bb8ad8c2882f9e49bedcf385ea139f29594a88e67aec37f646/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 26 05:12:02 localhost podman[330783]: 2025-11-26 10:12:02.217576649 +0000 UTC m=+0.167691750 container init b2eeadde59d09f5277188199bcce97e644b9898ace7613255b6b8cab22b5fc0c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d075d3f8-292c-4df9-9421-a5e645f00f81, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 26 05:12:02 localhost podman[330783]: 2025-11-26 10:12:02.227294591 +0000 UTC m=+0.177409682 container start b2eeadde59d09f5277188199bcce97e644b9898ace7613255b6b8cab22b5fc0c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d075d3f8-292c-4df9-9421-a5e645f00f81, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Nov 26 05:12:02 localhost dnsmasq[330801]: started, version 2.85 cachesize 150 Nov 26 05:12:02 localhost dnsmasq[330801]: DNS service limited to local subnets Nov 26 05:12:02 localhost dnsmasq[330801]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 26 05:12:02 localhost dnsmasq[330801]: warning: no upstream servers configured Nov 26 05:12:02 localhost dnsmasq-dhcp[330801]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 26 05:12:02 localhost dnsmasq[330801]: read /var/lib/neutron/dhcp/d075d3f8-292c-4df9-9421-a5e645f00f81/addn_hosts - 0 addresses Nov 26 05:12:02 localhost dnsmasq-dhcp[330801]: read /var/lib/neutron/dhcp/d075d3f8-292c-4df9-9421-a5e645f00f81/host Nov 26 05:12:02 localhost dnsmasq-dhcp[330801]: read /var/lib/neutron/dhcp/d075d3f8-292c-4df9-9421-a5e645f00f81/opts Nov 26 05:12:02 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:12:02.331 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:12:02Z, description=, device_id=fbf519de-fd74-495e-984d-3cebff7cd0f9, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=61439325-0262-4258-936c-ccc8ce13076c, ip_allocation=immediate, mac_address=fa:16:3e:31:f4:44, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:11:58Z, description=, dns_domain=, id=d075d3f8-292c-4df9-9421-a5e645f00f81, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIMysqlTest-1549214327-network, port_security_enabled=True, project_id=53fd247374fe4fd29fbc5ba5a3ef7754, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=50832, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3896, status=ACTIVE, subnets=['5aa6fb45-6610-4972-9023-cf5569234a9a'], tags=[], tenant_id=53fd247374fe4fd29fbc5ba5a3ef7754, updated_at=2025-11-26T10:11:59Z, vlan_transparent=None, network_id=d075d3f8-292c-4df9-9421-a5e645f00f81, port_security_enabled=False, project_id=53fd247374fe4fd29fbc5ba5a3ef7754, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3904, status=DOWN, tags=[], tenant_id=53fd247374fe4fd29fbc5ba5a3ef7754, updated_at=2025-11-26T10:12:02Z on network d075d3f8-292c-4df9-9421-a5e645f00f81#033[00m Nov 26 05:12:02 localhost dnsmasq[330801]: read /var/lib/neutron/dhcp/d075d3f8-292c-4df9-9421-a5e645f00f81/addn_hosts - 1 addresses Nov 26 05:12:02 localhost dnsmasq-dhcp[330801]: read /var/lib/neutron/dhcp/d075d3f8-292c-4df9-9421-a5e645f00f81/host Nov 26 05:12:02 localhost podman[330819]: 2025-11-26 10:12:02.577693206 +0000 UTC m=+0.064468644 container kill b2eeadde59d09f5277188199bcce97e644b9898ace7613255b6b8cab22b5fc0c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d075d3f8-292c-4df9-9421-a5e645f00f81, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0) Nov 26 05:12:02 localhost dnsmasq-dhcp[330801]: read /var/lib/neutron/dhcp/d075d3f8-292c-4df9-9421-a5e645f00f81/opts Nov 26 05:12:02 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:12:02.968 262471 INFO neutron.agent.dhcp.agent [None req-8e3a3440-6ee8-4b63-9c58-534b77690198 - - - - - -] DHCP configuration for ports {'ccf05ff3-5c00-4139-aedd-f268a63ae6e6'} is completed#033[00m Nov 26 05:12:03 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:12:03.103 262471 INFO neutron.agent.dhcp.agent [None req-c5915ecf-e9be-4d39-b6b4-7427dde2b44a - - - - - -] DHCP configuration for ports {'61439325-0262-4258-936c-ccc8ce13076c'} is completed#033[00m Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.587 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'name': 'test', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005536118.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'hostId': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.588 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.593 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'efe97794-beb6-4cc0-b8a7-c91247309866', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:12:03.589199', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '5adc86e8-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12398.831453913, 'message_signature': '484ae4d3be4c0c2b3d74aaa47d1766c798e37002d53d5a905369cac9203c518d'}]}, 'timestamp': '2025-11-26 10:12:03.594510', '_unique_id': '7e02d3d855984f14b3cee8cca48d39f0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.596 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.597 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.597 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '91367938-bd15-462f-8074-ae021f0b298f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:12:03.597518', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '5add1108-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12398.831453913, 'message_signature': '2bf327ba0de7f01bd5d958987061de496ee90b06e6724913a0d1a9248f2dde17'}]}, 'timestamp': '2025-11-26 10:12:03.598034', '_unique_id': 'ee1615ed56dd4fcbae8e1b33efb78953'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.598 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.600 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.600 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.600 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.612 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.612 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d97f195-ee9c-407f-919b-0bea6dfb3c77', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:12:03.600536', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5adf5a9e-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12398.842783695, 'message_signature': 'f71cf0ea9bb588de43fc5af4fbac62acbb0443825070aaa1b5f0d7cc286e6fca'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:12:03.600536', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5adf6cd2-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12398.842783695, 'message_signature': 'fae9847e8dc89de88cf5892420fdde5945ba185c9c332805916e06d4dc261d42'}]}, 'timestamp': '2025-11-26 10:12:03.613411', '_unique_id': '72072b4e2f3c431fa6a3910a7c0ca7d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.614 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.615 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.643 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 1143371229 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.643 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.latency volume: 23326743 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ed4043e-a3df-4620-b406-d7cbf42d05ec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1143371229, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:12:03.615622', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5ae411e2-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12398.857872233, 'message_signature': '2453ab65bee474419c01ad7acf17786aa9afb95c7941625519fee6bd87333a86'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 23326743, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:12:03.615622', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5ae424b6-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12398.857872233, 'message_signature': 'de004f1491e70ad454cec82623ed71380f4ad49b57ea1e03c445baf613373337'}]}, 'timestamp': '2025-11-26 10:12:03.644335', '_unique_id': 'e689fc5ea1384215ae86da64dc743780'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.645 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.646 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.646 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8264df0d-4768-4c6c-8cb9-af9b3739ab78', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:12:03.646595', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '5ae48e24-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12398.831453913, 'message_signature': '3d9fe1771c0f36b6eb81026dd7f7388045104b8b13211e88489922f192530fb4'}]}, 'timestamp': '2025-11-26 10:12:03.647094', '_unique_id': '79461a630c3d48368749653105ed023b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.648 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.649 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.649 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8a55d8b-18e9-4e1d-a3e0-05c7ffef57f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:12:03.649195', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '5ae4f486-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12398.831453913, 'message_signature': '2826f1b721036d554494b129b6f2323b47d32ae83f862be60b29ae864e34fc2c'}]}, 'timestamp': '2025-11-26 10:12:03.649684', '_unique_id': '92145b6fbcdd45b6ad4147b6ff7f2d73'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.650 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.651 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.651 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '600f0f3a-3a24-4fe9-8428-0ab89cb8585e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:12:03.651763', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '5ae55930-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12398.831453913, 'message_signature': '7529a888cbb7233f4c6f09bd987f577c1cf6110c6e70cafae26335cfa1d3bc30'}]}, 'timestamp': '2025-11-26 10:12:03.652257', '_unique_id': 'e0e6dc5935db4114b2dc5c0fee418ea9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.653 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.654 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.654 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 1723586642 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.654 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.latency volume: 89399569 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a09faa9-631d-4375-9f79-7af52d2d8f37', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1723586642, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:12:03.654494', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5ae5c230-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12398.857872233, 'message_signature': '125d82053e802240c676ee94f0cfb41b18f209beaa5b1e8c519c45013df0c70d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89399569, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:12:03.654494', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5ae5d3ce-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12398.857872233, 'message_signature': '99bda83a2c2bba3262230feb76e1919a6ae9a95692abdeec514ae87b19932192'}]}, 'timestamp': '2025-11-26 10:12:03.655448', '_unique_id': '20287f33ba59482ea91b3a0528311a0c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.656 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.657 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.657 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.657 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97b5fe77-fb29-4020-9d7c-d21db4e942cc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:12:03.657762', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '5ae6430e-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12398.831453913, 'message_signature': '5d41a2aca9a30ac59e524189fee34b05a684f380668db3a8b0247dd274f9e0cc'}]}, 'timestamp': '2025-11-26 10:12:03.658244', '_unique_id': '7379a2fa0e7f4394894d1a0d8dfbc4d8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.659 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.660 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.660 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.660 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f830200d-868f-42ce-a72f-f4f3b1c0c9ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:12:03.660311', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5ae6a556-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12398.842783695, 'message_signature': '8811c400d46144c90d78284f176b6568bfbdb7bed2b2b5a07b00ae7a91c733e4'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:12:03.660311', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5ae6b690-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12398.842783695, 'message_signature': '5e2ced2ab205b2ea41000da925afaa2e5be3a1d59b151c30b92951978e1b6992'}]}, 'timestamp': '2025-11-26 10:12:03.661174', '_unique_id': '697a9a8bba5343d0b84756dc57ceb9e9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.662 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.663 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.663 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.663 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f0046c1e-6301-47f9-9a91-d4bcd6e94e45', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:12:03.663299', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5ae71a2c-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12398.857872233, 'message_signature': 'aacc6efd42abb41a84a7613fbbdc0f104971e43a218664e540ea91f0f1318610'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:12:03.663299', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5ae72a1c-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12398.857872233, 'message_signature': 'ae86dc08ffd36600d0d490d49ce730b051e5d73219b78216c526b2d228022a78'}]}, 'timestamp': '2025-11-26 10:12:03.664169', '_unique_id': 'ee7490d2ad244484b3ae7175ca186f43'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.665 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.666 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.666 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '58f434e8-f0e6-4108-bc02-9915d888aadc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:12:03.666377', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '5ae79286-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12398.831453913, 'message_signature': '5394767f312305224f01c34e5cbc08f2a8027b110170e1d3073b234360369b23'}]}, 'timestamp': '2025-11-26 10:12:03.666832', '_unique_id': 'd143890f873a48e18c85070d676975aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.667 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.668 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.669 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets volume: 68 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cf7b6217-fa1f-47de-a8ec-bcb57a39b0a4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 68, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:12:03.668971', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '5ae7f8ca-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12398.831453913, 'message_signature': '4565e8b177604b4b97f712abf22687b43d450b5a03f9aaaf661f4d95f0dd962a'}]}, 'timestamp': '2025-11-26 10:12:03.669451', '_unique_id': '1a9235fdba344912ae74ab407dbeb2af'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.670 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.671 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 26 05:12:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:12:03.680 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:12:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:12:03.681 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:12:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:12:03.682 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.687 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/memory.usage volume: 51.79296875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '837b15a7-7aed-4f3b-933d-f8251e21e7b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.79296875, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T10:12:03.671514', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '5aeaddec-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12398.930050476, 'message_signature': '1b77da84927cd169d99a8af37e56aec18d870218d85f7f510abbfb7696018aea'}]}, 'timestamp': '2025-11-26 10:12:03.688417', '_unique_id': '22d6217e5d88444390d661b5f738f900'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.689 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.690 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.690 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.691 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e4becb5c-0dc7-4129-8d5f-f1e5df334828', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:12:03.690574', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5aeb437c-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12398.857872233, 'message_signature': '4acf4b40309df4848b55d844e2dd4ceba4c0478e63ecb57680f60731b40bd399'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:12:03.690574', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5aeb5aba-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12398.857872233, 'message_signature': '2da47cfbc75eb7328161fb276a750b1bc856810ce7e1cfb0306c62df74f8980d'}]}, 'timestamp': '2025-11-26 10:12:03.691656', '_unique_id': 'f6f939c0469c4adea46e02831b54b1e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.692 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.693 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.693 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.bytes volume: 7557 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5d070c9b-c871-458f-89bd-51f3add5fb25', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7557, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:12:03.693796', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '5aebc4a0-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12398.831453913, 'message_signature': '5a31154283d8c65ed6bce54492278d1ec1e090cba4ba206b878d794c62ca375b'}]}, 'timestamp': '2025-11-26 10:12:03.694332', '_unique_id': '46616284cb8a4c0dba6f3659db4deea2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.695 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.696 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.696 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.696 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9378dd17-c63a-443e-9b73-16d9026253b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:12:03.696396', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5aec26c0-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12398.857872233, 'message_signature': '726c38bf5541cff1dcb61cf492562aa5d1206ed02b31540c6a013031395c8ea4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:12:03.696396', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5aec3886-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12398.857872233, 'message_signature': '028077fe8e0ef9a634e6a897fc372fc0d8d9b24b69597377fa7db6b64b61b83b'}]}, 'timestamp': '2025-11-26 10:12:03.697268', '_unique_id': 'f2d049691307418fb1e9e025ad4e0758'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.698 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.699 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.699 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/cpu volume: 19740000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e25f917-8150-40c6-bea7-a2dc74aa7de9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 19740000000, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'timestamp': '2025-11-26T10:12:03.699369', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '5aec9af6-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12398.930050476, 'message_signature': 'be632eadd1c399ca698d6cadc26edb5e434949c2dbfd050fe444d20dfdeb3610'}]}, 'timestamp': '2025-11-26 10:12:03.699800', '_unique_id': 'b08e3e4f9c71493db19d1308e87505c0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.700 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.702 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.702 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.702 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.702 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd1aabfbe-2cf3-48f9-9575-c99889260405', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:12:03.702348', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5aed0fea-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12398.857872233, 'message_signature': '6ec50293cd258a5ccbf8a616f1f1fe99c27e7e76c490c5c7e996eadb89372808'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:12:03.702348', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5aed2246-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12398.857872233, 'message_signature': '8b8fab67de1eb0e3c48c194a95b520a2411006be10b43c0260369a2f9dbe6cbc'}]}, 'timestamp': '2025-11-26 10:12:03.703250', '_unique_id': '84dabe6944ab491a9d51a6b97a7e2d43'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.704 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.705 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.705 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.705 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9981cf8-8f2e-491c-b38e-cb43eb34b202', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vda', 'timestamp': '2025-11-26T10:12:03.705368', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5aed8560-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12398.842783695, 'message_signature': 'a1b7f22e3f716b73d5b8e82424e194c7fe30d8ace644042f8ce6867818b0be8a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af-vdb', 'timestamp': '2025-11-26T10:12:03.705368', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5aed967c-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12398.842783695, 'message_signature': '254fb50d2bdeb4d413d0ad68fe2af5f5fb6a66357561f36838741aebca83f294'}]}, 'timestamp': '2025-11-26 10:12:03.706225', '_unique_id': 'fa2f362c29874815af0337cd28557ea3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.707 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.708 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.708 12 DEBUG ceilometer.compute.pollsters [-] 9d78bef9-6977-4fb5-b50b-ae75124e73af/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8ed91640-a7df-4e3e-b79d-21435270e28d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9f8fafc3f43241c3a71039595891ea0e', 'user_name': None, 'project_id': 'b2fe3cd6f6ea49b8a2de01b236dd92e3', 'project_name': None, 'resource_id': 'instance-00000002-9d78bef9-6977-4fb5-b50b-ae75124e73af-tap5afdc9d0-95', 'timestamp': '2025-11-26T10:12:03.708301', 'resource_metadata': {'display_name': 'test', 'name': 'tap5afdc9d0-95', 'instance_id': '9d78bef9-6977-4fb5-b50b-ae75124e73af', 'instance_type': 'm1.small', 'host': '7a4d4169a8a0e423caadf45517f57bbe6dead17726a36b37547813ac', 'instance_host': 'np0005536118.localdomain', 'flavor': {'id': 'a8cafabf-98f1-4bbc-a3ca-a9382f40900b', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7ebee4f6-b3ad-441d-abd0-239ae838ae37'}, 'image_ref': '7ebee4f6-b3ad-441d-abd0-239ae838ae37', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:8c:0f:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afdc9d0-95'}, 'message_id': '5aedf810-cab0-11f0-9fe3-fa163e73ba36', 'monotonic_time': 12398.831453913, 'message_signature': 'e44483974582d5d87a43e670706be0cfe6465def098bcd1ba350e2351e30a1fd'}]}, 'timestamp': '2025-11-26 10:12:03.708750', '_unique_id': 'be3dc6fb9c0b484c8ef8d5dd23d2ce17'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging yield Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 26 05:12:03 localhost ceilometer_agent_compute[237388]: 2025-11-26 10:12:03.709 12 ERROR oslo_messaging.notify.messaging Nov 26 05:12:04 localhost ceph-mon[297296]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 26 05:12:04 localhost ceph-mon[297296]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.0 total, 600.0 interval#012Cumulative writes: 4865 writes, 36K keys, 4865 commit groups, 1.0 writes per commit group, ingest: 0.06 GB, 0.05 MB/s#012Cumulative WAL: 4865 writes, 4865 syncs, 1.00 writes per sync, written: 0.06 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2521 writes, 13K keys, 2521 commit groups, 1.0 writes per commit group, ingest: 19.60 MB, 0.03 MB/s#012Interval WAL: 2521 writes, 2521 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 149.9 0.26 0.12 17 0.015 0 0 0.0 0.0#012 L6 1/0 17.23 MB 0.0 0.3 0.0 0.2 0.2 0.0 0.0 6.5 203.3 187.1 1.36 0.80 16 0.085 203K 8289 0.0 0.0#012 Sum 1/0 17.23 MB 0.0 0.3 0.0 0.2 0.3 0.1 0.0 7.5 170.5 181.1 1.62 0.91 33 0.049 203K 8289 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.1 0.0 0.1 0.1 0.0 0.0 11.1 170.8 175.2 0.73 0.41 14 0.052 96K 3756 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low 0/0 0.00 KB 0.0 0.3 0.0 0.2 0.2 0.0 0.0 0.0 203.3 187.1 1.36 0.80 16 0.085 203K 8289 0.0 0.0#012High 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 151.8 0.26 0.12 16 0.016 0 0 0.0 0.0#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.5 0.00 0.00 1 0.003 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.038, interval 0.011#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.29 GB write, 0.24 MB/s write, 0.27 GB read, 0.23 MB/s read, 1.6 seconds#012Interval compaction: 0.13 GB write, 0.21 MB/s write, 0.12 GB read, 0.21 MB/s read, 0.7 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5572d8ebd350#2 capacity: 304.00 MB usage: 37.90 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 0.000356 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2519,36.54 MB,12.0183%) FilterBlock(33,607.23 KB,0.195067%) IndexBlock(33,788.11 KB,0.25317%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] ** Nov 26 05:12:04 localhost nova_compute[281415]: 2025-11-26 10:12:04.201 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:05 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:12:05 localhost nova_compute[281415]: 2025-11-26 10:12:05.655 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:05 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:12:05.912 262471 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-26T10:12:02Z, description=, device_id=fbf519de-fd74-495e-984d-3cebff7cd0f9, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=61439325-0262-4258-936c-ccc8ce13076c, ip_allocation=immediate, mac_address=fa:16:3e:31:f4:44, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-26T10:11:58Z, description=, dns_domain=, id=d075d3f8-292c-4df9-9421-a5e645f00f81, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIMysqlTest-1549214327-network, port_security_enabled=True, project_id=53fd247374fe4fd29fbc5ba5a3ef7754, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=50832, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3896, status=ACTIVE, subnets=['5aa6fb45-6610-4972-9023-cf5569234a9a'], tags=[], tenant_id=53fd247374fe4fd29fbc5ba5a3ef7754, updated_at=2025-11-26T10:11:59Z, vlan_transparent=None, network_id=d075d3f8-292c-4df9-9421-a5e645f00f81, port_security_enabled=False, project_id=53fd247374fe4fd29fbc5ba5a3ef7754, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3904, status=DOWN, tags=[], tenant_id=53fd247374fe4fd29fbc5ba5a3ef7754, updated_at=2025-11-26T10:12:02Z on network d075d3f8-292c-4df9-9421-a5e645f00f81#033[00m Nov 26 05:12:06 localhost dnsmasq[330801]: read /var/lib/neutron/dhcp/d075d3f8-292c-4df9-9421-a5e645f00f81/addn_hosts - 1 addresses Nov 26 05:12:06 localhost dnsmasq-dhcp[330801]: read /var/lib/neutron/dhcp/d075d3f8-292c-4df9-9421-a5e645f00f81/host Nov 26 05:12:06 localhost dnsmasq-dhcp[330801]: read /var/lib/neutron/dhcp/d075d3f8-292c-4df9-9421-a5e645f00f81/opts Nov 26 05:12:06 localhost podman[330855]: 2025-11-26 10:12:06.107674029 +0000 UTC m=+0.049872391 container kill b2eeadde59d09f5277188199bcce97e644b9898ace7613255b6b8cab22b5fc0c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d075d3f8-292c-4df9-9421-a5e645f00f81, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3) Nov 26 05:12:06 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:12:06.356 262471 INFO neutron.agent.dhcp.agent [None req-70bf8361-fd34-46a2-b292-3320ba30d984 - - - - - -] DHCP configuration for ports {'61439325-0262-4258-936c-ccc8ce13076c'} is completed#033[00m Nov 26 05:12:09 localhost nova_compute[281415]: 2025-11-26 10:12:09.203 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:09 localhost nova_compute[281415]: 2025-11-26 10:12:09.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:12:09 localhost nova_compute[281415]: 2025-11-26 10:12:09.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:12:09 localhost nova_compute[281415]: 2025-11-26 10:12:09.848 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 05:12:09 localhost nova_compute[281415]: 2025-11-26 10:12:09.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:12:09 localhost nova_compute[281415]: 2025-11-26 10:12:09.871 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:12:09 localhost nova_compute[281415]: 2025-11-26 10:12:09.871 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:12:09 localhost nova_compute[281415]: 2025-11-26 10:12:09.872 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:12:09 localhost nova_compute[281415]: 2025-11-26 10:12:09.872 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 05:12:09 localhost nova_compute[281415]: 2025-11-26 10:12:09.873 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 05:12:10 localhost dnsmasq[330801]: read /var/lib/neutron/dhcp/d075d3f8-292c-4df9-9421-a5e645f00f81/addn_hosts - 0 addresses Nov 26 05:12:10 localhost dnsmasq-dhcp[330801]: read /var/lib/neutron/dhcp/d075d3f8-292c-4df9-9421-a5e645f00f81/host Nov 26 05:12:10 localhost dnsmasq-dhcp[330801]: read /var/lib/neutron/dhcp/d075d3f8-292c-4df9-9421-a5e645f00f81/opts Nov 26 05:12:10 localhost podman[330915]: 2025-11-26 10:12:10.258690021 +0000 UTC m=+0.068883540 container kill b2eeadde59d09f5277188199bcce97e644b9898ace7613255b6b8cab22b5fc0c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d075d3f8-292c-4df9-9421-a5e645f00f81, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:12:10 localhost systemd[1]: tmp-crun.1VtgA3.mount: Deactivated successfully. Nov 26 05:12:10 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 05:12:10 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3955359977' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 05:12:10 localhost nova_compute[281415]: 2025-11-26 10:12:10.370 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 05:12:10 localhost nova_compute[281415]: 2025-11-26 10:12:10.433 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 05:12:10 localhost nova_compute[281415]: 2025-11-26 10:12:10.433 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 05:12:10 localhost ovn_controller[153664]: 2025-11-26T10:12:10Z|00527|binding|INFO|Releasing lport 2f5ea239-93f2-4dfc-8a28-947ab5f6db01 from this chassis (sb_readonly=0) Nov 26 05:12:10 localhost kernel: device tap2f5ea239-93 left promiscuous mode Nov 26 05:12:10 localhost nova_compute[281415]: 2025-11-26 10:12:10.469 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:10 localhost ovn_controller[153664]: 2025-11-26T10:12:10Z|00528|binding|INFO|Setting lport 2f5ea239-93f2-4dfc-8a28-947ab5f6db01 down in Southbound Nov 26 05:12:10 localhost ovn_metadata_agent[159481]: 2025-11-26 10:12:10.479 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005536118.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp6afb6947-feef-5137-bf43-039e7f09cb85-d075d3f8-292c-4df9-9421-a5e645f00f81', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d075d3f8-292c-4df9-9421-a5e645f00f81', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '53fd247374fe4fd29fbc5ba5a3ef7754', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005536118.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8b947871-56a9-4337-8534-3a21bfcedf5a, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=2f5ea239-93f2-4dfc-8a28-947ab5f6db01) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:12:10 localhost ovn_metadata_agent[159481]: 2025-11-26 10:12:10.481 159486 INFO neutron.agent.ovn.metadata.agent [-] Port 2f5ea239-93f2-4dfc-8a28-947ab5f6db01 in datapath d075d3f8-292c-4df9-9421-a5e645f00f81 unbound from our chassis#033[00m Nov 26 05:12:10 localhost ovn_metadata_agent[159481]: 2025-11-26 10:12:10.483 159486 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d075d3f8-292c-4df9-9421-a5e645f00f81, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 26 05:12:10 localhost ovn_metadata_agent[159481]: 2025-11-26 10:12:10.485 159592 DEBUG oslo.privsep.daemon [-] privsep: reply[17e5afa5-dcf1-4299-a1b1-a49239ea2768]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 26 05:12:10 localhost nova_compute[281415]: 2025-11-26 10:12:10.487 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:10 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:12:10 localhost nova_compute[281415]: 2025-11-26 10:12:10.649 281419 WARNING nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 05:12:10 localhost nova_compute[281415]: 2025-11-26 10:12:10.650 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=11108MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 05:12:10 localhost nova_compute[281415]: 2025-11-26 10:12:10.651 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:12:10 localhost nova_compute[281415]: 2025-11-26 10:12:10.651 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:12:10 localhost nova_compute[281415]: 2025-11-26 10:12:10.657 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:10 localhost nova_compute[281415]: 2025-11-26 10:12:10.731 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 05:12:10 localhost nova_compute[281415]: 2025-11-26 10:12:10.731 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 05:12:10 localhost nova_compute[281415]: 2025-11-26 10:12:10.731 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 05:12:10 localhost nova_compute[281415]: 2025-11-26 10:12:10.767 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 05:12:11 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 05:12:11 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1293840666' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 05:12:11 localhost nova_compute[281415]: 2025-11-26 10:12:11.221 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 05:12:11 localhost nova_compute[281415]: 2025-11-26 10:12:11.229 281419 DEBUG nova.compute.provider_tree [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 05:12:11 localhost nova_compute[281415]: 2025-11-26 10:12:11.251 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 05:12:11 localhost nova_compute[281415]: 2025-11-26 10:12:11.254 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 05:12:11 localhost nova_compute[281415]: 2025-11-26 10:12:11.254 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.603s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:12:11 localhost ovn_controller[153664]: 2025-11-26T10:12:11Z|00529|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:12:11 localhost nova_compute[281415]: 2025-11-26 10:12:11.423 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:11 localhost dnsmasq[330801]: exiting on receipt of SIGTERM Nov 26 05:12:11 localhost podman[330980]: 2025-11-26 10:12:11.939690602 +0000 UTC m=+0.062236094 container kill b2eeadde59d09f5277188199bcce97e644b9898ace7613255b6b8cab22b5fc0c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d075d3f8-292c-4df9-9421-a5e645f00f81, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 26 05:12:11 localhost systemd[1]: libpod-b2eeadde59d09f5277188199bcce97e644b9898ace7613255b6b8cab22b5fc0c.scope: Deactivated successfully. Nov 26 05:12:12 localhost podman[330994]: 2025-11-26 10:12:12.014741454 +0000 UTC m=+0.059997655 container died b2eeadde59d09f5277188199bcce97e644b9898ace7613255b6b8cab22b5fc0c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d075d3f8-292c-4df9-9421-a5e645f00f81, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 26 05:12:12 localhost systemd[1]: tmp-crun.PJ3Ptn.mount: Deactivated successfully. Nov 26 05:12:12 localhost podman[330994]: 2025-11-26 10:12:12.056646236 +0000 UTC m=+0.101902397 container cleanup b2eeadde59d09f5277188199bcce97e644b9898ace7613255b6b8cab22b5fc0c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d075d3f8-292c-4df9-9421-a5e645f00f81, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 26 05:12:12 localhost systemd[1]: libpod-conmon-b2eeadde59d09f5277188199bcce97e644b9898ace7613255b6b8cab22b5fc0c.scope: Deactivated successfully. Nov 26 05:12:12 localhost podman[330996]: 2025-11-26 10:12:12.106299269 +0000 UTC m=+0.142576241 container remove b2eeadde59d09f5277188199bcce97e644b9898ace7613255b6b8cab22b5fc0c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d075d3f8-292c-4df9-9421-a5e645f00f81, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118) Nov 26 05:12:12 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:12:12.132 262471 INFO neutron.agent.dhcp.agent [None req-2b45b443-3766-4cfc-ab02-a48d448a6ab8 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:12:12 localhost neutron_dhcp_agent[262467]: 2025-11-26 10:12:12.133 262471 INFO neutron.agent.dhcp.agent [None req-2b45b443-3766-4cfc-ab02-a48d448a6ab8 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 26 05:12:12 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:12:12 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:12:12 localhost systemd[1]: var-lib-containers-storage-overlay-ee6b694c4d0e89bb8ad8c2882f9e49bedcf385ea139f29594a88e67aec37f646-merged.mount: Deactivated successfully. Nov 26 05:12:12 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b2eeadde59d09f5277188199bcce97e644b9898ace7613255b6b8cab22b5fc0c-userdata-shm.mount: Deactivated successfully. Nov 26 05:12:12 localhost systemd[1]: run-netns-qdhcp\x2dd075d3f8\x2d292c\x2d4df9\x2d9421\x2da5e645f00f81.mount: Deactivated successfully. Nov 26 05:12:13 localhost nova_compute[281415]: 2025-11-26 10:12:13.250 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:12:13 localhost nova_compute[281415]: 2025-11-26 10:12:13.279 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:12:13 localhost nova_compute[281415]: 2025-11-26 10:12:13.280 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:12:13 localhost nova_compute[281415]: 2025-11-26 10:12:13.280 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:12:13 localhost nova_compute[281415]: 2025-11-26 10:12:13.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:12:14 localhost nova_compute[281415]: 2025-11-26 10:12:14.206 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:14 localhost nova_compute[281415]: 2025-11-26 10:12:14.849 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:12:15 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 05:12:15 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:12:15 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:12:15 localhost nova_compute[281415]: 2025-11-26 10:12:15.661 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:15 localhost openstack_network_exporter[242153]: ERROR 10:12:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:12:15 localhost openstack_network_exporter[242153]: ERROR 10:12:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:12:15 localhost openstack_network_exporter[242153]: ERROR 10:12:15 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 05:12:15 localhost openstack_network_exporter[242153]: ERROR 10:12:15 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 05:12:15 localhost openstack_network_exporter[242153]: Nov 26 05:12:15 localhost openstack_network_exporter[242153]: ERROR 10:12:15 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 05:12:15 localhost openstack_network_exporter[242153]: Nov 26 05:12:16 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:12:16 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:12:17 localhost nova_compute[281415]: 2025-11-26 10:12:17.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:12:17 localhost nova_compute[281415]: 2025-11-26 10:12:17.848 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 05:12:17 localhost nova_compute[281415]: 2025-11-26 10:12:17.849 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 05:12:17 localhost nova_compute[281415]: 2025-11-26 10:12:17.948 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 05:12:17 localhost nova_compute[281415]: 2025-11-26 10:12:17.949 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 05:12:17 localhost nova_compute[281415]: 2025-11-26 10:12:17.949 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 05:12:17 localhost nova_compute[281415]: 2025-11-26 10:12:17.950 281419 DEBUG nova.objects.instance [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 05:12:18 localhost nova_compute[281415]: 2025-11-26 10:12:18.515 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 05:12:18 localhost nova_compute[281415]: 2025-11-26 10:12:18.540 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 05:12:18 localhost nova_compute[281415]: 2025-11-26 10:12:18.541 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 05:12:19 localhost nova_compute[281415]: 2025-11-26 10:12:19.204 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:19 localhost nova_compute[281415]: 2025-11-26 10:12:19.207 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:19 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:12:19 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:12:19 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:12:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 05:12:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 05:12:19 localhost podman[331110]: 2025-11-26 10:12:19.847167996 +0000 UTC m=+0.095525379 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 26 05:12:19 localhost systemd[1]: tmp-crun.N8WClw.mount: Deactivated successfully. Nov 26 05:12:19 localhost podman[331110]: 2025-11-26 10:12:19.898546341 +0000 UTC m=+0.146903724 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Nov 26 05:12:19 localhost podman[331109]: 2025-11-26 10:12:19.901220714 +0000 UTC m=+0.149599837 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 05:12:19 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 05:12:19 localhost podman[331109]: 2025-11-26 10:12:19.984493762 +0000 UTC m=+0.232872835 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 26 05:12:19 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 05:12:20 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:12:20 localhost nova_compute[281415]: 2025-11-26 10:12:20.663 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:21 localhost nova_compute[281415]: 2025-11-26 10:12:21.641 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:23 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 26 05:12:23 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.15660 172.18.0.34:0/3347094183' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 26 05:12:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 05:12:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 05:12:24 localhost nova_compute[281415]: 2025-11-26 10:12:24.241 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:24 localhost podman[331152]: 2025-11-26 10:12:24.252276002 +0000 UTC m=+0.096542270 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:12:24 localhost podman[331153]: 2025-11-26 10:12:24.29148374 +0000 UTC m=+0.130732391 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41) Nov 26 05:12:24 localhost podman[331152]: 2025-11-26 10:12:24.295550597 +0000 UTC m=+0.139816885 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true) Nov 26 05:12:24 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 05:12:24 localhost podman[331153]: 2025-11-26 10:12:24.375627495 +0000 UTC m=+0.214876136 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=edpm) Nov 26 05:12:24 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 05:12:25 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:12:25 localhost nova_compute[281415]: 2025-11-26 10:12:25.667 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 05:12:25 localhost podman[331200]: 2025-11-26 10:12:25.824273058 +0000 UTC m=+0.088793450 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 05:12:25 localhost podman[331200]: 2025-11-26 10:12:25.837380754 +0000 UTC m=+0.101901146 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 26 05:12:25 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 05:12:26 localhost ovn_controller[153664]: 2025-11-26T10:12:26Z|00530|binding|INFO|Releasing lport 7d243368-b21b-43d3-98dc-158093f352bc from this chassis (sb_readonly=0) Nov 26 05:12:26 localhost nova_compute[281415]: 2025-11-26 10:12:26.067 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:27 localhost podman[240049]: time="2025-11-26T10:12:27Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 05:12:27 localhost podman[240049]: @ - - [26/Nov/2025:10:12:27 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153864 "" "Go-http-client/1.1" Nov 26 05:12:27 localhost podman[240049]: @ - - [26/Nov/2025:10:12:27 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18785 "" "Go-http-client/1.1" Nov 26 05:12:29 localhost nova_compute[281415]: 2025-11-26 10:12:29.281 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:30 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:12:30 localhost nova_compute[281415]: 2025-11-26 10:12:30.670 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:30 localhost ceph-osd[31674]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2. Nov 26 05:12:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 05:12:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 05:12:31 localhost podman[331224]: 2025-11-26 10:12:31.825336196 +0000 UTC m=+0.084246279 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Nov 26 05:12:31 localhost podman[331224]: 2025-11-26 10:12:31.835669647 +0000 UTC m=+0.094579740 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible) Nov 26 05:12:31 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 05:12:31 localhost podman[331223]: 2025-11-26 10:12:31.92943577 +0000 UTC m=+0.190326024 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 26 05:12:31 localhost podman[331223]: 2025-11-26 10:12:31.963190478 +0000 UTC m=+0.224080722 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:12:31 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 05:12:34 localhost nova_compute[281415]: 2025-11-26 10:12:34.320 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:35 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:12:35 localhost nova_compute[281415]: 2025-11-26 10:12:35.673 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:37 localhost ceph-mon[297296]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0. Nov 26 05:12:37 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:12:37.470698) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 26 05:12:37 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61 Nov 26 05:12:37 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151957470800, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 2253, "num_deletes": 259, "total_data_size": 3317143, "memory_usage": 3381600, "flush_reason": "Manual Compaction"} Nov 26 05:12:37 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started Nov 26 05:12:37 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151957483326, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 2162647, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 35348, "largest_seqno": 37596, "table_properties": {"data_size": 2153756, "index_size": 5396, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 21421, "raw_average_key_size": 22, "raw_value_size": 2134877, "raw_average_value_size": 2198, "num_data_blocks": 230, "num_entries": 971, "num_filter_entries": 971, "num_deletions": 259, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764151833, "oldest_key_time": 1764151833, "file_creation_time": 1764151957, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79fcc812-ff89-4330-9c0d-148e92884770", "db_session_id": "NHY1MHQN9GMTALZ562AS", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}} Nov 26 05:12:37 localhost ceph-mon[297296]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 12701 microseconds, and 6796 cpu microseconds. Nov 26 05:12:37 localhost ceph-mon[297296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 05:12:37 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:12:37.483411) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 2162647 bytes OK Nov 26 05:12:37 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:12:37.483440) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started Nov 26 05:12:37 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:12:37.486387) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done Nov 26 05:12:37 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:12:37.486410) EVENT_LOG_v1 {"time_micros": 1764151957486403, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 26 05:12:37 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:12:37.486437) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 26 05:12:37 localhost ceph-mon[297296]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 3306549, prev total WAL file size 3306549, number of live WAL files 2. Nov 26 05:12:37 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 05:12:37 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:12:37.487399) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132383031' seq:72057594037927935, type:22 .. '7061786F73003133303533' seq:0, type:0; will stop at (end) Nov 26 05:12:37 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 26 05:12:37 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(2111KB)], [60(17MB)] Nov 26 05:12:37 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151957487468, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 20234431, "oldest_snapshot_seqno": -1} Nov 26 05:12:37 localhost ceph-mon[297296]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 14332 keys, 18683527 bytes, temperature: kUnknown Nov 26 05:12:37 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151957581568, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 18683527, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18598516, "index_size": 48110, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 35845, "raw_key_size": 382825, "raw_average_key_size": 26, "raw_value_size": 18352101, "raw_average_value_size": 1280, "num_data_blocks": 1809, "num_entries": 14332, "num_filter_entries": 14332, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764150724, "oldest_key_time": 0, "file_creation_time": 1764151957, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79fcc812-ff89-4330-9c0d-148e92884770", "db_session_id": "NHY1MHQN9GMTALZ562AS", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}} Nov 26 05:12:37 localhost ceph-mon[297296]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 26 05:12:37 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:12:37.582030) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 18683527 bytes Nov 26 05:12:37 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:12:37.584245) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 214.8 rd, 198.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.1, 17.2 +0.0 blob) out(17.8 +0.0 blob), read-write-amplify(18.0) write-amplify(8.6) OK, records in: 14878, records dropped: 546 output_compression: NoCompression Nov 26 05:12:37 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:12:37.584277) EVENT_LOG_v1 {"time_micros": 1764151957584262, "job": 36, "event": "compaction_finished", "compaction_time_micros": 94206, "compaction_time_cpu_micros": 53185, "output_level": 6, "num_output_files": 1, "total_output_size": 18683527, "num_input_records": 14878, "num_output_records": 14332, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 26 05:12:37 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 05:12:37 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151957584844, "job": 36, "event": "table_file_deletion", "file_number": 62} Nov 26 05:12:37 localhost ceph-mon[297296]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005536118/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 26 05:12:37 localhost ceph-mon[297296]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764151957587782, "job": 36, "event": "table_file_deletion", "file_number": 60} Nov 26 05:12:37 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:12:37.487308) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:12:37 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:12:37.587961) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:12:37 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:12:37.587971) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:12:37 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:12:37.587974) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:12:37 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:12:37.587977) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:12:37 localhost ceph-mon[297296]: rocksdb: (Original Log Time 2025/11/26-10:12:37.587980) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 26 05:12:39 localhost nova_compute[281415]: 2025-11-26 10:12:39.364 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:40 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:12:40 localhost nova_compute[281415]: 2025-11-26 10:12:40.677 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:41 localhost ovn_metadata_agent[159481]: 2025-11-26 10:12:41.993 159486 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9a:5e:b4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '86:cf:7c:68:02:df'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 26 05:12:41 localhost ovn_metadata_agent[159481]: 2025-11-26 10:12:41.994 159486 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 26 05:12:42 localhost nova_compute[281415]: 2025-11-26 10:12:42.016 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:44 localhost nova_compute[281415]: 2025-11-26 10:12:44.409 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:45 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:12:45 localhost nova_compute[281415]: 2025-11-26 10:12:45.680 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:45 localhost openstack_network_exporter[242153]: ERROR 10:12:45 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 05:12:45 localhost openstack_network_exporter[242153]: ERROR 10:12:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:12:45 localhost openstack_network_exporter[242153]: ERROR 10:12:45 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 05:12:45 localhost openstack_network_exporter[242153]: Nov 26 05:12:45 localhost openstack_network_exporter[242153]: ERROR 10:12:45 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:12:45 localhost openstack_network_exporter[242153]: ERROR 10:12:45 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 05:12:45 localhost openstack_network_exporter[242153]: Nov 26 05:12:47 localhost ovn_metadata_agent[159481]: 2025-11-26 10:12:47.996 159486 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=8fad182b-d1fd-4eb1-a4d3-436a76a6f49e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 26 05:12:49 localhost nova_compute[281415]: 2025-11-26 10:12:49.412 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:50 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:12:50 localhost nova_compute[281415]: 2025-11-26 10:12:50.684 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 05:12:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 05:12:50 localhost systemd[1]: tmp-crun.YPnbmg.mount: Deactivated successfully. Nov 26 05:12:50 localhost podman[331261]: 2025-11-26 10:12:50.833188047 +0000 UTC m=+0.090953297 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2) Nov 26 05:12:50 localhost podman[331260]: 2025-11-26 10:12:50.871233229 +0000 UTC m=+0.133729215 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 05:12:50 localhost podman[331260]: 2025-11-26 10:12:50.878855315 +0000 UTC m=+0.141351351 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 26 05:12:50 localhost podman[331261]: 2025-11-26 10:12:50.892863511 +0000 UTC m=+0.150628771 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS) Nov 26 05:12:50 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 05:12:50 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 05:12:54 localhost nova_compute[281415]: 2025-11-26 10:12:54.449 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 05:12:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 05:12:54 localhost systemd[1]: tmp-crun.ZaVOIV.mount: Deactivated successfully. Nov 26 05:12:54 localhost podman[331305]: 2025-11-26 10:12:54.834557892 +0000 UTC m=+0.092211016 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true) Nov 26 05:12:54 localhost podman[331306]: 2025-11-26 10:12:54.883828823 +0000 UTC m=+0.137153202 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, maintainer=Red Hat, Inc., vcs-type=git, version=9.6, distribution-scope=public, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Nov 26 05:12:54 localhost podman[331306]: 2025-11-26 10:12:54.897008313 +0000 UTC m=+0.150332732 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., version=9.6, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, release=1755695350, vcs-type=git, io.openshift.expose-services=, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers) Nov 26 05:12:54 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 05:12:54 localhost podman[331305]: 2025-11-26 10:12:54.947363207 +0000 UTC m=+0.205016361 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 26 05:12:54 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 05:12:55 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:12:55 localhost nova_compute[281415]: 2025-11-26 10:12:55.687 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:12:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 05:12:56 localhost ovn_controller[153664]: 2025-11-26T10:12:56Z|00531|memory_trim|INFO|Detected inactivity (last active 30000 ms ago): trimming memory Nov 26 05:12:56 localhost podman[331349]: 2025-11-26 10:12:56.835215434 +0000 UTC m=+0.084458765 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 26 05:12:56 localhost podman[331349]: 2025-11-26 10:12:56.847418403 +0000 UTC m=+0.096661754 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 26 05:12:56 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 05:12:57 localhost podman[240049]: time="2025-11-26T10:12:57Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 05:12:57 localhost podman[240049]: @ - - [26/Nov/2025:10:12:57 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153864 "" "Go-http-client/1.1" Nov 26 05:12:57 localhost podman[240049]: @ - - [26/Nov/2025:10:12:57 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18788 "" "Go-http-client/1.1" Nov 26 05:12:59 localhost nova_compute[281415]: 2025-11-26 10:12:59.451 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:13:00 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:13:00 localhost nova_compute[281415]: 2025-11-26 10:13:00.699 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:13:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 05:13:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 05:13:02 localhost podman[331373]: 2025-11-26 10:13:02.837749377 +0000 UTC m=+0.090898975 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Nov 26 05:13:02 localhost podman[331373]: 2025-11-26 10:13:02.87130648 +0000 UTC m=+0.124456068 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 26 05:13:02 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 05:13:02 localhost podman[331374]: 2025-11-26 10:13:02.886861363 +0000 UTC m=+0.138646999 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:13:02 localhost podman[331374]: 2025-11-26 10:13:02.923999107 +0000 UTC m=+0.175784713 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 26 05:13:02 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 05:13:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:13:03.681 159486 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:13:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:13:03.681 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:13:03 localhost ovn_metadata_agent[159481]: 2025-11-26 10:13:03.682 159486 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:13:04 localhost nova_compute[281415]: 2025-11-26 10:13:04.455 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:13:05 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:13:05 localhost nova_compute[281415]: 2025-11-26 10:13:05.702 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:13:09 localhost nova_compute[281415]: 2025-11-26 10:13:09.459 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:13:09 localhost nova_compute[281415]: 2025-11-26 10:13:09.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:13:09 localhost nova_compute[281415]: 2025-11-26 10:13:09.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:13:09 localhost nova_compute[281415]: 2025-11-26 10:13:09.879 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:13:09 localhost nova_compute[281415]: 2025-11-26 10:13:09.879 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:13:09 localhost nova_compute[281415]: 2025-11-26 10:13:09.880 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:13:09 localhost nova_compute[281415]: 2025-11-26 10:13:09.880 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Auditing locally available compute resources for np0005536118.localdomain (node: np0005536118.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 26 05:13:09 localhost nova_compute[281415]: 2025-11-26 10:13:09.880 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 05:13:10 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 05:13:10 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3855688258' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 05:13:10 localhost nova_compute[281415]: 2025-11-26 10:13:10.373 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.493s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 05:13:10 localhost nova_compute[281415]: 2025-11-26 10:13:10.433 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 05:13:10 localhost nova_compute[281415]: 2025-11-26 10:13:10.433 281419 DEBUG nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 26 05:13:10 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:13:10 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 26 05:13:10 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/387263629' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 26 05:13:10 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 26 05:13:10 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/387263629' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 26 05:13:10 localhost nova_compute[281415]: 2025-11-26 10:13:10.677 281419 WARNING nova.virt.libvirt.driver [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 26 05:13:10 localhost nova_compute[281415]: 2025-11-26 10:13:10.678 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Hypervisor/Node resource view: name=np0005536118.localdomain free_ram=11109MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 26 05:13:10 localhost nova_compute[281415]: 2025-11-26 10:13:10.679 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 26 05:13:10 localhost nova_compute[281415]: 2025-11-26 10:13:10.679 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 26 05:13:10 localhost nova_compute[281415]: 2025-11-26 10:13:10.705 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:13:10 localhost nova_compute[281415]: 2025-11-26 10:13:10.778 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Instance 9d78bef9-6977-4fb5-b50b-ae75124e73af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 26 05:13:10 localhost nova_compute[281415]: 2025-11-26 10:13:10.778 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 26 05:13:10 localhost nova_compute[281415]: 2025-11-26 10:13:10.778 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Final resource view: name=np0005536118.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 26 05:13:10 localhost nova_compute[281415]: 2025-11-26 10:13:10.856 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 26 05:13:11 localhost ceph-mon[297296]: mon.np0005536118@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 26 05:13:11 localhost ceph-mon[297296]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2330153409' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 26 05:13:11 localhost nova_compute[281415]: 2025-11-26 10:13:11.328 281419 DEBUG oslo_concurrency.processutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 26 05:13:11 localhost nova_compute[281415]: 2025-11-26 10:13:11.336 281419 DEBUG nova.compute.provider_tree [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed in ProviderTree for provider: 05276789-7461-410b-9529-16f5185a8bff update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 26 05:13:11 localhost nova_compute[281415]: 2025-11-26 10:13:11.451 281419 DEBUG nova.scheduler.client.report [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Inventory has not changed for provider 05276789-7461-410b-9529-16f5185a8bff based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 26 05:13:11 localhost nova_compute[281415]: 2025-11-26 10:13:11.454 281419 DEBUG nova.compute.resource_tracker [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Compute_service record updated for np0005536118.localdomain:np0005536118.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 26 05:13:11 localhost nova_compute[281415]: 2025-11-26 10:13:11.454 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.775s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 26 05:13:13 localhost nova_compute[281415]: 2025-11-26 10:13:13.454 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:13:13 localhost nova_compute[281415]: 2025-11-26 10:13:13.455 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:13:13 localhost nova_compute[281415]: 2025-11-26 10:13:13.455 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:13:13 localhost nova_compute[281415]: 2025-11-26 10:13:13.455 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 26 05:13:13 localhost nova_compute[281415]: 2025-11-26 10:13:13.849 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:13:14 localhost nova_compute[281415]: 2025-11-26 10:13:14.461 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:13:15 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:13:15 localhost nova_compute[281415]: 2025-11-26 10:13:15.710 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:13:15 localhost openstack_network_exporter[242153]: ERROR 10:13:15 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 26 05:13:15 localhost openstack_network_exporter[242153]: ERROR 10:13:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:13:15 localhost openstack_network_exporter[242153]: ERROR 10:13:15 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 26 05:13:15 localhost openstack_network_exporter[242153]: ERROR 10:13:15 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 26 05:13:15 localhost openstack_network_exporter[242153]: Nov 26 05:13:15 localhost openstack_network_exporter[242153]: ERROR 10:13:15 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 26 05:13:15 localhost openstack_network_exporter[242153]: Nov 26 05:13:15 localhost nova_compute[281415]: 2025-11-26 10:13:15.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:13:16 localhost ceph-mon[297296]: from='mgr.34351 172.18.0.108:0/3056736363' entity='mgr.np0005536119.eupicg' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 26 05:13:16 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:13:16 localhost nova_compute[281415]: 2025-11-26 10:13:16.847 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:13:18 localhost nova_compute[281415]: 2025-11-26 10:13:18.848 281419 DEBUG oslo_service.periodic_task [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 26 05:13:18 localhost nova_compute[281415]: 2025-11-26 10:13:18.849 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 26 05:13:18 localhost nova_compute[281415]: 2025-11-26 10:13:18.849 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 26 05:13:19 localhost nova_compute[281415]: 2025-11-26 10:13:19.043 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquiring lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 26 05:13:19 localhost nova_compute[281415]: 2025-11-26 10:13:19.044 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Acquired lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 26 05:13:19 localhost nova_compute[281415]: 2025-11-26 10:13:19.045 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 26 05:13:19 localhost nova_compute[281415]: 2025-11-26 10:13:19.045 281419 DEBUG nova.objects.instance [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d78bef9-6977-4fb5-b50b-ae75124e73af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 26 05:13:19 localhost nova_compute[281415]: 2025-11-26 10:13:19.464 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:13:19 localhost ceph-mon[297296]: from='mgr.34351 ' entity='mgr.np0005536119.eupicg' Nov 26 05:13:19 localhost nova_compute[281415]: 2025-11-26 10:13:19.510 281419 DEBUG nova.network.neutron [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updating instance_info_cache with network_info: [{"id": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "address": "fa:16:3e:8c:0f:d8", "network": {"id": "3633976c-3aa0-4c4a-aa49-e8224cd25e39", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "b2fe3cd6f6ea49b8a2de01b236dd92e3", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afdc9d0-95", "ovs_interfaceid": "5afdc9d0-9595-4904-b83b-3d24f739ffec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 26 05:13:19 localhost nova_compute[281415]: 2025-11-26 10:13:19.527 281419 DEBUG oslo_concurrency.lockutils [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] Releasing lock "refresh_cache-9d78bef9-6977-4fb5-b50b-ae75124e73af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 26 05:13:19 localhost nova_compute[281415]: 2025-11-26 10:13:19.527 281419 DEBUG nova.compute.manager [None req-275393b0-1fc6-40c2-b5c0-d4a4ed133bca - - - - - -] [instance: 9d78bef9-6977-4fb5-b50b-ae75124e73af] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 26 05:13:20 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:13:20 localhost nova_compute[281415]: 2025-11-26 10:13:20.711 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:13:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef. Nov 26 05:13:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e. Nov 26 05:13:21 localhost podman[331540]: 2025-11-26 10:13:21.833089678 +0000 UTC m=+0.090341667 container health_status f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 26 05:13:21 localhost podman[331540]: 2025-11-26 10:13:21.847381492 +0000 UTC m=+0.104633461 container exec_died f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 26 05:13:21 localhost systemd[1]: f2d03685027d13c8ce8b7dc0f685a54de8ba8d0d42b6b2842a008c92cca5667e.service: Deactivated successfully. Nov 26 05:13:21 localhost podman[331539]: 2025-11-26 10:13:21.92941879 +0000 UTC m=+0.185938887 container health_status b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 26 05:13:21 localhost podman[331539]: 2025-11-26 10:13:21.94391493 +0000 UTC m=+0.200435037 container exec_died b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 26 05:13:21 localhost systemd[1]: b3fc0399ee214241f67bfda9bafbd90cc5e693adf32f07ae15fb423c8840cdef.service: Deactivated successfully. Nov 26 05:13:23 localhost sshd[331582]: main: sshd: ssh-rsa algorithm is disabled Nov 26 05:13:23 localhost systemd-logind[761]: New session 75 of user zuul. Nov 26 05:13:23 localhost systemd[1]: Started Session 75 of User zuul. Nov 26 05:13:24 localhost python3[331604]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager unregister#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-dd88-1a57-00000000000c-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 05:13:24 localhost nova_compute[281415]: 2025-11-26 10:13:24.490 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:13:25 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:13:25 localhost nova_compute[281415]: 2025-11-26 10:13:25.714 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:13:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140. Nov 26 05:13:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba. Nov 26 05:13:25 localhost systemd[1]: tmp-crun.hfG5nU.mount: Deactivated successfully. Nov 26 05:13:25 localhost podman[331607]: 2025-11-26 10:13:25.830999285 +0000 UTC m=+0.085544529 container health_status 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118) Nov 26 05:13:25 localhost podman[331608]: 2025-11-26 10:13:25.893822916 +0000 UTC m=+0.143305452 container health_status a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-type=git, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal) Nov 26 05:13:25 localhost podman[331607]: 2025-11-26 10:13:25.904388615 +0000 UTC m=+0.158933859 container exec_died 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 26 05:13:25 localhost systemd[1]: 123da3951598e61d8ee6bcc859d6a4953bec17b0b4ed31f06b23eaa2becc2140.service: Deactivated successfully. Nov 26 05:13:25 localhost podman[331608]: 2025-11-26 10:13:25.933561322 +0000 UTC m=+0.183043858 container exec_died a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=9.6, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350) Nov 26 05:13:25 localhost systemd[1]: a42fe823583c1f9da52635d0f604f05bed50b884bc8315b6213ba7b8338fceba.service: Deactivated successfully. Nov 26 05:13:27 localhost podman[240049]: time="2025-11-26T10:13:27Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 26 05:13:27 localhost podman[240049]: @ - - [26/Nov/2025:10:13:27 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153864 "" "Go-http-client/1.1" Nov 26 05:13:27 localhost podman[240049]: @ - - [26/Nov/2025:10:13:27 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18798 "" "Go-http-client/1.1" Nov 26 05:13:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab. Nov 26 05:13:27 localhost podman[331653]: 2025-11-26 10:13:27.823723059 +0000 UTC m=+0.077895711 container health_status 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 26 05:13:27 localhost podman[331653]: 2025-11-26 10:13:27.836346702 +0000 UTC m=+0.090519414 container exec_died 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 26 05:13:27 localhost systemd[1]: 4a15d3a3800ed977b3c5fbe215a58844d7ac021ba4de06e6817e73b1a79621ab.service: Deactivated successfully. Nov 26 05:13:29 localhost nova_compute[281415]: 2025-11-26 10:13:29.531 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:13:30 localhost systemd[1]: session-75.scope: Deactivated successfully. Nov 26 05:13:30 localhost systemd-logind[761]: Session 75 logged out. Waiting for processes to exit. Nov 26 05:13:30 localhost systemd-logind[761]: Removed session 75. Nov 26 05:13:30 localhost sshd[331676]: main: sshd: ssh-rsa algorithm is disabled Nov 26 05:13:30 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:13:30 localhost nova_compute[281415]: 2025-11-26 10:13:30.717 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:13:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c. Nov 26 05:13:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc. Nov 26 05:13:33 localhost systemd[1]: tmp-crun.tP4sNW.mount: Deactivated successfully. Nov 26 05:13:33 localhost podman[331678]: 2025-11-26 10:13:33.838399041 +0000 UTC m=+0.095893120 container health_status 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible) Nov 26 05:13:33 localhost podman[331678]: 2025-11-26 10:13:33.868728564 +0000 UTC m=+0.126222653 container exec_died 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Nov 26 05:13:33 localhost systemd[1]: 659e0e588db40389ba26717cf76ec825fcb7f689f230e7ede029498081ff590c.service: Deactivated successfully. Nov 26 05:13:33 localhost podman[331679]: 2025-11-26 10:13:33.887372033 +0000 UTC m=+0.142808387 container health_status 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 26 05:13:33 localhost podman[331679]: 2025-11-26 10:13:33.902297767 +0000 UTC m=+0.157734111 container exec_died 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 26 05:13:33 localhost systemd[1]: 8d363a4f2327c9311cb4efd64504c244c9411f80cc065f622555a300878325bc.service: Deactivated successfully. Nov 26 05:13:34 localhost nova_compute[281415]: 2025-11-26 10:13:34.569 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:13:35 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:13:35 localhost nova_compute[281415]: 2025-11-26 10:13:35.763 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:13:39 localhost nova_compute[281415]: 2025-11-26 10:13:39.616 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:13:40 localhost ceph-mon[297296]: mon.np0005536118@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 26 05:13:40 localhost nova_compute[281415]: 2025-11-26 10:13:40.789 281419 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 26 05:13:42 localhost sshd[331715]: main: sshd: ssh-rsa algorithm is disabled Nov 26 05:13:42 localhost systemd-logind[761]: New session 76 of user zuul. Nov 26 05:13:42 localhost systemd[1]: Started Session 76 of User zuul.